Maybe at some point the big providers will get the message and their scrapers will finally respect robots.txt? Let's hope so.
I guess this was just a matter of time, the obsession of "just make it bigger" was making most player myopic. Now this obviously collides with geopolitics since this time it's about a Chinese company being ahead.
This is a worthy questioning... We try to reuse, but maybe we do it too much? For sure some ecosystems quickly lead to hundreds of dependencies even for small features.
Finally some sane API to deal with date and time in JavaScript? Maybe, we'll see...
Always be careful with regular expressions indeed. It can badly backfire.
This is good to see funds being raised for those projects. Lets hope they get madly successful.
Nice tricks to say no when people push to get something in a product. 😉
Interesting proposal of structure for technical documentation.
A harsh reminder that getenv is not thread safe...
Nice experiment in minimalism. It's nice to see we can still build tiny systems like that.
Nice musing on how a type system can be a way to tame complexity or at least isolate it explicitly in one place.
Very nice explorations of the different behaviours type systems can have around inference.
This is indeed clear, the centralized web platforms are fragile by default. They are very prone to capture, this is what just happened.
Very nice editorial. It's clear that the level of trust in the technologies we depend on is low... but that's not due to the technologies themselves it's more about the business practices around them. In the end the solution will have to be political, in the meantime we ought to support the good players.
Yet another attempt at protecting content from AI scrapers. A very different approach for this one.
Pointing out an important dilemma indeed. Which tests to keep over time? What to do with redundancies?
It becomes clear that there are more and more reasons to move back to simpler times regarding the handling of web frontends.
There was a time when scraping bots were well behaved... Now apparently we have to add software to actively defend against AI scrapers.
Really cool procedural environment generation.
I think this is a very welcome protest at FOSDEM. This keynote would be a shame on the conference. Unfortunately I already planned to not attend FOSDEM this year, but if you are: please participate to this sit-in.