Indeed, we should stop listening to such people who are basically pushing fantasies in order to raise more money.
OK, this paper picked my curiosity. The limitations of the experiments makes me wonder if some threshold effects aren't ignored. Still this is a good indication that the question is worth pursuing further.
Interesting proposals, let's see how far they go. They could bring most of the benefits of htmx and similar straight in HTML.
As it gets more adoption people are figuring out ways to use htmx properly and not abuse what should be niche features.
How shocking! This was all hype? Not surprised since we've seen the referenced papers before, but put all together it makes things really clear.
Always happy to see a patent troll bite the dust.
The arm race is still on-going at a furious pace. Still wondering how messy it will be when this bubble bursts.
Mozilla is clearly loosing its way, this is sad to watch. I guess the forks which remove the online advertising measures will become more popular.
Nice results. Interesting implementation too. I wonder if some of it will make its way to the glibc or musl.
Doxxing will get easier and easier. Con men are likely paying attention.
Good article about the ethical implications of using AI in systems. I like the distinction about assistive vs automated. It's not perfect as it underestimates the "asleep at the steering wheel" effects, but this is a good starting point.
Interesting point. You likely need to be careful with fallback modes especially in distributed systems. They might bring even more issues when the system is already under stress.
Unexpected but definitely welcome. Let's wish them luck in this endeavor.
Excellent clip for the W3C 30th anniversary. Shows the big milestones and evolution of the WWW.
If you run the number, we actually can't afford this kind of generative AI arm race. It's completely unsustainable both for training and during use...
Interesting comparison of the different choices made in Rust and the upcoming C++26 for code generation. It's fascinating how they managed to have such facilities in Rust while having no introspection. C++ going the opposite direction will have a very different feel both in term of use or of implementation.
Putting things in the public domain voluntarily is indeed more difficult than it should be. The best tool we got is CC0, but it still raises (probably unwarranted) concerns for software.
This is a short article summarizing a research paper at the surface level. It is clearly the last nail in the coffin for the generative AI grand marketing claims. Of course, I recommend reading the actual research paper (link at the end) but if you prefer this very short form, here it is. It's clearly time to go back to the initial goals of the AI field: understanding cognition. The latest industrial trends tend to confuse too much the map with the territory.
We keep saying they're not the same. This article does a good job highlighting the differences and explaining why you need both.
Or why we shouldn't trust marketing survey... they definitely confuse perception and actual results. Worse they do it on purpose.