It feels like it's supercharging an old bias... We tend to confuse confidence for competence.
Real innovations come from constraints. The frugal AI movement is clearly where we will see interesting things emerging. Interestingly, those approaches are closer to what AI is about as a research field than the industrial complex which got unleashed with all its extractive power.
The price hike on RAM due to the LLM as a service bubble is really killing interesting fields. Can't we have nice things? Will the arm race end soon?
Unsurprisingly, they need to find new data to feed the monster...
Not peer reviewed as far as I can tell. That said if confirmed by other studies this feels like an important paper. The language flattening might be real and this will have lasting cultural impacts.
I personally think this is where it'll head after the bubble pops. We should be able to recover enough material to have something viable to run locally. The question will be "where the updated models come from?", it might be the public sector helping there and hopefully those will be truly FOSS and ethical (like Apertus).
Or why this latest trend in genAI hype is a fool's errand.
Excellent piece, indeed legal is not the same as legitimate. More often than not the law is lagging behind and things might be wrongly "fixed" at a later date. In that interval that's when our communities need to build its own tools to protect the commons. We're clearly reaching such an inflection point. Interestingly, I think there's is a difference of reaction between the people with a Free Software culture and the ones with an Open Source culture.
Obviously the essay from Peter Naur keeps popping up lately. It feels like an important piece, especially in the current atmosphere of vibe coding. This article lays out quite well why vibe coding is the opposite of what we should be doing.
This is concerning, hopefully the amount of issues which get through will be limited.
Very good essay on why the developer profession is not going away. On the contrary we need to double down on essential skills and put in the work. This is long overdue anyway.
This fantasy regularly comes back. Yet, the tools evolve, might improve some things but the core difficulties of programming don't change. At each hype cycle our industry over promises and under delivers, this is unnecessary.
One more example that it should be used for NLP tasks, not knowledge related tasks. The model makers are consuming so much data indiscriminately that they can't easily fine comb everything to remove the poisoned information.
The OpenClaw instances running around are really a security hazard...
This planned giant data center by Meta shows how the big players are grabbing land to satisfy their hubris. So much waste all around.
Interesting point, there are indeed different types of "debt" in the systems we build. It likely help to be more precise about their nature, and indeed assisted coding might help grow a particular kind of debt.
If you're wondering the kind of dumpster fire Facebook is now, that gives an idea. It was crap all along for sure, but clearly they crossed another threshold.
Still a bit mysterious but could be interesting if they really deliver.
I was so waiting for someone motivated enough to publish a review of that paper. I indeed threw it away as weak after reading it. Thanks for taking the time to write this up! This is good scientific inquiry... and it shows there were interesting findings in the paper that the authors decided to just ignore.
Another example of how much of a problem this is for some projects. Of course it is compounded by having so many projects on GitHub, this pushes people to try to farm for activity to attempt to make their resume look good. This is sad.