71 private links
A long but important report in my opinion. Reading the executive summary is a must. This gives a good overview of the AI industrial complex and the type of society it's leading us into. The report algo gives a political agenda to put us on a better path.
I'm really glad about this interview. I've been thinking that Machiavelli texts have been wrongly considered for a long time now. It's time to reconsider what Machiavellian should really mean.
Interesting research, this gives a few hints at building tools to ensure some more transparency at the ideologies pushed by models. They're not unbiased, that much we know, characterising the biases are thus important.
It definitely has a point. The code output isn't really what matters. It is necessary at the end, but without the whole process it's worthless and don't empower anyone... It embodies many risks instead. I think my preferred quote in this article is this:
"We are teaching people that they are not worth to have decent, well-made things."
They've been warned of this leak by GitGuardian weeks ago... and did nothing. For people manipulating such sensitive data their security practices are preposterous.
A reminder that reckless political decisions can have dire consequences for quite a few FOSS projects.
Reminder of why privacy matter and why we shouldn't collectively give in to the data vultures.
Are they really believing their own lies now? More likely they're trying to manipulate clueless lawmakers at this point. They can't afford to let the circus end.
Maybe something good will come out of the political turmoil around the CVE Program. This would be nice to see it more independent indeed.
This is a question which I have been pondering for a while... what will be left when the generative AI bursts. And indeed it won't be the models as they won't age well. The conclusion of this article got a chill running down my spine. It's indeed likely that the conclusion will be infrastructure for a bigger surveillance apparatus.
Don't confuse scenarios for predictions... Big climate improvements due to AI tomorrow after accepting lots of emissions today is just a belief. There's nothing to back up it would really happen.
This is definitely a funny hack. I wonder how long the people behind this knew about the vulnerability and waited for the right opportunity to do something with it.
Sure, a filter which turns pictures into something with the Ghibli style looks cute. But make no mistake, it has utter political motives. They need a distraction from their problems and it's yet another way to breach a boundary. Unfortunately I expect people will comply and use the feature with enthusiasm...
I guess more reviews of that book will come out. It looks like Meta and some EU politicians are even more rotten to the core than we ever suspected...
In case it wasn't clear yet that the tech industry was eminently political, this editorial drives the point home. It's also a good reminder that it's been the case for a long while.
Interesting piece, we indeed need to move beyond from the "for hackers by hackers" mindset. I don't even think it was really the whole extent of the political goals when the Free Software movement started. Somehow we got stuck there though.
They really never learn... Whatever the country politician try to blindly fight against cryptography again and again. Let's hope this one is stopped.
Maybe it'll at least be a wake up call for governments and businesses to let go of their US cloud addiction. There are reasons why you don't want such vendor lock-in. The political drama unfolding in the United States makes obvious why you should think carefully at how dependent you are from your service and infrastructure providers.
This might be accidental but this highlights the lack of transparency on how those models are produced. It also means we should get ready for future generation of such models to turn into very subtle propaganda machines. Indeed even if for now it's accidental I doubt it'll be the case much longer.
Some powerful bullies want to make the life of editors impossible. Looks like the foundation has the right tools in store to protect those contributors.