71 private links
OK, this is old so I wish it'd go beyond 2003. Still, that's quite a funny read.
Indeed it is. It's not the perfect or most sexy language, and yet it has some interesting properties.
This is indeed a welcome improvement in my opinion. It's nice to get a glimpse of the process of adding such features in Rust.
Very interesting pattern. The article is really in-depth and gets all the way to language lawyer level. It's not for everyone I guess.
Looks like a good list of pointers to understand languages and compilers... More reading ahead!
This is very interesting research. This confirms that LLMs can't be trusted on any output they make about their own inference. The example about simple maths is particularly striking, the real inference and what it outputs if you ask about its inference process are completely different.
Now for the topic dearest to my heart: It looks like there's some form of concept graph hiding in there which is reapplied across languages. Now we don't know if a particular language influences that graph. I don't expect the current research to explore this question yet, but looking forward to someone tackling it.
Indeed, it's something where we lack consensus across languages and sometimes within the same ecosystem.
Translation and localisation is a complex topic too often overlooked by developers. This is a modest list of widespread misconceptions. If you get in the details it get complex fairly quickly.
Looks like a good resource if you're interested in natural language processing.
It's a not niche indeed but has its place in some applications.
Comparing languages based on some benchmark is probably a fool's errand indeed. To many factors can change between language and benchmark implementations.
I definitely like the approach of having vectorisation in the RDBMS directly. This is one less moving part, less complexity at the application level to synchronize everything together. In this case it's a Postgres extension.
This is still an important step with LLM. It's not because the models are huge that tokenizers disappeared or that you don't need to clean up your data.
Very interesting research. Looks like we're slowly moving away from the "language and thinking are intertwined" hypothesis. This is probably the last straw for Chomsky's theory of language. It served us well but neuroscience points that it's time to leave it behind.
Using the right metaphors will definitely help with the conversation in our industry around AI. This proposal is an interesting one.
Interesting view about the LSP specification, where it shines, and where it falls short.
Interesting to see Typescript and Rust picking up pace slowly. Otherwise Python, Java, Javascript and C++ are still the big four overall. For jobs, C# and SQL are good to have in your tool belt.
If you wonder why information retrieval from natural language texts is a tough domain, here is a short article listing the important things to keep in mind.
Now this is a very good article highlighting the pros and cons of large language models for natural language processing tasks. It can help on some things but definitely shouldn't be relied on for longer term systems.
Newer language really have a hard time moving up in this ranking. Lots of inertia all around.