Clearly not a style which works for any and every applications. Still, it's definitely a good thing to aim towards such an architecture. It brings really nice properties in terms of testability and safety.
Still young and pretty much a one man show. This could turn into a nice tool to use C4 more productively.
This looks like an interesting way to frame problems. It can give an idea of how likely they can be tackled with LLMs. It also shows that the architecture and the complexity greatly matter.
I'm not sure I fully align with this piece. The core tenet of generic design advice vs concrete design advice makes sense though.
Interesting tool and I like the underlying approach. I wish we'd have good equivalent tools for other ecosystems.
It's not the only factor leading to troublesome architectures of course. Still, if state and thus data is wrongly handled, you're indeed on the wrong track.
This is very true. It's not like whoever produced bad code is particularly stupid, in most cases it's the context around which breaks the people.
A good list of characteristics to aim for. Gives clue about the quality of your software architecture.
There are indeed options for managing dependencies in more complex Rust codebases. It needs to be planned properly when doing the software architecture of your components though.
This is a good way to see that the architecture questions are multi-layered. And yes, in enterprise contexts they go all the way to the company strategy level.
A bit of an advertisement toward the end. That said, the evaluated constraints are completely valid. You don't want to fit your whole code base into the "cloud function" model, only a few workloads will make sense there.
Indeed, in most case you don't need the extra complexity. Also interesting is showing that even if the application has to scale rapidly you still got quite some time to plan the transition to something else. It makes Postgres a sane default choice.
Some food for thought about the use of bounded contexts in Domain Driven Design.
Maybe it's time to stop obsessing about scale and distributed architectures? The hardware has been improved quite a bit at the right places, especially storage.
Yes an external cache is definitely faster. That said does your application need the extra complexity? Is the caching in database really the bottleneck? If not, the question of the external cache is still open.
I don't think it's always unfolding exactly like this but there's some truth to that. Most projects see a "let's rewrite it in X" phase, this is rarely the best outcome.
Databases do improve and provide more "cache like" features, but such caches are still needed for the time being.
A short and to the point reminder on how to manage properly a "layer cake" architecture.
Good reminder of this important but imperfect guide to software design. There is some ambiguity on what "simplest" actually means. Still it helps keeping in mind that simple is rarely easy to find.
Ever wondered about how Windows 3 was architectured? This is an interesting read. It was really complex though, you can really tell it's in the middle of several transitions.