Comment by shoo

Comment by shoo 3 days ago

3 replies

Probably works OK for a small project with a close knit team of skilled contributors where there's some well defined structure and everyone has sufficient high level understanding of that structure to know what kinds of dependencies are or are not healthy to have.

But, unless you have some way of enforcing that access between different components happens through some kind of well defined interfaces, the codebase may end up very tightly coupled and expensive or impractical to evolve and change, if shared memory makes it easy for folks to add direct dependencies between data structures of different components that shouldn't be coupled.

default-kramer 3 days ago

> But, unless you have some way of enforcing that access between different components happens through some kind of well defined interfaces, the codebase may end up very tightly coupled and expensive or impractical to evolve and change

You are describing the "microservice architecture" that I currently loathe at my day job. Fans of microservices would accurately say "well that's not proper microservices; that's a distributed monolith" but my point is that choosing microservices does not enforce any kind of architectural quality at all. It just means that all of your mistakes are now eternally enshrined thanks to Hyrum's Law, rather than being private/unpublished functions that are easy to refactor using "Find All References" and unit tests.

Nextgrid 3 days ago

> through some kind of well defined interfaces

Every compiled language has the concept of "interfaces", and can load even compiled modules/assemblies if you insist on them being built separately.

The compiler will enforce interface compliance much better than hitting untyped JSON endpoints over a network.

jeltz 3 days ago

Video games are very successfully built by huge teams as monoliths. As are some big open source projects like Linux and PostgreSQL.