Posted by enether 10/29/2025
The fact that there is no common library that implements the authors strategy is a good sign that there is not much demand for this.
I think a bigger issue is the DBMS themselves getting feature after feature and becoming bloated and unfocused. Add the thing to Postgres because it is convenient! At least Postgres has a decent plugin approach. But I think more use cases might be served by standalone products than by add-ons.
Your DB on the other hand is usually a well-understood part of your system, and while scaling issues like that can cause problems, they're often fairly easy to predict- just unfortunate on timing. This means that while they'll disrupt, they're usually solved quickly, which you can't always say for additional systems.
I think kafka makes easy to create an event driven architecture. This is particularly useful when you have many teams. They are properly isolated from each other.
And with many teams, another problem comes, there's no guarantee that queries are gonna be properly written, then postgres' performance may be hindered.
Given this, I think using Kafka in companies with many teams can be useful, even if the data they move is not insanely big.
It's built on pgmq and not married to supabase (nearly everything is in the database).
Postgres is enough.
For a medium to large organization with independent programs that need to talk to each other, Kafka provides an essential capability that would be much slower and higher risk with Postgres.
Standardizing the flow of information across an organization is difficult. Kafka is crucial for that. To achieve that in Postgres would require either a shared database which is inherently risky or would require a customized API for access which introduces another layer of performance bottleneck and build/maintenance cost and decreases development productivity/performance. So you have a double whammy of performance degradation with an API. And for multiple consumers operating against the same events (for example: write to storage, perform action, send to data lake), with a database you need a magnitude more access, so N*X with N being the number of consumers multiplied by the query to consume. With three consumers you're tripling your database queries, which adds up fast across topics. Now you need to start fixing indexes and creating views and other workload to keep performance optimal. And at some point you're just poorly recreating Kafka in a database.
The common denominator in every "which is better" debate is always use case. This article seems like it would primariy apply to small organizations or limited consumer need. And yea, at that point why are you using events in the first place? Use a single API or database and be done with it. This is where the buzzword thing is relevant. If you're using Kafka for your single team, single database, small organization, it's overkill.
Side note: Someone mentioned Postgres as an audit log. Oh god. Done it. It was a nightmare. Ended up migrating to pub/sub with long-term storage in Mongo. which solved significant performance issues. Audit log is inheritently write once read many. There is no advantage to storing in a relational database.
It's a complex piece of software that solves a complex problem, but there's many trade-offs, so only use it when you need to.