▲ | KraftyOne 3 days ago | |
Kafka is great for streaming use cases, but the big advantage of Postgres-backed queues is that they can integrate with durable workflows, providing durability guarantees for larger programs. For example, a workflow can enqueue many tasks, then wait for them to complete, with fault-tolerance guarantees both for the individual tasks and the larger workflow. | ||
▲ | dbacar 3 days ago | parent | next [-] | |
I guess if you use different topics(queues) in Kafka you can do all this by the help of a processor like Storm, Spark etc, routing messages to different topics hence a workflow. | ||
▲ | chatmasta 3 days ago | parent | prev [-] | |
Huh? Kafka messages are durable just like Postgres commits are durable. That’s why it’s used for things like Debezium that need a durable queue of CDC messages like those from the Postgres WAL. There’s nothing inherently different about the durability of Postgres that makes it better than Kafka for implementing durable workflows. There are many reasons it’s a better choice for building a system like DBOS to implement durable workflows – ranging from ergonomics to ecosystem compatibility. But in theory you could build the same solution on Kafka, and if the company were co-founded by the Kafka creators rather than Michael Stonebraker, maybe they would have chosen that. |