Streaming Huge Databases Using Logical Decoding

Practical aspects of extracting consistent data snapshots from a PostgreSQL database.

Posted on Feb 23, 2016

Zalando’s database engineering team recently spoke at FOSDEM PGDay, an event hosted by the PostgreSQL community the day before the FOSDEM conference in Brussels. I had the opportunity to share insights on Streaming Huge Databases using Logical Decoding.

Logical decoding is a new feature of PostgreSQL (since version 9.4) that allows streaming database changes in a custom format. My talk explains what potential problems one may encounter while extracting consistent snapshot of a big database and approaches to mitigate these problems. A performance comparison of different existing Logical Decoding plugins is discussed, and a unified interface to stream pre-existing data through a plugin output function is proposed. Finally, I discussed some potential performance bottlenecks and their mitigation means.

Watch my talk below and let me know your thoughts:

Streaming huge databases using logical decoding from Alexander Shulgin


We're hiring! Do you like working in an ever evolving organization such as Zalando? Consider joining our teams as a Software Engineer!



Related posts