Some years ago, I would have run my blog on a simple LAMP/LEMP installation on a single well equipped virtual server with a couple of GB of memory and a couple of CPU cores. Wow, how many years have gone since then! The wasted time I spent to recover my VMs after a kernel update for security patches that would impact my php or Apache installations. Every time, it was a pain!
Transferring data from one data source to a specific target almost real-time implies most of the time the use of a message broker that would ingest and distribute the data.
The scenario that I’m going to describe you here, instead, is based on a near real-time end-to-end solution that uses Azure Data Factory as the orchestrator.
For both solution, though, you would need a Change Data Capture feature, or similar depending on your sources, to track and store data changes.
As I’ve spent some time trying figuring out how to get all the table changes on a big database without…