Does anyone have experience with implementing data streaming via CDC (change data capture) between on premises Oracle databases? I would like to understand the risks for this type of implementation. We are considering Oracle solutions like Golden Gate, Debezium, etc.
Sort by:
Hi Sergio,
My advice would be to keep it simple and use standard tooling such as GoldenGate, and only depart if there is a clear use-case that necessitates this. In general:
1. GoldenGate is the go-to for Oracle – solid and proven. Debezium can work too, but being open-source, it often needs extra components like Kafka connectors.
2. Go log-based – it’s lighter on the source system. If that’s not an option, consider using a replica to reduce impact. Make sure logs are retained and set up properly.
3. Plan for restarts – CDC runs well until something breaks. The longer the downtime, the harder it is to recover. Build in monitoring and fault-tolerance so you don’t have to start from scratch.
4. Check your network – CDC can push a lot of data. Make sure paths are optimised and not competing with other critical traffic.
5. Filter smartly – don’t move everything. Validate and select only what you need to keep things lean.
I have not worked with the Oracle tech stack for this specifically; however, I have worked with other similar features in other database engines. Here's my general advice:
This type of activity can be highly impacted by packet loss and latency. Make sure that your network latency is as low as possible. Make sure that your network path between the servers is clean and stable, i.e. no MTU mismatches, no excessive packet fragmentation, no suppression of critical ICMP messages, sufficient bandwidth for the traffic, etc.
On an internal network with IPv4, at a minimum, these ICMP message types must be allowed:
Echo Reply - Type 0
Destination Unreachable - Type 3
Echo - Type 8
Time Exceeded - Type 11
Parameter Problem - Type 12
Rate limiting ICMP to prevent denial of service is ok.
For IPv6 and ICMPv6 make sure that your networking team is familiar with the recommendations in RFC 4890.
Make sure that you do your initial snapshot either during a planned outage or a low use time window. The logs used for this can roll off before you complete the process otherwise and could terminate it with errors.
Make sure that the disk IOPS on the target machine is sufficient to handle committing the data stream without excessive queuing.
Research performance tuning both for your database engine and your CDC technology.
CDC solutions like Oracle Golden Gate need agent to be installed on source systems and they help to capture changes in near realtime based on events. We have it implemented in one of our locations.
Another way is to capture changes using ETL pipielines but that would be more batch oriented.
oracle solutions, like golden gate, and others, like debezium.

I would dig into the use case, what are you trying to achive.
If you need that data replicated, there are clear "oracle" ways of doing this, and perhaps you need a new read replica?
If you're considering streaming the data out for other use cases (like Debezium might provide) I would dig into you and your teams readyness to go event driven in such a way. These projects are often far more difficult than selecting the data movement technology.