Oracle 使用 Apache kafka 复制数据

2021-11-12 00:00:00 oracle apache-kafka

我想从我的 oracle 数据库中公开数据表并公开到 apache kafka 中.技术上可行吗?同样,我需要从我的 oracle 表中流式传输数据更改并将其通知给 Kafka.你知道这个用例的好文档吗?谢谢

解决方案

  1. 您需要 Kafka Connect JDBC 源连接器才能从 Oracle 数据库加载数据.Confluent 有一个开源的捆绑连接器.它已经与 Confluent 平台的其余部分一起打包和测试,包括模式注册表.使用此连接器就像编写简单的连接器配置并启动独立的 Kafka Connect 进程或向 Kafka Connect 集群发出 REST 请求一样简单.可以在此处找到此连接器的文档/p>

  2. 要将变更数据从 Oracle 事务数据库实时移动到 Kafka,您需要首先使用变更数据捕获 (CDC) 专有工具,该工具需要购买商业许可,例如 Oracle 的 Golden Gate、Attunity Replicate、Dbvisit复制或 Striim.然后,您可以利用它们都提供的 Kafka Connect 连接器.它们都列在此处

  3. Debezium 是 Redhat 的一个开源 CDC 工具,它计划开发一种不依赖于 Oracle Golden Gate 许可的连接器.相关的 JIRA 是这里.

I would like to expose the data table from my oracle database and expose into apache kafka. is it technicaly possible? As well i need to stream data change from my oracle table and notify it to Kafka. do you know good documentation of this use case? thanks

解决方案

  1. You need Kafka Connect JDBC source connector to load data from your Oracle database. There is an open source bundled connector from Confluent. It has been packaged and tested with the rest of the Confluent Platform, including the schema registry. Using this connector is as easy as writing a simple connector configuration and starting a standalone Kafka Connect process or making a REST request to a Kafka Connect cluster. Documentation for this connector can be found here

  2. To move change data in real-time from Oracle transactional databases to Kafka you need to first use a Change Data Capture (CDC) proprietary tool which requires purchasing a commercial license such as Oracle’s Golden Gate, Attunity Replicate, Dbvisit Replicate or Striim. Then, you can leverage the Kafka Connect connectors that they all provide. They are all listed here

  3. Debezium, an open source CDC tool from Redhat, is planning to work on a connector that is not relying on Oracle Golden Gate license. The related JIRA is here.

相关文章