从 Oracle 迁移到 MySQL

2021-11-20 00:00:00 import migration dump mysql oracle

我们的 Oracle 数据库遇到了严重的性能问题,我们想尝试将其迁移到基于 MySQL 的数据库(直接使用 MySQL,或者更佳为 Infobright).

We ran into serious performance problems with our Oracle database and we would like to try to migrate it to a MySQL-based database (either MySQL directly or, more preferably, Infobright).

问题是,在我们真正知道新数据库的所有功能是否符合我们的需求之前,我们需要让新旧系统至少重叠几周甚至几个月.

The thing is, we need to let the old and the new system overlap for at least some weeks if not months, before we actually know, if all the features of the new database match our needs.

所以,这是我们的情况:

So, here is our situation:

Oracle 数据库由多个表组成,每个表有数百万行.白天,确实有成千上万的语句,我们无法停止迁移.

The Oracle database consists of multiple tables with each millions of rows. During the day, there are literally thousands of statements, which we cannot stop for migration.

每天早上,新数据都会导入 Oracle 数据库,替换数千行.复制这个过程不是问题,所以理论上我们可以在两个数据库中并行导入.

Every morning, new data is imported into the Oracle database, replacing some thousands of rows. Copying this process is not a problem, so we could, in theory, import in both databases in parallel.

但是,这里的挑战在于,要使其发挥作用,我们需要从 Oracle 数据库导出从某一天起具有一致状态的数据.(我们不能在周一导出一些表,而在周二导出其他一些表,等等.)这意味着,至少应该在不到一天的时间内完成导出.

But, and here the challenge lies, for this to work we need to have an export from the Oracle database with a consistent state from one day. (We cannot export some tables on Monday and some others on Tuesday, etc.) This means, that at least the export should be finished in less than one day.

我们的第一个想法是转储模式,但我找不到将 Oracle 转储文件导入 MySQL 的工具.导出 CSV 文件中的表格可能有效,但恐怕需要太长时间.

Our first thought was to dump the schema, but I wasn't able to find a tool to import an Oracle dump file into MySQL. Exporting tables in CSV files might work, but I'm afraid it could take too long.

所以我现在的问题是:

我该怎么办?是否有任何工具可以将 Oracle 转储文件导入 MySQL?有没有人有过这么大规模迁移的经验?

What should I do? Is there any tool to import Oracle dump files into MySQL? Does anybody have any experience with such a large-scale migration?

PS:请不要建议 Oracle 的性能优化技术,我们已经尝试了很多 :-)

PS: Please, don't suggest performance optimization techniques for Oracle, we already tried a lot :-)

我们之前已经尝试过一些 ETL 工具,结果发现它们不够快:仅导出一张表已经花费了 4 个多小时......

We already tried some ETL tools before, only to find out, that they were not fast enough: Exporting only one table already took more than 4 hours ...

第二次 来吧伙计们......没有人尝试过尽可能快地导出整个数据库并转换数据以便将其导入另一个数据库系统吗?

2nd Come on folks ... did nobody ever try to export a whole database as fast as possible and convert the data so that it can be imported into another database system?

推荐答案

Oracle 不提供开箱即用的卸载实用程序.

Oracle does not supply an out-of-the-box unload utility.

请记住,如果没有关于您的环境的全面信息(oracle 版本?服务器平台?有多少数据?什么数据类型?)这里的一切都是 YMMV,您希望在您的系统上尝试一下以提高性能和时间.

Keep in mind without comprehensive info about your environment (oracle version? server platform? how much data? what datatypes?) everything here is YMMV and you would want to give it a go on your system for performance and timing.

我的第 1-3 点只是一般的数据移动想法.第 4 点是一种将停机时间或中断减少到几分钟或几秒钟的方法.

My points 1-3 are just generic data movement ideas. Point 4 is a method that will reduce downtime or interruption to minutes or seconds.

1) 有 3rd 方实用程序可用.我已经使用了其中的一些,但最好让您根据自己的预期目的自行检查.此处列出了一些 3rd 方产品:OraFaq.不幸的是,它们中的很多都在 Windows 上运行,这会减慢数据卸载过程,除非您的数据库服务器在 Windows 上并且您可以直接在服务器上运行加载实用程序.

1) There are 3rd party utilities available. I have used a few of these but best for you to check them out yourself for your intended purpose. A few 3rd party products are listed here: OraFaq . Unfortunately a lot of them run on Windows which would slow down the data unload process unless your DB server was on windows and you could run the load utility directly on the server.

2) 如果您没有像 LOB 这样的任何复杂数据类型,那么您可以使用 SQLPLUS 推出自己的数据类型.如果你一次做一个表,那么你可以很容易地并行化它.本网站上的主题可能不止一次被访问过,这里是一个例子:Linky

2) If you don't have any complex datatypes like LOBs then you can roll your own with SQLPLUS. If you did a table at a time then you can easily parallelize it. Topic has been visited on this site probably more than once, here is an example: Linky

3) 如果您是 10g+,那么外部表可能是完成此任务的一种高效方式.如果创建一些与当前表结构相同的空白外部表并将数据复制到其中,则数据将转换为外部表格式(文本文件).再次,OraFAQ 到救援.

3) If you are 10g+ then External Tables might be a performant way to accomplish this task. If you create some blank external tables with the same structure as your current tables and copy the data to them, the data will be converted to the external table format (a text file). Once again, OraFAQ to the rescue.

4) 如果您必须使系统并行运行数天/数周/数月,则使用变更数据捕获/应用工具来实现接近零的停机时间.准备支付$$$.我使用了 Golden Gate Software 的工具,该工具可以挖掘 Oracle 重做日志并向 MySQL 数据库提供插入/更新语句.您可以在上线前一周迁移大量数据而无需停机.然后在您的上线期间,关闭源数据库,让 Golden Gate 赶上最后剩余的事务,然后打开对新目标数据库的访问.我已将其用于升级,并且赶上时间仅为几分钟.我们已经拥有 Golden Gate 的站点许可证,因此对我们来说不是什么自掏腰包的东西.

4) If you must keep systems in parallel for days/weeks/months then use a change data capture/apply tool for near-zero downtime. Be prepared to pay $$$. I have used Golden Gate Software's tool that can mine the Oracle redo logs and supply insert/update statements to a MySQL Database. You can migrate the bulk of the data with no downtime the week before go-live. Then during your go-live period, shut down the source database, have Golden Gate catch up the last remaining transactions, then open up access to your new target database. I have used this for upgrades and the catch up period was only a few minutes. We already had a site licenses for Golden Gate so it wasn't anything out of pocket for us.

我将在这里扮演 Cranky DBA 的角色,并说如果您不能让 Oracle 表现良好,我很乐意看到有关 MySQL 如何解决您的特定问题的文章.如果您的应用程序无法触及 SQL,那么仍​​有许多可能的方法来调整 Oracle./肥皂盒

And I'll play the role of Cranky DBA here and say if you can't get Oracle performing well I would love to see a write up of how MySQL fixed your particular issues. If you have an application where you can't touch the SQL, there are still lots of possible ways to tune Oracle. /soapbox

相关文章