如何将mysql表转移到hive?
我有一个很大的 mysql 表,我想将它转移到一个 Hadoop/Hive 表.是否有标准命令或技术可以将简单(但很大)的表从 Mysql 传输到 Hive?该表主要存储分析数据.
I have a large mysql table that I would like to transfer to a Hadoop/Hive table. Are there standard commands or techniques to transfer a simple (but large) table from Mysql to Hive? The table stores mostly analytics data.
推荐答案
首先下载mysql-connector-java-5.0.8,把jar包放到Sqoop的lib和bin文件夹
First of all download mysql-connector-java-5.0.8 and put the jar to lib and bin folder of Sqoop
在 Hive 中创建表定义,使用 确切的字段名称和类型,就像在 mysql 中一样
Create the table definition in Hive with exact field names and types as in mysql
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://localhost/test --table employee --hive-import --warehouse-dir/user/hive/warehouse--fields-terminated-by ',' --split-by id --hive-table 员工
sqoop import --verbose --fields-terminated-by ',' --connect jdbc:mysql://localhost/test --table employee --hive-import --warehouse-dir /user/hive/warehouse --fields-terminated-by ',' --split-by id --hive-table employee
测试 - 数据库名称
employee - 表名(存在于测试中)
employee - Table name (present in test)
/user/hive/warehouse - HDFS 中需要导入数据的目录
/user/hive/warehouse - Directory in HDFS where the data has to be imported
--split-by id - id可以是'employee'表的主键
--split-by id - id can be the primary key of the table 'employee'
--hive-table employee - 其定义存在于 Hive 中的雇员表
--hive-table employee - employee table whose definition is present in Hive
Sqoop 用户指南(学习 Sqoop 的最佳指南之一)
Sqoop User Guide (One of the best guide for learning Sqoop)
相关文章