sqoop使用

来源:互联网 发布:国债期货 知乎 编辑:程序博客网 时间:2024/05/22 03:06
  1. list-databases 列出数据库
    sqoop list-databases –connect jdbc:oracle:thin:@172.21.202.4:1521:TJZHCSDEV –username tjzhcs –password tjzhcs

  2. list-tables 列出某数据库下的表
    sqoop list-tables–connect jdbc:oracle:thin:@172.21.202.4:1521:TJZHCSDEV –username tjzhcs –password tjzhcs

  3. eval 可以快速地使用SQL语句对关系数据库进行操作,这可以使得在使用import这种工具进行数据导入的时候,可以预先了解相关的SQL语句是否正确,并能将结果显示在控制台
    sqoop eval –connect jdbc:oracle:thin:@172.21.202.4:1521:TJZHCSDEV –username tjzhcs –password tjzhcs -e “select * from b_code”

  4. create-hive-table 生成与关系数据库表的表结构对应的HIVE表
    sqoop create-hive-table –connect jdbc:oracle:thin:@172.21.202.4:1521:TJZHCSDEV –username tjzhcs –password tjzhcs -table B_CODE -hive-table h_b_code(注:oracle导入的时候表名需要使用大写!!!)

  5. 导入oracle表到hive中
    sqoop import –connect jdbc:oracle:thin:@172.21.202.4:1521:TJZHCSDEV –username tjzhcs –password tjzhcs –table B_CODE –hive-table h_b_code –hive-import
    Hive arguments: Argument Description
    --hive-home <dir> Override $HIVE_HOME
    --hive-import Import tables into Hive (Uses Hive’s default delimiters if none are set.)
    --hive-overwrite Overwrite existing data in the Hive table.
    --create-hive-table If set, then the job will fail if the target hive
    table exits. By default this property is false.
    --hive-table <table-name> Sets the table name to use when importing to Hive.
    --hive-drop-import-delims Drops \n, \r, and \01 from string fields when importing to Hive.
    --hive-delims-replacement Replace \n, \r, and \01 from string fields with user defined string when importing to Hive.
    --hive-partition-key Name of a hive field to partition are sharded on
    --hive-partition-value <v> String-value that serves as partition key for this imported into hive in this job.
    --map-column-hive <map> Override default mapping from SQL type to Hive type for configured columns.

  6. sqoop list-databases –connect jdbc:mysql://172.21.80.123:3306/ –username root –password 1234

sqoop import --connect jdbc:mysql://172.21.80.123:3306/hadoop --username root --password 1234 --table lsemp --hbase-table lsemp --hbase-create-table --hbase-row-key empno --column-family infoHBase arguments:```Argument    Description--column-family <family>    Sets the target column family for the import--hbase-create-table    If specified, create missing HBase tables--hbase-row-key <col>   Specifies which input column to use as the row keyIn case, if input table contains compositekey, then <col> must be in the form of acomma-separated list of composite keyattributes--hbase-table <table-name>  Specifies an HBase table to use as the target instead of HDFS--hbase-bulkload    Enables bulk loading
mysql相关1.执行sqlsqoop eval --connect jdbc:mysql://localhost:3306/hive --username root --password 123456 -e 'show tables'2.列出库sqoop list-databases --connect jdbc:mysql://localhost:3306/ --username root --password 1234563.导入mysql数据到hivesqoop import --connect jdbc:mysql://localhost:3306/test --username root --password 123456 --table student --hive-table student  --hive-import --hive-database test4.导出hive数据到mysqlsqoop export --connect jdbc:mysql://localhost:3306/test --username root --password 123456  --table word_count -export-dir  /user/hive/warehouse/test.db/word_count --input-fields-terminated-by '\001'
原创粉丝点击