|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use ImportException | |
---|---|
com.cloudera.sqoop.manager | |
com.cloudera.sqoop.mapreduce | |
com.cloudera.sqoop.tool |
Uses of ImportException in com.cloudera.sqoop.manager |
---|
Methods in com.cloudera.sqoop.manager that throw ImportException | |
---|---|
void |
SqlManager.importTable(ImportJobContext context)
Default implementation of importTable() is to launch a MapReduce job via DataDrivenImportJob to read the table with DataDrivenDBInputFormat. |
void |
PostgresqlManager.importTable(ImportJobContext context)
|
abstract void |
ConnManager.importTable(ImportJobContext context)
Perform an import of a table from the database into HDFS. |
void |
MySQLManager.importTable(ImportJobContext context)
|
void |
DirectPostgresqlManager.importTable(ImportJobContext context)
|
void |
DirectMySQLManager.importTable(ImportJobContext context)
Import the table into HDFS by using mysqldump to pull out the data from the database and upload the files directly to HDFS. |
void |
OracleManager.importTable(ImportJobContext context)
|
Uses of ImportException in com.cloudera.sqoop.mapreduce |
---|
Methods in com.cloudera.sqoop.mapreduce that throw ImportException | |
---|---|
void |
ImportJobBase.runImport(java.lang.String tableName,
java.lang.String ormJarFile,
java.lang.String splitByCol,
org.apache.hadoop.conf.Configuration conf)
Run an import job to read a table in to HDFS. |
Uses of ImportException in com.cloudera.sqoop.tool |
---|
Methods in com.cloudera.sqoop.tool that throw ImportException | |
---|---|
protected void |
ImportTool.importTable(SqoopOptions options,
java.lang.String tableName,
HiveImport hiveImport)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |