|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Uses of HiveException in org.apache.hadoop.hive.ql.exec |
---|
Subclasses of HiveException in org.apache.hadoop.hive.ql.exec | |
---|---|
class |
AmbiguousMethodException
Exception thrown by the UDF and UDAF method resolvers in case a unique method is not found. |
class |
NoMatchingMethodException
Exception thrown by the UDF and UDAF method resolvers in case no matching method is found. |
class |
UDFArgumentException
exception class, thrown when udf argument have something wrong. |
class |
UDFArgumentLengthException
exception class, thrown when udf arguments have wrong length. |
class |
UDFArgumentTypeException
exception class, thrown when udf arguments have wrong types. |
Methods in org.apache.hadoop.hive.ql.exec that throw HiveException | |
---|---|
protected Object |
ExprNodeColumnEvaluator._evaluate(Object row,
int version)
|
protected abstract Object |
ExprNodeEvaluator._evaluate(Object row,
int version)
Evaluate value |
protected Object |
ExprNodeGenericFuncEvaluator._evaluate(Object row,
int version)
|
protected Object |
ExprNodeNullEvaluator._evaluate(Object row,
int version)
|
protected Object |
ExprNodeFieldEvaluator._evaluate(Object row,
int version)
|
protected Object |
ExprNodeEvaluatorRef._evaluate(Object row,
int version)
|
protected Object |
ExprNodeEvaluatorHead._evaluate(Object row,
int version)
|
protected Object |
ExprNodeConstantEvaluator._evaluate(Object row,
int version)
|
void |
FileSinkOperator.FSPaths.abortWriters(FileSystem fs,
boolean abort,
boolean delete)
|
static URI |
ArchiveUtils.addSlash(URI u)
Makes sure, that URI points to directory by adding slash to it. |
void |
PTFPartition.append(Object o)
|
protected void |
CommonJoinOperator.checkAndGenObject()
|
void |
Operator.cleanUpInputFileChanged()
|
void |
MapJoinOperator.cleanUpInputFileChangedOp()
|
void |
TableScanOperator.cleanUpInputFileChangedOp()
|
void |
MapOperator.cleanUpInputFileChangedOp()
|
void |
SMBMapJoinOperator.cleanUpInputFileChangedOp()
|
void |
Operator.cleanUpInputFileChangedOp()
|
void |
FetchTask.clearFetch()
Clear the Fetch Operator. |
void |
FetchOperator.clearFetchContext()
Clear the context, if anything needs to be done. |
void |
SkewJoinHandler.close(boolean abort)
|
void |
Operator.close(boolean abort)
|
void |
ScriptOperator.close(boolean abort)
|
void |
MapJoinOperator.closeOp(boolean abort)
|
void |
CommonJoinOperator.closeOp(boolean abort)
All done. |
void |
TableScanOperator.closeOp(boolean abort)
|
void |
HashTableSinkOperator.closeOp(boolean abort)
|
protected void |
PTFOperator.closeOp(boolean abort)
|
void |
MapOperator.closeOp(boolean abort)
close extra child operators that are initialized but are not executed. |
void |
SMBMapJoinOperator.closeOp(boolean abort)
|
protected void |
UDTFOperator.closeOp(boolean abort)
|
void |
HashTableDummyOperator.closeOp(boolean abort)
|
protected void |
DemuxOperator.closeOp(boolean abort)
|
void |
JoinOperator.closeOp(boolean abort)
All done. |
protected void |
Operator.closeOp(boolean abort)
Operator specific close routine. |
void |
LimitOperator.closeOp(boolean abort)
|
protected void |
ReduceSinkOperator.closeOp(boolean abort)
|
void |
GroupByOperator.closeOp(boolean abort)
We need to forward all the aggregations to children. |
void |
FileSinkOperator.closeOp(boolean abort)
|
protected void |
MuxOperator.closeOp(boolean abort)
|
void |
FileSinkOperator.FSPaths.closeWriters(boolean abort)
|
Integer |
ExprNodeGenericFuncEvaluator.compare(Object row)
If the genericUDF is a base comparison, it returns an integer based on the result of comparing the two sides of the UDF, like the compareTo method in Comparable. |
static ArrayList<Object> |
JoinUtil.computeKeys(Object row,
List<ExprNodeEvaluator> keyFields,
List<ObjectInspector> keyFieldsOI)
Return the key as a standard object. |
static MapJoinKey |
JoinUtil.computeMapJoinKeys(MapJoinKey key,
Object row,
List<ExprNodeEvaluator> keyFields,
List<ObjectInspector> keyFieldsOI)
Return the key as a standard object. |
static Object[] |
JoinUtil.computeMapJoinValues(Object row,
List<ExprNodeEvaluator> valueFields,
List<ObjectInspector> valueFieldsOI,
List<ExprNodeEvaluator> filters,
List<ObjectInspector> filtersOI,
int[] filterMap)
Return the value as a standard object. |
static ArrayList<Object> |
JoinUtil.computeValues(Object row,
List<ExprNodeEvaluator> valueFields,
List<ObjectInspector> valueFieldsOI,
boolean hasFilter)
Return the value as a standard object. |
static String |
ArchiveUtils.conflictingArchiveNameOrNull(Hive db,
Table tbl,
LinkedHashMap<String,String> partSpec)
Determines if one can insert into partition(s), or there's a conflict with archive. |
static void |
PTFOperator.connectLeadLagFunctionsToPartition(PTFDesc ptfDesc,
PTFPartition.PTFPartitionIterator<Object> pItr)
|
static PTFPartition |
PTFPartition.create(HiveConf cfg,
SerDe serDe,
StructObjectInspector inputOI,
StructObjectInspector outputOI)
|
static ArchiveUtils.PartSpecInfo |
ArchiveUtils.PartSpecInfo.create(Table tbl,
Map<String,String> partSpec)
Extract partial prefix specification from table and key-value map |
PTFPartition |
PTFOperator.createFirstPartitionForChain(ObjectInspector oi,
HiveConf hiveConf,
boolean isMapSide)
Create a new Partition. |
Path |
ArchiveUtils.PartSpecInfo.createPath(Table tbl)
Creates path where partitions matching prefix should lie in filesystem |
void |
CommonJoinOperator.endGroup()
Forward a record of join results. |
void |
DemuxOperator.endGroup()
|
void |
JoinOperator.endGroup()
Forward a record of join results. |
void |
Operator.endGroup()
|
void |
GroupByOperator.endGroup()
|
void |
MuxOperator.endGroup()
|
Object |
ExprNodeEvaluator.evaluate(Object row)
|
protected Object |
ExprNodeEvaluator.evaluate(Object row,
int version)
Evaluate the expression given the row. |
void |
Operator.flush()
|
void |
GroupByOperator.flush()
Forward all aggregations to children. |
protected void |
GroupByOperator.forward(Object[] keys,
GenericUDAFEvaluator.AggregationBuffer[] aggs)
Forward a record of keys and aggregation results. |
void |
DemuxOperator.forward(Object row,
ObjectInspector rowInspector)
|
protected void |
Operator.forward(Object row,
ObjectInspector rowInspector)
|
void |
MuxOperator.forward(Object row,
ObjectInspector rowInspector)
|
void |
UDTFOperator.forwardUDTFOutput(Object o)
forwardUDTFOutput is typically called indirectly by the GenericUDTF when the GenericUDTF has generated output rows that should be passed on to the next operator(s) in the DAG. |
void |
MapJoinOperator.generateMapMetaData()
|
static ExprNodeEvaluator |
ExprNodeEvaluatorFactory.get(ExprNodeDesc desc)
|
static int |
ArchiveUtils.getArchivingLevel(Partition p)
Returns archiving level, which is how many fields were set in partial specification ARCHIVE was run for |
Object |
PTFPartition.getAt(int i)
|
protected ArrayList<Object> |
CommonJoinOperator.getFilteredValue(byte alias,
Object row)
|
static List<LinkedHashMap<String,String>> |
Utilities.getFullDPSpecs(Configuration conf,
DynamicPartitionCtx dpCtx)
Construct a list of full partition spec from Dynamic Partition Context and the directory names corresponding to these dynamic partitions. |
URI |
ArchiveUtils.HarPathHelper.getHarUri(URI original,
HadoopShims shim)
|
String |
ArchiveUtils.PartSpecInfo.getName()
Generates name for prefix partial partition specification. |
static List<ObjectInspector>[] |
JoinUtil.getObjectInspectorsFromEvaluators(List<ExprNodeEvaluator>[] exprEntries,
ObjectInspector[] inputObjInspector,
int posBigTableAlias,
int tagLen)
|
ObjectInspector |
FetchOperator.getOutputObjectInspector()
returns output ObjectInspector, never null |
static String |
ArchiveUtils.getPartialName(Partition p,
int level)
Get a prefix of the given parition's string representation. |
static PartitionDesc |
Utilities.getPartitionDesc(Partition part)
|
static PartitionDesc |
Utilities.getPartitionDescFromTableDesc(TableDesc tblDesc,
Partition part)
|
static RowContainer<List<Object>> |
JoinUtil.getRowContainer(Configuration hconf,
List<ObjectInspector> structFieldObjectInspectors,
Byte alias,
int containerSize,
TableDesc[] spillTableDesc,
JoinDesc conf,
boolean noFilter,
Reporter reporter)
|
void |
SkewJoinHandler.handleSkew(int tag)
|
protected static ObjectInspector[] |
Operator.initEvaluators(ExprNodeEvaluator[] evals,
int start,
int length,
ObjectInspector rowInspector)
Initialize an array of ExprNodeEvaluator from start, for specified length and return the result ObjectInspectors. |
protected static ObjectInspector[] |
Operator.initEvaluators(ExprNodeEvaluator[] evals,
ObjectInspector rowInspector)
Initialize an array of ExprNodeEvaluator and return the result ObjectInspectors. |
protected static StructObjectInspector |
ReduceSinkOperator.initEvaluatorsAndReturnStruct(ExprNodeEvaluator[] evals,
List<List<Integer>> distinctColIndices,
List<String> outputColNames,
int length,
ObjectInspector rowInspector)
Initializes array of ExprNodeEvaluator. |
protected static StructObjectInspector |
Operator.initEvaluatorsAndReturnStruct(ExprNodeEvaluator[] evals,
List<String> outputColName,
ObjectInspector rowInspector)
Initialize an array of ExprNodeEvaluator and put the return values into a StructObjectInspector with integer field names. |
void |
Operator.initialize(Configuration hconf,
ObjectInspector[] inputOIs)
Initializes operators only if all parents have been initialized. |
protected void |
Operator.initialize(Configuration hconf,
ObjectInspector inputOI,
int parentId)
Collects all the parent's output object inspectors and calls actual initialization method. |
ObjectInspector |
ExprNodeColumnEvaluator.initialize(ObjectInspector rowInspector)
|
abstract ObjectInspector |
ExprNodeEvaluator.initialize(ObjectInspector rowInspector)
Initialize should be called once and only once. |
ObjectInspector |
ExprNodeGenericFuncEvaluator.initialize(ObjectInspector rowInspector)
|
ObjectInspector |
ExprNodeNullEvaluator.initialize(ObjectInspector rowInspector)
|
ObjectInspector |
ExprNodeFieldEvaluator.initialize(ObjectInspector rowInspector)
|
ObjectInspector |
ExprNodeEvaluatorRef.initialize(ObjectInspector rowInspector)
|
ObjectInspector |
ExprNodeEvaluatorHead.initialize(ObjectInspector rowInspector)
|
ObjectInspector |
ExprNodeConstantEvaluator.initialize(ObjectInspector rowInspector)
|
void |
MapOperator.initializeAsRoot(Configuration hconf,
MapWork mapWork)
Initializes this map op as the root of the tree. |
protected void |
DemuxOperator.initializeChildren(Configuration hconf)
|
protected void |
Operator.initializeChildren(Configuration hconf)
Calls initialize on each of the children with outputObjetInspector as the output row format. |
protected void |
MuxOperator.initializeChildren(Configuration hconf)
Calls initialize on each of the children with outputObjetInspector as the output row format. |
void |
SMBMapJoinOperator.initializeLocalWork(Configuration hconf)
|
void |
Operator.initializeLocalWork(Configuration hconf)
|
void |
SMBMapJoinOperator.initializeMapredLocalWork(MapJoinDesc mjConf,
Configuration hconf,
MapredLocalWork localWork,
org.apache.commons.logging.Log l4j)
|
protected void |
MapJoinOperator.initializeOp(Configuration hconf)
|
protected void |
CommonJoinOperator.initializeOp(Configuration hconf)
|
protected void |
LateralViewJoinOperator.initializeOp(Configuration hconf)
|
protected void |
TableScanOperator.initializeOp(Configuration hconf)
|
protected void |
HashTableSinkOperator.initializeOp(Configuration hconf)
|
protected void |
PTFOperator.initializeOp(Configuration jobConf)
|
void |
MapOperator.initializeOp(Configuration hconf)
|
protected void |
SMBMapJoinOperator.initializeOp(Configuration hconf)
|
protected void |
UDTFOperator.initializeOp(Configuration hconf)
|
protected void |
HashTableDummyOperator.initializeOp(Configuration hconf)
|
protected void |
UnionOperator.initializeOp(Configuration hconf)
UnionOperator will transform the input rows if the inputObjInspectors from different parents are different. |
protected void |
DemuxOperator.initializeOp(Configuration hconf)
|
protected void |
JoinOperator.initializeOp(Configuration hconf)
|
protected void |
Operator.initializeOp(Configuration hconf)
Operator specific initialization. |
protected void |
SelectOperator.initializeOp(Configuration hconf)
|
protected void |
ExtractOperator.initializeOp(Configuration hconf)
|
protected void |
AbstractMapJoinOperator.initializeOp(Configuration hconf)
|
protected void |
LimitOperator.initializeOp(Configuration hconf)
|
protected void |
ReduceSinkOperator.initializeOp(Configuration hconf)
|
protected void |
CollectOperator.initializeOp(Configuration hconf)
|
protected void |
ScriptOperator.initializeOp(Configuration hconf)
|
protected void |
GroupByOperator.initializeOp(Configuration hconf)
|
protected void |
FilterOperator.initializeOp(Configuration hconf)
|
protected void |
FileSinkOperator.initializeOp(Configuration hconf)
|
protected void |
MuxOperator.initializeOp(Configuration hconf)
|
protected void |
ListSinkOperator.initializeOp(Configuration hconf)
|
protected void |
DummyStoreOperator.initializeOp(Configuration hconf)
|
static Object |
FunctionRegistry.invoke(Method m,
Object thisObject,
Object... arguments)
|
protected static short |
JoinUtil.isFiltered(Object row,
List<ExprNodeEvaluator> filters,
List<ObjectInspector> ois,
int[] filterMap)
Returns true if the row does not pass through filters. |
PTFPartition.PTFPartitionIterator<Object> |
PTFPartition.iterator()
|
void |
Operator.jobClose(Configuration conf,
boolean success,
JobCloseFeedBack feedBack)
Unlike other operator interfaces which are called from map or reduce task, jobClose is called from the jobclient side once the job has completed. |
void |
JoinOperator.jobCloseOp(Configuration hconf,
boolean success,
JobCloseFeedBack feedBack)
|
void |
Operator.jobCloseOp(Configuration conf,
boolean success,
JobCloseFeedBack feedBack)
|
void |
FileSinkOperator.jobCloseOp(Configuration hconf,
boolean success,
JobCloseFeedBack feedBack)
|
T |
PTFPartition.PTFPartitionIterator.lag(int amt)
|
T |
PTFPartition.PTFPartitionIterator.lead(int amt)
|
static void |
Utilities.mvFileToFinalPath(String specPath,
Configuration hconf,
boolean success,
org.apache.commons.logging.Log log,
DynamicPartitionCtx dpCtx,
FileSinkDesc conf,
Reporter reporter)
|
protected GenericUDAFEvaluator.AggregationBuffer[] |
GroupByOperator.newAggregations()
|
static int |
JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap,
Map<Byte,List<ExprNodeDesc>> inputMap,
Byte[] order,
int posBigTableAlias)
|
static int |
JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap,
Map<Byte,List<ExprNodeDesc>> inputMap,
int posBigTableAlias)
|
Object |
MuxOperator.Handler.process(Object row)
|
void |
Operator.process(Object row,
int tag)
Process the row. |
void |
MapOperator.process(Writable value)
|
void |
Operator.processGroup(int tag)
|
void |
MuxOperator.processGroup(int tag)
|
protected void |
PTFOperator.processInputPartition()
|
protected void |
PTFOperator.processMapFunction()
|
void |
MapJoinOperator.processOp(Object row,
int tag)
|
void |
LateralViewJoinOperator.processOp(Object row,
int tag)
An important assumption for processOp() is that for a given row from the TS, the LVJ will first get the row from the left select operator, followed by all the corresponding rows from the UDTF operator. |
void |
TableScanOperator.processOp(Object row,
int tag)
Other than gathering statistics for the ANALYZE command, the table scan operator does not do anything special other than just forwarding the row. |
void |
HashTableSinkOperator.processOp(Object row,
int tag)
|
void |
PTFOperator.processOp(Object row,
int tag)
|
void |
MapOperator.processOp(Object row,
int tag)
|
void |
SMBMapJoinOperator.processOp(Object row,
int tag)
|
void |
UDTFOperator.processOp(Object row,
int tag)
|
void |
HashTableDummyOperator.processOp(Object row,
int tag)
|
void |
UnionOperator.processOp(Object row,
int tag)
|
void |
DemuxOperator.processOp(Object row,
int tag)
|
void |
JoinOperator.processOp(Object row,
int tag)
|
abstract void |
Operator.processOp(Object row,
int tag)
Process the row. |
void |
SelectOperator.processOp(Object row,
int tag)
|
void |
ExtractOperator.processOp(Object row,
int tag)
|
void |
LimitOperator.processOp(Object row,
int tag)
|
void |
ReduceSinkOperator.processOp(Object row,
int tag)
|
void |
LateralViewForwardOperator.processOp(Object row,
int tag)
|
void |
CollectOperator.processOp(Object row,
int tag)
|
void |
ScriptOperator.processOp(Object row,
int tag)
|
void |
GroupByOperator.processOp(Object row,
int tag)
|
void |
ForwardOperator.processOp(Object row,
int tag)
|
void |
FilterOperator.processOp(Object row,
int tag)
|
void |
FileSinkOperator.processOp(Object row,
int tag)
|
void |
MuxOperator.processOp(Object row,
int tag)
|
void |
ListSinkOperator.processOp(Object row,
int tag)
|
void |
DummyStoreOperator.processOp(Object row,
int tag)
|
boolean |
FetchOperator.pushRow()
Get the next row and push down it to operator tree. |
protected void |
FetchOperator.pushRow(InspectableObject row)
|
protected void |
PTFOperator.reconstructQueryDef(HiveConf hiveConf)
Initialize the visitor to use the QueryDefDeserializer Use the order defined in QueryDefWalker to visit the QueryDef |
static void |
Utilities.rename(FileSystem fs,
Path src,
Path dst)
Rename src to dst, or in the case dst already exists, move files in src to dst. |
static void |
Utilities.renameOrMoveFiles(FileSystem fs,
Path src,
Path dst)
Rename src to dst, or in the case dst already exists, move files in src to dst. |
void |
PTFPartition.reset()
|
void |
PTFPartition.PTFPartitionIterator.reset()
|
protected void |
GroupByOperator.resetAggregations(GenericUDAFEvaluator.AggregationBuffer[] aggs)
|
Object |
PTFPartition.PTFPartitionIterator.resetToIndex(int idx)
|
void |
MapOperator.setChildren(Configuration hconf)
|
protected void |
PTFOperator.setupKeysWrapper(ObjectInspector inputOI)
|
int |
DDLTask.showColumns(Hive db,
ShowColumnsDesc showCols)
|
void |
CommonJoinOperator.startGroup()
|
void |
DemuxOperator.startGroup()
|
void |
Operator.startGroup()
|
void |
GroupByOperator.startGroup()
|
void |
MuxOperator.startGroup()
|
static void |
FunctionRegistry.unregisterTemporaryUDF(String functionName)
|
protected void |
GroupByOperator.updateAggregations(GenericUDAFEvaluator.AggregationBuffer[] aggs,
Object row,
ObjectInspector rowInspector,
boolean hashAggr,
boolean newEntryForHashAggr,
Object[][] lastInvoke)
|
Constructors in org.apache.hadoop.hive.ql.exec that throw HiveException | |
---|---|
ArchiveUtils.HarPathHelper(HiveConf hconf,
URI archive,
URI originalBase)
Creates helper for archive. |
|
ExprNodeFieldEvaluator(ExprNodeFieldDesc desc)
|
|
ExprNodeGenericFuncEvaluator(ExprNodeGenericFuncDesc expr)
|
|
MuxOperator.Handler(ObjectInspector inputObjInspector,
List<ExprNodeDesc> keyCols,
List<ExprNodeDesc> valueCols,
List<String> outputKeyColumnNames,
List<String> outputValueColumnNames,
Integer tag)
|
|
PTFPartition(HiveConf cfg,
SerDe serDe,
StructObjectInspector inputOI,
StructObjectInspector outputOI)
|
|
SecureCmdDoAs(HiveConf conf)
|
Uses of HiveException in org.apache.hadoop.hive.ql.exec.mapjoin |
---|
Subclasses of HiveException in org.apache.hadoop.hive.ql.exec.mapjoin | |
---|---|
class |
MapJoinMemoryExhaustionException
|
Uses of HiveException in org.apache.hadoop.hive.ql.exec.mr |
---|
Methods in org.apache.hadoop.hive.ql.exec.mr that throw HiveException | |
---|---|
static void |
ExecDriver.main(String[] args)
|
Constructors in org.apache.hadoop.hive.ql.exec.mr that throw HiveException | |
---|---|
ExecDriver(MapredWork plan,
JobConf job,
boolean isSilent)
Constructor/Initialization for invocation as independent utility. |
|
MapredLocalTask(MapredLocalWork plan,
JobConf job,
boolean isSilent)
|
|
MapRedTask(MapredWork plan,
JobConf job,
boolean isSilent)
|
Uses of HiveException in org.apache.hadoop.hive.ql.exec.persistence |
---|
Methods in org.apache.hadoop.hive.ql.exec.persistence that throw HiveException | |
---|---|
void |
PTFRowContainer.add(Row t)
|
abstract void |
AbstractRowContainer.add(ROW t)
|
void |
RowContainer.add(ROW t)
|
abstract void |
AbstractRowContainer.clear()
Remove all elements in the RowContainer. |
void |
PTFRowContainer.clear()
|
void |
RowContainer.clear()
Remove all elements in the RowContainer. |
void |
PTFRowContainer.close()
|
protected void |
RowContainer.close()
|
void |
RowContainer.copyToDFSDirecory(FileSystem destFs,
Path destPath)
|
abstract ROW |
AbstractRowContainer.first()
|
Row |
PTFRowContainer.first()
|
ROW |
RowContainer.first()
|
Row |
PTFRowContainer.getAt(int rowIdx)
|
MapJoinTableContainer |
MapJoinTableContainerSerDe.load(ObjectInputStream in)
|
abstract ROW |
AbstractRowContainer.next()
|
Row |
PTFRowContainer.next()
|
ROW |
RowContainer.next()
|
protected boolean |
RowContainer.nextBlock(int readIntoOffset)
|
void |
MapJoinTableContainerSerDe.persist(ObjectOutputStream out,
MapJoinTableContainer tableContainer)
|
protected void |
RowContainer.setupWriter()
|
void |
TestPTFRowContainer.testLargeBlockSize()
|
void |
TestPTFRowContainer.testSmallBlockSize()
|
Constructors in org.apache.hadoop.hive.ql.exec.persistence that throw HiveException | |
---|---|
PTFRowContainer(int bs,
Configuration jc,
Reporter reporter)
|
|
RowContainer(Configuration jc,
Reporter reporter)
|
|
RowContainer(int bs,
Configuration jc,
Reporter reporter)
|
Uses of HiveException in org.apache.hadoop.hive.ql.index |
---|
Methods in org.apache.hadoop.hive.ql.index that throw HiveException | |
---|---|
void |
HiveIndexHandler.analyzeIndexDefinition(Table baseTable,
Index index,
Table indexTable)
Requests that the handler validate an index definition and fill in additional information about its stored representation. |
void |
AggregateIndexHandler.analyzeIndexDefinition(Table baseTable,
Index idx,
Table indexTable)
|
boolean |
HiveIndexResult.contains(FileSplit split)
|
List<Task<?>> |
HiveIndexHandler.generateIndexBuildTaskList(Table baseTbl,
Index index,
List<Partition> indexTblPartitions,
List<Partition> baseTblPartitions,
Table indexTbl,
Set<ReadEntity> inputs,
Set<WriteEntity> outputs)
Requests that the handler generate a plan for building the index; the plan should read the base table and write out the index representation. |
List<Task<?>> |
TableBasedIndexHandler.generateIndexBuildTaskList(Table baseTbl,
Index index,
List<Partition> indexTblPartitions,
List<Partition> baseTblPartitions,
Table indexTbl,
Set<ReadEntity> inputs,
Set<WriteEntity> outputs)
|
protected abstract Task<?> |
TableBasedIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs,
Set<WriteEntity> outputs,
List<FieldSchema> indexField,
boolean partitioned,
PartitionDesc indexTblPartDesc,
String indexTableName,
PartitionDesc baseTablePartDesc,
String baseTableName,
String dbName)
|
Constructors in org.apache.hadoop.hive.ql.index that throw HiveException | |
---|---|
HiveIndexResult(List<String> indexFiles,
JobConf conf)
|
Uses of HiveException in org.apache.hadoop.hive.ql.index.bitmap |
---|
Methods in org.apache.hadoop.hive.ql.index.bitmap that throw HiveException | |
---|---|
void |
BitmapIndexHandler.analyzeIndexDefinition(Table baseTable,
Index index,
Table indexTable)
|
protected Task<?> |
BitmapIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs,
Set<WriteEntity> outputs,
List<FieldSchema> indexField,
boolean partitioned,
PartitionDesc indexTblPartDesc,
String indexTableName,
PartitionDesc baseTablePartDesc,
String baseTableName,
String dbName)
|
Uses of HiveException in org.apache.hadoop.hive.ql.index.compact |
---|
Methods in org.apache.hadoop.hive.ql.index.compact that throw HiveException | |
---|---|
void |
CompactIndexHandler.analyzeIndexDefinition(Table baseTable,
Index index,
Table indexTable)
|
protected Task<?> |
CompactIndexHandler.getIndexBuilderMapRedTask(Set<ReadEntity> inputs,
Set<WriteEntity> outputs,
List<FieldSchema> indexField,
boolean partitioned,
PartitionDesc indexTblPartDesc,
String indexTableName,
PartitionDesc baseTablePartDesc,
String baseTableName,
String dbName)
|
Uses of HiveException in org.apache.hadoop.hive.ql.io |
---|
Methods in org.apache.hadoop.hive.ql.io that throw HiveException | |
---|---|
static boolean |
HiveFileFormatUtils.checkInputFormat(FileSystem fs,
HiveConf conf,
Class<? extends InputFormat> inputFormatCls,
ArrayList<FileStatus> files)
checks if files are in same format as the given input format. |
static FileSinkOperator.RecordWriter |
HiveFileFormatUtils.getHiveRecordWriter(JobConf jc,
TableDesc tableInfo,
Class<? extends Writable> outputClass,
FileSinkDesc conf,
Path outPath,
Reporter reporter)
|
static FileSinkOperator.RecordWriter |
HiveFileFormatUtils.getRecordWriter(JobConf jc,
HiveOutputFormat<?,?> hiveOutputFormat,
Class<? extends Writable> valueClass,
boolean isCompressed,
Properties tableProp,
Path outPath,
Reporter reporter)
|
Uses of HiveException in org.apache.hadoop.hive.ql.io.rcfile.merge |
---|
Methods in org.apache.hadoop.hive.ql.io.rcfile.merge that throw HiveException | |
---|---|
static Path |
RCFileMergeMapper.backupOutputPath(FileSystem fs,
Path outpath,
JobConf job)
|
static void |
RCFileMergeMapper.jobClose(String outputPath,
boolean success,
JobConf job,
SessionState.LogHelper console,
DynamicPartitionCtx dynPartCtx,
Reporter reporter)
|
Uses of HiveException in org.apache.hadoop.hive.ql.io.rcfile.truncate |
---|
Methods in org.apache.hadoop.hive.ql.io.rcfile.truncate that throw HiveException | |
---|---|
static Path |
ColumnTruncateMapper.backupOutputPath(FileSystem fs,
Path outpath,
JobConf job)
|
static void |
ColumnTruncateMapper.jobClose(String outputPath,
boolean success,
JobConf job,
SessionState.LogHelper console,
DynamicPartitionCtx dynPartCtx,
Reporter reporter)
|
Uses of HiveException in org.apache.hadoop.hive.ql.lockmgr |
---|
Subclasses of HiveException in org.apache.hadoop.hive.ql.lockmgr | |
---|---|
class |
LockException
Exception from lock manager. |
Uses of HiveException in org.apache.hadoop.hive.ql.metadata |
---|
Subclasses of HiveException in org.apache.hadoop.hive.ql.metadata | |
---|---|
class |
InvalidTableException
Generic exception class for Hive. |
Methods in org.apache.hadoop.hive.ql.metadata that throw HiveException | |
---|---|
void |
Hive.alterDatabase(String dbName,
Database db)
|
void |
Hive.alterIndex(String dbName,
String baseTblName,
String idxName,
Index newIdx)
Updates the existing index metadata with the new metadata. |
void |
Hive.alterPartition(String tblName,
Partition newPart)
Updates the existing partition metadata with the new metadata. |
void |
Hive.alterPartition(String dbName,
String tblName,
Partition newPart)
Updates the existing partition metadata with the new metadata. |
void |
Hive.alterPartitions(String tblName,
List<Partition> newParts)
Updates the existing table metadata with the new metadata. |
void |
Hive.alterTable(String tblName,
Table newTbl)
Updates the existing table metadata with the new metadata. |
void |
Hive.cancelDelegationToken(String tokenStrForm)
|
void |
HiveMetaStoreChecker.checkMetastore(String dbName,
String tableName,
List<? extends Map<String,String>> partitions,
CheckResult result)
Check the metastore for inconsistencies, data missing in either the metastore or on the dfs. |
void |
Table.checkValidity()
|
Table |
Table.copy()
|
protected static void |
Hive.copyFiles(HiveConf conf,
Path srcf,
Path destf,
FileSystem fs)
|
protected void |
Table.copyFiles(Path srcf)
Inserts files specified into the partition. |
void |
Hive.createDatabase(Database db)
Create a Database. |
void |
Hive.createDatabase(Database db,
boolean ifNotExist)
Create a database |
void |
Hive.createIndex(String tableName,
String indexName,
String indexHandlerClass,
List<String> indexedCols,
String indexTblName,
boolean deferredRebuild,
String inputFormat,
String outputFormat,
String serde,
String storageHandler,
String location,
Map<String,String> idxProps,
Map<String,String> tblProps,
Map<String,String> serdeProps,
String collItemDelim,
String fieldDelim,
String fieldEscape,
String lineDelim,
String mapKeyDelim,
String indexComment)
|
Partition |
Hive.createPartition(Table tbl,
Map<String,String> partSpec)
Creates a partition. |
Partition |
Hive.createPartition(Table tbl,
Map<String,String> partSpec,
Path location,
Map<String,String> partParams,
String inputFormat,
String outputFormat,
int numBuckets,
List<FieldSchema> cols,
String serializationLib,
Map<String,String> serdeParams,
List<String> bucketCols,
List<Order> sortCols)
Creates a partition |
void |
Hive.createRole(String roleName,
String ownerName)
|
void |
Hive.createTable(String tableName,
List<String> columns,
List<String> partCols,
Class<? extends InputFormat> fileInputFormat,
Class<?> fileOutputFormat)
Creates a table metdata and the directory for the table data |
void |
Hive.createTable(String tableName,
List<String> columns,
List<String> partCols,
Class<? extends InputFormat> fileInputFormat,
Class<?> fileOutputFormat,
int bucketCount,
List<String> bucketCols)
Creates a table metdata and the directory for the table data |
void |
Hive.createTable(Table tbl)
Creates the table with the give objects |
void |
Hive.createTable(Table tbl,
boolean ifNotExists)
Creates the table with the give objects |
boolean |
Hive.databaseExists(String dbName)
Query metadata to see if a database with the given name already exists. |
boolean |
Hive.deletePartitionColumnStatistics(String dbName,
String tableName,
String partName,
String colName)
|
boolean |
Hive.deleteTableColumnStatistics(String dbName,
String tableName,
String colName)
|
void |
Hive.dropDatabase(String name)
Drop a database. |
void |
Hive.dropDatabase(String name,
boolean deleteData,
boolean ignoreUnknownDb)
Drop a database |
void |
Hive.dropDatabase(String name,
boolean deleteData,
boolean ignoreUnknownDb,
boolean cascade)
Drop a database |
boolean |
Hive.dropIndex(String db_name,
String tbl_name,
String index_name,
boolean deleteData)
|
boolean |
Hive.dropPartition(String tblName,
List<String> part_vals,
boolean deleteData)
|
boolean |
Hive.dropPartition(String db_name,
String tbl_name,
List<String> part_vals,
boolean deleteData)
|
void |
Hive.dropRole(String roleName)
|
void |
Hive.dropTable(String tableName)
Drops table along with the data in it. |
void |
Hive.dropTable(String dbName,
String tableName)
Drops table along with the data in it. |
void |
Hive.dropTable(String dbName,
String tableName,
boolean deleteData,
boolean ignoreUnknownTab)
Drops the table. |
void |
Hive.exchangeTablePartitions(Map<String,String> partitionSpecs,
String sourceDb,
String sourceTable,
String destDb,
String destinationTableName)
|
PrincipalPrivilegeSet |
Hive.get_privilege_set(HiveObjectType objectType,
String db_name,
String table_name,
List<String> part_values,
String column_name,
String user_name,
List<String> group_names)
|
static Hive |
Hive.get()
|
static Hive |
Hive.get(HiveConf c)
Gets hive object for the current thread. |
static Hive |
Hive.get(HiveConf c,
boolean needsRefresh)
get a connection to metastore. |
List<String> |
Hive.getAllDatabases()
Get all existing database names. |
List<Index> |
Table.getAllIndexes(short max)
|
Set<Partition> |
Hive.getAllPartitionsForPruner(Table tbl)
Get all the partitions; unlike Hive.getPartitions(Table) , does not include auth. |
List<String> |
Hive.getAllRoleNames()
Get all existing role names. |
List<String> |
Hive.getAllTables()
Get all table names for the current database. |
List<String> |
Hive.getAllTables(String dbName)
Get all table names for the specified database. |
static HiveAuthenticationProvider |
HiveUtils.getAuthenticator(Configuration conf,
HiveConf.ConfVars authenticatorConfKey)
|
HiveAuthorizationProvider |
DefaultStorageHandler.getAuthorizationProvider()
|
HiveAuthorizationProvider |
HiveStorageHandler.getAuthorizationProvider()
Returns the implementation specific authorization provider |
static HiveAuthorizationProvider |
HiveUtils.getAuthorizeProviderManager(Configuration conf,
HiveConf.ConfVars authorizationProviderConfKey,
HiveAuthenticationProvider authenticator)
|
Database |
Hive.getDatabase(String dbName)
Get the database by name. |
Database |
Hive.getDatabaseCurrent()
Get the Database object for current database |
List<String> |
Hive.getDatabasesByPattern(String databasePattern)
Get all existing databases that match the given pattern. |
String |
Hive.getDelegationToken(String owner,
String renewer)
|
static List<FieldSchema> |
Hive.getFieldsFromDeserializer(String name,
Deserializer serde)
|
Index |
Hive.getIndex(String qualifiedIndexName)
|
Index |
Hive.getIndex(String baseTableName,
String indexName)
|
Index |
Hive.getIndex(String dbName,
String baseTableName,
String indexName)
|
List<Index> |
Hive.getIndexes(String dbName,
String tblName,
short max)
|
static HiveIndexHandler |
HiveUtils.getIndexHandler(HiveConf conf,
String indexHandlerClass)
|
Class<? extends InputFormat> |
Partition.getInputFormatClass()
|
Class<? extends HiveOutputFormat> |
Partition.getOutputFormatClass()
|
Partition |
Hive.getPartition(Table tbl,
Map<String,String> partSpec,
boolean forceCreate)
|
Partition |
Hive.getPartition(Table tbl,
Map<String,String> partSpec,
boolean forceCreate,
String partPath,
boolean inheritTableSpecs)
Returns partition metadata |
ColumnStatistics |
Hive.getPartitionColumnStatistics(String dbName,
String tableName,
String partName,
String colName)
|
List<String> |
Hive.getPartitionNames(String tblName,
short max)
|
List<String> |
Hive.getPartitionNames(String dbName,
String tblName,
Map<String,String> partSpec,
short max)
|
List<String> |
Hive.getPartitionNames(String dbName,
String tblName,
short max)
|
List<Partition> |
Hive.getPartitions(Table tbl)
get all the partitions that the table has |
List<Partition> |
Hive.getPartitions(Table tbl,
Map<String,String> partialPartSpec)
get all the partitions of the table that matches the given partial specification. |
List<Partition> |
Hive.getPartitions(Table tbl,
Map<String,String> partialPartSpec,
short limit)
get all the partitions of the table that matches the given partial specification. |
List<Partition> |
Hive.getPartitionsByFilter(Table tbl,
String filter)
Get a list of Partitions by filter. |
List<Partition> |
Hive.getPartitionsByNames(Table tbl,
List<String> partNames)
Get all partitions of the table that matches the list of given partition names. |
List<Partition> |
Hive.getPartitionsByNames(Table tbl,
Map<String,String> partialPartSpec)
get all the partitions of the table that matches the given partial specification. |
Path[] |
Partition.getPath(Sample s)
|
static HiveStorageHandler |
HiveUtils.getStorageHandler(Configuration conf,
String className)
|
Table |
Hive.getTable(String tableName)
Returns metadata for the table named tableName |
Table |
Hive.getTable(String tableName,
boolean throwException)
Returns metadata for the table named tableName |
Table |
Hive.getTable(String dbName,
String tableName)
Returns metadata of the table |
Table |
Hive.getTable(String dbName,
String tableName,
boolean throwException)
Returns metadata of the table |
ColumnStatistics |
Hive.getTableColumnStatistics(String dbName,
String tableName,
String colName)
|
List<String> |
Hive.getTablesByPattern(String tablePattern)
Returns all existing tables from default database which match the given pattern. |
List<String> |
Hive.getTablesByPattern(String dbName,
String tablePattern)
Returns all existing tables from the specified database which match the given pattern. |
List<String> |
Hive.getTablesForDb(String database,
String tablePattern)
Returns all existing tables from the given database which match the given pattern. |
boolean |
Hive.grantPrivileges(PrivilegeBag privileges)
|
boolean |
Hive.grantRole(String roleName,
String userName,
PrincipalType principalType,
String grantor,
PrincipalType grantorType,
boolean grantOption)
|
boolean |
Table.isValidSpec(Map<String,String> spec)
|
List<Role> |
Hive.listRoles(String userName,
PrincipalType principalType)
|
ArrayList<LinkedHashMap<String,String>> |
Hive.loadDynamicPartitions(Path loadPath,
String tableName,
Map<String,String> partSpec,
boolean replace,
int numDP,
boolean holdDDLTime,
boolean listBucketingEnabled)
Given a source directory name of the load path, load all dynamically generated partitions into the specified table and return a list of strings that represent the dynamic partition paths. |
void |
Hive.loadPartition(Path loadPath,
String tableName,
Map<String,String> partSpec,
boolean replace,
boolean holdDDLTime,
boolean inheritTableSpecs,
boolean isSkewedStoreAsSubdir)
Load a directory into a Hive Table Partition - Alters existing content of the partition with the contents of loadPath. |
void |
Hive.loadTable(Path loadPath,
String tableName,
boolean replace,
boolean holdDDLTime)
Load a directory into a Hive Table. |
Table |
Hive.newTable(String tableName)
|
protected static boolean |
Hive.renameFile(HiveConf conf,
Path srcf,
Path destf,
FileSystem fs,
boolean replace)
|
void |
Hive.renamePartition(Table tbl,
Map<String,String> oldPartSpec,
Partition newPart)
Rename a old partition to new partition |
protected void |
Table.replaceFiles(Path srcf)
Replaces the directory corresponding to the table by srcf. |
protected static void |
Hive.replaceFiles(Path srcf,
Path destf,
Path oldPath,
HiveConf conf)
Replaces files in the partition with new data set specified by srcf. |
boolean |
Hive.revokePrivileges(PrivilegeBag privileges)
|
boolean |
Hive.revokeRole(String roleName,
String userName,
PrincipalType principalType)
|
void |
Table.setBucketCols(List<String> bucketCols)
|
void |
Table.setInputFormatClass(String name)
|
void |
Table.setOutputFormatClass(String name)
|
void |
Table.setSkewedColNames(List<String> skewedColNames)
|
void |
Table.setSkewedColValues(List<List<String>> skewedValues)
|
void |
Table.setSkewedInfo(SkewedInfo skewedInfo)
|
void |
Table.setSkewedValueLocationMap(List<String> valList,
String dirName)
|
void |
Partition.setSkewedValueLocationMap(List<String> valList,
String dirName)
|
void |
Table.setSortCols(List<Order> sortOrder)
|
void |
Table.setStoredAsSubDirectories(boolean storedAsSubDirectories)
|
void |
Partition.setValues(Map<String,String> partSpec)
Set Partition's values |
List<HiveObjectPrivilege> |
Hive.showPrivilegeGrant(HiveObjectType objectType,
String principalName,
PrincipalType principalType,
String dbName,
String tableName,
List<String> partValues,
String columnName)
|
List<Role> |
Hive.showRoleGrant(String principalName,
PrincipalType principalType)
|
void |
TestHiveMetaStoreChecker.testDataDeletion()
|
void |
TestHiveMetaStoreChecker.testPartitionsCheck()
|
void |
TestHiveMetaStoreChecker.testTableCheck()
|
boolean |
Hive.updatePartitionColumnStatistics(ColumnStatistics statsObj)
|
boolean |
Hive.updateTableColumnStatistics(ColumnStatistics statsObj)
|
void |
Hive.validatePartitionNameCharacters(List<String> partVals)
|
Constructors in org.apache.hadoop.hive.ql.metadata that throw HiveException | |
---|---|
DummyPartition(Table tbl,
String name)
|
|
DummyPartition(Table tbl,
String name,
Map<String,String> partSpec)
|
|
Partition(Table tbl)
create an empty partition. |
|
Partition(Table tbl,
Map<String,String> partSpec,
Path location)
Create partition object with the given info. |
|
Partition(Table tbl,
Partition tp)
|
|
Sample(int num,
int fraction,
Dimension d)
|
Uses of HiveException in org.apache.hadoop.hive.ql.metadata.formatting |
---|
Methods in org.apache.hadoop.hive.ql.metadata.formatting that throw HiveException | |
---|---|
void |
JsonMetaDataFormatter.describeTable(DataOutputStream out,
String colPath,
String tableName,
Table tbl,
Partition part,
List<FieldSchema> cols,
boolean isFormatted,
boolean isExt,
boolean isPretty)
Describe table. |
void |
MetaDataFormatter.describeTable(DataOutputStream out,
String colPath,
String tableName,
Table tbl,
Partition part,
List<FieldSchema> cols,
boolean isFormatted,
boolean isExt,
boolean isPretty)
Describe table. |
void |
JsonMetaDataFormatter.error(OutputStream out,
String msg,
int errorCode,
String sqlState)
Write an error message. |
void |
MetaDataFormatter.error(OutputStream out,
String msg,
int errorCode,
String sqlState)
Write an error message. |
void |
JsonMetaDataFormatter.error(OutputStream out,
String errorMessage,
int errorCode,
String sqlState,
String errorDetail)
|
void |
MetaDataFormatter.error(OutputStream out,
String errorMessage,
int errorCode,
String sqlState,
String errorDetail)
|
void |
JsonMetaDataFormatter.showDatabaseDescription(DataOutputStream out,
String database,
String comment,
String location,
Map<String,String> params)
Show the description of a database |
void |
MetaDataFormatter.showDatabaseDescription(DataOutputStream out,
String database,
String comment,
String location,
Map<String,String> params)
Describe a database. |
void |
JsonMetaDataFormatter.showDatabases(DataOutputStream out,
List<String> databases)
Show a list of databases |
void |
MetaDataFormatter.showDatabases(DataOutputStream out,
List<String> databases)
Show the databases |
void |
JsonMetaDataFormatter.showTablePartitons(DataOutputStream out,
List<String> parts)
Show the table partitions. |
void |
MetaDataFormatter.showTablePartitons(DataOutputStream out,
List<String> parts)
Show the table partitions. |
void |
JsonMetaDataFormatter.showTables(DataOutputStream out,
Set<String> tables)
Show a list of tables. |
void |
MetaDataFormatter.showTables(DataOutputStream out,
Set<String> tables)
Show a list of tables. |
void |
JsonMetaDataFormatter.showTableStatus(DataOutputStream out,
Hive db,
HiveConf conf,
List<Table> tbls,
Map<String,String> part,
Partition par)
|
void |
MetaDataFormatter.showTableStatus(DataOutputStream out,
Hive db,
HiveConf conf,
List<Table> tbls,
Map<String,String> part,
Partition par)
Show the table status. |
Uses of HiveException in org.apache.hadoop.hive.ql.optimizer |
---|
Methods in org.apache.hadoop.hive.ql.optimizer that throw HiveException | |
---|---|
static Set<Partition> |
IndexUtils.checkPartitionsCoveredByIndex(TableScanOperator tableScan,
ParseContext pctx,
Map<Table,List<Index>> indexes)
Check the partitions used by the table scan to make sure they also exist in the index table. |
Uses of HiveException in org.apache.hadoop.hive.ql.optimizer.ppr |
---|
Methods in org.apache.hadoop.hive.ql.optimizer.ppr that throw HiveException | |
---|---|
static Object |
PartExprEvalUtils.evalExprWithPart(ExprNodeDesc expr,
LinkedHashMap<String,String> partSpec,
List<VirtualColumn> vcs,
StructObjectInspector rowObjectInspector)
Evaluate expression with partition columns |
static Object |
PartExprEvalUtils.evaluateExprOnPart(ObjectPair<PrimitiveObjectInspector,ExprNodeEvaluator> pair,
Object partColValues)
|
static ObjectPair<PrimitiveObjectInspector,ExprNodeEvaluator> |
PartExprEvalUtils.prepareExpr(ExprNodeDesc expr,
List<String> partNames)
|
static PrunedPartitionList |
PartitionPruner.prune(TableScanOperator ts,
ParseContext parseCtx,
String alias)
Get the partition list for the TS operator that satisfies the partition pruner condition. |
static boolean |
PartitionPruner.prunePartitionNames(List<String> columnNames,
ExprNodeDesc prunerExpr,
String defaultPartitionName,
List<String> partNames)
Prunes partition names to see if they match the prune expression. |
Uses of HiveException in org.apache.hadoop.hive.ql.parse |
---|
Subclasses of HiveException in org.apache.hadoop.hive.ql.parse | |
---|---|
class |
SemanticException
Exception from SemanticAnalyzer. |
Methods in org.apache.hadoop.hive.ql.parse that throw HiveException | |
---|---|
PTFDesc.PTFExpressionDef |
PTFTranslator.buildExpressionDef(PTFDesc.ShapeDetails inpShape,
ASTNode arg)
|
List<Task<? extends Serializable>> |
IndexUpdater.generateUpdateTasks()
|
static ExprNodeEvaluator |
WindowingExprNodeEvaluatorFactory.get(PTFTranslator.LeadLagInfo llInfo,
ExprNodeDesc desc)
|
Hive |
HiveSemanticAnalyzerHookContext.getHive()
|
Hive |
HiveSemanticAnalyzerHookContextImpl.getHive()
|
PrunedPartitionList |
ParseContext.getPrunedPartitions(String alias,
TableScanOperator ts)
|
boolean |
BaseSemanticAnalyzer.isValidPrefixSpec(Table tTable,
Map<String,String> spec)
Checks if given specification is proper specification for prefix of partition cols, for table partitioned by ds, hr, min valid ones are (ds='2008-04-08'), (ds='2008-04-08', hr='12'), (ds='2008-04-08', hr='12', min='30') invalid one is for example (ds='2008-04-08', min='30') |
void |
WindowingExprNodeEvaluatorFactory.FindLeadLagFuncExprs.visit(ExprNodeGenericFuncDesc fnExpr)
|
Uses of HiveException in org.apache.hadoop.hive.ql.plan |
---|
Methods in org.apache.hadoop.hive.ql.plan that throw HiveException | |
---|---|
protected void |
PTFDeserializer.initialize(PTFDesc.BoundaryDef def,
PTFDesc.ShapeDetails inpShape)
|
protected void |
PTFDeserializer.initialize(PTFDesc.PartitionedTableFunctionDef def)
|
protected void |
PTFDeserializer.initialize(PTFDesc.PTFExpressionDef eDef,
PTFDesc.ShapeDetails inpShape)
|
protected void |
PTFDeserializer.initialize(PTFDesc.PTFQueryInputDef def,
StructObjectInspector OI)
|
protected void |
PTFDeserializer.initialize(PTFDesc.ShapeDetails shp,
StructObjectInspector OI)
|
void |
PTFDeserializer.initializePTFChain(PTFDesc.PartitionedTableFunctionDef tblFnDef)
|
void |
PTFDeserializer.initializeWindowing(PTFDesc.WindowTableFunctionDef def)
|
Constructors in org.apache.hadoop.hive.ql.plan that throw HiveException | |
---|---|
PartitionDesc(Partition part)
|
|
PartitionDesc(Partition part,
TableDesc tblDesc)
|
Uses of HiveException in org.apache.hadoop.hive.ql.security |
---|
Methods in org.apache.hadoop.hive.ql.security that throw HiveException | |
---|---|
void |
DummyHiveMetastoreAuthorizationProvider.authorize(Database db,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
DummyHiveMetastoreAuthorizationProvider.authorize(Partition part,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
DummyHiveMetastoreAuthorizationProvider.authorize(Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
DummyHiveMetastoreAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
DummyHiveMetastoreAuthorizationProvider.authorize(Table table,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
InjectableDummyAuthenticator.destroy()
|
void |
DummyAuthenticator.destroy()
|
void |
HiveAuthenticationProvider.destroy()
|
void |
HadoopDefaultAuthenticator.destroy()
|
void |
DummyHiveMetastoreAuthorizationProvider.init(Configuration conf)
|
Uses of HiveException in org.apache.hadoop.hive.ql.security.authorization |
---|
Methods in org.apache.hadoop.hive.ql.security.authorization that throw HiveException | |
---|---|
void |
HiveAuthorizationProvider.authorize(Database db,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a database object. |
void |
StorageBasedAuthorizationProvider.authorize(Database db,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
BitSetCheckedAuthorizationProvider.authorize(Database db,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
void |
HiveAuthorizationProvider.authorize(Partition part,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a hive partition object. |
void |
StorageBasedAuthorizationProvider.authorize(Partition part,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
BitSetCheckedAuthorizationProvider.authorize(Partition part,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
void |
StorageBasedAuthorizationProvider.authorize(Path path,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a path. |
void |
HiveAuthorizationProvider.authorize(Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization user level privileges. |
void |
StorageBasedAuthorizationProvider.authorize(Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
BitSetCheckedAuthorizationProvider.authorize(Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
void |
HiveAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a list of columns. |
void |
StorageBasedAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
BitSetCheckedAuthorizationProvider.authorize(Table table,
Partition part,
List<String> columns,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
void |
HiveAuthorizationProvider.authorize(Table table,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
Authorization privileges against a hive table object. |
void |
StorageBasedAuthorizationProvider.authorize(Table table,
Privilege[] readRequiredPriv,
Privilege[] writeRequiredPriv)
|
void |
BitSetCheckedAuthorizationProvider.authorize(Table table,
Privilege[] inputRequiredPriv,
Privilege[] outputRequiredPriv)
|
protected boolean |
BitSetCheckedAuthorizationProvider.authorizePrivileges(PrincipalPrivilegeSet privileges,
Privilege[] inputPriv,
boolean[] inputCheck,
Privilege[] outputPriv,
boolean[] outputCheck)
|
protected boolean |
BitSetCheckedAuthorizationProvider.authorizeUserPriv(Privilege[] inputRequiredPriv,
boolean[] inputCheck,
Privilege[] outputRequiredPriv,
boolean[] outputCheck)
|
PrincipalPrivilegeSet |
HiveAuthorizationProviderBase.HiveProxy.get_privilege_set(HiveObjectType column,
String dbName,
String tableName,
List<String> partValues,
String col,
String userName,
List<String> groupNames)
|
Database |
HiveAuthorizationProviderBase.HiveProxy.getDatabase(String dbName)
|
protected Path |
StorageBasedAuthorizationProvider.getDbLocation(Database db)
|
void |
DefaultHiveAuthorizationProvider.init(Configuration conf)
|
void |
HiveAuthorizationProvider.init(Configuration conf)
|
void |
DefaultHiveMetastoreAuthorizationProvider.init(Configuration conf)
|
void |
StorageBasedAuthorizationProvider.init(Configuration conf)
|
Constructors in org.apache.hadoop.hive.ql.security.authorization that throw HiveException | |
---|---|
AuthorizationPreEventListener(Configuration config)
|
Uses of HiveException in org.apache.hadoop.hive.ql.session |
---|
Methods in org.apache.hadoop.hive.ql.session that throw HiveException | |
---|---|
static CreateTableAutomaticGrant |
CreateTableAutomaticGrant.create(HiveConf conf)
|
Uses of HiveException in org.apache.hadoop.hive.ql.udf |
---|
Methods in org.apache.hadoop.hive.ql.udf that throw HiveException | |
---|---|
Object |
GenericUDFEncode.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFDecode.evaluate(GenericUDF.DeferredObject[] arguments)
|
void |
TestGenericUDFDecode.testDecode()
|
void |
TestGenericUDFEncode.testEncode()
|
void |
TestGenericUDFDecode.verifyDecode(String string,
String charsetName)
|
void |
TestGenericUDFEncode.verifyEncode(String string,
String charsetName)
|
Uses of HiveException in org.apache.hadoop.hive.ql.udf.generic |
---|
Methods in org.apache.hadoop.hive.ql.udf.generic that throw HiveException | |
---|---|
void |
NGramEstimator.add(ArrayList<String> ng)
Adds a new n-gram to the estimation. |
void |
GenericUDAFEvaluator.aggregate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
This function will be called by GroupByOperator when it sees a new input row. |
Object |
GenericUDFConcat.binaryEvaluate(GenericUDF.DeferredObject[] arguments)
|
void |
GenericUDTFJSONTuple.close()
|
void |
GenericUDTFParseUrlTuple.close()
|
void |
GenericUDTFInline.close()
|
void |
GenericUDTFExplode.close()
|
void |
GenericUDTFStack.close()
|
abstract void |
GenericUDTF.close()
Called to notify the UDTF that there are no more rows to process. |
void |
UDTFCollector.collect(Object input)
|
void |
Collector.collect(Object input)
Other classes will call collect() with the data that it has. |
Integer |
GenericUDFBaseCompare.compare(GenericUDF.DeferredObject[] arguments)
|
void |
GenericUDAFAverage.GenericUDAFAverageEvaluatorDouble.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<Double> aggregation)
|
void |
GenericUDAFAverage.GenericUDAFAverageEvaluatorDecimal.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<HiveDecimal> aggregation)
|
protected abstract void |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.doReset(org.apache.hadoop.hive.ql.udf.generic.GenericUDAFAverage.AverageAggregationBuffer<TYPE> aggregation)
|
Object |
GenericUDAFEvaluator.evaluate(GenericUDAFEvaluator.AggregationBuffer agg)
This function will be called by GroupByOperator when it sees a new input row. |
Object |
GenericUDFTestGetJavaBoolean.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFEvaluateNPE.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
DummyContextUDF.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFTestTranslate.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFTestGetJavaString.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFStruct.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFArrayContains.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPEqualOrGreaterThan.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFMapKeys.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPAnd.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFEWAHBitmapEmpty.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFBetween.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFLocate.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPNotNull.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFLower.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPNotEqual.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFSortArray.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFFormatNumber.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFToUnixTimeStamp.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPNull.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFPrintf.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPNot.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPGreaterThan.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFNvl.evaluate(GenericUDF.DeferredObject[] arguments)
|
abstract Object |
GenericUDF.evaluate(GenericUDF.DeferredObject[] arguments)
Evaluate the GenericUDF with the arguments. |
Object |
GenericUDFConcat.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFSentences.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPEqual.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFConcatWS.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFLeadLag.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFInFile.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFToBinary.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFArray.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFStringToMap.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFAssertTrue.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPEqualOrLessThan.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFHash.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFIn.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFUnion.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFCoalesce.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFSplit.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFMacro.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFMapValues.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFToDate.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFMap.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFElt.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFTimestamp.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFWhen.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFField.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFReflect.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFFromUtcTimestamp.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPEqualNS.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFReflect2.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFIf.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPLessThan.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFUpper.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFIndex.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFSize.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFTranslate.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFNamedStruct.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFToDecimal.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
AbstractGenericUDFEWAHBitmapBop.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFBridge.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFUnixTimeStamp.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFCase.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFToVarchar.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFOPOr.evaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDFInstr.evaluate(GenericUDF.DeferredObject[] arguments)
|
protected void |
GenericUDTF.forward(Object o)
Passes an output row to the collector. |
Object |
GenericUDF.DeferredObject.get()
|
Object |
GenericUDF.DeferredJavaObject.get()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFMin.GenericUDAFMinEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFMax.GenericUDAFMaxEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFAverage.GenericUDAFAverageEvaluatorDouble.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFAverage.GenericUDAFAverageEvaluatorDecimal.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFCount.GenericUDAFCountEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFSum.GenericUDAFSumHiveDecimal.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFSum.GenericUDAFSumDouble.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFSum.GenericUDAFSumLong.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFNTile.GenericUDAFNTileEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFRank.GenericUDAFRankEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.getNewAggregationBuffer()
|
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.getNewAggregationBuffer()
|
abstract GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFEvaluator.getNewAggregationBuffer()
Get a new aggregation object. |
GenericUDAFEvaluator.AggregationBuffer |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.getNewAggregationBuffer()
|
protected org.apache.hadoop.hive.ql.udf.generic.GenericUDAFLeadLag.LeadLagBuffer |
GenericUDAFLead.GenericUDAFLeadEvaluator.getNewLLBuffer()
|
protected org.apache.hadoop.hive.ql.udf.generic.GenericUDAFLeadLag.LeadLagBuffer |
GenericUDAFLag.GenericUDAFLagEvaluator.getNewLLBuffer()
|
protected abstract org.apache.hadoop.hive.ql.udf.generic.GenericUDAFLeadLag.LeadLagBuffer |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.getNewLLBuffer()
|
ArrayList<Object[]> |
NGramEstimator.getNGrams()
Returns the final top-k n-grams in a format suitable for returning to Hive. |
protected double[] |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.getQuantileArray(ConstantObjectInspector quantileOI)
|
protected abstract Object |
GenericUDFLeadLag.getRow(int amt)
|
protected Object |
GenericUDFLeadLag.GenericUDFLead.getRow(int amt)
|
protected Object |
GenericUDFLeadLag.GenericUDFLag.getRow(int amt)
|
ObjectInspector |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFPercentileApprox.GenericUDAFSinglePercentileApproxEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFPercentileApprox.GenericUDAFMultiplePercentileApproxEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFMin.GenericUDAFMinEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFMax.GenericUDAFMaxEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFPercentRank.GenericUDAFPercentRankEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFCumeDist.GenericUDAFCumeDistEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFCount.GenericUDAFCountEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFSum.GenericUDAFSumHiveDecimal.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFSum.GenericUDAFSumDouble.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFSum.GenericUDAFSumLong.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFNTile.GenericUDAFNTileEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFRank.GenericUDAFRankEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
ObjectInspector |
GenericUDAFEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
Initialize the evaluator. |
ObjectInspector |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.init(GenericUDAFEvaluator.Mode m,
ObjectInspector[] parameters)
|
void |
NGramEstimator.initialize(int pk,
int ppf,
int pn)
Sets the 'k' and 'pf' parameters. |
void |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFMin.GenericUDAFMinEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFMax.GenericUDAFMaxEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer aggregation,
Object[] parameters)
|
void |
GenericUDAFCount.GenericUDAFCountEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFSum.GenericUDAFSumHiveDecimal.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFSum.GenericUDAFSumDouble.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFSum.GenericUDAFSumLong.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFNTile.GenericUDAFNTileEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFRank.GenericUDAFRankEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
abstract void |
GenericUDAFEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
Iterate through original data. |
void |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.iterate(GenericUDAFEvaluator.AggregationBuffer agg,
Object[] parameters)
|
void |
GenericUDFInFile.load(InputStream is)
Load the file from an InputStream. |
void |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFMin.GenericUDAFMinEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFMax.GenericUDAFMaxEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object obj)
|
void |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer aggregation,
Object partial)
|
void |
GenericUDAFCount.GenericUDAFCountEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFSum.GenericUDAFSumHiveDecimal.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFSum.GenericUDAFSumDouble.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFSum.GenericUDAFSumLong.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFNTile.GenericUDAFNTileEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFRank.GenericUDAFRankEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
abstract void |
GenericUDAFEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
Merge with partial aggregation result. |
void |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.merge(GenericUDAFEvaluator.AggregationBuffer agg,
Object partial)
|
void |
NGramEstimator.merge(List<Text> other)
Takes a serialized n-gram estimator object created by the serialize() method and merges it with the current n-gram object. |
void |
GenericUDF.DeferredObject.prepare(int version)
|
void |
GenericUDF.DeferredJavaObject.prepare(int version)
|
void |
GenericUDTFJSONTuple.process(Object[] o)
|
void |
GenericUDTFParseUrlTuple.process(Object[] o)
|
void |
GenericUDTFInline.process(Object[] os)
|
void |
GenericUDTFExplode.process(Object[] o)
|
void |
GenericUDTFStack.process(Object[] args)
|
abstract void |
GenericUDTF.process(Object[] args)
Give a set of arguments for the UDTF to process. |
void |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFMin.GenericUDAFMinEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFMax.GenericUDAFMaxEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer aggregation)
|
void |
GenericUDAFCount.GenericUDAFCountEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFSum.GenericUDAFSumHiveDecimal.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFSum.GenericUDAFSumDouble.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFSum.GenericUDAFSumLong.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFNTile.GenericUDAFNTileEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFRank.GenericUDAFRankEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
void |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
abstract void |
GenericUDAFEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
Reset the aggregation. |
void |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.reset(GenericUDAFEvaluator.AggregationBuffer agg)
|
ArrayList<Text> |
NGramEstimator.serialize()
In preparation for a Hive merge() call, serializes the current n-gram estimator object into an ArrayList of Text objects. |
String |
GenericUDFConcat.stringEvaluate(GenericUDF.DeferredObject[] arguments)
|
Object |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFPercentileApprox.GenericUDAFSinglePercentileApproxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFPercentileApprox.GenericUDAFMultiplePercentileApproxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFMin.GenericUDAFMinEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFMax.GenericUDAFMaxEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFPercentRank.GenericUDAFPercentRankEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCovarianceSample.GenericUDAFCovarianceSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCumeDist.GenericUDAFCumeDistEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFStd.GenericUDAFStdEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFStdSample.GenericUDAFStdSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer aggregation)
|
Object |
GenericUDAFCount.GenericUDAFCountEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumHiveDecimal.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumDouble.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumLong.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFNTile.GenericUDAFNTileEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFRank.GenericUDAFRankEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFVarianceSample.GenericUDAFVarianceSampleEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
abstract Object |
GenericUDAFEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
Get final aggregation result. |
Object |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.terminate(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFFirstValue.GenericUDAFFirstValueEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFLastValue.GenericUDAFLastValueEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFPercentileApprox.GenericUDAFPercentileApproxEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFnGrams.GenericUDAFnGramEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCollectSet.GenericUDAFMkSetEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFMin.GenericUDAFMinEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFMax.GenericUDAFMaxEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFContextNGrams.GenericUDAFContextNGramEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFVariance.GenericUDAFVarianceEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFEWAHBitmap.GenericUDAFEWAHBitmapEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFCorrelation.GenericUDAFCorrelationEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFAverage.AbstractGenericUDAFAverageEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer aggregation)
|
Object |
GenericUDAFCount.GenericUDAFCountEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFBridge.GenericUDAFBridgeEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumHiveDecimal.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumDouble.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFSum.GenericUDAFSumLong.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFNTile.GenericUDAFNTileEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFRowNumber.GenericUDAFRowNumberEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFHistogramNumeric.GenericUDAFHistogramNumericEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFLeadLag.GenericUDAFLeadLagEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFRank.GenericUDAFRankEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFBooleanStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFLongStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFDoubleStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFStringStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Object |
GenericUDAFComputeStats.GenericUDAFBinaryStatsEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
abstract Object |
GenericUDAFEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
Get partial aggregation result. |
Object |
GenericUDAFCovariance.GenericUDAFCovarianceEvaluator.terminatePartial(GenericUDAFEvaluator.AggregationBuffer agg)
|
Uses of HiveException in org.apache.hadoop.hive.ql.udf.ptf |
---|
Uses of HiveException in org.apache.hadoop.hive.ql.udf.xml |
---|
Methods in org.apache.hadoop.hive.ql.udf.xml that throw HiveException | |
---|---|
Object |
GenericUDFXPath.evaluate(GenericUDF.DeferredObject[] arguments)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |