|
||||||||||
PREV NEXT | FRAMES NO FRAMES |
Packages that use exprNodeDesc | |
---|---|
org.apache.hadoop.hive.ql.exec | |
org.apache.hadoop.hive.ql.parse | |
org.apache.hadoop.hive.ql.plan |
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.exec |
---|
Methods in org.apache.hadoop.hive.ql.exec with parameters of type exprNodeDesc | |
---|---|
static ExprNodeEvaluator |
ExprNodeEvaluatorFactory.get(exprNodeDesc desc)
|
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.parse |
---|
Methods in org.apache.hadoop.hive.ql.parse that return exprNodeDesc | |
---|---|
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String name,
exprNodeDesc... children)
Get the exprNodeDesc |
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName,
List<exprNodeDesc> children)
This function create an ExprNodeDesc for a UDF function given the children (arguments). |
static exprNodeDesc |
TypeCheckProcFactory.processGByExpr(Node nd,
Object procCtx)
Function to do groupby subexpression elimination. |
Methods in org.apache.hadoop.hive.ql.parse with parameters of type exprNodeDesc | |
---|---|
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String name,
exprNodeDesc... children)
Get the exprNodeDesc |
static boolean |
PartitionPruner.mightBeUnknown(exprNodeDesc desc)
|
Method parameters in org.apache.hadoop.hive.ql.parse with type arguments of type exprNodeDesc | |
---|---|
static exprNodeDesc |
TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName,
List<exprNodeDesc> children)
This function create an ExprNodeDesc for a UDF function given the children (arguments). |
static boolean |
TypeCheckProcFactory.DefaultExprProcessor.isRedundantConversionFunction(ASTNode expr,
boolean isFunction,
ArrayList<exprNodeDesc> children)
|
Uses of exprNodeDesc in org.apache.hadoop.hive.ql.plan |
---|
Subclasses of exprNodeDesc in org.apache.hadoop.hive.ql.plan | |
---|---|
class |
exprNodeColumnDesc
|
class |
exprNodeConstantDesc
|
class |
exprNodeFieldDesc
|
class |
exprNodeFuncDesc
The reason that we have to store UDFClass as well as UDFMethod is because UDFMethod might be declared in a parent class of UDFClass. |
class |
exprNodeIndexDesc
|
class |
exprNodeNullDesc
|
Methods in org.apache.hadoop.hive.ql.plan that return exprNodeDesc | |
---|---|
exprNodeDesc |
extractDesc.getCol()
|
exprNodeDesc |
exprNodeIndexDesc.getDesc()
|
exprNodeDesc |
exprNodeFieldDesc.getDesc()
|
exprNodeDesc |
exprNodeIndexDesc.getIndex()
|
exprNodeDesc |
filterDesc.getPredicate()
|
Methods in org.apache.hadoop.hive.ql.plan that return types with arguments of type exprNodeDesc | |
---|---|
ArrayList<exprNodeDesc> |
exprNodeFuncDesc.getChildren()
|
ArrayList<exprNodeDesc> |
selectDesc.getColList()
|
Map<Byte,ArrayList<exprNodeDesc>> |
joinDesc.getExprs()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getKeyCols()
|
ArrayList<exprNodeDesc> |
groupByDesc.getKeys()
|
ArrayList<exprNodeDesc> |
aggregationDesc.getParameters()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getPartitionCols()
|
ArrayList<exprNodeDesc> |
reduceSinkDesc.getValueCols()
|
Methods in org.apache.hadoop.hive.ql.plan with parameters of type exprNodeDesc | |
---|---|
void |
extractDesc.setCol(exprNodeDesc col)
|
void |
exprNodeIndexDesc.setDesc(exprNodeDesc desc)
|
void |
exprNodeFieldDesc.setDesc(exprNodeDesc desc)
|
void |
exprNodeIndexDesc.setIndex(exprNodeDesc index)
|
void |
filterDesc.setPredicate(exprNodeDesc predicate)
|
Method parameters in org.apache.hadoop.hive.ql.plan with type arguments of type exprNodeDesc | |
---|---|
static List<FieldSchema> |
PlanUtils.getFieldSchemasFromColumnList(ArrayList<exprNodeDesc> cols,
String fieldPrefix)
Convert the ColumnList to FieldSchema list. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
int tag,
ArrayList<exprNodeDesc> partitionCols,
String order,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
int tag,
int numPartitionFields,
int numReducers)
Create the reduce sink descriptor. |
static reduceSinkDesc |
PlanUtils.getReduceSinkDesc(ArrayList<exprNodeDesc> keyCols,
ArrayList<exprNodeDesc> valueCols,
int tag,
int numPartitionFields,
int numReducers)
Create the reduce sink descriptor. |
void |
exprNodeFuncDesc.setChildren(ArrayList<exprNodeDesc> children)
|
void |
selectDesc.setColList(ArrayList<exprNodeDesc> colList)
|
void |
joinDesc.setExprs(Map<Byte,ArrayList<exprNodeDesc>> exprs)
|
void |
reduceSinkDesc.setKeyCols(ArrayList<exprNodeDesc> keyCols)
|
void |
groupByDesc.setKeys(ArrayList<exprNodeDesc> keys)
|
void |
aggregationDesc.setParameters(ArrayList<exprNodeDesc> parameters)
|
void |
reduceSinkDesc.setPartitionCols(ArrayList<exprNodeDesc> partitionCols)
|
void |
reduceSinkDesc.setValueCols(ArrayList<exprNodeDesc> valueCols)
|
Constructors in org.apache.hadoop.hive.ql.plan with parameters of type exprNodeDesc | |
---|---|
exprNodeFieldDesc(TypeInfo typeInfo,
exprNodeDesc desc,
String fieldName,
Boolean isList)
|
|
exprNodeIndexDesc(exprNodeDesc desc,
exprNodeDesc index)
|
|
exprNodeIndexDesc(TypeInfo typeInfo,
exprNodeDesc desc,
exprNodeDesc index)
|
|
extractDesc(exprNodeDesc col)
|
|
filterDesc(exprNodeDesc predicate)
|
|
||||||||||
PREV NEXT | FRAMES NO FRAMES |