Uses of Class
org.apache.hadoop.hive.ql.plan.ExprNodeDesc

Packages that use ExprNodeDesc
org.apache.hadoop.hive.ql.exec Hive QL execution tasks, operators, functions and other handlers. 
org.apache.hadoop.hive.ql.index   
org.apache.hadoop.hive.ql.index.bitmap   
org.apache.hadoop.hive.ql.index.compact   
org.apache.hadoop.hive.ql.io.sarg   
org.apache.hadoop.hive.ql.metadata   
org.apache.hadoop.hive.ql.optimizer   
org.apache.hadoop.hive.ql.optimizer.correlation   
org.apache.hadoop.hive.ql.optimizer.lineage   
org.apache.hadoop.hive.ql.optimizer.listbucketingpruner   
org.apache.hadoop.hive.ql.optimizer.pcr   
org.apache.hadoop.hive.ql.optimizer.ppr   
org.apache.hadoop.hive.ql.parse   
org.apache.hadoop.hive.ql.plan   
org.apache.hadoop.hive.ql.ppd   
org.apache.hadoop.hive.ql.udf.generic Standard toolkit and framework for generic User-defined functions. 
org.apache.hadoop.hive.ql.udf.ptf   
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.exec
 

Classes in org.apache.hadoop.hive.ql.exec with type parameters of type ExprNodeDesc
 class ExprNodeEvaluator<T extends ExprNodeDesc>
          ExprNodeEvaluator.
 

Fields in org.apache.hadoop.hive.ql.exec declared as ExprNodeDesc
protected  T ExprNodeEvaluator.expr
           
 

Fields in org.apache.hadoop.hive.ql.exec with type parameters of type ExprNodeDesc
protected  Map<String,ExprNodeDesc> Operator.colExprMap
          A map of output column name to input expression map.
 

Methods in org.apache.hadoop.hive.ql.exec that return ExprNodeDesc
static ExprNodeDesc Utilities.deserializeExpression(String s, Configuration conf)
           
 

Methods in org.apache.hadoop.hive.ql.exec that return types with arguments of type ExprNodeDesc
 Map<String,ExprNodeDesc> Operator.getColumnExprMap()
          Returns a map of output column name to input expression map Note that currently it returns only key columns for ReduceSink and GroupBy operators.
 

Methods in org.apache.hadoop.hive.ql.exec with parameters of type ExprNodeDesc
static String Utilities.checkJDOPushDown(Table tab, ExprNodeDesc expr, GenericUDF parent)
          Check if the partition pruning expression can be pushed down to JDO filtering.
static ExprNodeEvaluator ExprNodeEvaluatorFactory.get(ExprNodeDesc desc)
           
static boolean FunctionRegistry.isOpAnd(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "and".
static boolean FunctionRegistry.isOpAndOrNot(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "and", "or", "not".
static boolean FunctionRegistry.isOpNot(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "not".
static boolean FunctionRegistry.isOpOr(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "or".
static boolean FunctionRegistry.isOpPositive(ExprNodeDesc desc)
          Returns whether the exprNodeDesc is a node of "positive".
static boolean FunctionRegistry.isOpPreserveInputName(ExprNodeDesc desc)
          Returns whether the exprNodeDesc can recommend name for the expression
static void FunctionRegistry.registerTemporaryMacro(String macroName, ExprNodeDesc body, List<String> colNames, List<TypeInfo> colTypes)
          Registers thae appropriate kind of temporary function based on a class's type.
static String Utilities.serializeExpression(ExprNodeDesc expr)
           
 

Method parameters in org.apache.hadoop.hive.ql.exec with type arguments of type ExprNodeDesc
static
<T extends OperatorDesc>
Operator<T>
OperatorFactory.getAndMakeChild(T conf, RowSchema rwsch, Map<String,ExprNodeDesc> colExprMap, List<Operator<? extends OperatorDesc>> oplist)
          Returns an operator given the conf and a list of parent operators.
static
<T extends OperatorDesc>
Operator<T>
OperatorFactory.getAndMakeChild(T conf, RowSchema rwsch, Map<String,ExprNodeDesc> colExprMap, Operator... oplist)
          Returns an operator given the conf and a list of parent operators.
static int JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap, Map<Byte,List<ExprNodeDesc>> inputMap, Byte[] order, int posBigTableAlias)
           
static int JoinUtil.populateJoinKeyValue(List<ExprNodeEvaluator>[] outMap, Map<Byte,List<ExprNodeDesc>> inputMap, int posBigTableAlias)
           
 void Operator.setColumnExprMap(Map<String,ExprNodeDesc> colExprMap)
           
 

Constructor parameters in org.apache.hadoop.hive.ql.exec with type arguments of type ExprNodeDesc
MuxOperator.Handler(ObjectInspector inputObjInspector, List<ExprNodeDesc> keyCols, List<ExprNodeDesc> valueCols, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, Integer tag)
           
MuxOperator.Handler(ObjectInspector inputObjInspector, List<ExprNodeDesc> keyCols, List<ExprNodeDesc> valueCols, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, Integer tag)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index
 

Methods in org.apache.hadoop.hive.ql.index that return ExprNodeDesc
 ExprNodeDesc IndexPredicateAnalyzer.analyzePredicate(ExprNodeDesc predicate, List<IndexSearchCondition> searchConditions)
          Analyzes a predicate.
 ExprNodeDesc IndexSearchCondition.getComparisonExpr()
           
 ExprNodeDesc HiveIndexQueryContext.getResidualPredicate()
           
 ExprNodeDesc IndexPredicateAnalyzer.translateSearchConditions(List<IndexSearchCondition> searchConditions)
          Translates search conditions back to ExprNodeDesc form (as a left-deep conjunction).
 

Methods in org.apache.hadoop.hive.ql.index with parameters of type ExprNodeDesc
 ExprNodeDesc IndexPredicateAnalyzer.analyzePredicate(ExprNodeDesc predicate, List<IndexSearchCondition> searchConditions)
          Analyzes a predicate.
 void AbstractIndexHandler.generateIndexQuery(Index index, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 void HiveIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
          Generate the list of tasks required to run an index optimized sub-query for the given predicate, using the given indexes.
 void IndexSearchCondition.setComparisonExpr(ExprNodeDesc comparisonExpr)
           
 void HiveIndexQueryContext.setResidualPredicate(ExprNodeDesc residualPredicate)
           
 

Constructors in org.apache.hadoop.hive.ql.index with parameters of type ExprNodeDesc
IndexSearchCondition(ExprNodeColumnDesc columnDesc, String comparisonOp, ExprNodeConstantDesc constantDesc, ExprNodeDesc comparisonExpr)
          Constructs a search condition, which takes the form
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index.bitmap
 

Methods in org.apache.hadoop.hive.ql.index.bitmap with parameters of type ExprNodeDesc
 void BitmapIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 

Constructors in org.apache.hadoop.hive.ql.index.bitmap with parameters of type ExprNodeDesc
BitmapInnerQuery(String tableName, ExprNodeDesc predicate, String alias)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.index.compact
 

Methods in org.apache.hadoop.hive.ql.index.compact with parameters of type ExprNodeDesc
 void CompactIndexHandler.generateIndexQuery(List<Index> indexes, ExprNodeDesc predicate, ParseContext pctx, HiveIndexQueryContext queryContext)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.io.sarg
 

Methods in org.apache.hadoop.hive.ql.io.sarg with parameters of type ExprNodeDesc
 SearchArgument SearchArgument.Factory.create(ExprNodeDesc expression)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.metadata
 

Fields in org.apache.hadoop.hive.ql.metadata declared as ExprNodeDesc
 ExprNodeDesc HiveStoragePredicateHandler.DecomposedPredicate.pushedPredicate
          Portion of predicate to be evaluated by storage handler.
 ExprNodeDesc HiveStoragePredicateHandler.DecomposedPredicate.residualPredicate
          Portion of predicate to be post-evaluated by Hive for any rows which are returned by storage handler.
 

Methods in org.apache.hadoop.hive.ql.metadata with parameters of type ExprNodeDesc
 HiveStoragePredicateHandler.DecomposedPredicate HiveStoragePredicateHandler.decomposePredicate(JobConf jobConf, Deserializer deserializer, ExprNodeDesc predicate)
          Gives the storage handler a chance to decompose a predicate.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer
 

Methods in org.apache.hadoop.hive.ql.optimizer that return ExprNodeDesc
protected abstract  ExprNodeDesc PrunerExpressionOperatorFactory.ColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
          Process column desc.
 

Methods in org.apache.hadoop.hive.ql.optimizer that return types with arguments of type ExprNodeDesc
 Map<Byte,List<ExprNodeDesc>> SortBucketJoinProcCtx.getKeyExprMap()
           
 

Methods in org.apache.hadoop.hive.ql.optimizer with parameters of type ExprNodeDesc
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,ExprNodeDesc> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred)
          Add pruning predicate.
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred, Partition part)
          Add pruning predicate.
static Map<Node,Object> PrunerUtils.walkExprTree(ExprNodeDesc pred, NodeProcessorCtx ctx, NodeProcessor colProc, NodeProcessor fieldProc, NodeProcessor genFuncProc, NodeProcessor defProc)
          Walk expression tree for pruner generation.
 

Method parameters in org.apache.hadoop.hive.ql.optimizer with type arguments of type ExprNodeDesc
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,ExprNodeDesc> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred)
          Add pruning predicate.
protected  void PrunerOperatorFactory.FilterPruner.addPruningPred(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPrunner, TableScanOperator top, ExprNodeDesc new_pruner_pred, Partition part)
          Add pruning predicate.
protected  boolean AbstractBucketJoinProc.checkConvertBucketMapJoin(ParseContext pGraphContext, BucketJoinProcCtx context, QBJoinTree joinCtx, Map<Byte,List<ExprNodeDesc>> keysMap, String baseBigAlias, List<String> joinAliases)
           
 void SortBucketJoinProcCtx.setKeyExprMap(Map<Byte,List<ExprNodeDesc>> keyExprMap)
           
 List<String> AbstractBucketJoinProc.toColumns(List<ExprNodeDesc> keys)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.correlation
 

Methods in org.apache.hadoop.hive.ql.optimizer.correlation with parameters of type ExprNodeDesc
protected static String CorrelationUtilities.getColumnName(Map<String,ExprNodeDesc> opColumnExprMap, ExprNodeDesc expr)
           
protected static int CorrelationUtilities.indexOf(ExprNodeDesc cexpr, ExprNodeDesc[] pexprs, Operator child, Operator[] parents, boolean[] sorted)
           
protected static int CorrelationUtilities.indexOf(ExprNodeDesc cexpr, ExprNodeDesc[] pexprs, Operator child, Operator[] parents, boolean[] sorted)
           
protected static boolean CorrelationUtilities.isExisted(ExprNodeDesc expr, List<ExprNodeDesc> columns)
           
 

Method parameters in org.apache.hadoop.hive.ql.optimizer.correlation with type arguments of type ExprNodeDesc
protected static String CorrelationUtilities.getColumnName(Map<String,ExprNodeDesc> opColumnExprMap, ExprNodeDesc expr)
           
protected static boolean CorrelationUtilities.isExisted(ExprNodeDesc expr, List<ExprNodeDesc> columns)
           
protected  Integer ReduceSinkDeDuplication.AbsctractReducerReducerProc.sameKeys(List<ExprNodeDesc> cexprs, List<ExprNodeDesc> pexprs, Operator<?> child, Operator<?> parent)
           
protected  Integer ReduceSinkDeDuplication.AbsctractReducerReducerProc.sameKeys(List<ExprNodeDesc> cexprs, List<ExprNodeDesc> pexprs, Operator<?> child, Operator<?> parent)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.lineage
 

Methods in org.apache.hadoop.hive.ql.optimizer.lineage with parameters of type ExprNodeDesc
static LineageInfo.Dependency ExprProcFactory.getExprDependency(LineageCtx lctx, Operator<? extends OperatorDesc> inpOp, ExprNodeDesc expr)
          Gets the expression dependencies for the expression.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner that return ExprNodeDesc
static ExprNodeDesc LBExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, Partition part)
          Generates the list bucketing pruner for the expression tree.
protected  ExprNodeDesc LBExprProcFactory.LBPRColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner that return types with arguments of type ExprNodeDesc
 Map<TableScanOperator,Map<String,ExprNodeDesc>> LBOpWalkerCtx.getOpToPartToLBPruner()
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner with parameters of type ExprNodeDesc
static ExprNodeDesc LBExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred, Partition part)
          Generates the list bucketing pruner for the expression tree.
static Path[] ListBucketingPruner.prune(ParseContext ctx, Partition part, ExprNodeDesc pruner)
          Prunes to the directories which match the skewed keys in where clause.
 

Constructor parameters in org.apache.hadoop.hive.ql.optimizer.listbucketingpruner with type arguments of type ExprNodeDesc
LBOpWalkerCtx(Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToLBPruner, Partition part)
          Constructor.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.pcr
 

Fields in org.apache.hadoop.hive.ql.optimizer.pcr declared as ExprNodeDesc
 ExprNodeDesc PcrExprProcFactory.NodeInfoWrapper.outExpr
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.pcr with parameters of type ExprNodeDesc
static PcrExprProcFactory.NodeInfoWrapper PcrExprProcFactory.walkExprTree(String tabAlias, ArrayList<Partition> parts, List<VirtualColumn> vcs, ExprNodeDesc pred)
          Remove partition conditions when necessary from the the expression tree.
 

Constructors in org.apache.hadoop.hive.ql.optimizer.pcr with parameters of type ExprNodeDesc
PcrExprProcFactory.NodeInfoWrapper(PcrExprProcFactory.WalkState state, Boolean[] resultVector, ExprNodeDesc outExpr)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.optimizer.ppr
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return ExprNodeDesc
static ExprNodeDesc ExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred)
          Generates the partition pruner for the expression tree.
protected  ExprNodeDesc ExprProcFactory.PPRColumnExprProcessor.processColumnDesc(NodeProcessorCtx procCtx, ExprNodeColumnDesc cd)
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr that return types with arguments of type ExprNodeDesc
 HashMap<TableScanOperator,ExprNodeDesc> OpWalkerCtx.getOpToPartPruner()
           
 

Methods in org.apache.hadoop.hive.ql.optimizer.ppr with parameters of type ExprNodeDesc
static Object PartExprEvalUtils.evalExprWithPart(ExprNodeDesc expr, LinkedHashMap<String,String> partSpec, List<VirtualColumn> vcs, StructObjectInspector rowObjectInspector)
          Evaluate expression with partition columns
static ExprNodeDesc ExprProcFactory.genPruner(String tabAlias, ExprNodeDesc pred)
          Generates the partition pruner for the expression tree.
static boolean PartitionPruner.hasColumnExpr(ExprNodeDesc desc)
          Whether the expression contains a column node or not.
static boolean PartitionPruner.onlyContainsPartnCols(Table tab, ExprNodeDesc expr)
          Find out whether the condition only contains partitioned columns.
static ObjectPair<PrimitiveObjectInspector,ExprNodeEvaluator> PartExprEvalUtils.prepareExpr(ExprNodeDesc expr, List<String> partNames)
           
static boolean PartitionPruner.prunePartitionNames(List<String> columnNames, ExprNodeDesc prunerExpr, String defaultPartitionName, List<String> partNames)
          Prunes partition names to see if they match the prune expression.
 

Constructor parameters in org.apache.hadoop.hive.ql.optimizer.ppr with type arguments of type ExprNodeDesc
OpWalkerCtx(HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner)
          Constructor.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.parse
 

Methods in org.apache.hadoop.hive.ql.parse that return ExprNodeDesc
 ExprNodeDesc SemanticAnalyzer.genExprNodeDesc(ASTNode expr, RowResolver input)
          Generates an expression node descriptor for the expression with TypeCheckCtx.
 ExprNodeDesc SemanticAnalyzer.genExprNodeDesc(ASTNode expr, RowResolver input, TypeCheckCtx tcCtx)
          Returns expression node descriptor for the expression.
static ExprNodeDesc TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName, ExprNodeDesc... children)
           
static ExprNodeDesc TypeCheckProcFactory.processGByExpr(Node nd, Object procCtx)
          Function to do groupby subexpression elimination.
 

Methods in org.apache.hadoop.hive.ql.parse that return types with arguments of type ExprNodeDesc
 Map<ASTNode,ExprNodeDesc> SemanticAnalyzer.genAllExprNodeDesc(ASTNode expr, RowResolver input)
          Generates an expression node descriptors for the expression and children of it with default TypeCheckCtx.
 Map<ASTNode,ExprNodeDesc> SemanticAnalyzer.genAllExprNodeDesc(ASTNode expr, RowResolver input, TypeCheckCtx tcCtx)
          Generates all of the expression node descriptors for the expression and children of it passed in the arguments.
static Map<ASTNode,ExprNodeDesc> TypeCheckProcFactory.genExprNode(ASTNode expr, TypeCheckCtx tcCtx)
           
 HashMap<TableScanOperator,ExprNodeDesc> ParseContext.getOpToPartPruner()
           
 Map<TableScanOperator,Map<String,ExprNodeDesc>> ParseContext.getOpToPartToSkewedPruner()
           
 

Methods in org.apache.hadoop.hive.ql.parse with parameters of type ExprNodeDesc
 void PTFTranslator.LeadLagInfo.addLLFuncExprForTopExpr(ExprNodeDesc topExpr, ExprNodeGenericFuncDesc llFuncExpr)
           
static ExprNodeEvaluator WindowingExprNodeEvaluatorFactory.get(PTFTranslator.LeadLagInfo llInfo, ExprNodeDesc desc)
           
static ExprNodeDesc TypeCheckProcFactory.DefaultExprProcessor.getFuncExprNodeDesc(String udfName, ExprNodeDesc... children)
           
 List<ExprNodeGenericFuncDesc> PTFTranslator.LeadLagInfo.getLLFuncExprsInTopExpr(ExprNodeDesc topExpr)
           
 

Method parameters in org.apache.hadoop.hive.ql.parse with type arguments of type ExprNodeDesc
 void ParseContext.setOpPartToSkewedPruner(HashMap<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner)
           
 void ParseContext.setOpToPartPruner(HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner)
           
 

Constructor parameters in org.apache.hadoop.hive.ql.parse with type arguments of type ExprNodeDesc
ParseContext(HiveConf conf, QB qb, ASTNode ast, HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner, HashMap<TableScanOperator,PrunedPartitionList> opToPartList, HashMap<String,Operator<? extends OperatorDesc>> topOps, HashMap<String,Operator<? extends OperatorDesc>> topSelOps, LinkedHashMap<Operator<? extends OperatorDesc>,OpParseContext> opParseCtx, Map<JoinOperator,QBJoinTree> joinContext, Map<SMBMapJoinOperator,QBJoinTree> smbMapJoinContext, HashMap<TableScanOperator,Table> topToTable, HashMap<TableScanOperator,Map<String,String>> topToProps, Map<FileSinkOperator,Table> fsopToTable, List<LoadTableDesc> loadTableWork, List<LoadFileDesc> loadFileWork, Context ctx, HashMap<String,String> idToTableNameMap, int destTableId, UnionProcContext uCtx, List<AbstractMapJoinOperator<? extends MapJoinDesc>> listMapJoinOpsNoReducer, Map<GroupByOperator,Set<String>> groupOpToInputTables, Map<String,PrunedPartitionList> prunedPartitions, HashMap<TableScanOperator,FilterDesc.sampleDesc> opToSamplePruner, GlobalLimitCtx globalLimitCtx, HashMap<String,SplitSample> nameToSplitSample, HashSet<ReadEntity> semanticInputs, List<Task<? extends Serializable>> rootTasks, Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner, Map<String,ReadEntity> viewAliasToInput, List<ReduceSinkOperator> reduceSinkOperatorsAddedByEnforceBucketingSorting, QueryProperties queryProperties)
           
ParseContext(HiveConf conf, QB qb, ASTNode ast, HashMap<TableScanOperator,ExprNodeDesc> opToPartPruner, HashMap<TableScanOperator,PrunedPartitionList> opToPartList, HashMap<String,Operator<? extends OperatorDesc>> topOps, HashMap<String,Operator<? extends OperatorDesc>> topSelOps, LinkedHashMap<Operator<? extends OperatorDesc>,OpParseContext> opParseCtx, Map<JoinOperator,QBJoinTree> joinContext, Map<SMBMapJoinOperator,QBJoinTree> smbMapJoinContext, HashMap<TableScanOperator,Table> topToTable, HashMap<TableScanOperator,Map<String,String>> topToProps, Map<FileSinkOperator,Table> fsopToTable, List<LoadTableDesc> loadTableWork, List<LoadFileDesc> loadFileWork, Context ctx, HashMap<String,String> idToTableNameMap, int destTableId, UnionProcContext uCtx, List<AbstractMapJoinOperator<? extends MapJoinDesc>> listMapJoinOpsNoReducer, Map<GroupByOperator,Set<String>> groupOpToInputTables, Map<String,PrunedPartitionList> prunedPartitions, HashMap<TableScanOperator,FilterDesc.sampleDesc> opToSamplePruner, GlobalLimitCtx globalLimitCtx, HashMap<String,SplitSample> nameToSplitSample, HashSet<ReadEntity> semanticInputs, List<Task<? extends Serializable>> rootTasks, Map<TableScanOperator,Map<String,ExprNodeDesc>> opToPartToSkewedPruner, Map<String,ReadEntity> viewAliasToInput, List<ReduceSinkOperator> reduceSinkOperatorsAddedByEnforceBucketingSorting, QueryProperties queryProperties)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.plan
 

Subclasses of ExprNodeDesc in org.apache.hadoop.hive.ql.plan
 class ExprNodeColumnDesc
          ExprNodeColumnDesc.
 class ExprNodeColumnListDesc
          Dummy desc only for populating TOK_ALLCOLREF and should not be used outside of TypeCheckProcFactory
 class ExprNodeConstantDesc
          A constant expression.
 class ExprNodeFieldDesc
          ExprNodeFieldDesc.
 class ExprNodeGenericFuncDesc
          Describes a GenericFunc node.
 class ExprNodeNullDesc
          ExprNodeNullDesc.
 

Methods in org.apache.hadoop.hive.ql.plan that return ExprNodeDesc
static ExprNodeDesc ExprNodeDescUtils.backtrack(ExprNodeDesc source, Operator<?> current, Operator<?> terminal)
           
 ExprNodeDesc ExprNodeConstantDesc.clone()
           
 ExprNodeDesc ExprNodeNullDesc.clone()
           
 ExprNodeDesc ExprNodeGenericFuncDesc.clone()
           
 ExprNodeDesc ExprNodeFieldDesc.clone()
           
abstract  ExprNodeDesc ExprNodeDesc.clone()
           
 ExprNodeDesc ExprNodeColumnListDesc.clone()
           
 ExprNodeDesc ExprNodeColumnDesc.clone()
           
 ExprNodeDesc CreateMacroDesc.getBody()
           
 ExprNodeDesc ExtractDesc.getCol()
           
 ExprNodeDesc ExprNodeFieldDesc.getDesc()
           
 ExprNodeDesc PTFDesc.ValueBoundaryDef.getExprNode()
           
 ExprNodeDesc PTFDesc.PTFExpressionDef.getExprNode()
           
 ExprNodeDesc ExprNodeDesc.ExprNodeDescEqualityWrapper.getExprNodeDesc()
           
 ExprNodeDesc TableScanDesc.getFilterExpr()
           
 ExprNodeDesc FilterDesc.getPredicate()
           
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(ExprNodeDesc prev, ExprNodeDesc next)
          bind two predicates by AND op
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(List<ExprNodeDesc> exprs)
          bind n predicates by AND op
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 

Methods in org.apache.hadoop.hive.ql.plan that return types with arguments of type ExprNodeDesc
static ArrayList<ExprNodeDesc> ExprNodeDescUtils.backtrack(List<ExprNodeDesc> sources, Operator<?> current, Operator<?> terminal)
          Convert expressions in current operator to those in terminal operator, which is an ancestor of current or null (back to top operator).
 List<ExprNodeDesc> ExprNodeGenericFuncDesc.getChildExprs()
           
 List<ExprNodeDesc> ExprNodeGenericFuncDesc.getChildren()
           
 List<ExprNodeDesc> ExprNodeFieldDesc.getChildren()
           
 List<ExprNodeDesc> ExprNodeDesc.getChildren()
           
 List<ExprNodeDesc> ExprNodeColumnListDesc.getChildren()
           
 List<ExprNodeDesc> SelectDesc.getColList()
           
 Map<Byte,List<ExprNodeDesc>> JoinDesc.getExprs()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getExprs()
           
 Map<Byte,List<ExprNodeDesc>> JoinDesc.getFilters()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getFilters()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getKeyCols()
           
 ArrayList<ExprNodeDesc> GroupByDesc.getKeys()
           
 Map<Byte,List<ExprNodeDesc>> HashTableSinkDesc.getKeys()
           
 Map<Byte,List<ExprNodeDesc>> MapJoinDesc.getKeys()
           
 ArrayList<ExprNodeDesc> AggregationDesc.getParameters()
           
 List<List<ExprNodeDesc>> MuxDesc.getParentToKeyCols()
           
 List<List<ExprNodeDesc>> MuxDesc.getParentToValueCols()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getPartitionCols()
           
 ArrayList<ExprNodeDesc> FileSinkDesc.getPartitionCols()
           
 ArrayList<ExprNodeDesc> ReduceSinkDesc.getValueCols()
           
static List<ExprNodeDesc> ExprNodeDescUtils.split(ExprNodeDesc current)
          split predicates by AND op
static List<ExprNodeDesc> ExprNodeDescUtils.split(ExprNodeDesc current, List<ExprNodeDesc> splitted)
          split predicates by AND op
 

Methods in org.apache.hadoop.hive.ql.plan with parameters of type ExprNodeDesc
static ExprNodeDesc ExprNodeDescUtils.backtrack(ExprNodeDesc source, Operator<?> current, Operator<?> terminal)
           
static boolean ExprNodeDescUtils.containsPredicate(ExprNodeDesc source, ExprNodeDesc predicate)
          return true if predicate is already included in source
static int ExprNodeDescUtils.indexOf(ExprNodeDesc origin, List<ExprNodeDesc> sources)
           
static boolean ExprNodeDescUtils.isDeterministic(ExprNodeDesc desc)
          Return false if the expression has any non determinitic function
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(ExprNodeDesc prev, ExprNodeDesc next)
          bind two predicates by AND op
static String ExprNodeDescUtils.recommendInputName(ExprNodeDesc desc)
          Recommend name for the expression
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 void ExtractDesc.setCol(ExprNodeDesc col)
           
 void ExprNodeFieldDesc.setDesc(ExprNodeDesc desc)
           
 void PTFDesc.PTFExpressionDef.setExprNode(ExprNodeDesc exprNode)
           
 void ExprNodeDesc.ExprNodeDescEqualityWrapper.setExprNodeDesc(ExprNodeDesc exprNodeDesc)
           
 void TableScanDesc.setFilterExpr(ExprNodeDesc filterExpr)
           
 void FilterDesc.setPredicate(ExprNodeDesc predicate)
           
static List<ExprNodeDesc> ExprNodeDescUtils.split(ExprNodeDesc current)
          split predicates by AND op
static List<ExprNodeDesc> ExprNodeDescUtils.split(ExprNodeDesc current, List<ExprNodeDesc> splitted)
          split predicates by AND op
 

Method parameters in org.apache.hadoop.hive.ql.plan with type arguments of type ExprNodeDesc
static ArrayList<ExprNodeDesc> ExprNodeDescUtils.backtrack(List<ExprNodeDesc> sources, Operator<?> current, Operator<?> terminal)
          Convert expressions in current operator to those in terminal operator, which is an ancestor of current or null (back to top operator).
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnList(List<ExprNodeDesc> cols, List<String> outputColumnNames, int start, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnList(List<ExprNodeDesc> cols, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static List<FieldSchema> PlanUtils.getFieldSchemasFromColumnListWithLength(List<ExprNodeDesc> cols, List<List<Integer>> distinctColIndices, List<String> outputColumnNames, int length, String fieldPrefix)
          Convert the ColumnList to FieldSchema list.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, ArrayList<ExprNodeDesc> valueCols, List<String> outputColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKeyCols, int tag, ArrayList<ExprNodeDesc> partitionCols, String order, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static ReduceSinkDesc PlanUtils.getReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numKeys, ArrayList<ExprNodeDesc> valueCols, List<List<Integer>> distinctColIndices, List<String> outputKeyColumnNames, List<String> outputValueColumnNames, boolean includeKey, int tag, int numPartitionFields, int numReducers)
          Create the reduce sink descriptor.
static int ExprNodeDescUtils.indexOf(ExprNodeDesc origin, List<ExprNodeDesc> sources)
           
static ExprNodeDesc ExprNodeDescUtils.mergePredicates(List<ExprNodeDesc> exprs)
          bind n predicates by AND op
static ExprNodeGenericFuncDesc ExprNodeGenericFuncDesc.newInstance(GenericUDF genericUDF, List<ExprNodeDesc> children)
          Create a exprNodeGenericFuncDesc based on the genericUDFClass and the children parameters.
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
static ExprNodeDesc ExprNodeDescUtils.replace(ExprNodeDesc origin, List<ExprNodeDesc> sources, List<ExprNodeDesc> targets)
           
 void ExprNodeGenericFuncDesc.setChildExprs(List<ExprNodeDesc> children)
           
 void SelectDesc.setColList(List<ExprNodeDesc> colList)
           
 void JoinDesc.setExprs(Map<Byte,List<ExprNodeDesc>> exprs)
           
 void HashTableSinkDesc.setExprs(Map<Byte,List<ExprNodeDesc>> exprs)
           
 void JoinDesc.setFilters(Map<Byte,List<ExprNodeDesc>> filters)
           
 void HashTableSinkDesc.setFilters(Map<Byte,List<ExprNodeDesc>> filters)
           
 void ReduceSinkDesc.setKeyCols(ArrayList<ExprNodeDesc> keyCols)
           
 void GroupByDesc.setKeys(ArrayList<ExprNodeDesc> keys)
           
 void HashTableSinkDesc.setKeys(Map<Byte,List<ExprNodeDesc>> keys)
           
 void MapJoinDesc.setKeys(Map<Byte,List<ExprNodeDesc>> keys)
           
 void AggregationDesc.setParameters(ArrayList<ExprNodeDesc> parameters)
           
 void MuxDesc.setParentToKeyCols(List<List<ExprNodeDesc>> parentToKeyCols)
           
 void MuxDesc.setParentToValueCols(List<List<ExprNodeDesc>> parentToValueCols)
           
 void ReduceSinkDesc.setPartitionCols(ArrayList<ExprNodeDesc> partitionCols)
           
 void FileSinkDesc.setPartitionCols(ArrayList<ExprNodeDesc> partitionCols)
           
 void ReduceSinkDesc.setValueCols(ArrayList<ExprNodeDesc> valueCols)
           
static List<ExprNodeDesc> ExprNodeDescUtils.split(ExprNodeDesc current, List<ExprNodeDesc> splitted)
          split predicates by AND op
 

Constructors in org.apache.hadoop.hive.ql.plan with parameters of type ExprNodeDesc
CreateMacroDesc(String macroName, List<String> colNames, List<TypeInfo> colTypes, ExprNodeDesc body)
           
ExprNodeDesc.ExprNodeDescEqualityWrapper(ExprNodeDesc exprNodeDesc)
           
ExprNodeFieldDesc(TypeInfo typeInfo, ExprNodeDesc desc, String fieldName, Boolean isList)
           
ExtractDesc(ExprNodeDesc col)
           
FilterDesc(ExprNodeDesc predicate, boolean isSamplingPred)
           
FilterDesc(ExprNodeDesc predicate, boolean isSamplingPred, FilterDesc.sampleDesc sampleDescr)
           
 

Constructor parameters in org.apache.hadoop.hive.ql.plan with type arguments of type ExprNodeDesc
AggregationDesc(String genericUDAFName, GenericUDAFEvaluator genericUDAFEvaluator, ArrayList<ExprNodeDesc> parameters, boolean distinct, GenericUDAFEvaluator.Mode mode)
           
ExprNodeGenericFuncDesc(ObjectInspector oi, GenericUDF genericUDF, List<ExprNodeDesc> children)
           
ExprNodeGenericFuncDesc(TypeInfo typeInfo, GenericUDF genericUDF, List<ExprNodeDesc> children)
           
FileSinkDesc(String dirName, TableDesc tableInfo, boolean compressed, int destTableId, boolean multiFileSpray, boolean canBeMerged, int numFiles, int totalFiles, ArrayList<ExprNodeDesc> partitionCols, DynamicPartitionCtx dpCtx)
           
GroupByDesc(GroupByDesc.Mode mode, ArrayList<String> outputColumnNames, ArrayList<ExprNodeDesc> keys, ArrayList<AggregationDesc> aggregators, boolean groupKeyNotReductionKey, boolean bucketGroup, float groupByMemoryUsage, float memoryThreshold, List<Integer> listGroupingSets, boolean groupingSetsPresent, int groupingSetsPosition, boolean isDistinct)
           
GroupByDesc(GroupByDesc.Mode mode, ArrayList<String> outputColumnNames, ArrayList<ExprNodeDesc> keys, ArrayList<AggregationDesc> aggregators, boolean groupKeyNotReductionKey, float groupByMemoryUsage, float memoryThreshold, List<Integer> listGroupingSets, boolean groupingSetsPresent, int groupingSetsPosition, boolean isDistinct)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, boolean noOuterJoin, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters)
           
JoinDesc(Map<Byte,List<ExprNodeDesc>> exprs, List<String> outputColumnNames, JoinCondDesc[] conds)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
MapJoinDesc(Map<Byte,List<ExprNodeDesc>> keys, TableDesc keyTblDesc, Map<Byte,List<ExprNodeDesc>> values, List<TableDesc> valueTblDescs, List<TableDesc> valueFilteredTblDescs, List<String> outputColumnNames, int posBigTable, JoinCondDesc[] conds, Map<Byte,List<ExprNodeDesc>> filters, boolean noOuterJoin, String dumpFilePrefix)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
ReduceSinkDesc(ArrayList<ExprNodeDesc> keyCols, int numDistributionKeys, ArrayList<ExprNodeDesc> valueCols, ArrayList<String> outputKeyColumnNames, List<List<Integer>> distinctColumnIndices, ArrayList<String> outputValueColumnNames, int tag, ArrayList<ExprNodeDesc> partitionCols, int numReducers, TableDesc keySerializeInfo, TableDesc valueSerializeInfo)
           
SelectDesc(List<ExprNodeDesc> colList, boolean selectStar, boolean selStarNoCompute)
           
SelectDesc(List<ExprNodeDesc> colList, List<String> outputColumnNames)
           
SelectDesc(List<ExprNodeDesc> colList, List<String> outputColumnNames, boolean selectStar)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.ppd
 

Methods in org.apache.hadoop.hive.ql.ppd that return ExprNodeDesc
 ExprNodeDesc ExprWalkerInfo.getConvertedNode(Node nd)
           
 

Methods in org.apache.hadoop.hive.ql.ppd that return types with arguments of type ExprNodeDesc
 Map<String,List<ExprNodeDesc>> ExprWalkerInfo.getFinalCandidates()
          Returns the list of pushdown expressions for each alias that appear in the current operator's RowResolver.
 Map<ExprNodeDesc,ExprNodeDesc> ExprWalkerInfo.getNewToOldExprMap()
           
 Map<ExprNodeDesc,ExprNodeDesc> ExprWalkerInfo.getNewToOldExprMap()
           
 Map<String,List<ExprNodeDesc>> ExprWalkerInfo.getNonFinalCandidates()
          Returns list of non-final candidate predicate for each map.
 

Methods in org.apache.hadoop.hive.ql.ppd with parameters of type ExprNodeDesc
 void ExprWalkerInfo.addAlias(ExprNodeDesc expr, String alias)
          Adds the specified alias to the specified expr.
 void ExprWalkerInfo.addConvertedNode(ExprNodeDesc oldNode, ExprNodeDesc newNode)
          adds a replacement node for this expression.
 void ExprWalkerInfo.addFinalCandidate(ExprNodeDesc expr)
          Adds the specified expr as the top-most pushdown expr (ie all its children can be pushed).
 void ExprWalkerInfo.addNonFinalCandidate(ExprNodeDesc expr)
          Adds the specified expr as a non-final candidate
static ExprWalkerInfo ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext, Operator<? extends OperatorDesc> op, ExprNodeDesc pred)
           
 String ExprWalkerInfo.getAlias(ExprNodeDesc expr)
          Returns the alias of the specified expr.
 boolean ExprWalkerInfo.isCandidate(ExprNodeDesc expr)
          Returns true if the specified expression is pushdown candidate else false.
 void ExprWalkerInfo.setIsCandidate(ExprNodeDesc expr, boolean b)
          Marks the specified expr to the specified value.
 

Method parameters in org.apache.hadoop.hive.ql.ppd with type arguments of type ExprNodeDesc
 void ExprWalkerInfo.addPushDowns(String alias, List<ExprNodeDesc> pushDowns)
          Adds the passed list of pushDowns for the alias.
static ExprWalkerInfo ExprWalkerProcFactory.extractPushdownPreds(OpWalkerInfo opContext, Operator<? extends OperatorDesc> op, List<ExprNodeDesc> preds)
          Extracts pushdown predicates from the given list of predicate expression.
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.udf.generic
 

Methods in org.apache.hadoop.hive.ql.udf.generic that return ExprNodeDesc
 ExprNodeDesc GenericUDFMacro.getBody()
           
 

Methods in org.apache.hadoop.hive.ql.udf.generic with parameters of type ExprNodeDesc
 void GenericUDFMacro.setBody(ExprNodeDesc bodyDesc)
           
 

Constructors in org.apache.hadoop.hive.ql.udf.generic with parameters of type ExprNodeDesc
GenericUDFMacro(String macroName, ExprNodeDesc bodyDesc, List<String> colNames, List<TypeInfo> colTypes)
           
 

Uses of ExprNodeDesc in org.apache.hadoop.hive.ql.udf.ptf
 

Methods in org.apache.hadoop.hive.ql.udf.ptf that return ExprNodeDesc
static ExprNodeDesc MatchPath.ResultExpressionParser.buildExprNode(ASTNode expr, TypeCheckCtx typeCheckCtx)
           
 

Methods in org.apache.hadoop.hive.ql.udf.ptf that return types with arguments of type ExprNodeDesc
 ArrayList<ExprNodeDesc> MatchPath.ResultExprInfo.getResultExprNodes()
           
 

Method parameters in org.apache.hadoop.hive.ql.udf.ptf with type arguments of type ExprNodeDesc
 void MatchPath.ResultExprInfo.setResultExprNodes(ArrayList<ExprNodeDesc> resultExprNodes)
           
 



Copyright © 2012 The Apache Software Foundation