Uses of Class
org.apache.hadoop.hive.ql.parse.SemanticException

Packages that use SemanticException
org.apache.hadoop.hive.ql.lib   
org.apache.hadoop.hive.ql.optimizer   
org.apache.hadoop.hive.ql.optimizer.unionproc   
org.apache.hadoop.hive.ql.parse   
org.apache.hadoop.hive.ql.tools   
 

Uses of SemanticException in org.apache.hadoop.hive.ql.lib
 

Methods in org.apache.hadoop.hive.ql.lib that throw SemanticException
 int RuleRegExp.cost(Stack<Node> stack)
          This function returns the cost of the rule for the specified stack.
 int Rule.cost(Stack<Node> stack)
           
 void DefaultGraphWalker.dispatch(Node nd, Stack<Node> ndStack)
          Dispatch the current operator
 Object DefaultRuleDispatcher.dispatch(Node nd, Stack<Node> ndStack, Object... nodeOutputs)
          dispatcher function
 Object Dispatcher.dispatch(Node nd, Stack<Node> stack, Object... nodeOutputs)
          Dispatcher function.
 Object NodeProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
          generic process for all ops that don't have specific implementations
 void GraphWalker.startWalking(Collection<Node> startNodes, HashMap<Node,Object> nodeOutput)
          starting point for walking.
 void DefaultGraphWalker.startWalking(Collection<Node> startNodes, HashMap<Node,Object> nodeOutput)
          starting point for walking
 void PreOrderWalker.walk(Node nd)
          walk the current operator and its descendants
 void DefaultGraphWalker.walk(Node nd)
          walk the current operator and its descendants
 

Uses of SemanticException in org.apache.hadoop.hive.ql.optimizer
 

Methods in org.apache.hadoop.hive.ql.optimizer that throw SemanticException
 List<String> ColumnPrunerProcCtx.genColLists(Operator<? extends Serializable> curOp)
          Creates the list of internal column names(these names are used in the RowResolver and are different from the external column names) that are needed in the subtree.
static void GenMapRedUtils.initPlan(ReduceSinkOperator op, GenMRProcContext opProcCtx)
          Initialize the current plan by adding it to root tasks
static void GenMapRedUtils.initUnionPlan(ReduceSinkOperator op, GenMRProcContext opProcCtx)
          Initialize the current union plan.
static void GenMapRedUtils.joinPlan(ReduceSinkOperator op, Task<? extends Serializable> oldTask, Task<? extends Serializable> task, GenMRProcContext opProcCtx)
          Merge the current task with the task for the current reducer
 ParseContext Optimizer.optimize()
          invoke all the transformations one-by-one, and alter the query plan
 Object GenMRUnion1.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          Union Operator encountered .
 Object GenMRTableScan1.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          Table Sink encountered
 Object GenMRRedSink3.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          Reduce Scan encountered
 Object GenMRRedSink2.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          Reduce Scan encountered
 Object GenMRRedSink1.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          Reduce Scan encountered
 Object GenMROperator.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
          Reduce Scan encountered
 Object GenMRFileSink1.process(Node nd, Stack<Node> stack, NodeProcessorCtx opProcCtx, Object... nodeOutputs)
          File Sink Operator encountered
 Object ColumnPrunerProcFactory.ColumnPrunerFilterProc.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
 Object ColumnPrunerProcFactory.ColumnPrunerGroupByProc.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
 Object ColumnPrunerProcFactory.ColumnPrunerDefaultProc.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
 Object ColumnPrunerProcFactory.ColumnPrunerReduceSinkProc.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
 Object ColumnPrunerProcFactory.ColumnPrunerSelectProc.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
static void GenMapRedUtils.setTaskPlan(String alias_id, Operator<? extends Serializable> topOp, mapredWork plan, boolean local, GenMRProcContext opProcCtx)
          set the current task in the mapredWork
static void GenMapRedUtils.splitPlan(ReduceSinkOperator op, GenMRProcContext opProcCtx)
          Split the current plan by creating a temporary destination
 ParseContext Transform.transform(ParseContext pctx)
          All transformation steps implement this interface
 ParseContext ColumnPruner.transform(ParseContext pactx)
          Transform the query tree.
 void ColumnPruner.ColumnPrunerWalker.walk(Node nd)
          Walk the given operator
 

Uses of SemanticException in org.apache.hadoop.hive.ql.optimizer.unionproc
 

Methods in org.apache.hadoop.hive.ql.optimizer.unionproc that throw SemanticException
 Object UnionProcFactory.MapRedUnion.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object UnionProcFactory.MapUnion.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object UnionProcFactory.UnknownUnion.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object UnionProcFactory.NoUnion.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 ParseContext UnionProcessor.transform(ParseContext pCtx)
          Transform the query tree.
 

Uses of SemanticException in org.apache.hadoop.hive.ql.parse
 

Methods in org.apache.hadoop.hive.ql.parse that throw SemanticException
 void PartitionPruner.addExpression(ASTNode expr)
          Add an expression
 void PartitionPruner.addJoinOnExpression(ASTNode expr)
          Add an expression from the JOIN condition.
 void BaseSemanticAnalyzer.analyze(ASTNode ast, Context ctx)
           
 void SemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
 void LoadSemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
 void FunctionSemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
 void ExplainSemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
 void DDLSemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
abstract  void BaseSemanticAnalyzer.analyzeInternal(ASTNode ast, Context ctx)
           
static String BaseSemanticAnalyzer.charSetString(String charSetName, String charSetString)
           
 void SemanticAnalyzer.doPhase1(ASTNode ast, QB qb, org.apache.hadoop.hive.ql.parse.SemanticAnalyzer.Phase1Ctx ctx_1)
           
 void SemanticAnalyzer.doPhase1QBExpr(ASTNode ast, QBExpr qbexpr, String id, String alias)
           
 Operator SemanticAnalyzer.genPlan(QB qb)
           
static BaseSemanticAnalyzer SemanticAnalyzerFactory.get(HiveConf conf, ASTNode tree)
           
 ColumnInfo RowResolver.get(String tab_alias, String col_alias)
          Gets the column Info to tab_alias.col_alias type of a column reference.
 void SemanticAnalyzer.getMetaData(QB qb)
           
 Object TypeCheckProcFactory.NullExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object TypeCheckProcFactory.NumExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object TypeCheckProcFactory.StrExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object TypeCheckProcFactory.BoolExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object TypeCheckProcFactory.ColumnExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object TypeCheckProcFactory.DefaultExprProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
           
 Object PrintOpTreeProcessor.process(Node nd, Stack<Node> stack, NodeProcessorCtx ctx, Object... nodeOutputs)
           
static exprNodeDesc TypeCheckProcFactory.processGByExpr(Node nd, Object procCtx)
          Function to do groupby subexpression elimination.
 org.apache.hadoop.fs.Path[] SamplePruner.prune(Partition part)
          Prunes to get all the files in the partition that satisfy the TABLESAMPLE clause
static String BaseSemanticAnalyzer.stripQuotes(String val)
           
 void GenMapRedWalker.walk(Node nd)
          Walk the given operator
 

Constructors in org.apache.hadoop.hive.ql.parse that throw SemanticException
BaseSemanticAnalyzer.tableSpec(Hive db, ASTNode ast, boolean forceCreatePartition)
           
BaseSemanticAnalyzer(HiveConf conf)
           
DDLSemanticAnalyzer(HiveConf conf)
           
ExplainSemanticAnalyzer(HiveConf conf)
           
FunctionSemanticAnalyzer(HiveConf conf)
           
LoadSemanticAnalyzer(HiveConf conf)
           
SemanticAnalyzer(HiveConf conf)
           
 

Uses of SemanticException in org.apache.hadoop.hive.ql.tools
 

Methods in org.apache.hadoop.hive.ql.tools that throw SemanticException
 void LineageInfo.getLineageInfo(String query)
          parses given query and gets the lineage info.
static void LineageInfo.main(String[] args)
           
 Object LineageInfo.process(Node nd, Stack<Node> stack, NodeProcessorCtx procCtx, Object... nodeOutputs)
          Implements the process method for the NodeProcessor interface.
 



Copyright © 2009 The Apache Software Foundation