类 JavaScriptTarget
- java.lang.Object
-
- org.antlr.codegen.Target
-
- org.antlr.codegen.JavaScriptTarget
-
public class JavaScriptTarget extends Target
-
-
字段概要
-
从类继承的字段 org.antlr.codegen.Target
targetCharValueEscape
-
-
构造器概要
构造器 构造器 说明 JavaScriptTarget()
-
方法概要
所有方法 实例方法 具体方法 修饰符和类型 方法 说明 java.lang.String
encodeIntAsCharEscape(int v)
Convert an int to a JavaScript Unicode character literal.java.lang.String
getTarget64BitStringFromValue(long word)
Convert long to two 32-bit numbers separted by a comma.-
从类继承的方法 org.antlr.codegen.Target
genRecognizerFile, genRecognizerHeaderFile, getMaxCharValue, getTargetCharLiteralFromANTLRCharLiteral, getTargetStringLiteralFromANTLRStringLiteral, getTargetStringLiteralFromString, getTargetStringLiteralFromString, getTokenTypeAsTargetLabel, isValidActionScope, performGrammarAnalysis, postProcessAction, useBaseTemplatesForSynPredFragments
-
-
-
-
方法详细资料
-
encodeIntAsCharEscape
public java.lang.String encodeIntAsCharEscape(int v)
Convert an int to a JavaScript Unicode character literal. The current JavaScript spec (ECMA-262) doesn't provide for octal notation in String literals, although some implementations support it. This method overrides the parent class so that characters will always be encoded as Unicode literals (e.g. ).- 覆盖:
encodeIntAsCharEscape
在类中Target
-
getTarget64BitStringFromValue
public java.lang.String getTarget64BitStringFromValue(long word)
Convert long to two 32-bit numbers separted by a comma. JavaScript does not support 64-bit numbers, so we need to break the number into two 32-bit literals to give to the Bit. A number like 0xHHHHHHHHLLLLLLLL is broken into the following string: "0xLLLLLLLL, 0xHHHHHHHH" Note that the low order bits are first, followed by the high order bits. This is to match how the BitSet constructor works, where the bits are passed in in 32-bit chunks with low-order bits coming first. Note: stole the following two methods from the ActionScript target.- 覆盖:
getTarget64BitStringFromValue
在类中Target
-
-