|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||
| Interface Summary | |
|---|---|
| BaseTokenStreamTestCase.CheckClearAttributesAttribute | Attribute that records if it was cleared or not. |
| Class Summary | |
|---|---|
| BaseTokenStreamTestCase | Base class for all Lucene unit tests that use TokenStreams. |
| BaseTokenStreamTestCase.CheckClearAttributesAttributeImpl | Attribute that records if it was cleared or not. |
| CannedTokenStream | TokenStream from a canned list of Tokens. |
| CollationTestBase | Base test class for testing Unicode collation. |
| EmptyTokenizer | Emits no tokens |
| LookaheadTokenFilter<T extends LookaheadTokenFilter.Position> | An abstract TokenFilter to make it easier to build graph token filters requiring some lookahead. |
| LookaheadTokenFilter.Position | Holds all state for a single position; subclass this to record other state at each position. |
| MockAnalyzer | Analyzer for testing |
| MockCharFilter | the purpose of this charfilter is to send offsets out of bounds if the analyzer doesn't use correctOffset or does incorrect offset math. |
| MockFixedLengthPayloadFilter | TokenFilter that adds random fixed-length payloads. |
| MockGraphTokenFilter | Randomly inserts overlapped (posInc=0) tokens with posLength sometimes > 1. |
| MockHoleInjectingTokenFilter | |
| MockRandomLookaheadTokenFilter | Uses LookaheadTokenFilter to randomly peek at future tokens. |
| MockReaderWrapper | Wraps a Reader, and can throw random or fixed exceptions, and spoon feed read chars. |
| MockTokenizer | Tokenizer for testing. |
| MockVariableLengthPayloadFilter | TokenFilter that adds random variable-length payloads. |
| TokenStreamToDot | Consumes a TokenStream and outputs the dot (graphviz) string (graph). |
| ValidatingTokenFilter | A TokenFilter that checks consistency of the tokens (eg offsets are consistent with one another). |
| VocabularyAssert | Utility class for doing vocabulary-based stemming tests |
|
||||||||||
| PREV PACKAGE NEXT PACKAGE | FRAMES NO FRAMES | |||||||||