Class TokenStream
- All Implemented Interfaces:
Closeable,AutoCloseable
- Direct Known Subclasses:
TokenFilter,Tokenizer
TokenStream enumerates the sequence of tokens, either from Fields of a
Document or from query text.
This is an abstract class; concrete subclasses are:
Tokenizer, aTokenStreamwhose input is a Reader; andTokenFilter, aTokenStreamwhose input is anotherTokenStream.
TokenStream extends AttributeSource, which provides access to all of the
token Attributes for the TokenStream. Note that only one instance per AttributeImpl is created and reused for every token. This approach reduces object creation and
allows local caching of references to the AttributeImpls. See incrementToken()
for further details.
The workflow of the new TokenStream API is as follows:
- Instantiation of
TokenStream/TokenFilters which add/get attributes to/from theAttributeSource. - The consumer calls
reset(). - The consumer retrieves attributes from the stream and stores local references to all attributes it wants to access.
- The consumer calls
incrementToken()until it returns false consuming the attributes after each call. - The consumer calls
end()so that any end-of-stream operations can be performed. - The consumer calls
close()to release any resource when finished using theTokenStream.
incrementToken().
You can find some example code for the new API in the analysis package level Javadoc.
Sometimes it is desirable to capture a current state of a TokenStream, e.g., for
buffering purposes (see CachingTokenFilter, TeeSinkTokenFilter). For this usecase AttributeSource.captureState() and AttributeSource.restoreState(org.apache.lucene.util.AttributeSource.State) can be used.
The TokenStream-API in Lucene is based on the decorator pattern. Therefore all
non-abstract subclasses must be final or have at least a final implementation of incrementToken()! This is checked when Java assertions are enabled.
-
Nested Class Summary
Nested classes/interfaces inherited from class org.apache.lucene.util.AttributeSource
AttributeSource.State -
Field Summary
FieldsModifier and TypeFieldDescriptionstatic final AttributeFactoryDefaultAttributeFactoryinstance that should be used for TokenStreams. -
Constructor Summary
ConstructorsModifierConstructorDescriptionprotectedA TokenStream using the default attribute factory.protectedTokenStream(AttributeFactory factory) A TokenStream using the supplied AttributeFactory for creating newAttributeinstances.protectedTokenStream(AttributeSource input) A TokenStream that uses the same attributes as the supplied one. -
Method Summary
Modifier and TypeMethodDescriptionvoidclose()Releases resources associated with this stream.voidend()This method is called by the consumer after the last token has been consumed, afterincrementToken()returnedfalse(using the newTokenStreamAPI).abstract booleanConsumers (i.e.,IndexWriter) use this method to advance the stream to the next token.voidreset()This method is called by a consumer before it begins consumption usingincrementToken().Methods inherited from class org.apache.lucene.util.AttributeSource
addAttribute, addAttributeImpl, captureState, clearAttributes, cloneAttributes, copyTo, endAttributes, equals, getAttribute, getAttributeClassesIterator, getAttributeFactory, getAttributeImplsIterator, hasAttribute, hasAttributes, hashCode, reflectAsString, reflectWith, removeAllAttributes, restoreState, toString
-
Field Details
-
DEFAULT_TOKEN_ATTRIBUTE_FACTORY
DefaultAttributeFactoryinstance that should be used for TokenStreams.
-
-
Constructor Details
-
TokenStream
protected TokenStream()A TokenStream using the default attribute factory. -
TokenStream
A TokenStream that uses the same attributes as the supplied one. -
TokenStream
A TokenStream using the supplied AttributeFactory for creating newAttributeinstances.
-
-
Method Details
-
incrementToken
Consumers (i.e.,IndexWriter) use this method to advance the stream to the next token. Implementing classes must implement this method and update the appropriateAttributeImpls with the attributes of the next token.The producer must make no assumptions about the attributes after the method has been returned: the caller may arbitrarily change it. If the producer needs to preserve the state for subsequent calls, it can use
AttributeSource.captureState()to create a copy of the current attribute state.This method is called for every token of a document, so an efficient implementation is crucial for good performance. To avoid calls to
AttributeSource.addAttribute(Class)andAttributeSource.getAttribute(Class), references to allAttributeImpls that this stream uses should be retrieved during instantiation.To ensure that filters and consumers know which attributes are available, the attributes must be added during instantiation. Filters and consumers are not required to check for availability of attributes in
incrementToken().- Returns:
- false for end of stream; true otherwise
- Throws:
IOException
-
end
This method is called by the consumer after the last token has been consumed, afterincrementToken()returnedfalse(using the newTokenStreamAPI). Streams implementing the old API should upgrade to use this feature.This method can be used to perform any end-of-stream operations, such as setting the final offset of a stream. The final offset of a stream might differ from the offset of the last token eg in case one or more whitespaces followed after the last token, but a WhitespaceTokenizer was used.
Additionally any skipped positions (such as those removed by a stopfilter) can be applied to the position increment, or any adjustment of other attributes where the end-of-stream value may be important.
If you override this method, always call
super.end().- Throws:
IOException- If an I/O error occurs
-
reset
This method is called by a consumer before it begins consumption usingincrementToken().Resets this stream to a clean state. Stateful implementations must implement this method so that they can be reused, just as if they had been created fresh.
If you override this method, always call
super.reset(), otherwise some internal state will not be correctly reset (e.g.,Tokenizerwill throwIllegalStateExceptionon further usage).- Throws:
IOException
-
close
Releases resources associated with this stream.If you override this method, always call
super.close(), otherwise some internal state will not be correctly reset (e.g.,Tokenizerwill throwIllegalStateExceptionon reuse).- Specified by:
closein interfaceAutoCloseable- Specified by:
closein interfaceCloseable- Throws:
IOException
-