org.apache.nlpcraft
Type members
Classlikes
A context containing a fully parsed data from the input query.
A context containing a fully parsed data from the input query.
- See also:
- Source:
- NCContext.scala
Conversation container. Conversation contains everything that should be implicitly remembered during the active, ongoing conversation and forgotten once the conversation ends or timed out. Conversation contains the following elements:
Conversation container. Conversation contains everything that should be implicitly remembered during the active, ongoing conversation and forgotten once the conversation ends or timed out. Conversation contains the following elements:
- List of entities comprising a "short-term-memory" (STM) of this conversation.
- Chronological list of previously matched intents.
- Auto-expiring user data.
Note that the conversation is unique for given combination of user and data model.
Conversation management is based on idea of a "short-term-memory" (STM). STM can be viewed as a condensed short-term history of the input for a given user and data model. Every submitted user request that wasn't rejected is added to the conversation STM as a list of tokens. Existing STM tokens belonging to the same group will be overridden by the more recent tokens from the same group. Note also that tokens in STM automatically expire (i.e. context is "forgotten") after a certain period of time and/or based on the depth of the conversation since the last mention.
You can also maintain user state-machine between requests using conversation's session. Conversation's data is a mutable thread-safe container that can hold any arbitrary user data while supporting the same expiration logic as the rest of the conversation elements (i.e. tokens and previously matched intent IDs).
Conversation expiration policy is configured by two configuration properties:
- See also:
- Source:
- NCConversation.scala
A type of rejection indicating that human curation is required. Curation is typically an indication that input query is likely valid but needs a human correction like a typo fix, slang resolution, etc.
A type of rejection indicating that human curation is required. Curation is typically an indication that input query is likely valid but needs a human correction like a typo fix, slang resolution, etc.
Note that NLPCraft does not handle the curation process itself but only allows to indicate the curation request by throwing this exception. Curation is a special type of rejection. The user code is responsible for he actual handling of the curation logic, e.g. giving the user an option to fix the input and resubmitting the request.
- Value parameters:
- cause
Optional cause of this exception.
- msg
Curation message.
- Source:
- NCCuration.scala
An item of the dialog flow. Dialog flow is a chronologically ordered list of dialog flow items. Each item represents a snapshot of winning intent's match and its associated data. List of dialog flow items is passed into a custom user-defined dialog flow match method.
An item of the dialog flow. Dialog flow is a chronologically ordered list of dialog flow items. Each item represents a snapshot of winning intent's match and its associated data. List of dialog flow items is passed into a custom user-defined dialog flow match method.
- See also:
- Source:
- NCDialogFlowItem.scala
An entity is a collection if one or more tokens. An entity typically has a consistent semantic meaning and usually denotes a real-world object, such as persons, locations, number, date and time, organizations, products, etc. - where such objects can be abstract or have a physical existence. Entities are produced by NCEntityParser. See NCPipeline for documentation on the entities in the overall processing pipeline.
An entity is a collection if one or more tokens. An entity typically has a consistent semantic meaning and usually denotes a real-world object, such as persons, locations, number, date and time, organizations, products, etc. - where such objects can be abstract or have a physical existence. Entities are produced by NCEntityParser. See NCPipeline for documentation on the entities in the overall processing pipeline.
Note that both NCToken and NCEntity interfaces extend NCPropertyMap trait that allows them to store custom metadata properties. Parser, enrichers and validators for tokens and entities use this capability to store and check their properties in tokens and entities.
- See also:
- Source:
- NCEntity.scala
A pipeline component that enrichers entities by settings their properties. See NCPipeline for documentation on the overall processing pipeline. Note that this is an optional component in the pipeline.
A pipeline component that enrichers entities by settings their properties. See NCPipeline for documentation on the overall processing pipeline. Note that this is an optional component in the pipeline.
- See also:
- Source:
- NCEntityEnricher.scala
A pipeline component that allows to map one set of entities into another after the entities were parsed and enriched. Entity mapper is an optional component and the pipeline can have zero or more entity mappers. Mappers are typically used for combining several existing entities into a new one without necessarily touching the entity parser or enrichers. See NCPipeline for documentation on the overall processing pipeline.
A pipeline component that allows to map one set of entities into another after the entities were parsed and enriched. Entity mapper is an optional component and the pipeline can have zero or more entity mappers. Mappers are typically used for combining several existing entities into a new one without necessarily touching the entity parser or enrichers. See NCPipeline for documentation on the overall processing pipeline.
- See also:
- Source:
- NCEntityMapper.scala
A pipeline component that converts list of tokens into the list of entities.
A pipeline component that converts list of tokens into the list of entities.
Parser instance can produce NCEntity instances with different types. Each NCEntity instance contains NCToken instances list and each NCToken instance can belong to one or more different NCEntity instances. Order of result entities list is not important.
See NCPipeline for documentation on the overall processing pipeline. Note that pipeline must have at least one entity parser.
- See also:
- Source:
- NCEntityParser.scala
A pipeline components that validates the final list of parsed and enriched entities. See NCPipeline for documentation on the overall processing pipeline. Note that this is an optional component.
A pipeline components that validates the final list of parsed and enriched entities. See NCPipeline for documentation on the overall processing pipeline. Note that this is an optional component.
This component can be used to perform any kind of last-stage validation before the list of entities is passed to intent matching. For example, this can be used for content-based access control, time-based control, location-based control, etc.
- See also:
- Source:
- NCEntityValidator.scala
Base NLPCraft exception.
Base NLPCraft exception.
- Value parameters:
- cause
Optional cause exception.
- msg
Error message.
- Source:
- NCException.scala
Descriptor of the matched intent.
Descriptor of the matched intent.
- Source:
- NCIntentMatch.scala
Control flow exception to skip current intent. This exception can be thrown by the intent callback to indicate that current intent should be skipped (even though it was matched and its callback was called). If there's more than one intent matched the next best matching intent will be selected and its callback will be called.
Control flow exception to skip current intent. This exception can be thrown by the intent callback to indicate that current intent should be skipped (even though it was matched and its callback was called). If there's more than one intent matched the next best matching intent will be selected and its callback will be called.
This exception becomes useful when it is hard or impossible to encode the entire matching logic using just declarative IDL. In these cases the intent definition can be relaxed and the "last mile" of intent matching can happen inside the intent callback's user logic. If it is determined that intent in fact does not match then throwing this exception allows to try next best matching intent, if any.
- See also:
- Source:
- NCIntentSkip.scala
Lifecycle callbacks for various pipeline components.
Lifecycle callbacks for various pipeline components.
- See also:
- Source:
- NCLifecycle.scala
A descriptor of the intent callback returned by NCModelClient.debugAsk method. This descriptor defines the callback for the intent that was detected as a winning intent but whose callback wasn't fired as per NCModelClient.debugAsk method's semantic.
A descriptor of the intent callback returned by NCModelClient.debugAsk method. This descriptor defines the callback for the intent that was detected as a winning intent but whose callback wasn't fired as per NCModelClient.debugAsk method's semantic.
Using this descriptor the user can execute callback itself, if necessary.
- See also:
- Source:
- NCMatchedCallback.scala
Data model.
Data model.
Data model is a key entity in NLPCraft and contains:
- Model configuration.
- Model processing pipeline.
- Life-cycle callbacks.
NLPCraft employs model-as-a-code approach where entire data model is an implementation of just this interface. The instance of this interface is passed to NCModelClient class. Note that the model-as-a-code approach natively supports any software life cycle tools and frameworks like various build tools, CI/SCM tools, IDEs, etc. You don't need any additional tools to manage some aspects of your data models - your entire model and all of its components are part of your project's source code.
- See also:
- Source:
- NCModel.scala
Client API to issue requests again given model. This the primary method of interacting with NLPCraft from the user perspective.
Client API to issue requests again given model. This the primary method of interacting with NLPCraft from the user perspective.
- Value parameters:
- mdl
A data model to issue requests against.
- Source:
- NCModelClient.scala
Model configuration factory.
Model configuration factory.
- Companion:
- class
- Source:
- NCModelConfig.scala
Model configuration container.
Model configuration container.
- See also:
- Companion:
- object
- Source:
- NCModelConfig.scala
NLP processing pipeline for the input request. Pipeline is associated with the model.
NLP processing pipeline for the input request. Pipeline is associated with the model.
An NLP pipeline is a container for the sequence of processing components that take the input text at the beginning of the pipeline and produce the list of variants at the end of the pipeline. Schematically the pipeline looks like this:
+----------+ +-----------+ +--------+ *=========* +---------+ +---+-------+ | +---+-------+ | +---+-----+ | : Text : -> | Token | -> | Token | | -> | Token | | -> | Entity | | ----. : Input : | Parser | | Enrichers |--+ | Validators |--+ | Parsers |--+ \ *=========* +---------+ +-----------+ +------------+ +---------+ \ } +--------+ +--------+ +-----------+ +----------+ / *============* +---+-----+ | +---+-----+ | +---+--------+ | +---+-------+ | / : Variants : <- | Variant | | <- | Entity | | <- | Entity | | <- | Entity | | <-' : List : | Filters |--+ | Mappers |--+ | Validators |--+ | Enrichers |--+ *============* +----- ---+ +----- ---+ +------------+ +-----------+
The result variants are then passed further to the intent matching. Note that only one token parser and at least one entity parser is required for the minimal pipeline.
- See also:
- Source:
- NCPipeline.scala
Convenient builder for NCPipeline instance.
Convenient builder for NCPipeline instance.
- Source:
- NCPipelineBuilder.scala
Map-like container that provides support for mutable runtime-only properties or metadata.
Map-like container that provides support for mutable runtime-only properties or metadata.
- See also:
- Source:
- NCPropertyMap.scala
Convenient adapter for NCPropertyMap interface. Note that this class uses ConcurrentHashMap for its implementation making its access thread-safe.
Convenient adapter for NCPropertyMap interface. Note that this class uses ConcurrentHashMap for its implementation making its access thread-safe.
- Source:
- NCPropertyMapAdapter.scala
An exception to indicate a rejection of the user input. This exception is either thrown automatically by the processing logic or can be thrown by the user from the intent callback.
An exception to indicate a rejection of the user input. This exception is either thrown automatically by the processing logic or can be thrown by the user from the intent callback.
This exception typically indicates that user has not provided enough information in the input string to have it processed automatically, without additional curation or modification. In most cases this means that the user's input is either too short or too simple, too long or too complex, missing required context, or unrelated to the requested data model.
- Source:
- NCRejection.scala
Descriptor for the input user request. User request descriptor can be obtained via NCContext.getRequest method.
Descriptor for the input user request. User request descriptor can be obtained via NCContext.getRequest method.
- See also:
- Source:
- NCRequest.scala
Convenient factory for creating NCResult instances.
Convenient factory for creating NCResult instances.
- Companion:
- class
- Source:
- NCResult.scala
Represents a contiguous substring of the original input text produced by NCTokenParser. A token is the result of tokenization - the process of demarcating and classifying sections of a string of input characters. See NCPipeline for documentation on the tokens place in the overall processing pipeline.
Represents a contiguous substring of the original input text produced by NCTokenParser. A token is the result of tokenization - the process of demarcating and classifying sections of a string of input characters. See NCPipeline for documentation on the tokens place in the overall processing pipeline.
Note that both NCToken and NCEntity interfaces extend NCPropertyMap interface that allows them to store custom metadata properties. Parser, enrichers and validators for tokens and entities use this capability to store and check their properties in tokens and entities.
- See also:
- Source:
- NCToken.scala
Optional pipeline component that can enrich previously parsed tokens. See NCPipeline for documentation on the token enricher place in the overall processing pipeline.
Optional pipeline component that can enrich previously parsed tokens. See NCPipeline for documentation on the token enricher place in the overall processing pipeline.
- See also:
- Source:
- NCTokenEnricher.scala
A tokenizer that splits given text into the list of NCToken objects. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
A tokenizer that splits given text into the list of NCToken objects. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
- See also:
- Source:
- NCTokenParser.scala
A pipeline component that allows to validate a list of tokens produced by token parser before they get sent further down the pipeline. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
A pipeline component that allows to validate a list of tokens produced by token parser before they get sent further down the pipeline. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
- See also:
- Source:
- NCTokenValidator.scala
A parsing variant is a list of entities defining one possible parsing of the input query. Note that a given user input almost always has one or more possible parsing variants. Furthermore, depending on the model configuration a user input can produce hundreds and even thousands of parsing variants.
A parsing variant is a list of entities defining one possible parsing of the input query. Note that a given user input almost always has one or more possible parsing variants. Furthermore, depending on the model configuration a user input can produce hundreds and even thousands of parsing variants.
Pipeline provides user-defined variant filter component NCVariantFilter to allow a programmatic filtration of the variants. Note that even a few dozens of variants can significantly slow down the overall NLPCraft processing.
- See also:
NCModel.onVariant()
- Source:
- NCVariant.scala
A pipeline component that allows to filter out unnecessary parsing variants. Note that a given user input almost always has one or more possible different parsing variants. Furthermore, depending on the model configuration a user input can produce hundreds and even thousands of parsing variants. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
A pipeline component that allows to filter out unnecessary parsing variants. Note that a given user input almost always has one or more possible different parsing variants. Furthermore, depending on the model configuration a user input can produce hundreds and even thousands of parsing variants. This is one of the user-defined components of the processing pipeline. See NCPipeline for documentation on the token parser place in the overall processing pipeline.
- See also:
- Source:
- NCVariantFilter.scala
Value members
Concrete methods
Global syntax sugar for throwing NCException.
Global syntax sugar for throwing NCException.
- Value parameters:
- cause
Optional cause.
- msg
Exception message.
- Source:
- NCGlobals.scala
Extensions
Extensions
Provides equality check between two options' values. This check is true
when both options
are defined and have the same value.
Provides equality check between two options' values. This check is true
when both options
are defined and have the same value.
- Value parameters:
- x
Option to compare.
- Source:
- NCGlobals.scala
Provides equality check between the option and the value of the same type. This check is true
when
the option is defined and its value is equal to the given value.
Provides equality check between the option and the value of the same type. This check is true
when
the option is defined and its value is equal to the given value.
- Value parameters:
- x
Value to compare the option to
- Source:
- NCGlobals.scala
Converts milliseconds int
value to days.
Converts milliseconds int
value to days.
- Source:
- NCGlobals.scala
Converts bytes int
value to gigabytes.
Converts bytes int
value to gigabytes.
- Source:
- NCGlobals.scala
Converts milliseconds int
value to hours.
Converts milliseconds int
value to hours.
- Source:
- NCGlobals.scala
Converts bytes int
value to kilobytes.
Converts bytes int
value to kilobytes.
- Source:
- NCGlobals.scala
Converts bytes int
value to megabytes.
Converts bytes int
value to megabytes.
- Source:
- NCGlobals.scala
Converts milliseconds int
value to minutes.
Converts milliseconds int
value to minutes.
- Source:
- NCGlobals.scala
Converts milliseconds int
value to seconds.
Converts milliseconds int
value to seconds.
- Source:
- NCGlobals.scala
Converts bytes int
value to terabytes.
Converts bytes int
value to terabytes.
- Source:
- NCGlobals.scala
Converts milliseconds long
value to days.
Converts milliseconds long
value to days.
- Source:
- NCGlobals.scala
Converts bytes long
value to gigabytes.
Converts bytes long
value to gigabytes.
- Source:
- NCGlobals.scala
Converts milliseconds long
value to hours.
Converts milliseconds long
value to hours.
- Source:
- NCGlobals.scala
Converts bytes long
value to kilobytes.
Converts bytes long
value to kilobytes.
- Source:
- NCGlobals.scala
Converts bytes long
value to megabytes.
Converts bytes long
value to megabytes.
- Source:
- NCGlobals.scala
Converts milliseconds long
value to minutes.
Converts milliseconds long
value to minutes.
- Source:
- NCGlobals.scala
Converts milliseconds long
value to seconds.
Converts milliseconds long
value to seconds.
- Source:
- NCGlobals.scala
Converts bytes long
value to terabytes.
Converts bytes long
value to terabytes.
- Source:
- NCGlobals.scala