Share via


NGramTokenizer Class

Definition

Tokenizes the input into n-grams of the given size(s). This tokenizer is implemented using Apache Lucene.

public class NGramTokenizer : Azure.Search.Documents.Indexes.Models.LexicalTokenizer
type NGramTokenizer = class
    inherit LexicalTokenizer
Public Class NGramTokenizer
Inherits LexicalTokenizer
Inheritance
NGramTokenizer

Constructors

NGramTokenizer(String)

Initializes a new instance of NGramTokenizer.

Properties

MaxGram

The maximum n-gram length. Default is 2. Maximum is 300.

MinGram

The minimum n-gram length. Default is 1. Maximum is 300. Must be less than the value of maxGram.

Name

The name of the tokenizer. It must only contain letters, digits, spaces, dashes or underscores, can only start and end with alphanumeric characters, and is limited to 128 characters.

(Inherited from LexicalTokenizer)
TokenChars

Character classes to keep in the tokens.

Applies to