Skip to content

Commit 5fc4908

Browse files
Docs: Corrected behaviour of max_token_length in standard tokenizer
1 parent c20bc9f commit 5fc4908

File tree

1 file changed

+1
-1
lines changed

1 file changed

+1
-1
lines changed

docs/reference/analysis/tokenizers/standard-tokenizer.asciidoc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -13,6 +13,6 @@ type:
1313
|=======================================================================
1414
|Setting |Description
1515
|`max_token_length` |The maximum token length. If a token is seen that
16-
exceeds this length then it is discarded. Defaults to `255`.
16+
exceeds this length then it is split at `max_token_length` intervals. Defaults to `255`.
1717
|=======================================================================
1818

0 commit comments

Comments
 (0)