.. currentmodule:: pythainlp.tokenize .. _tokenize-doc: pythainlp.tokenize ===================================== The :class:`pythainlp.tokenize` contains multiple functions for tokenizing a chunk of Thai text into desirable units. .. autofunction:: word_tokenize .. autofunction:: dict_word_tokenize .. autofunction:: subword_tokenize .. autofunction:: sent_tokenize .. autofunction:: isthai .. autofunction:: create_custom_dict_trie