pythainlp.chat

class pythainlp.chat.ChatBotModel[source]
__init__()[source]

Chat using AI generation

reset_chat()[source]

Reset chat by cleaning history

load_model(model_name: str = 'wangchanglm', return_dict: bool = True, load_in_8bit: bool = False, device: str = 'cuda', torch_dtype=torch.float16, offload_folder: str = './', low_cpu_mem_usage: bool = True)[source]

Load model

Parameters:
  • model_name (str) – Model name (Now, we support wangchanglm only)

  • return_dict (bool) – return_dict

  • load_in_8bit (bool) – load model in 8bit

  • device (str) – device (cpu, cuda or other)

  • torch_dtype (torch_dtype) – torch_dtype

  • offload_folder (str) – offload folder

  • low_cpu_mem_usage (bool) – low cpu mem usage

chat(text: str) str[source]

Chatbot

Parameters:

text (str) – text for asking chatbot with.

Returns:

answer from chatbot.

Return type:

str

Example:

from pythainlp.chat import ChatBotModel
import torch

chatbot = ChatBotModel()
chatbot.load_model(device="cpu",torch_dtype=torch.bfloat16)

print(chatbot.chat("สวัสดี"))
# output: ยินดีที่ได้รู้จัก

print(chatbot.history)
# output: [('สวัสดี', 'ยินดีที่ได้รู้จัก')]