Skip to main content

⚙️AI Settings

LLM Model & Temperature

LiveChatAI supports many Large Language Models(LLM) for generating conversational responses.

Additionally, users have the option to configure the "temperature" setting, which affects the randomness of the AI's responses.

Here's a detailed overview of the LLM models and temperature settings available in LiveChatAI:

LLM Models on LiveChatAI

GPT4

choosing gpt-4 for GPT Model on LiveChatAI
  • Availability: Optional (Except Basic Plan)
  • Cost: Standard message credit rate.
  • Description: GPT4 is a highly advanced model, offering more nuanced and contextually aware responses.

GPT3.5

choosing gpt-3.5-turbo for GPT Model on LiveChatAI
  • Availability: Standard
  • Cost: Standard message credit rate.
  • Description: GPT3.5 is a robust model capable of handling a wide range of conversational tasks.

Other Options

There are also other models that LiveChatAI uses to support:

Mistral AI

Mistral AI provides models that demonstrate the potential of smaller, efficient models without the prohibitive compute costs associated with larger foundation models like GPT-4.

LIama AI

LLaMA models have been designed to perform a wide range of natural language processing tasks, from translation and summarization to question-answering and more. It includes a variant fine-tuned for chat applications, offering capabilities similar to GPT-4 but with the same context length of 4K tokens as the base model.

Choosing the Right Model:

Selecting between models depends on your specific requirements and budget constraints.

You can choose the model that best fits your needs from the settings panel.

Temperature Setting

  • Default Value: 0.6
  • Customization: You can change the temperature value to whatever you prefer.
  • Function: The temperature setting controls the randomness of the AI's responses. A higher value (closer to 1) results in more diverse and creative responses, while a lower value (closer to 0) produces more deterministic and focused answers.

How to Adjust Temperature:

1. Navigate to Settings.

From the dashboard, go to the AI settings section.

settings section in LiveChatAI

2. Select Temperature.

Find the temperature option.

editing the Temperature value on LiveChatAI

3. Adjust Value.

You can modify the existing temperature value to your desired level.

4. Save Changes.

Click 'Save' to apply the new temperature setting.

the GPT Model, Temperature, and the Save button highlighted on the dashboard of LiveChatAI

Changing the LLM model or temperature is an advanced setting. Be mindful of the associated costs and impact on response behavior. If you are unsure about making changes, consult our support team or refer to additional resources.

The LLM Model & Temperature settings in LiveChatAI provide flexibility in tailoring the AI's response behavior to fit your unique requirements.

Understanding these options and carefully selecting the right model and temperature can enhance your customer interaction experience.

For any further assistance or inquiries, feel free to reach out to our support team.