Create Chat Using SDK
The Tror Gen AI SDK empowers you to interact with our underlying Large Language Models (LLMs) and retrieve informative responses to your questions. This section guides you through the process of formulating and sending questions to specific LLM models within the application.
Formulating Your Question
Craft a clear and concise question that you want the LLM to answer. The more specific your question, the more focused and relevant the response will be. Choosing the Right LLM Model (Optional)
By default, the SDK utilizes a general-purpose LLM model to answer your questions. However, you can optionally specify a particular LLM model if your question aligns better with its area of expertise.
Here's how to specify an LLM model (if desired):
import trorgenai
question = "What is the capital of France?"
model_name = "gpt-4" # Optional: Specify a model name
response = trorgenai.ask_question(question, model_name)
print(response)
In this example, the code defines a specific model named "gpt-4" to answer the question about the capital of France.
Refer to the available LLM models and their areas of expertise within the SDK documentation for informed selection.
- If you don't specify a model name, the default general-purpose model will be used.
Sending the Question and Receiving the Response
The ask_question function from the SDK handles sending your question to the chosen LLM model and retrieving the response.
response = trorgenai.ask_question(question)
The ask_question function takes two arguments:
-
question: The text of your question you want the LLM to answer.
-
model_name (Optional): The name of the specific LLM model you want to use (refer to available models in the documentation).
The function returns a string containing the LLM's generated response to your question.
Putting it All Together
Here's a complete example demonstrating how to ask a question and receive a response:
import trorgenai
question = "Who wrote the novel Moby Dick?"
response = trorgenai.ask_question(question)
print(f"The LLM responded: {response}")
This code snippet sends the question "Who wrote the novel Moby Dick?" and prints the LLM's response.
Next Steps:
Explore the SDK documentation for further functionalities such as fine-tuning LLM models with your data, creating custom personas, and managing your knowledge base to enhance the interaction experience.