Skip to content

Integration

The Tror Gen AI SDK allows you to create custom integrations that combine the power of our LLM models, personas, and knowledge base with your external tools and services. This section guides you through the process of building and deploying integrations.

Defining Your Integration

An integration consists of the following elements:

  • Name: A descriptive name to identify your integration within the application.

  • Model: Select the specific LLM model you want to leverage for the integration's functionality.

  • Persona (Optional): Assign a persona to the integration, influencing the communication style and content delivered.

  • Knowledge Base (Optional): Connect the integration to a relevant knowledge base to enhance responses with specific information.

Building the Integration

Once you've defined the elements above, use the create_integration function provided by the SDK:

import trorgenai

integration_name = "My Customer Support Integration"
model_name = "customer_service_assistant"  # Optional
persona_name = "Support Specialist"  # Optional

integration = trorgenai.create_integration(integration_name, model_name, persona_name=persona_name)

This code snippet creates an integration named "My Customer Support Integration" utilizing the "customer_service_assistant" model. Additionally, it assigns the "Support Specialist" persona (if defined).

Interaction Methods

The SDK offers two primary methods for interacting with your created integration:

  • Iframe: Generate an HTML iframe code snippet that can be embedded directly into your web application or platform. This allows users to interact with the LLM functionality within your existing interface.

  • API: Access the integration through a dedicated API endpoint. The SDK will provide the endpoint URL upon integration creation. This approach enables programmatic interaction with the LLM for more advanced use cases.