LLM Manager User Guide

The LLM Manager allows you to connect and configure multiple AI providers (OpenAI, Anthropic, Google) to power AI features throughout SimpleTranslate. Set up API keys, select models, and control which engines are active for your organization.

What you can do:

  • Connect multiple AI providers with secure API key configuration
  • Switch between different models for various use cases
  • Control costs by activating/deactivating engines
  • Set default engines for consistent user experience

Getting Started

1. Access LLM Manager

Navigate to Setup → LLM Manager from the main menu. You'll need the MANAGE_SETUP permission to access this feature.

2. Connect Your First LLM

Click the Connect new LLM button to add your first AI engine:

  • Name: Give your engine a descriptive name (e.g., “Production GPT-4”)
  • AI Type: Select your provider (OpenAI, Anthropic, or Google)
  • API Key: Enter your provider's API key securely
  • Model: Choose from available models for your selected provider
  • Status: Set to Active to enable the engine
  • Default: Optionally set as the default engine
🔐 Security Note

API keys are encrypted before storage and never exposed in logs or exports. Always use API keys with appropriate permissions and rate limits.

Configuring AI Engines

Engine List View

The main interface displays all your configured engines in a table with:

  • Engine name and AI type
  • Selected model
  • Status (Active/Inactive)
  • Default indicator
  • Creation and modification timestamps

Editing Engines

Click the menu icon (⋮) on any engine row to:

  • Edit: Update name, API key, or model
  • Delete: Remove the engine (except system defaults)
  • Set as Default: Make this the primary engine

Best Practices

1. API Key Configuration

  • Use separate API keys for development and production
  • Rotate keys regularly (quarterly recommended)
  • Set appropriate spending limits with your provider

2. Model Selection

  • Use GPT-4 for complex analysis and reasoning tasks
  • Use Claude for creative content and long-form generation
  • Use GPT-4o mini for simple, high-volume tasks
  • Consider cost vs. performance for each use case
  • We continuously test new models and add them as they prove reliable

3. Multi-Provider Strategy

  • Configure multiple providers for redundancy
  • Use different providers for different features
  • Monitor provider status pages for outages
  • Test failover procedures regularly

Troubleshooting

API Key Issues

  • “Invalid API Key”: Verify key is correct and active with provider
  • “Rate Limit Exceeded”: Check provider dashboard for limits
  • “Insufficient Quota”: Add billing or increase limits

Model Availability

  • Model not showing: Ensure API key has access to the model
  • Model deprecated: Update to newer version
  • Region restrictions: Some models have geographic limits
💡 Pro Tip

Default system engines are available as fallbacks but have limited capabilities. Always configure your own engines for production use.

Integration with Other Features

Engines configured in LLM Manager are automatically available in:

  • Prompt Builder: Select engines when creating prompts
  • SimpleT Salesforce Components: Use configured engines in all Salesforce AI components

Remember to configure appropriate engines before using AI features. The system will use the default engine if no specific selection is made.