function get_available_models
Flask API endpoint that returns a JSON response containing the list of available LLM models and the default model configured in the application.
/tf/active/vicechatdev/docchat/blueprint.py
331 - 336
simple
Purpose
This endpoint serves as a configuration discovery mechanism for frontend clients to retrieve the available language models that can be used for document chat functionality. It allows the UI to dynamically populate model selection dropdowns and know which model is set as default, enabling flexible model switching without hardcoding model names in the frontend.
Source Code
def get_available_models():
"""Get list of available LLM models"""
return jsonify({
'models': config.AVAILABLE_MODELS,
'default': config.DEFAULT_MODEL
})
Return Value
Returns a Flask JSON response object containing a dictionary with two keys: 'models' (a list of available LLM model identifiers from config.AVAILABLE_MODELS) and 'default' (a string representing the default model identifier from config.DEFAULT_MODEL). The HTTP status code is 200 on success.
Dependencies
flaskflask-login
Required Imports
from flask import Blueprint, jsonify
from flask_login import login_required
from config import config
Usage Example
# In your Flask application setup
from flask import Flask, Blueprint, jsonify
from flask_login import LoginManager, login_required
# Configuration module
class Config:
AVAILABLE_MODELS = ['gpt-4', 'gpt-3.5-turbo', 'claude-2']
DEFAULT_MODEL = 'gpt-3.5-turbo'
config = Config()
# Create Flask app and blueprint
app = Flask(__name__)
app.secret_key = 'your-secret-key'
docchat_bp = Blueprint('docchat', __name__)
# Setup Flask-Login
login_manager = LoginManager()
login_manager.init_app(app)
@docchat_bp.route('/api/models', methods=['GET'])
@login_required
def get_available_models():
return jsonify({
'models': config.AVAILABLE_MODELS,
'default': config.DEFAULT_MODEL
})
app.register_blueprint(docchat_bp)
# Client-side usage (JavaScript fetch example)
# fetch('/api/models')
# .then(response => response.json())
# .then(data => {
# console.log('Available models:', data.models);
# console.log('Default model:', data.default);
# });
Best Practices
- Ensure config.AVAILABLE_MODELS is properly populated before the application starts to avoid returning empty or None values
- Validate that config.DEFAULT_MODEL exists in config.AVAILABLE_MODELS to maintain consistency
- This endpoint requires authentication via @login_required decorator, ensure users are logged in before accessing
- Consider caching this response on the client side as model configurations typically don't change during runtime
- Handle potential AttributeError if config module doesn't have AVAILABLE_MODELS or DEFAULT_MODEL attributes
- Consider adding error handling to return appropriate HTTP status codes if configuration is missing
- Document the expected format of model identifiers for consistency across the application
Tags
Similar Components
AI-powered semantic similarity - components with related functionality:
-
function api_get_models 93.8% similar
-
function chat_with_text_section 57.5% similar
-
function api_send_chat_message_v1 56.3% similar
-
class LLMClient_v1 55.4% similar
-
function get_llm_instance 55.1% similar