function test_config
A test function that validates the presence and correctness of all required configuration settings for a multi-model RAG (Retrieval-Augmented Generation) system.
/tf/active/vicechatdev/docchat/test_model_selection.py
9 - 47
simple
Purpose
This function performs comprehensive validation of the application's configuration module, ensuring that all necessary settings are present and correctly configured. It checks for the existence of 4 specific LLM models (GPT-4o, GPT-4o-mini, Claude Sonnet, and Gemini), validates their metadata structure, verifies API key configurations for OpenAI, Gemini, and Anthropic, confirms ChromaDB connection settings, and validates the documents directory path. The function is designed to be run as part of a test suite to catch configuration errors early in the development or deployment process.
Source Code
def test_config():
"""Test configuration has all required settings"""
print("=" * 60)
print("TEST 1: Configuration")
print("=" * 60)
import config
# Check AVAILABLE_MODELS
assert hasattr(config, 'AVAILABLE_MODELS'), "Missing AVAILABLE_MODELS"
assert len(config.AVAILABLE_MODELS) == 4, f"Expected 4 models, got {len(config.AVAILABLE_MODELS)}"
expected_models = ['gpt-4o', 'gpt-4o-mini', 'claude-sonnet-4-5-20250929', 'gemini-2.0-flash-exp']
for model in expected_models:
assert model in config.AVAILABLE_MODELS, f"Missing model: {model}"
assert 'name' in config.AVAILABLE_MODELS[model], f"Model {model} missing 'name'"
assert 'provider' in config.AVAILABLE_MODELS[model], f"Model {model} missing 'provider'"
assert 'description' in config.AVAILABLE_MODELS[model], f"Model {model} missing 'description'"
# Check API keys
assert hasattr(config, 'OPENAI_API_KEY'), "Missing OPENAI_API_KEY"
assert hasattr(config, 'GEMINI_API_KEY'), "Missing GEMINI_API_KEY"
assert hasattr(config, 'ANTHROPIC_API_KEY'), "Missing ANTHROPIC_API_KEY"
# Check ChromaDB settings
assert hasattr(config, 'CHROMA_HOST'), "Missing CHROMA_HOST"
assert hasattr(config, 'CHROMA_PORT'), "Missing CHROMA_PORT"
assert config.CHROMA_HOST == 'oneco_chroma', f"Expected CHROMA_HOST='oneco_chroma', got '{config.CHROMA_HOST}'"
assert config.CHROMA_PORT == 8000, f"Expected CHROMA_PORT=8000, got {config.CHROMA_PORT}"
# Check documents directory
assert hasattr(config, 'DOCUMENTS_DIR'), "Missing DOCUMENTS_DIR"
assert 'qa_docs' in str(config.DOCUMENTS_DIR), f"DOCUMENTS_DIR should contain 'qa_docs', got {config.DOCUMENTS_DIR}"
print("✓ All configuration checks passed")
print(f" - 4 models configured: {list(config.AVAILABLE_MODELS.keys())}")
print(f" - ChromaDB: {config.CHROMA_HOST}:{config.CHROMA_PORT}")
print(f" - Documents: {config.DOCUMENTS_DIR}")
print()
Return Value
This function does not return any value (implicitly returns None). It performs assertions and prints status messages to stdout. If any assertion fails, it raises an AssertionError with a descriptive message. On success, it prints a summary of validated configuration settings.
Dependencies
config
Required Imports
import config
Usage Example
# Ensure config.py exists with all required settings
# Example config.py structure:
# AVAILABLE_MODELS = {
# 'gpt-4o': {'name': 'GPT-4o', 'provider': 'openai', 'description': '...'},
# 'gpt-4o-mini': {'name': 'GPT-4o Mini', 'provider': 'openai', 'description': '...'},
# 'claude-sonnet-4-5-20250929': {'name': 'Claude Sonnet', 'provider': 'anthropic', 'description': '...'},
# 'gemini-2.0-flash-exp': {'name': 'Gemini 2.0', 'provider': 'google', 'description': '...'}
# }
# OPENAI_API_KEY = 'sk-...'
# GEMINI_API_KEY = 'AI...'
# ANTHROPIC_API_KEY = 'sk-ant-...'
# CHROMA_HOST = 'oneco_chroma'
# CHROMA_PORT = 8000
# DOCUMENTS_DIR = Path('/path/to/qa_docs')
import config
# Run the test
test_config()
# Expected output on success:
# ============================================================
# TEST 1: Configuration
# ============================================================
# ✓ All configuration checks passed
# - 4 models configured: ['gpt-4o', 'gpt-4o-mini', 'claude-sonnet-4-5-20250929', 'gemini-2.0-flash-exp']
# - ChromaDB: oneco_chroma:8000
# - Documents: /path/to/qa_docs
Best Practices
- Run this test function before starting the application to ensure all configuration is properly set up
- This function should be part of an automated test suite that runs during CI/CD pipeline
- If any assertion fails, the function will raise an AssertionError - handle this appropriately in your test runner
- The function prints detailed output to stdout - ensure your test environment captures this output for debugging
- Update the expected_models list if you add or remove models from your configuration
- Consider running this test after any configuration changes or environment updates
- The function uses hardcoded expected values (like CHROMA_HOST='oneco_chroma') - modify these if your deployment differs
Tags
Similar Components
AI-powered semantic similarity - components with related functionality:
-
function check_configuration 77.8% similar
-
function test_rag_engine 71.5% similar
-
function test_configuration_v1 70.0% similar
-
function test_config_loading 69.5% similar
-
function test_configuration 69.3% similar