🔍 Code Extractor

Search Components

Full-Text: Fast keyword matching | Semantic: AI-powered understanding of intent (finds similar concepts)

Search Results for "claude"

Found 16 matching component(s)

  • function check_configuration

    A comprehensive configuration verification function that checks and displays the status of all DocChat system settings, including API keys, models, ChromaDB connection, directories, and LLM initialization.

    File: /tf/active/vicechatdev/docchat/verify_setup.py

    configuration verification diagnostics setup validation
  • class DocChatRAG

    Main RAG engine with three operating modes: 1. Basic RAG (similarity search) 2. Extensive (full document retrieval with preprocessing) 3. Full Reading (process all documents)

    File: /tf/active/vicechatdev/docchat/rag_engine.py

    class docchatrag
  • function test_config

    A test function that validates the presence and correctness of all required configuration settings for a multi-model RAG (Retrieval-Augmented Generation) system.

    File: /tf/active/vicechatdev/docchat/test_model_selection.py

    testing validation configuration unit-test assertion
  • class LLMMessage

    A simple dataclass that represents a message for Large Language Model (LLM) interactions, containing only the message content as a string.

    File: /tf/active/vicechatdev/docchat/llm_factory.py

    dataclass message llm data-container language-model
  • function get_llm_instance

    Factory function that creates and returns an appropriate LLM (Large Language Model) instance based on the specified model name, automatically detecting the provider (OpenAI, Azure OpenAI, or Anthropic) and configuring it with the given parameters.

    File: /tf/active/vicechatdev/docchat/llm_factory.py

    llm factory-pattern openai azure anthropic
  • class LLMClient_v1

    Multi-LLM client that provides a unified interface for interacting with OpenAI GPT-4o, Azure OpenAI, Google Gemini, and Anthropic Claude models.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    llm openai azure gemini claude
  • function api_send_chat_message_v1

    Flask API endpoint that handles sending messages in a chat session, processes them through a RAG (Retrieval-Augmented Generation) engine with configurable LLM models, and returns AI-generated responses with references.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    chat api rag llm conversational-ai
  • class Config

    Configuration class that manages application-wide settings, directory structures, API keys, and operational parameters for a statistical analysis application.

    File: /tf/active/vicechatdev/vice_ai/smartstat_config.py

    configuration settings flask api-keys directory-management
  • class StatisticalAgent

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/vice_ai/statistical_agent.py

    class statisticalagent
  • class Config_v1

    Configuration class that centralizes all application settings including Flask configuration, directory paths, API keys, LLM model settings, and statistical analysis parameters.

    File: /tf/active/vicechatdev/full_smartstat/config.py

    configuration settings flask llm openai
  • class TwoPassSqlWorkflow

    Two-pass SQL generation workflow with iteration and error correction

    File: /tf/active/vicechatdev/full_smartstat/two_pass_sql_workflow.py

    class twopasssqlworkflow
  • class SQLQueryGenerator

    Generates SQL queries based on user requests and database schema

    File: /tf/active/vicechatdev/full_smartstat/sql_query_generator.py

    class sqlquerygenerator
  • class StatisticalAgent_v1

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/full_smartstat/statistical_agent.py

    class statisticalagent
  • class StatisticalAgent_v2

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/smartstat/statistical_agent.py

    class statisticalagent
  • class TestLLMClient

    Unit test class for testing the LLMClient class, which provides comprehensive test coverage for initialization, text generation, structured data extraction, and error handling across multiple LLM providers (OpenAI, Anthropic, Azure, local).

    File: /tf/active/vicechatdev/invoice_extraction/tests/test_utils.py

    testing unit-test llm openai anthropic
  • class LLMClient

    A singleton client class for interacting with multiple LLM providers (OpenAI, Anthropic, Azure OpenAI, and local models) with unified interface for text generation and structured data extraction.

    File: /tf/active/vicechatdev/invoice_extraction/utils/llm_client.py

    llm openai anthropic azure gpt

Search Examples