🔍 Code Extractor

Search Components

Full-Text: Fast keyword matching | Semantic: AI-powered understanding of intent (finds similar concepts)

Search Results for "gemini"

Found 17 matching component(s)

  • class MeetingMinutesGenerator_v1

    A class that generates professional meeting minutes from meeting transcripts using either OpenAI's GPT-4o or Google's Gemini AI models.

    File: /tf/active/vicechatdev/advanced_meeting_minutes_generator.py

    meeting-minutes transcript-processing llm gpt-4o gemini
  • function main_v15

    Command-line interface function that orchestrates the generation of meeting minutes from a transcript file using either GPT-4o or Gemini LLM models.

    File: /tf/active/vicechatdev/advanced_meeting_minutes_generator.py

    cli command-line meeting-minutes transcript-processing llm
  • class EnhancedMeetingMinutesGenerator

    A class named EnhancedMeetingMinutesGenerator

    File: /tf/active/vicechatdev/leexi/enhanced_meeting_minutes_generator.py

    class enhancedmeetingminutesgenerator
  • function main_v2

    Command-line interface function that orchestrates the generation of enhanced meeting minutes from transcript files and PowerPoint presentations using various LLM models (GPT-4o, Azure GPT-4o, or Gemini).

    File: /tf/active/vicechatdev/leexi/enhanced_meeting_minutes_generator.py

    cli command-line meeting-minutes llm gpt-4
  • function check_configuration

    A comprehensive configuration verification function that checks and displays the status of all DocChat system settings, including API keys, models, ChromaDB connection, directories, and LLM initialization.

    File: /tf/active/vicechatdev/docchat/verify_setup.py

    configuration verification diagnostics setup validation
  • function test_config

    A test function that validates the presence and correctness of all required configuration settings for a multi-model RAG (Retrieval-Augmented Generation) system.

    File: /tf/active/vicechatdev/docchat/test_model_selection.py

    testing validation configuration unit-test assertion
  • function test_rag_engine

    A test function that validates the RAG engine's ability to correctly instantiate different LLM models (OpenAI, Anthropic, Gemini) based on configuration settings.

    File: /tf/active/vicechatdev/docchat/test_model_selection.py

    testing rag llm model-switching validation
  • function get_llm_instance

    Factory function that creates and returns an appropriate LLM (Large Language Model) instance based on the specified model name, automatically detecting the provider (OpenAI, Azure OpenAI, or Anthropic) and configuring it with the given parameters.

    File: /tf/active/vicechatdev/docchat/llm_factory.py

    llm factory-pattern openai azure anthropic
  • class LLMClient

    Multi-LLM client that provides a unified interface for interacting with OpenAI GPT-4o, Azure OpenAI, Google Gemini, and Anthropic Claude models.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    llm openai azure gemini claude
  • class SmartStatConfig

    Configuration class for SmartStat service that manages directory paths and API keys for various LLM providers integrated into Vice AI.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    configuration setup initialization directory-management api-keys
  • function chat_with_text_section

    Flask API endpoint that enables AI-powered chat conversations about a specific text section, with support for multiple LLM models and document context.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    flask api-endpoint chat ai-assistant llm
  • function api_send_chat_message_v1

    Flask API endpoint that handles sending messages in a chat session, processes them through a RAG (Retrieval-Augmented Generation) engine with configurable LLM models, and returns AI-generated responses with references.

    File: /tf/active/vicechatdev/vice_ai/new_app.py

    chat api rag llm conversational-ai
  • class Config

    Configuration class that manages application-wide settings, directory structures, API keys, and operational parameters for a statistical analysis application.

    File: /tf/active/vicechatdev/vice_ai/smartstat_config.py

    configuration settings flask api-keys directory-management
  • class StatisticalAgent

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/vice_ai/statistical_agent.py

    class statisticalagent
  • class Config_v1

    Configuration class that centralizes all application settings including Flask configuration, directory paths, API keys, LLM model settings, and statistical analysis parameters.

    File: /tf/active/vicechatdev/full_smartstat/config.py

    configuration settings flask llm openai
  • class StatisticalAgent_v1

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/full_smartstat/statistical_agent.py

    class statisticalagent
  • class StatisticalAgent_v2

    LLM-powered statistical analysis agent

    File: /tf/active/vicechatdev/smartstat/statistical_agent.py

    class statisticalagent

Search Examples