🔍 Code Extractor

class LLMMessage

Maturity: 26

A simple dataclass that represents a message for Large Language Model (LLM) interactions, containing only the message content as a string.

File:
/tf/active/vicechatdev/docchat/llm_factory.py
Lines:
18 - 19
Complexity:
simple

Purpose

LLMMessage serves as a lightweight data container for storing message content in LLM-based applications. It provides a structured way to represent messages that will be sent to or received from language models like OpenAI's GPT, Azure OpenAI, or Anthropic's Claude. The dataclass decorator automatically generates __init__, __repr__, __eq__, and other special methods, making it easy to create, compare, and display message instances.

Source Code

class LLMMessage:
    content: str

Parameters

Name Type Default Kind
bases - -

Parameter Details

content: A string containing the actual message text. This is the primary data field that holds the content of the message to be processed by or received from an LLM. No constraints on length or format are enforced at the class level.

Return Value

Instantiating LLMMessage returns an instance of the class with the content attribute set to the provided string value. The dataclass decorator ensures the instance has standard comparison and representation methods.

Class Interface

Methods

__init__(self, content: str) -> None

Purpose: Initialize a new LLMMessage instance with the provided content. This method is automatically generated by the dataclass decorator.

Parameters:

  • content: A string containing the message text to be stored in this instance

Returns: None - initializes the instance

__repr__(self) -> str

Purpose: Return a string representation of the LLMMessage instance. Automatically generated by dataclass.

Returns: A string in the format 'LLMMessage(content=<content_value>)'

__eq__(self, other: object) -> bool

Purpose: Compare this LLMMessage instance with another object for equality. Automatically generated by dataclass.

Parameters:

  • other: Another object to compare with this instance

Returns: True if other is an LLMMessage with the same content value, False otherwise

Attributes

Name Type Description Scope
content str The message content as a string. This is the primary data field that stores the actual text of the message for LLM processing. instance

Dependencies

  • dataclasses
  • typing
  • logging
  • config
  • openai
  • langchain_anthropic

Required Imports

from dataclasses import dataclass

Conditional/Optional Imports

These imports are only needed under specific conditions:

from typing import Optional

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional
import logging

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional
import config

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional
from openai import OpenAI

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional
from openai import AzureOpenAI

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional
from langchain_anthropic import ChatAnthropic

Condition: imported in the source file but not directly used by LLMMessage class itself

Optional

Usage Example

from dataclasses import dataclass

@dataclass
class LLMMessage:
    content: str

# Create a message instance
message = LLMMessage(content="Hello, how can I help you today?")

# Access the content
print(message.content)  # Output: Hello, how can I help you today?

# Create another message
response = LLMMessage(content="I need help with Python programming.")

# Compare messages
print(message == response)  # Output: False

# String representation
print(message)  # Output: LLMMessage(content='Hello, how can I help you today?')

Best Practices

  • LLMMessage is a simple data container with no methods or complex logic, making it straightforward to use.
  • Since it's a dataclass, instances are mutable by default. Consider using frozen=True in the decorator if immutability is desired: @dataclass(frozen=True).
  • The class has no validation logic, so any string (including empty strings) can be passed as content.
  • This class is typically used as part of a larger conversation or message history system for LLM interactions.
  • Consider extending this class with additional fields like role (user/assistant/system), timestamp, or metadata if needed for more complex use cases.
  • The dataclass decorator automatically provides __init__, __repr__, __eq__, and __hash__ (if frozen) methods.
  • No special cleanup or resource management is needed; instances can be created and discarded freely.
  • For type safety, always pass a string to the content parameter during instantiation.

Similar Components

AI-powered semantic similarity - components with related functionality:

  • class ChatMessage 67.3% similar

    A dataclass representing an individual chat message in a conversation, with support for serialization to and from dictionaries.

    From: /tf/active/vicechatdev/vice_ai/models.py
  • class LLMClient_v1 55.4% similar

    Client for interacting with LLM providers (OpenAI, Anthropic, Azure, etc.)

    From: /tf/active/vicechatdev/contract_validity_analyzer/utils/llm_client.py
  • class AzureOpenAIChatLLM 55.3% similar

    Adapter class for interacting with Azure OpenAI's Chat Completions API, providing a simplified interface for generating chat responses using Azure-hosted OpenAI models.

    From: /tf/active/vicechatdev/docchat/llm_factory.py
  • class LLMClient 55.3% similar

    Multi-LLM client that provides a unified interface for interacting with OpenAI GPT-4o, Azure OpenAI, Google Gemini, and Anthropic Claude models.

    From: /tf/active/vicechatdev/vice_ai/new_app.py
  • function get_llm_instance 55.1% similar

    Factory function that creates and returns an appropriate LLM (Large Language Model) instance based on the specified model name, automatically detecting the provider (OpenAI, Azure OpenAI, or Anthropic) and configuring it with the given parameters.

    From: /tf/active/vicechatdev/docchat/llm_factory.py
← Back to Browse