class LLMMessage
A simple dataclass that represents a message for Large Language Model (LLM) interactions, containing only the message content as a string.
/tf/active/vicechatdev/docchat/llm_factory.py
18 - 19
simple
Purpose
LLMMessage serves as a lightweight data container for storing message content in LLM-based applications. It provides a structured way to represent messages that will be sent to or received from language models like OpenAI's GPT, Azure OpenAI, or Anthropic's Claude. The dataclass decorator automatically generates __init__, __repr__, __eq__, and other special methods, making it easy to create, compare, and display message instances.
Source Code
class LLMMessage:
content: str
Parameters
| Name | Type | Default | Kind |
|---|---|---|---|
bases |
- | - |
Parameter Details
content: A string containing the actual message text. This is the primary data field that holds the content of the message to be processed by or received from an LLM. No constraints on length or format are enforced at the class level.
Return Value
Instantiating LLMMessage returns an instance of the class with the content attribute set to the provided string value. The dataclass decorator ensures the instance has standard comparison and representation methods.
Class Interface
Methods
__init__(self, content: str) -> None
Purpose: Initialize a new LLMMessage instance with the provided content. This method is automatically generated by the dataclass decorator.
Parameters:
content: A string containing the message text to be stored in this instance
Returns: None - initializes the instance
__repr__(self) -> str
Purpose: Return a string representation of the LLMMessage instance. Automatically generated by dataclass.
Returns: A string in the format 'LLMMessage(content=<content_value>)'
__eq__(self, other: object) -> bool
Purpose: Compare this LLMMessage instance with another object for equality. Automatically generated by dataclass.
Parameters:
other: Another object to compare with this instance
Returns: True if other is an LLMMessage with the same content value, False otherwise
Attributes
| Name | Type | Description | Scope |
|---|---|---|---|
content |
str | The message content as a string. This is the primary data field that stores the actual text of the message for LLM processing. | instance |
Dependencies
dataclassestypingloggingconfigopenailangchain_anthropic
Required Imports
from dataclasses import dataclass
Conditional/Optional Imports
These imports are only needed under specific conditions:
from typing import Optional
Condition: imported in the source file but not directly used by LLMMessage class itself
Optionalimport logging
Condition: imported in the source file but not directly used by LLMMessage class itself
Optionalimport config
Condition: imported in the source file but not directly used by LLMMessage class itself
Optionalfrom openai import OpenAI
Condition: imported in the source file but not directly used by LLMMessage class itself
Optionalfrom openai import AzureOpenAI
Condition: imported in the source file but not directly used by LLMMessage class itself
Optionalfrom langchain_anthropic import ChatAnthropic
Condition: imported in the source file but not directly used by LLMMessage class itself
OptionalUsage Example
from dataclasses import dataclass
@dataclass
class LLMMessage:
content: str
# Create a message instance
message = LLMMessage(content="Hello, how can I help you today?")
# Access the content
print(message.content) # Output: Hello, how can I help you today?
# Create another message
response = LLMMessage(content="I need help with Python programming.")
# Compare messages
print(message == response) # Output: False
# String representation
print(message) # Output: LLMMessage(content='Hello, how can I help you today?')
Best Practices
- LLMMessage is a simple data container with no methods or complex logic, making it straightforward to use.
- Since it's a dataclass, instances are mutable by default. Consider using frozen=True in the decorator if immutability is desired: @dataclass(frozen=True).
- The class has no validation logic, so any string (including empty strings) can be passed as content.
- This class is typically used as part of a larger conversation or message history system for LLM interactions.
- Consider extending this class with additional fields like role (user/assistant/system), timestamp, or metadata if needed for more complex use cases.
- The dataclass decorator automatically provides __init__, __repr__, __eq__, and __hash__ (if frozen) methods.
- No special cleanup or resource management is needed; instances can be created and discarded freely.
- For type safety, always pass a string to the content parameter during instantiation.
Similar Components
AI-powered semantic similarity - components with related functionality:
-
class ChatMessage 67.3% similar
-
class LLMClient_v1 55.4% similar
-
class AzureOpenAIChatLLM 55.3% similar
-
class LLMClient 55.3% similar
-
function get_llm_instance 55.1% similar