Class AIUtils
This class handles the construction of prompts, communication with the Ollama API, and parsing of AI responses.
Usage Example:
MessageResponseFormat response = AIUtils.getInstance().generateResponse( chatHistory, chatConfig, isQuizMode );
- Author:
- Justin.
-
Nested Class Summary
Nested ClassesModifier and TypeClassDescriptionclass
The response format for the AI's responses.class
Format of a single option in a quiz question generated by the AI.class
A singular question in a quiz generated by the AI.class
Structure of a quiz generated by the AI. -
Method Summary
Modifier and TypeMethodDescriptiongenerateResponse
(List<Message> history, Chat chatConfig, boolean isQuizMode) Calls /api/chat on the Ollama host generating an AI response based on chat history and configuration.static AIUtils
Gets the singleton instance ofAIUtils
.boolean
hasModel()
Calls /api/tags on the Ollama host to check ifmodelName
is available.boolean
Pings Ollama host to check if it is running.void
setVerbose
(boolean verbose) Sets the verbosity ofOllamaAPI
output.static boolean
Validates the structure and content of an AI-generated quiz response.
-
Method Details
-
getInstance
Gets the singleton instance ofAIUtils
.- Returns:
- The singleton instance.
- Throws:
RuntimeException
- if initialization fails.
-
validateQuizResponse
Validates the structure and content of an AI-generated quiz response.This method thoroughly checks the response format to ensure it contains valid quizzes, questions, and options. First, responses cannot be null or empty. If the response is valid, it checks each quiz for a title and questions.
A quiz must have at least one question and a list of questions. Furthermore, each question must satisfy all the variables in the
AIUtils.Question
class. Each question must have a valid question number, content, and at least one option. Ideally, the AI should generate at least one correct answer option for each question. If not, the quiz response is considered invalid.- Parameters:
response
- TheAIUtils.ModelResponseFormat
to validate.- Returns:
- True if the quiz response is valid, false otherwise.
-
isOllamaRunning
public boolean isOllamaRunning()Pings Ollama host to check if it is running.Ollama is by default running locally at http://127.0.0.1:11434/.
On Windows, having the Ollama host set to "localhost" will not be equivalent to 127.0.0.1
To setup ollama, `ollama serve` must be run in the terminal.
- Returns:
- True if Ollama is running, false otherwise.
-
hasModel
public boolean hasModel()Calls /api/tags on the Ollama host to check ifmodelName
is available.- Returns:
- True if the model is available, false otherwise.
-
getModelName
-
setVerbose
public void setVerbose(boolean verbose) Sets the verbosity ofOllamaAPI
output.By default, this is false when
OllamaAPI
is initialized.- Parameters:
verbose
- True to enable verbose output, false to disable.
-
generateResponse
public AIUtils.ModelResponseFormat generateResponse(List<Message> history, Chat chatConfig, boolean isQuizMode) Calls /api/chat on the Ollama host generating an AI response based on chat history and configuration.The AI model is configured accordingly using
OptionsBuilder
with the specified temperature, number of predictions, and context size. These model parameters are highly dependant on which model is being used.After the model is generates a response, it's thinking tokens are formatted out, the response is further processed by either
processQuizResponse(String)
orprocessChatResponse(String)
methods assuming the response is in JSON format, then the response is returned in aAIUtils.ModelResponseFormat
.If the model response fails, the default response is an error message.
It is expected but not enforced that there should be a system prompt for the AI to follow. The system prompt is constructed based on the provided chat configuration and whether the response is for a quiz or general chat.
System prompts are loaded via
loadPrompts()
method.- Parameters:
history
- The list of previousMessage
objects in the chat.chatConfig
- TheChat
configuration (e.g., personality, quiz settings).isQuizMode
- True if a quiz response is requested, false for a general chat response.- Returns:
- A
AIUtils.ModelResponseFormat
containing the AI's response.
-