This commit introduces a large-scale refactor that replaces the existing TCP-based API with a Spring Boot-powered HTTP REST architecture. Due to the size and scope of this change, only essential structural notes are included below. High-level changes: - Replaced TCP-based communication with RESTful endpoints - Introduced Spring Boot for API handling and configuration - Refactored internal core logic to support REST architecture New/Updated API components: - `APIApplication.java`: Main Spring Boot entry point - `MessageController.java`: Handles LLM-related queries - `ToolController.java`: Handles adding/removing tools - `NewToolRequest.java` / `NewToolResponse.java`: Data models for tool addition - `NewQueryResponseHook.java`: Webhook handler for LLM query results - `WebhookError.java`: Model for reporting webhook errors - `EnableIfNotDisplay.java`: Conditional configuration for TTY context - Other supporting classes (e.g., `ToolArgument`, `ToolRequest`) Core changes: - `Core.java`: Removed deprecated `addFunctionTool`, added `removeTool` - `LaunchOptions.java`: Added `notDisplay` flag for headless operation - `OllamaObject.java`: Implements tool removal logic Launcher/display changes: - `Launcher.java`: Starts `APIApplication` if not in TTY mode - `Display.java`: Integrates REST API contextually with TTY display NOTE: Several classes are included but not yet fully utilized; these are placeholders for upcoming features (e.g., `MessageResponse`, `ToolRequest`). BREAKING CHANGE: This refactors removes all TCP-based API code and replaces it with HTTP REST using Spring Boot. Any clients or modules depending on the old TCP interface will need to be updated.
2.5 KiB
Chat thing
(better name pending probably)
What is it?
Well it's basically a font end for Ollama like Open-WebUI, but meant to run a local model and open up a somewhat easy to use API system for tools.
Use cases?
While the primary goal is providing a somewhat more modular frontend for Ollama, you can also use this system to integrate AI into your own application, For example AI could act as a player in a game or interact with external system with the use of tools via the API
simple examples are in the Display module where i gave it the ability to access python through docker and get the current date and time both with the OllamaFunctionTool thru my Ollama framework
API
The documentation for the API is available at the gitea wiki under API docs
How to run?
To run you need to build the launcher module
$ ./gradlew :launcher:shadowJar
This will put the launcher-1.0-all.jar(or similar, based on the version) in ./launcher/build/libs, this can now be copied anywhere and can be ran with
$ java -jar launcher-1.0-all.jar
you can use the -h argument to get a help message.
Launch options for NeuroChat:
-h --help Provides this help message
-s --server Starts the application as API server
-p --port Provides the port number that the API server should use, defaults to 39075
-o --output Redirects the API Server output to another file
--api Provides API docs
How to build? And general information
If you only want to build a specific module, you can use the same command as listed in How to run? but replace launcher with the desire module.
- API:
$ ./gradlew :API:shadowJar- Depends on
Core
- Depends on
- Display:
$ ./gradlew :Display:shadowJar- Depends on
CoreandMALAPITool
- Depends on
- Core:
$ ./gradlew :Core:shadowJar- However, this one is kinda useless unless you want to directly implement the system into your application
- MALAPITool:
$ ./gradlew :MALAPITool:shadowJar- Please read MALAPITool README.md
- Launcher:
$ ./gradlew :launcher:shadowJar- Depends on
API,DisplayandCore - This is the main module that runs the application and starts the API server
- Depends on