for documentation on MAL API check out the [API docs](https://myanimelist.net/apiconfig/references/api/v2) Display:Display.java - Switched the AI model to `qwen3:8b` - some temporary test changes - switched to using static references when supposed too MALAPITool - Module for handling the MAL API MALAPITool:README.md - Some general information about the tools Started adding some base systems for finding API endpoints, will later be added to the API module for the ability to add plugins .gitignore - Added so all data folder are ignored. So submodules data folder from testing aren't added to git Core:OllamaObject.java - switched location of messages.json from the static location of `./cache/` to the dynamic location of `${Core.DATA_DIR}/messages.json` Core:OllamaPerameter.java - Added ENUM and ARRAY values. Incidentally also discover that Ollama supports more than just STRING, INT, and BOOLEAN ad parameters Core:Core.java - Added precluding day to the name of logging files, now using the format dd_HH-mm-ss
Chat thing
(better name pending probably)
What is it?
Well it's basically a font end for Ollama like Open-WebUI, but meant to run a local model and open up a somewhat easy to use API system for tools.
Use cases?
While the primary goal is providing a somewhat more modular frontend for Ollama, you can also use this system to integrate AI into your own application, For example AI could act as a player in a game or interact with external system with the use of tools via the API
simple examples are in the Display module where i gave it the ability to access python through docker and get the current date and time both with the OllamaFunctionTool thru my Ollama framework
API
The documentation for the API is available at the gitea wiki under API docs
How to run?
To run you need to build the launcher module
$ ./gradlew :launcher:shadowJar
This will put the launcher-1.0-all.jar(or similar, based on the version) in ./launcher/build/libs, this can now be copied anywhere and can be ran with
$ java -jar launcher-1.0-all.jar
you can use the -h argument to get a help message.
Launch options for AI_chat
-h --help Provides this help message
-s --server Starts the application as API server
-p --port Provides the port number that the API server should use, defaults to 39075
-o --output Redirects the API Server output to another file
--api Provides API docs
If you only want to build the Display or API module you can use the same command but replace launcher with the desire module.
- API:
$ ./gradlew :API:shadowJar - Display:
$ ./gradlew :Display:shadowJar - Core:
$ ./gradlew :Core:shadowJar- However this one is kinda useless unless you want to directly implement the system into your application