Built a new LLM client
Runs entirely in your terminal
Supports all the models, just provide your own api key
Also supports custom functions via Model Context Protocol... Currently I gave it a python interpreter so it can write and run code before answering. Will soon add