Invoke - Local LLM Client

Invoke - Local LLM Client

0.0 2.74MB 0 免费
版本 1.1.1 更新 0001-01-01 开发者 kazuhiko sugimoto
Note: This app is intended for users who are able to set up a local LLM server (Ollama or LM Studio) within their own LAN environment. Some technical setup is required.

Chat with your local LLM! Seamlessly connect to Ollama or LM Studio for a fully offline, privacy-focused AI chat experience!

This iOS app connects to a locally hosted Large Language Model (LLM) server and enables seamless, natural conversations.
Compatible with Ollama and LM Studio via HTTP, it provides real-time message streaming and intuitive chat history management.
The app operates entirely within a local network—no internet connection required—making it ideal for those who prioritize privacy and security.

Key Features:
- Easy connection to local LLM servers (Ollama / LM Studio)
- Natural chat UI with bubble-style layout
- Auto-saving and browsing chat history
- Server and model selection via settings screen
- Supports Dark Mode
分类: 软件开发工具(4119) 版本: 1.1.1 BundleId: com.gmail.89.sugi.kaz.LocalLLMClient 开发者: kazuhiko sugimoto 最近更新: 0001-01-01