minus-squareO_R_I_O_N@lemm.eetoSelfhosted@lemmy.world•What can I use for an offline, selfhosted LLM client, pref with images,charts, python code executionlinkfedilinkEnglisharrow-up3·edit-210 days agoChainLit is a super ez UI too. Ollama works well with Semantic Kernal (for integration with existing code) and langChain (for agent orchestration). I’m working on building MCP interaction with ComfyUI’s API, it’s a pain in the ass. linkfedilink
ChainLit is a super ez UI too. Ollama works well with Semantic Kernal (for integration with existing code) and langChain (for agent orchestration). I’m working on building MCP interaction with ComfyUI’s API, it’s a pain in the ass.