Cool VL Viewer forum

View unanswered posts | View active topics It is currently 2025-06-24 05:28:53



Reply to topic  [ 4 posts ] 
Model Context Protocol? 
Author Message

Joined: 2011-10-07 10:39:20
Posts: 214
Reply with quote
Hi Henri,

i just stumbled over the MCP (https://modelcontextprotocol.io/introduction), it is basically a way to connect a LLM / AI with an arbitrary program like Blender or a Text editor or others.

It's mostly a bit of JSON-RPC that can be used to control or query a program. There is a server side in the program (e.g. a Blender plugin) and a local program connected to a cloud or locally running LLM.

I wonder how hard it would be to add an MCP Server to the viewer? The LUA automation interface probably provides nearly all the useful things already, so it would probably just need to JSON-RPC parts. I guess on Linux this could be hacked together with the DBUS Bindings and merging it with one of the standalone MCP Servers available.

I could imagine this to be pretty useful to clean/sort inventories or similar tasks or scripting stuff.


2025-05-17 11:50:13
Profile

Joined: 2009-03-17 18:42:51
Posts: 5995
Reply with quote
While I'm indeed planing to allow using external LLMs via Lua (*), this interface/protocol seems too specific and does not even provide native Linux support.

I so far explored other ways, such as the use of simple HTTP to communicate with either local (e.g KoboldCpp) or remote LLM servers, which is way more universal, future-proof, and does not involve introducing more dependencies to build or run the viewer.

In fact, I'll likely simply provide an HTTP request Lua function (making use of the existing LLCoreHttpUtil methods in the viewer, including the ones for dealing with JSON requests, with added Lua table/JSON conversion), with an HTTP callback... From there, you can communicate with any LLM providing an HTTP API (most, if not all do provide one).


(*) I did try SL AI characters but they are disappointing, and running a local LLM would provide much better results.


2025-05-19 18:46:49
Profile WWW

Joined: 2011-10-07 10:39:20
Posts: 214
Reply with quote
Thats a good point.

MCP seems to be a bit of a hype theme at the moment, so letting things brew a little is probably not bad.

You perception that MCP does not provide linux support is a bit off though. There are various SDKs to built MCP servers (none for C++/C yet last time i looked) that do work on Linux. In the end, its just TCP sockets, HTTP + JSON-RPC, so not too outlandish.

MCP is a way for a program to expose its APIs to some AI/LLM. So this is different to what you are wanting to do.
Having the option to sends some prompts/commands to a local LLM and parse it, maybe via Lua is the thing you talk about. Thats already nice, for a lot of usecases.

But if the viewer was an MCP server, you would have usecases to say like: "Hey, AI bot. Please sort my SL inventory for Kathrine Jansma. Make links for all of my shoes in the /Clothes/Shoes folder." and the LLM would then use MCP to connect to the running viewer, ask for the available commands and just do it directly by calling appropriate Lua functions via MCP.

Or you could tell the LLM "Hey, AI bot. Please dress my avatar in SL. I want some sexy looking black dress and heels." And it could then look at the preview pictures in inventory, parse them, figure out stuff and actually dress the avatar by wearing things.

This of course also works bi-directional, so if the viewer was a MCP client, it could tell e.g. a locally running Ollama with the MCP stuff to e.g. start Blender, render some mesh stuff, then import it into SL via the viewer. For example, the Blender MCP website has some tutorial how this is used to create a landscape in Minecraft... (https://blender-mcp.com/).

So you kind of did not see the point yet, MCP is more like a DBUS/COM way to expose APIs to the outside world. It is not a way to call into an LLM, its the receiver for calls by LLMs.


2025-05-21 15:18:04
Profile

Joined: 2009-03-17 18:42:51
Posts: 5995
Reply with quote
kathrine wrote:
There are various SDKs to built MCP servers (none for C++/C yet last time i looked)
That's what I meant: if you need Python at runtime, then Python becomes another dependency needed to run the viewer...
I don't want anything of that (no bloat, minimum dependencies for the host system to run the viewer, everything either statically linked to the viewer or provided in the viewer package as shared libraries).

kathrine wrote:
you would have usecases to say like: "Hey, AI bot. Please sort my SL inventory for Kathrine Jansma. Make links for all of my shoes in the /Clothes/Shoes folder."
You would be disappointed by the result... LLMs are notably unreliable and subject to many hallucinations (and when they don't know how to reply to your prompt, they make up something totally inept: current LLMs are incapable to reply "I don't know" or "I'm really not sure" to a prompt). I, for one, would never trust one to sort my inventory.

kathrine wrote:
Or you could tell the LLM "Hey, AI bot. Please dress my avatar in SL. I want some sexy looking black dress and heels." And it could then look at the preview pictures in inventory, parse them, figure out stuff and actually dress the avatar by wearing things.
  1. It would involve implementing a shitload of Lua functions to recover textures, send them to the AI (which would need to also be competent in image recognition, so not just a simple LLM), then to offer Lua commands to dress up your avatar (i.e. wear/remove inventory items), and yet others to report each step success/failure (e.g. did the outfit got fully worn and rezzed and baked, etc).
  2. You would be limited by the context size: to explain to a LLM how SL and a SL viewer work, what commands it can use, what is an inventory, how to interpret the inventory thumbnails, what is "sexy" and what is not, etc... In total (inventory list/metadata + explanations/rules prompts) it would need megabytes of context when current LLMs barely can deal with more than a few dozens of kilobytes...

Unrealistic, in the current state of LLMs... at least if you don't want to use a supercomputer to sort your inventory or dress up your avatar. :lol:

No, for me, the one and only useful use case for LLMs in SL is role-playing (and yes, it can be fun to roleplay with a LLM: some models (*) have been trained for such purpose and are quite decent role-play partners).


(*) This one is pretty good for adult (uncensored/jail-broken) role-play with KoboldCpp.


2025-05-21 16:46:08
Profile WWW
Display posts from previous:  Sort by  
Reply to topic   [ 4 posts ] 

Who is online

Users browsing this forum: No registered users and 1 guest


You cannot post new topics in this forum
You cannot reply to topics in this forum
You cannot edit your posts in this forum
You cannot delete your posts in this forum
You cannot post attachments in this forum

Search for:
Jump to:  
Powered by phpBB® Forum Software © phpBB Group
Designed by ST Software.