Thats a good point.
MCP seems to be a bit of a hype theme at the moment, so letting things brew a little is probably not bad.
You perception that MCP does not provide linux support is a bit off though. There are various SDKs to built MCP servers (none for C++/C yet last time i looked) that do work on Linux. In the end, its just TCP sockets, HTTP + JSON-RPC, so not too outlandish.
MCP is a way for a program to expose its APIs to some AI/LLM. So this is different to what you are wanting to do.
Having the option to sends some prompts/commands to a local LLM and parse it, maybe via Lua is the thing you talk about. Thats already nice, for a lot of usecases.
But if the viewer was an MCP server, you would have usecases to say like: "Hey, AI bot. Please sort my SL inventory for Kathrine Jansma. Make links for all of my shoes in the /Clothes/Shoes folder." and the LLM would then use MCP to connect to the running viewer, ask for the available commands and just do it directly by calling appropriate Lua functions via MCP.
Or you could tell the LLM "Hey, AI bot. Please dress my avatar in SL. I want some sexy looking black dress and heels." And it could then look at the preview pictures in inventory, parse them, figure out stuff and actually dress the avatar by wearing things.
This of course also works bi-directional, so if the viewer was a MCP client, it could tell e.g. a locally running Ollama with the MCP stuff to e.g. start Blender, render some mesh stuff, then import it into SL via the viewer. For example, the Blender MCP website has some tutorial how this is used to create a landscape in Minecraft... (
https://blender-mcp.com/).
So you kind of did not see the point yet, MCP is more like a DBUS/COM way to expose APIs to the outside world. It is not a way to call into an LLM, its the receiver for calls by LLMs.