A Primordial Computing Soup
Fostering AI art scenius, creating an open planetary network of robots
In the last Obliquities column, The Fabric and the Brain I offered a conceptual vision of how protocols and AI might work together to form stable ecologies of high-personality computing infrastructures that span the planet. The basic idea is that AI capabilities take the form of distributed populations of diverse AIs. This is the brain part. The protocol capabilities weave them together in specific ways, allowing a particular ecological personality to emerge from the varied individuals in the population. This is the fabric part, which makes the sum greater than the parts. Put many such ecologies together, and you get a particular vision of planetary computation.
In this installment, I want to provide two quick examples of how this might work at the level of individual ecologies, and sketch out how many more such ecologies might form a primordial computing soup.
AI Art Scenius with Titles
The first example is TITLES (who also have a Substack called TITLES), the generative art platform that we use to produce the artwork for Protocolized. The brain part of Titles is a pipeline to make fine-tuned models from art collections by a particular artist. The high-personality part is that each model reflects a distinct individual artist’s style for that project.
The fabric part is a rather clever “creator studio” for composing these individual models together, to create an ecology based on “sampling” multiple models (in the sense of sampling in music) to create new artwork. The fabric accomplishes two things – combining multiple models together in a mathematically meaningful way, and keeping track of the contributions to allow for attribution and profit-sharing. The overall ecology also has a personality, similar to how music scenes can have personalities.

An Open Planetary Network of Robots
The second example is more complex, and one I’m involved in personally – the Yak Robotics Garage (YaRG) project.
The goal of this project is to create a planet-wide network of open-source rovers and other robots (such as drones), as a stepping stone towards rover networks on the moon and Mars. The idea started with Anuraj R. (a Protocol School alum) figuring out how to teleoperate robots securely, in exchange for blockchain payments, and then generalizing the mechanism to use the ERC 8004 protocol (a sort of onchain directory and rating service for AI agents) to drive discovery of available robots for tasking.
Summer of Protocols researcher rafa then joined in the fun and prototyped an auction marketplace to allow for posting of jobs for robots, and bidding by robots able to do them. There is currently a demo marketplace going (with dummy data, and a mix of real and virtual rovers, but real prototype protocol plumbing behind it) and plans underway to test the technology in the construction sector.
Where does AI fit in here?
Well, the problem with operating an open network of rovers in the real world is that there can be a dizzying variety of hardware types with different capabilities, owned by a large variety of actors of different levels of trustworthiness, situated in different environments. There can be all sorts of potential operators anywhere on the planet – or even on an entirely different planet – with varied skill levels.
Rather than brittle and specialized command modes, you want high-intelligence robots of all sorts to expose their capabilities to potential users/customers via a flexible command surface, and high-intelligence clients commanding them using LLMs that can understand their varied technical capabilities and map them to the needs of particular tasks and missions.
So you use MCP (Model Context Protocol) to expose the capabilities, ERC 8004 (try searching for “robot”) to discover the capabilities, LLM agents to use those capabilities to get tasks done, and either traditional or blockchain rails, using the x402 protocol, to organize a marketplace for robotic services to be provisioned and procured for money.
Those are just the main moving parts in a rather complex scheme – but one in which all the complexity is mainly dealt with by AIs rather than humans. Here is an explainer video (AI generated) of the technical infrastructure behind the scheme:
Here is a simple demo video of the basic protocol in action with a real robot. And here’s another video with Anuraj and Rafa demonstrating the auction marketplace in action.
It might not seem like much compared to the spectacular robot demonstration videos you find all over social media these days, but the point is not the robots themselves, or what they do, but that it is all being orchestrated over the open internet, using mechanisms that can potentially scale planet-wide without being owned or controlled by any single entity, such as a powerful corporation or state.
In this example, the brain is distributed across multiple rovers and the LLMs that can control them. The fabric is a stack of different protocols handling various coordination needs, ranging from discovery and verification of capabilities in a variable-trust market environment, to actually enabling the teleoperation connection, to handling the auditing of results and completing any financial transactions as agreed upon. All in high-speed automated ways that still allow for case-by-case judgment and decision-making by AIs supervised by humans.
It is worth comparing this vision to a competing vision: The kind promoted by vertically integrated robotics companies through jazzy demos featuring robots doing impressive acrobatics in controlled environments. These visions typically rely on highly integrated and closed products, even if they sometimes offer lip-service to open-source affordances for some parts of the whole picture. These are comparable to early proprietary computing networks, or contemporary social media platforms owned by large corporations.
An open robotics marketplace, on the other hand, would be more like the open internet – anyone with a robot of any sort (from small hobby rover in someone’s basement to a billion dollar rover on Mars) could potentially join, and connect with anyone else with a need for that particular robot’s capabilities and the ability to pay for it. It would be messy, janky, and glued-together. It would form a kind of tangled bank of artificial organisms competing for survival in an atomized market-like environment.
Which world would you rather live in? Yet another world of monopolistic platforms, or a cheerful anarchy of robots and their owners wheeling and dealing in an open economy?
The Primordial Soup
These are just two examples of how protocols and AI can be put together in creative ways. There are dozens of others being experimented with right now, ranging from the viral and highly visible OpenClaw ecosystem to obscure and specialized ones that are as yet only crazy ideas in the heads of teenaged hackers.
Over the next decade, we’ll probably seen tens of thousands of such brain-and-fabric ecologies take shape independently. They will likely fall into loosely similar families of patterns. Some may converge, others may diverge, just like biological ecosystems.
If you think that’s a fun vision, imagine what could happen once these ecologies begin to run into each other and interconnect. Thanks to AIs, protocol systems that would have been non-interoperable in older technology paradigms will be able to automatically figure out how to talk to each other, forming squishy, oozy interfaces with each other, cobbled together by AI agents feeling each other out and inventing pidgins as they go. When AI is cheap enough, and the basic fabric capable enough, inventing a language even for just two entities to talk to each other for one short interaction becomes possible.
Take even the two examples in this essay. We can imagine photography robots in different parts of the world in the Yak network submitting photos to Titles to train individual models based on their particular image-making capabilities (such as different types of camera). Users could then sample those models to synthesize composite images by sampling those models to create strange new images seen by wholly synthetic robotic eyes.
Imagine that sort of thing, but in a primordial soup of thousands of ecologies.
As this process unfolds over the years, and the primordial soup boils and bubbles, the planetary computational character will begin to emerge in the form of a planet-scale emergent distributed brain, integrated and orchestrated by an emergent world fabric.




