We are hitting an era where many of the concepts created during the computing and internet emergence era will be used for AI.
If you look at it from one lens, you can see a lot of similarities between an LLM and software.
Back in the day, someone created a piece of software and it also had value-generative tendencies.
For example, someone made a word processor.
What is happening there? It’s some code that takes an input and generates an output for you to see.
Now, what do we have?
With AI, we have a model. We give it some input, and it generates some output for us.
The parallel between both is the tendency of value generation from inputs.
Back in the early computer days, we had isolated applications. Then we understood that by using operating systems, we could connect different value generators inside our computer to each other.
After that, we realized
Why not connect the computers themselves?
For that, we started to use local networks and began connecting one computer to another, but only those we could physically get our hands on.
Then came DNS, which allowed us to connect to computers we could never have imagined existed, let alone physically reach.
I think the same thing will happen with AI.
We have LLM1, LLM2, LLM3 and we use all of them.
Some LLMs are for image generation, some for text generation, some for coding, and so on.
Right now, these are very isolated.
Soon, we will see something akin to what emerged as the operating system in the early computer days—and later, something like DNS.
A lot of things from the early internet era are waiting to be explored and implemented in AI.
Essentially, it’s a replay of the internet’s architectural evolution—this time in the substrate of intelligence rather than computation.
Of course, the software era flourished because the outputs those systems created were deterministic and solid.
But with the probabilistic nature of LLM outputs, there’s a much higher chance of hallucination.
That’s why we need an orientation layer to stabilize it.
And tools like Rohkun do just that.