Lackluster superintelligence and the infinite data plane
There’s no doubt that LLMs are great at coding - with the introduction of GPT what feels like a few years ago, and since then a furious pace of innovation, and that feels like it’s even futher doubled down at this moment for software development.
LLMs went roughly from great autocomplete to chain of thought to agentic to recursive, to now swarm and orchestrated. It sure feels like we’re building a dialectic, or consciousness.
Popular blogs about things like Gastown and OpenClaw feel like leaps and bounds better than things were even 6 months ago.
It’s is primarily in the arena of coding that llms are getting better, though there is certainly a lot of work in other industries being applied (I’m working on bringing AI effectiveness for patients in Healthcare engagements) it’s just that llms are kind of naturally better at coding - though perhaps that’s on purpose
Here’s a thought though - what if all the coding leads to a “data plane” with perfect input and recall, everywhere & all the time, but nothing else? Even if the period is short in the grand schems, it might be long to my feeble time.
Imagine if data were perfect - that’s the goal of software, right? to turn raw data of the world into information by categorizing and managing it.
Immediate and near free software could power this - you describe to the data what you want, and input and output systems are created instantaniously, perfectly tailored for your purpose, but… then what?
Could it be the end of the information economy?
Of course there’s a dark possiblity, privacy becomes the enemy of those with limited perspectives of the data (due to business rules). It is an artificial limitation placed on the beholder because of rules, and those with low visibility clambor to become those with high visiblity into the data (look how many friends I have on ____, I have so much “reach”) in a perfectly connected information plane, though, all it would take is one bad actor to spill the secrets and remove those visibility barriers (e.g. massive data leaks that happen near constantly).
Instead, something in me says (hopes?) that humans reach for the analog. We’ll accept and use the information, but I hope it’s value too, drops drastically and we rekindle relationships and community and exploration and just build and see how far we can go, see what we can learn (novel information feels like it will be valuable), see what we’re truly capable of as a species.