In het kort (TL;DR)
Generative AI is cheap right now because it builds on years of Big Data investments and abundant compute capacity. LLMs are not magic, they recognize patterns in vast amounts of data. Companies with rich data lakes have a clear advantage. This affordability will not last. As AI infrastructure matures and consolidates, costs will rise and access will become more exclusive.LLMs: not magic, just patterns
Large Language Models stand on the shoulders of an earlier hype cycle: Big Data. At their core, they do one thing extremely well: predict the next token based on the previous ones.
Add too much noise or variation, and the pattern breaks down. That’s when models start hallucinating. The iron law of IT still applies: garbage in, garbage out.
But there is something genuinely new here. This technology can work with unstructured data full of hidden patterns — at scale. And the rule is simple: the more data, the better.
So where does that data come from?
Exactly. From the Big Data hoarding spree of the past decade. Companies with large, well-populated data lakes now hold the strongest cards.
Don’t have one? Then now is the time to start collecting — or buying — data.
One warning: AI is cheap for now
Today, generative AI is remarkably inexpensive. Major providers are buying market share.
In about four years, once the current generation of AI chips and data centers has been economically depreciated, the landscape will change. AI will become far more exclusive.
Keep that in mind before turning your entire organization “AI-enabled.” For a deeper, more technical perspective, check out the blog series by my colleague Janna: Demystifying AI
Or get in touch via the Contact page if you prefer a real conversation.