ChatGPT Is Not an Oracle
- Drew Estes

- Apr 9
- 2 min read
Updated: Apr 11
It baffles me when people say they “asked AI” and then present the output like a divine revelation.
“I asked AI to generate a strategy for my business and…”
Replace “AI” with “Google” in this sentence and suddenly it sounds silly. But, as far as where the information is ultimately coming from, it's practically the same statement.
We’ve spent years teaching ourselves that Google isn’t a source — it points you to sources. Wikipedia isn’t a source either, and every grade schooler gets that drilled in early. So why do we forget all of that when it comes to ChatGPT?
Despite all the hype about “reasoning models,” ChatGPT doesn’t actually reason. It guesses what text should come next based on patterns.
That’s not insight. That’s autocomplete with a better PR team.
Now, I’m not a generative AI cynic. I’m probably more bullish than most. I love automating the hell out of repetitive work. But let’s not confuse tools with truth.

The early web went through this same hype cycle.
Suddenly every business was an “internet business” with no real plan to back it up. Then the bubble popped — and only the businesses that actually understood how to use the internet survived.
Take Walgreens. They were slow to adopt the web and lost market share early. But when they finally moved, they asked the right questions:
How can we use the Web to enhance what we already do better than anyone else? How can we increase cash flow per customer visit using this tool?
No hype. No chasing trends. Just deliberate, strategic use of new tech to get better at what they already did well.
AI's no different. Don't just blindly apply it — learn the specific things it actually excels at, and incorporate those deliberately to help your business do more what it's already good at.
ChatGPT won’t hand you a strategy anytime soon. It might hand you a spark. What you do with that is what matters.



Comments