A few months ago someone told me they had found me through ChatGPT.
They were researching something adjacent to what I do. My name surfaced in the results from one of my many business articles and it made sense to them and they reached out.
I hadn’t paid for placement nor optimized for a search engine. I hadn’t even thought about how I might appear in that environment. My business doesn’t do well with paid ads given the relationship-driven foundation it requires. Most of my business is either by referral or strategic cold calling.
That moment was an eye opener for me because it revealed something quiet about how these systems were functioning. Visibility was emerging from substance, not spend. At least in that instance.
This week I watched a well-produced ad from Claude. The setup was clever. A young man asks what appears to be a humanized “gym bro” ChatGPT how to get a six-pack quickly. The “AI” begins to personalize a plan, then pivots into confidence in the bedroom with a certain pill and a discount code for a trial. Dr. Dre’s song “What’s the Difference?” kicks in. The message is clear without being shouted.
Ads are coming to AI. Not here.
Competitive advertising has arrived inside the category itself. Not just product comparisons or feature lists but actual category positioning.
The speed of this shift reflects the speed of adoption. Individuals rely on these tools daily. Corporations are embedding them into workflow, compliance, diligence, and strategy. Once usage scales that quickly, differentiation becomes existential. Trust becomes the asset.
In capital markets, we see this pattern whenever infrastructure matures. The early phase is capability. The next phase is narrative. Whoever defines the risk defines the market.
In this case, the implied risk is commercialization. Answers becoming funnels. Guidance becoming product placement. The ad does not argue technical superiority. It suggests moral clarity.
That is sophisticated marketing.
It also signals that the category has entered its second phase far sooner than expected.
My earlier experience, being found organically, now sits against this backdrop differently. It raises a practical question: as these systems evolve, will discovery remain grounded in relevance, or will it gradually mirror the ad-driven ecosystems we already distrust?
Business models need to make sense and inserting ad revenue is nothing new. Infrastructure at this scale is expensive. There are trade-offs in every direction. You can already see issues arising with power consumption and outdated electrical infrastructure.
What is interesting is how quickly we have moved from curiosity about AI to competition over its integrity. For professionals whose reputation depends on signal over noise, that distinction will matter.
There is another layer to this conversation that rarely receives the same level of attention. Much of the public narrative assumes AI is either the solution to everything or the beginning of widespread professional displacement. Entire fields are casually placed on the chopping block. The law and accounting professions seem to be easy rhetorical targets. The assumption is that once a model can draft, summarize, or analyze, the human layer becomes redundant.
That assumption misunderstands what clients pay for.
People are not buying words. They are buying judgment and accountability. They want someone’s experience inside situations where information is incomplete and risk is real. A model can accelerate research but it cannot sit across the negotiating table and read hesitation. It cannot carry liability or absorb the consequences of a bad decision.
The technology has improved at a remarkable pace. The early examples of AI struggling with someone eating spaghetti already feel distant. Progress is real, and the acceleration is visible. Still, sharper tools do not dissolve professional responsibility. They increase leverage and do not replace the judgment of the person using them.
The debate around ads and commercialization will continue, as it should. Business models matter. Infrastructure at this scale requires funding. AI will settle into workflow. It will influence how research is gathered and how early analysis is framed. Yet the work that carries consequence remains tethered to human decision-making, to the discipline of thinking clearly and standing behind the outcome.
Reputation tends to form in quieter ways. It accumulates through decisions made when information is incomplete and outcomes matter. Tools can assist in that process. They can speed it up at the margins.
But the responsibility for the decision never transfers.





Leave a Comment