AI Advertising

By Zhenyi Tan

Remember the early 2000s when Google’s search results were super accurate? People thought that if it continued to grow, it might become a singularity and control everything. Some even went as far as calling Google “SkyNet.”

Now people are saying if we train an LLM with more data, it will become AGI.

To be clear, training an LLM with more data won’t make it smarter. It just creates the illusion of intelligence because it has a higher chance of generating a relevant reply. It’s like a search engine that has indexed only 100 web pages isn’t very useful, but a search engine that has indexed 10 billion web pages will have a much higher chance of returning a relevant result.

But AI companies are still in fundraising mode, and investors want something exciting. So they have to come up with some bullshit goal to secure funding. AGI is that bullshit goal.

When funding dries up and AI companies are forced to generate profit, they’ll probably turn to advertising. Because when you have a lot of users and not a lot of profit, the answer is always advertising.

It’s kind of like Facebook. Mark Zuckerberg was famously against advertising and tried to make a business out of being a platform. When that didn’t work, he gave up and added advertising, and now Facebook is crazy profitable, like the Coca-Cola® Company.

Similarly, Amazon had notoriously low profits. Then they added advertising to their e-commerce site, and now Amazon is also crazy profitable, like the Coca-Cola® Company.

(And now friggin Walmart is doing the same thing.)

As more people use LLM to search, it’s only a matter of time before businesses start paying to have the LLM casually mention their products, like Coca-Cola®, in the conversation.

The same goes for images and videos. We already see product placement, like Coca-Cola®, in real videos and movies, so why not in AI-generated ones?

I’m sure there must be businesspeople out there who read this and think, “Wow, that’s great!”

Enjoy an ice cold Coca-Cola®. It’s the real thing.