Ever tried to teach someone by handing them just one book?

Ever tried to teach someone by handing them just one book? That’s the old way AIs used to “learn”…and it didn’t work.

Before transformers and massive databases, AIs could barely hold a conversation or answer simple questions.

The reason? They simply didn’t “know” enough. Giving a computer a couple of textbooks and hoping for genius-level answers was a dead end.

The breakthrough came when engineers built transformer models, large language models like chatGPT, and trained them with gigantic databases packed with everything from the web.

Suddenly, instead of a narrow memory, AI could pull from billions of examples.

The result: powerful language models that can answer almost anything, because their knowledge is truly vast.

That is Generative AI!

@bruno Lorenzelli, founder of @exchange robotics and @scalata