Intial Post from Linkedin:
https://www.linkedin.com/posts/matthewwallaceco_i-love-hearing-news-about-genai-companies-activity-7263202560990208000-Xex-?utm_source=share&utm_medium=member_desktop
I love hearing news about GenAI companies hitting a wall with foundational models. Wait, what?! Hear me out. As my co-founder Luke Norris often says: The obstacle is the way.
Foundational models have advanced so quickly that the engineering to fully exploit them lags behind, even causing some paralysis—“What if a smarter model makes this obsolete?” This bottleneck was predictable.
Experts broadly agree that current LLMs can’t achieve AGI due to fundamental limits in reasoning out of distribution. Progress to date has been driven by scaling, data quality, and innovative training techniques, but there was plenty of evidence of asymptotic gains. The leap from early GPTs to models like Llama 3 highlights this: more data, smarter training, but still bounded by in-distribution constraints.
While it is possible we may see a 10x order of magnitude model (think 10T parameters, 150T+ tokens) perform better by feeding it mountains of data generated from Chain-of-Thought inference models like o1-preview which are then curated, this scale is challenging - Orion is purportedly being trained on a 10x size cluster vs GPT-4, but 10x the parameters and 10x the data requires a 100x cluster or a 10x cluster and 10x the time. And even then, I would not expect a model that, itself, is smarter than o1-preview (or the "superior" o1 model, as OpenAI purportedly has 3 tiers and did not release the top-tier model yet.)
So the good news? This potentially points back to *technique* and not just scaling. When simply "bigger, faster, more" is not enough, you have to return to true innovation.
Meanwhile, as papers such as this one: https://lnkd.in/gRX4gzwY stack up and show that AI is not only powerful for productivity, but innovation itself. As the paper notes:
"AI-assisted scientists discover 44% more materials. These compounds possess superior properties, revealing that the model also improves quality. This influx of materials leads to a 39% increase in patent filings and, several months later, a 17% rise in product prototypes incorporating the new compounds."
The napkin math here is that AI, as it is today, represents trillions of dollars of improvement globally - as much as 10T+. No matter the pace of FMs, there's a mountain of engineering work to do.