- AI&TFOW
- Posts
- Inside the Age of Inference [Newsletter #77]
Inside the Age of Inference [Newsletter #77]
What the Next Stage demands from Us
Hello, AI enthusiasts from around the world.
Welcome to this week’s newsletter for the AI and the Future of Work podcast.
The internet changed our lives in ways we can’t fully describe, connecting the entire planet without borders. It’s hard to believe it, but the next wave of change will be even bigger.
We’ve spent the past decade training AI and building the foundation for what comes next. Now we’re entering a new stage, and it brings fresh challenges.
The biggest one might not be the technical hurdles we usually talk about.
Today’s conversation highlights why, instead of viewing energy and cost as the main constraints, we may need to focus on humans as the real key to AI’s future.
Let’s dive into this week’s highlights! 🚀
🎙️ New Podcast Episode with Sid Sheth, CEO and co-founder of d-Matrix
AI has advanced at a pace none of us imagined, and here in 2025 we’ve entered a new phase: The Age of Inference has arrived.
We can now draw conclusions from evidence and analysis, but there are hurdles ahead.
In the past, we focused on training AI. Now it’s time to let AI elevate human capability. We’re moving toward new levels of productivity for humanity. But with every adoption phase, there’s uncertainty.
AI’s rapid rise has created challenges. They’re real, but they’re not the ones we tend to focus on.
Energy demands for this new age are high, and today’s power grids weren’t built for it. That was the cost of training. Now we’re shifting to a world where AI works in our favor.
That difference is what matters most according to Sid Sheth, CEO and co-founder of d-Matrix; the AI chip company making inference efficient and scalable for datacenters.
Sid’s mission is to rethink infrastructure so AI has a sustainable path forward. He points to the Jevons Paradox, where gains in efficiency increase demand. As AI becomes more efficient, usage will rise even faster.
To meet this moment, d-Matrix is developing purpose-built accelerators to help AI reach the next step. And along the way, Sid has identified a challenge that has nothing to do with infrastructure or efficiency.
In his conversation with PeopleReign CEO Dan Turchin, he explains that the biggest bottleneck for AI’s future isn’t energy or optimized datacenters. It’s human talent. We’re entering a period of deep talent shortages, and talent isn’t something you can replace with electricity.
Today’s conversation explores this and more:
As the Age of AI Inference begins, productivity will boom and create new, unpredictable market demands.
Why smaller, more efficient models are unlocking the inference era and what this shift means for large-scale AI adoption.
Cost, time, and energy limit inference today, and d-Matrix is building chips designed to remove those constraints.
How the rise of reasoning models and agentic AI is moving demand away from generic tasks and toward abstract problem-solving.
The shortage no one talks about is talent. Thriving in this new era requires upskilling at a faster pace than anything we’ve seen before.
🎧 This week's episode of AI and the Future of Work, featuring Sid Sheth, CEO and co-founder of d-Matrix.
📖 AI Fun Fact Article
The conversation about AI’s energy demands is growing, but today’s usage is small compared to what lies ahead, according to Jackie Snow from Quartz.
Electricity needs for AI data centers are set to surge over the next decade. New Deloitte research estimates that power requirements will rise from 4 gigawatts to 30 by 2035, a thirtyfold increase.
Energy is only one part of the challenge. Another is how long it takes to build the infrastructure. Data centers go up in one to two years. Power plants take a decade.
As a result, data centers have become both a pressure point and a major opportunity. The industry faces a significant labor shortage, with 63 percent of data center executives citing the shortage of skilled talent as their biggest obstacle.
These gaps should have pushed data center operators and power companies to collaborate more closely. But Snow highlights that only 15 percent of data center executives and 8 percent of power company leaders describe their partnerships as “highly effective,” even though 72 percent of both groups consider power and grid capacity major challenges.
At the same time, investment is accelerating. The tech industry is expected to spend more than one trillion dollars on U.S. manufacturing of AI supercomputers, chips, and servers in the next four years. Electric and gas utilities are projected to invest more than one trillion dollars over the next five years, and hyperscalers may reach that same trillion-dollar mark in just three years.

Source: Quartz
PeopleReign CEO Dan Turchin highlights that there are too many unknowns to react strongly to even the most dire predictions. We are extrapolating future energy and computing requirements based on the inefficiency of today’s AI models, chips, and infrastructure.
We are also assuming that future AI demand will be served by current approaches. We may soon reach the useful limits of large language models (LLMs). Achieving the reasoning capabilities that foundation models claim today will likely require a more neurosymbolic representation of data, which goes far beyond next-word prediction.
We also lived through the deep-seek moment earlier this year in 2025. We don’t yet know what new computer architectures will need or how they will perform at scale. We do knowthat we will need massive new capacity, especially for inference.
He urges us not to view this as a reason to panic, but as a chance to appreciate the breakthroughs ahead and, more importantly, the fulfilling, high-paying, and safe careers that this wave of innovation will create.
Listener Spotlight
Pamela in Providence, RI is a college professor at Brown.
Her favorite episode is the conversation with Matt Beane, author of The Skill Code, from season five, where he explores how to master skills and stay relevant in the age of AI.
🎧 You can listen to that excellent episode here!
As always, we love hearing from you. Want to be featured in an upcoming episode or newsletter? Share how you listen and which episode has stayed with you the most.
We'd love to hear from you. Your feedback helps us improve and ensures we continue bringing valuable insights to our podcast community. 👇
🎶 Worth a listen: Anniversary Episode
365 conversations, one for every day of the year. Each one has given us a chance to hear how people and technology are evolving together.
Across these episodes, individuals and companies have shown how curiosity and collaboration are shaping the next stage of AI’s evolution. In our special 365th episode, we celebrate that curiosity and revisit what “the future of work” truly means.
The excerpts in this compilation capture only a fraction of the insights and lessons shared along the way.
Thank you for listening, and here’s to the next 365 conversations.
You can listen to this special episode here.
Until Next Time: Stay Curious 🤔
We want to keep you informed about the latest developments in AI. Here are a few stories from around the world worth reading:
OpenAI has declared Code Red as Google and Anthropic gain ground. Here’s more.
YouTube’s new AI deepfake tracking tool is raising concerns among experts and creators. Here’s why.
Amazon is partnering with Nvidia to build AI chips and deploy new servers. Here’s more.
👋 That's a Wrap for This Week!
Will we need more energy? If AI keeps advancing at this pace, the answer is yes. That’s why many leaders are preparing for the next stage of AI, a stage defined less by training and more by real-world use.
As AI becomes mainstream, usage will grow and energy demands will rise with it. Meeting that challenge will require more than just better infrastructure. It will also require human insight and talent.
We hope this week’s conversation encourages you to learn, grow, and take part in the next chapter of AI. 🎙️✨
If you liked this newsletter, share it with your friends!
If this email was forwarded to you, subscribe to our Linkedin’s newsletter here to get the newsletter every week.

