Inflation is climbing, but not alarmingly… a September cut is expected… is AI about to start disrupting other AI?… the companies most at risk today… where to invest to avoid the disruption
This morning’s Consumer Price Index (CPI) report came in on the warm side, but not hot enough to derail the market’s expectation for a September rate cut.
Prices in July rose 0.2% for the month and 2.7% for the year. While that monthly figure matched estimates, the 2.7% yearly number was below the 2.8% forecast.
Core CPI, which strips out volatile food and energy prices, was slightly hotter.
On the month, core CPI climbed 0.3%, translating to a yearly rise of 3.1%. Again, that 0.3% reading matched forecasts, but the 3.1% yearly reading came in just above the estimate of 3.0%.
So, did the tariff-based inflation boogeyman show up or not?
In some places, yes.
For example, furniture prices rose 0.9% in July and are 3.2% higher than a year ago. Shoe prices jumped 1.4%. And household furnishings and supplies rose 0.7%.
But while some goods prices are increasing due to tariffs, we’re not seeing widespread runaway inflation.
Overall, this is more of a slow-burn inflation, not a surge. It’s persistent enough to keep the Fed cautious, but not hot enough to shut the door on rate cuts, especially with labor market softness coming into focus.
If anything, today’s CPI data strengthened the expectation for a rate cut.
We see this by looking at the CME Group’s FedWatch Tool. This shows us the probabilities that traders are assigning different fed funds target rates at dates in the future.
Yesterday, traders put 85.9% odds on a quarter-point cut next month.
In the wake of this morning’s report, that probability has jumped to 92.2%.

While “hotter inflation” equals “a rate cut” might seem counterintuitive, the thinking goes:
- Recent disappointing labor market data presents the greater risk today than this slow rise in inflation
- Today’s CPI data is going in the wrong direction, but it’s not hot enough to deter a cut
- It’s still unclear whether we’re seeing ongoing inflation rather than a “one time” bump higher in prices due to tariffs
So, put it all together and we remain “all systems go” for a September cut.
The new wave of creative destruction is hurtling toward us
Last week, ChatGPT-5 dropped – free to all users – and our technology expert Luke Lango calls it “a monster.”
From Luke’s Daily Notes in Innovation Investor:
It just tied for first place on the Artificial Analysis Intelligence Index – a composite score across eight separate AI intelligence evaluations – matching Grok 4’s performance.
The twist? Grok 4 needed 100 million tokens to get there. ChatGPT-5 did it with 43 million – less than half the compute.
Translation: AI isn’t just getting smarter. It’s getting faster, cheaper, and more modular.
That kind of leap could accelerate AI adoption across industries, opening the door to a wave of advanced applications.
Sounds great, right?
Yes and no.
While it’s great for the end users harnessing this incredible creative power, it represents destruction for a new wave of companies – including some AI companies – that are suddenly at risk of obsolescence.
Back to Luke:
When foundational models get this good, lower-tier AI products get squeezed.
If ChatGPT-5 or Grok 4 can spin up a fully functional website in minutes, what happens to companies like Wix(WIX) or GoDaddy (GDDY)?
To Luke’s point, if a general-purpose AI can generate a custom, functioning website in minutes – complete with design, copy, search engine and AI optimization, and e-commerce – what’s left for Wix and GoDaddy to offer beyond domain registration?
We’ve opened Pandora’s Box
The same creative destruction could spill over into graphic design tools (think Canva) if/when these AI “super-models” can produce agency-quality branding in seconds.
It even hits marketing automation software if/when campaigns can be strategized, written, and optimized by a few AI prompts.
Think about coding platforms like GitHub Copilot, or freelance marketplaces like Fiverr…
If a super-model like ChatGPT-5 can deliver production-ready code or creative assets with minimal human oversight, will those ecosystems shift entirely to AI-assisted quality control? At best, they shrink substantially, shedding human employees.
Even enterprise software isn’t immune. It’s possible that productivity suites, CRM platforms, or niche SaaS tools could be reduced to features within a mega-AI interface. Just consider the way the smartphone absorbed calculators, cameras, and GPS devices… not to mention home phones and land lines.
Behind all this is one critical question for investors…
How do we separate companies that can integrate and ride the AI super-model wave – from those whose only defense is that the wave hasn’t hit them yet?
Before we can answer that effectively, we need to better understand what this wave is, and what it represents.
Stepping back, this latest evolution of large language models (LLMs) is unbelievably powerful. But now consider what happens when all that capability becomes localized.
Back to Luke to explain:
AI is leaving the cloud. Crawling off the server racks. And stepping into the physical world. And when that happens, everything changes.
Because physical AI can’t rely on 500-watt datacenter GPUs. It can’t wait 300 milliseconds for a round trip to a hyperscaler.
It needs to be:
- Always on
- Instantaneous
- Battery-powered
- Offline-capable
- Private
- Cheap
It needs SLMs: compact, fine-tuned, hyper-efficient models built for mobile-class hardware.
To make sure we’re all on the same page, LLMs are the massive, compute-hungry engines trained on hundreds of billions (or trillions) of parameters. They’re powerful, but expensive to run and are often overkill for specific tasks.
SLMs – or “small language models” – by contrast, are lean, task-focused models that can be fine-tuned to perform at near-LLM quality for a fraction of the cost, energy, and hardware requirements. Some can even run locally on a laptop or smartphone.
So, with an SLM, “AI” is no longer a cloud-based service you log into. Instead, it’s woven into the fabric of your everyday device and routine workflow.
Luke writes that SLMs are in Apple’s upgraded Siri, Meta’s new Orion smart glasses, and Telsa’s Optimus robots.
What happens when the power of LLMs meets the precision of SLMs?
If SLMs begin to dominate within the wider AI space, we’ll see a massive shift in the playing field.
Back to Luke:
SLMs do not require data centers. They do not need $30,000 accelerators. They do not consume 50 megawatts of cooling. They do not even rely on OpenAI’s API.
All they need is efficient edge compute, a battery, and a purpose. And that changes everything.
The center of gravity in AI shifts – from cloud-based GPUs and training infrastructure to edge silicon, local inference, and deployment tooling.
So, while LLMs like ChatGPT are grabbing the headlines, SLMs may be the real agents of disruption – quietly dissolving business models and spawning new ones in their wake.
The winners will be the companies that figure out how to own the distribution layer before everyone else.
An example to drive this home
Take Zoom.
Today, live transcription, meeting summaries, and language translation all happen in the cloud. This costs time, bandwidth, and money.
In tomorrow’s world of SLM dominance, those same features could run instantly on your laptop or even your phone. No internet lag time, no subscription.
Imagine opening your MacBook and the built-in “AI Meeting Assistant” does everything Zoom once charged for. Meanwhile, Microsoft Teams or Google Meet integrates even better locally powered AI.
Overnight, Zoom’s core differentiators vanish. They’re replaced by default features from companies that already own the operating system or productivity suite.
In other words, an SLM turns Zoom – a billion-dollar SaaS giant – into a pre-installed checkbox.
And rather than layoffs, we get an “out of business” sign.
One way to invest in this shift right now
If SLMs represent the brains of the next wave of AI, then robotics and “physical AI” are the bodies that those brains will inhabit.
Shrinking models from sprawling, cloud-bound LLMs to nimble, on-device SLMs doesn’t just cut costs. It makes intelligence mobile.
Suddenly, you don’t need a warehouse of servers to power a robot’s decision-making. You can embed advanced reasoning directly into the machine itself.
And this unlocks a floodgate of possibilities:
- Drones that navigate without constant cloud connection…
- Warehouse bots that adapt on the fly…
- Humanoid assistants that operate efficiently in real time.
This is where digital intelligence meets mechanical capability – and it’s going to change our day-to-day world.
How do we get ahead of it?
Last week, Luke – along with Louis Navellier and Eric Fry – released their latest collaborative investment research package. It’s about how to position yourself today for the coming era of Physical AL.
Their Day Zero Portfolio holds the seven stocks they’ve identified as best-of-breed in AI-powered robotics, providing targeted exposure to the next wave of AI exponential progress.
Circling back to SLMs, I’ll give Luke the final word:
“Small models” do not make headlines. But they are what will drive profits—because they are what will scale artificial intelligence to a trillion devices and embed it into the everyday fabric of human life.
SLMs aren’t just a more efficient alternative to giant cloud-based AI. They’re the key to taking artificial intelligence off the server racks and into the real world…
In short: SLMs unlock the era of “physical AI.”
Have a good evening,
Jeff Remsburg