Chips and Quips

New Silicon and New AI Assistants

Welcome back to the AI Geekly, by Brodie Woods, brought to you by usurper.ai. This week we bring you yet another week of fast-paced AI developments packaged neatly in a 5 minute(ish) read.

TL;DR AAPL=M4; AI Assistant Upgrade; Congratulations, Twins!

This week we’re firing-up the rumor mill (or are we milling the mill?) with some juicy tidbits on potential new silicon coming from our friends at Apple, possibly for a first foray into servers and cloud infrastructure. On a similar rumor-laden but trail-blazin’ path, OpenAI is looking to take the fight right to the Voice Assistants —potentially offering its own Siri/Alexa/Google Assistant-like solution imbued with its best-in-class Generative AI (across text, sound, and image). Keeping to the silicon theme, we’re also having a peek at what the Chips Act has to say about Digital Twin —an important emerging technology in our rapidly developing convergence thesis around AI, Robotics, and Spatial Computing.

Don’t Forget to Chip Your Server
Apple’s aspirations reach the Cloud

What it is: Unique in its singular focus on the personal computing experience, Apple has stood apart from the rest of Big Tech. Each of Amazon, Google, Microsoft (even recently Meta), have had Cloud ambitions which have driven the development of massive enterprise technology ecosystems. Amazon’s AWS, Google’s Cloud, Microsoft’s Azure, are all formidable businesses that contribute substantially to their parent companies’ bottom-line.

What it means: Apple, interestingly, has eschewed this market up until now. Word on the street (thanks Paul!) is, with its M2 Ultra chip and later this year with the introduction of its new M4 chip, it will make its first foray into server development and cloud offerings (iCloud currently runs on a combo of AWS and Azure), a move which is critical for vertical integration of its AI aspirations. While it has recently released several open-source, smaller models in the hopes that the developer community will enhance them, as implied by the name, Large Language Models are, in fact, LARGE, and require server-scale equipment both to train and run.

Why it matters: We love vertical integration. Apple loves it even more (so much that it would marry it). Through its vertically integrated model, Apple was able to kick Intel to the curb a few years ago and run its internally designed silicon in its devices instead of relying on third parties. This has driven massive improvements in performance, battery life, efficiency, and for the company itself: profit. Sweet, sweet profit. So as it looks at the increasingly AI-focused world we live in (recall that they cancelled the Apple Car only weeks ago to focus more fully on AI) it makes sense for the company to take control of its AI future by seizing its Cloud destiny.

“Hey Siri”, “Ok Google”, “Alexa”, “Uhhhh… Hi Sam?”
OpenAI to introduce new voice assistant (probably not named after CEO)

What it is: Sam Altman (CEO) and his OpenAI gang of merry misfits (read: smartest AI minds the world has ever seen) are back at it, this time threatening the supremacy of your in-home AI assistant. Yes, Siri, Alexa, Google Assistant, (Bixby, Cortanahahaha, ok, sorry the last two from Samsung and Microsoft, respectively, never took off) will soon be battling it out with a new contender to ignore your significant other, play songs you didn’t ask for, and turn off the lights in adjacent rooms seemingly at random.

What it means: There have been a few recent rumors out about what exactly OpenAI is working on. We’ve seen: reports of GPT-5 (successor to OAI’s current top model) in the wild which have turned-out to be false, stories of a new AI-powered search engine that would threaten incumbents Google and MSFT (I thought they were friends!) and newcomers like much ballyhooed (for good reason) perplexity.ai, also, apparently false (we still wouldn’t be surprised, despite Sam Altman’s denials on X).

Why it matters: Tech journalists are constantly trying to predict exactly what new innovation will come from OpenAI’s secretive labs. With our finger on the pulse, the introduction of a new voice assistant seems the most likely near-term development. OpenAI has several AI models already that excel at voice (its Whisper model), image generation (DALL-E 3), sound (Voice Engine), and even video (Sora). It’s not a great leap to imagine they would tie these elements together within a singular multi-modal voice assistant model. Interestingly, rather than compete 100% with the likes of AAPL, GOOG, and AWS, it may also offer a white-label solution that enhances the Siris, Alexas, and G Assistants of the world (although we suspect the current model would need to run from the cloud as it would be too large to run on-device). Indeed, this type of enhancement of AI assistants is long overdue —the feature set and intelligence has languished for nearly a decade.

Step Aside Schwarzenegger and DeVito, The New Twins are Digital
US CHIPS ACT directs $285 mm in funding to Digital Twin

What it is: We’re contrarians here at the AI Geekly. We know that most people’s favorite Schwarzenegger film is clearly Junior. It’s scientifically accurate, uplifting, and truly the film of a generation, but to us it just doesn’t hold a candle to the Governator’s seminal masterpiece: Twins (opposite pint-sized Adonis, Danny DeVito). With this magnum opus in mind, we introduce to our readers a similar concept. Just as Danny DeVito is identical in every way to his Twin, Arnold, a “Digital Twin” is a perfect 1:1 digital recreation of an object, person, process, or facility, right down to the physics and interactions with the physical world. Digital twins are used to simulate complex factory floor environments: what happens if we put a robotic arm here? or move an assembly line there? This emerging simulation technology promises to dramatically reduce costs in fields including manufacturing, healthcare, energy, and more.

What it means: The US Government has dedicated $285 million from the $53 Bn CHIPS Act to develop digital twin research, a strategic pivot in the U.S. semiconductor strategy (which until it woke-up last year was essentially non-existent), focusing not only on production but also on innovation and design. The U.S. aims to create a more resilient, efficient, and cost-effective chip manufacturing ecosystem via Digital Twin tech. This approach enables companies to simulate and optimize processes in a virtual environment, thus reducing the need for physical prototypes, enhancing the speed of innovation.

Why it matters: The implications of this investment are critical to the U.S. economy and national security. By reducing dependency on foreign chip manufacturing (a major oversight and weakness in U.S. policy prior hereto)—particularly from regions with geopolitical tensions—the U.S. secures a more stable supply chain and bolsters its technological sovereignty. Furthermore, the emphasis on Digital Twins and AI integration cements their applicability (which has been questioned), potentially leading to breakthroughs that could permeate various sectors beyond semiconductors (at usurper.ai we have developed several Finance applications internally). This initiative not only supports the immediate goals of enhancing chip production capabilities but also underpins a broader vision of maintaining U.S. leadership in critical technologies.

Before you go… We have one quick question for you:

If this week's AI Geekly were a stock, would you:

Login or Subscribe to participate in polls.

About the Author: Brodie Woods

As CEO of usurper.ai, and with over 18 years of capital markets experience as a publishing equities analyst, an investment banker, a CTO, and an AI Strategist leading North American banks and boutiques, I bring a unique perspective to the AI Geekly. This viewpoint is informed by participation in two decades of capital market cycles from the front lines; publication of in-depth research for institutional audiences based on proprietary financial models; execution of hundreds of M&A and financing transactions; leadership roles in planning, implementing, and maintaining of the tech stack for a broker dealer; and, most recently, heading the AI strategy for the Capital Markets division of the eighth-largest commercial bank in North America.