- AI Geekly
- Posts
- AI Geekly - First!
AI Geekly - First!
Buckle-up for a wild 2024
Welcome back to the AI Geekly, by Brodie Woods.
Hope you all enjoyed the break: airing of grievances, feats of strength, seasonal urinal-puck flavored Starbucks. Note unfamiliar terms written in green are defined in the Glossary at the bottom of this note.
A More Serious Tone - Remember to Be Present
There is one piece of advice we wish we had shared at the top of the month. Remember to cherish this time, the early days of the AI Age. The world is changing, but it hasn’t changed yet. It will soon. We’ve discussed the dramatic pace at which we expect AI to impact society. It’s easy to get caught-up in it. Make sure that you appreciate the present and its population, despite its, and our own, many shortcomings. Time is of the essence.
Connectivity is Killing the Connection
Think about how much our personal relationships have shifted over recent decades, with the introduction of new technologies. Holidays, evenings with loved ones, interactions with family, friends, strangers, etc. have all changed dramatically from 20, 30, 40+ years ago. There are more news items, distractions, targeted-algorithms, emails, texts, posts, tweets, likes, shares, pings, smart reminders, calendar events, alerts, click-bait, bot-content, etc. than ever, each demanding a piece of our limited mindshare.
A Kingdom Divided
Indeed, our society has reached a point, which you may have become subconsciously aware of over the past few months —Nobody Knows What’s Happening Online Anymore, meaning the Internet has become so large, with so much content created so rapidly that it’s almost like we aren’t all on the same Internet anymore, it’s ephemeral. Combine this with increasing tribalism by political or other affinity and the lack of shared sense of Civics or societal obligation and society’s increasing fragmentation is hard to ignore.
Same World Different Realities
Another notable theme is the diametrically opposite experiences of the economy dependent upon income level, approaching Romanovian levels of disconnect. High-earners and wealthy survey respondents see a strengthening economy and will be buoyed by a potential soft-landing vis-a-vis the Fed’s response monetary policy, whereas lower-income survey respondents have been choked by rapid inflation hitting food, fuel and essentials, along with skyrocketing rents all relative to effectively negative wage growth on a real/net basis. These discrepancies are widening.
E_Pluribus_Unum.exe
If left to the absolute control of society’s powerbrokers (politicians, corporations, special interest groups, etc.) AI will inevitably be used like any other asset: doling-out benefits to the favored while penalizing those who oppose or disagree. The word “If”, is doing a whole lot of work in that last sentence. The future, as it turns out, rests on that one word. If. If instead, we are thoughtful in our development, prioritizing common goals, AI can be used to help us return to what matters; it can help us to refocus on our societies, it can become the underlying infrastructure for a system of Civics, and bring people closer together. The only way this can be accomplished is via open-source development of AI. By supporting and expanding current distributed AI capabilities of researchers and individuals at scale, we can democratize access and effectively create a check/balance against the technology’s negative impacts.
Turning to 2024 - Expectations
As we look to 2024, we see a few things:
Localize-It
We expect a continuation of the trend towards local inference with continued open-source development of models akin to Llama 2, Mistral, Falcon, etc. as robust AI research communities continue to collaborate globally. We also expect to see GPT-4 level open-source models drop in 2024, which will significantly up the ante. We also expect contributions to the community to increase, particularly as we see more CPUs with dedicated AI hardware directly on-chip (AAPL pioneered it, AMD and INTL have joined the party). Local inference provides several benefits including full control, privacy, protection from censorship, security, and more. That said, trade-offs in performance, scalability, compute cost, redundancy, etc. ensure that Cloud AI isn’t about to get gobbled-up by local.
GPU Prices Will Enter the StratoMesosphere
Following on the trend towards localization, GPU prices will most likely rise next year, as demand for VRAM and GPUs for consumers and smaller-scale on-prem companies make use of consumer and professional hardware. While it’s unlikely that Nvidia will rush to release its next generation Blackwell consumer RTX 5000 series cards, it could be enticed depending on whether AMD’s RDNA4 RX8000 cards exceed expectations (which AMD has a history of doing), possibly leading to an RTX 5000 launch in Q4/24.
From “Show Me” Story to Prime Time
GenAI will finally have tangible measurable outcomes. It’s been a little over a year since AI began to dominate the collective attention. We think by now creative individuals and teams have had enough time with the technology to begin leveraging foundational elements to create needle moving AI applications that generate measurable value. Examples where companies or enterprises have invested $X mm and increased revenue by Y% or reduced costs by $Z mm/y. It’s these tangible, measurable upticks in efficiency that we expect to finally start to materialize more broadly and answer the question about what real value GenAI can produce.
Robots Robots Robots
We can’t have jetpacks or flying cars, but they’ll let us have robots. That’s 1/3 on my Jetson’s Bucket List, so I’ll take it. Throughout the back half of last year, we saw a plethora of announcements, largely related to the incorporation of advancements in AI to applications in robotics. AI Geekly readers will recall how researchers have used LLMs in conjunction with traditional robotics models to dramatically accelerate unsupervised learning of tasks. Recently, we’ve seen a number of promising bipedal and quadrupedal robots announced for release in 2024 largely focused on factory, storage, and logistics applications.
Looking forward to an exciting 2024.
Before you go… We have one quick question for you:
If this week's AI Geekly were a stock, would you: |
About the Author: Brodie Woods
With over 18 years of capital markets experience as a publishing equities analyst, an investment banker, a CTO, and an AI Strategist leading North American banks and boutiques, I bring a unique perspective to the AI Geekly. This viewpoint is informed by participation in two decades of capital market cycles from the front lines; publication of in-depth research for institutional audiences based on proprietary financial models; execution of hundreds of M&A and financing transactions; leadership roles in planning, implementing, and maintaining of the tech stack for a broker dealer; and, most recently, heading the AI strategy for the Capital Markets division of the eighth-largest commercial bank in North America.
Glossary
Civics (Social): Social science concerned with the rights and duties of citizens. It covers the study of the theoretical, political, and practical aspects of citizenship, as well as civil law and government operation, focusing on the role of citizens
Soft-landing (Finance): In economics, a soft landing refers to a cyclical slowdown in economic growth that avoids recession. It typically occurs when a central bank raises interest rates just enough to prevent an economy from overheating and experiencing high inflation without causing a severe downturn
Local Inference (AI): process where an AI model applies learned information to make predictions or solve tasks using live data. It's the AI model's application of its training to new, real-time situations, such as identifying objects or translating languages.
Nvidia (company): American multinational chip designer producing processing units (GPUs) for the AI, gaming and professional markets, as well as system on a chip units (SoCs) for the mobile computing and automotive market.
Nvidia Blackwell (microarchitecture): Nvidia Blackwell is an upcoming GPU architecture from Nvidia. It is expected to feature a new multi-chip module (MCM)-based shading multiprocessor design and a denoising accelerator built into the ray tracing pipeline. Blackwell represents a significant advancement for Nvidia, potentially offering a large leap in performance compared to current GPUs
Nvidia RTX 5000 (GPU): speculated future lineup, is expected to be powered by the Blackwell graphics architecture. It may use a manufacturing process as advanced as 3nm if launched in 2024
AMD (company): an American multinational semiconductor company that develops computer processors and related technologies for business and consumer markets.
AMD RDNA4 (microarchitecture)an anticipated future graphics microarchitecture from AMD expected to be released in 2024.
AMD RX8000 (GPU): The AMD RX8000 series is speculated to be a future lineup of GPUs from AMD based on the RDNA4 architecture.