- AI Geekly
- Posts
- AI Geekly - August 27, 2023
AI Geekly - August 27, 2023
I'll Have What They're Having: Two New Coding AIs, a Grand-Slam Q2 from the Biggest AI Chipmaker, and a ~$5 Bn valuation for ChatGPT's Biggest Rivall
Introducing the AI Geekly, by Brodie Woods. Your curated 5-minute read on the latest developments in the rapidly evolving world of AI.
In a world clouded by clickbait journalism, incomplete information, and snap judgments, we endeavour to provide our readers with an expert perspective —excising the doom & gloom as well as the unabashed hype.
By cutting through the biases of competing narratives across social, economic, scientific, and political streams, we share a clearer picture of the real story that is playing out.
As we enter the Age of AI, staying informed and critical is key.
Let me be your guide:
AI Quote of the Week
“A new computing era has begun. The industry is simultaneously going through two platform transitions: accelerated computing and Generative AI —the race is on to adopt Generative AI”
-Jensen Huang, CEO Nvidia
TL;DR - Exec Summary
I'll Have What They're Having: Two New Coding AIs, a Grand-Slam Q2 from the Biggest AI Chipmaker, and a ~$5 Bn valuation for ChatGPT's Biggest Rival.
Summer doldrums don't seem to be having much of an impact on developments in the AI space: A busy week for Meta (the social network formerly known as Facebook), as it shared plans to give users more control over the data fed to them by AI, as well as releasing a promising new open source Coding AI model "copilot". Not content to let Meta have all the fun, OpenAI rival Hugging Face announced a new model of its own. SafeCoder, an on-premises Coding AI model, targets the lucrative and still largely untapped enterprise market. With that in mind, it's perhaps not surprising that Hugging Face this week completed a $235 mm Series D financing valuing the company at $4.5 Bn (100x revenue and >2x the company's mid-2022 valuation). The frothiness in AI demand from investors remains thick. On the topic of private market darlings, OpenAI is at long last giving customers what they want: the ability to fine-tune its models (GPT 3.5 Turbo for now, GPT 4.0 in the fall). Given OpenAI's models remain by far the most performant and adaptable in the market, the ability to further fine-tune the models helps to mitigate some of the current, and seemingly temporary (at this rate anyways) limitations of their LLMs. One of the major limitations in the AI sector is hardware bottlenecks. There simply isn't enough compute power, in the form of specialized AI chips to meet the training demands of rapidly growing data workloads and market size. The biggest beneficiary of this trend is Nvidia, demonstrated by an absolutely gangbusters quarter on the back of tremendous demand for their AI products —did I mention that NVDA is up >220% YTD? For those who have followed the name closely, you know that CEO Jensen Huang has been playing the long game. Long-term bets, whether on the company's proprietary CUDA parallel computing platform (>70% market share in AI), introduction of dedicated AI Ray-Tracing and Tensor cores, and heavy investment in its data center business have not only paid-off, but have created the necessary toolset to enable the entire Generative AI ecosystem.
-Read on for the full story
AI News
The Bladerunner Conundrum: You'd Tell Me if I Were An AI, Right?... Right?
Team of Scientists and Philosophers Propose Methodology to Measure AI Consciousness
What is: A self-appointed group of theorists have published a 120-page paper proposing a 14-point checklist to assess the consciousness of AI systems, including models like ChatGPT.
What it means: The paper introduces a comprehensive, theory-driven framework for evaluating the likelihood that an AI system possesses attributes of consciousness.
Why it matters: The authors raise valid ethical considerations as part of the ongoing dialog on AI technologies that appear to exhibit or perhaps mimic many human-like characteristics and capabilities.
Meta's Two Front War: EU Compliance and Open Source Gambit
Meta Announces Introduction of Killswitch in EU and Releases new Coding AI - Code Llama
That must Zuck…
What it is: Meta is complying with the EU's Digital Services Act by offering non-personalized content feeds on Facebook and Instagram. The company also released Code Llama, an open-source, coding-specialized large language model.
What it means: It's a delicate strategy of regulatory compliance in the EU and competitive positioning vs. its rival, open-sourcing expensive-to-train AI models to undercut OpenAI (almost industry trolling) while adhering to EU mandates on user autonomy.
Why it matters: Meta's dual moves signal a complex landscape where regulatory compliance can coexist with aggressive market strategies.
If You Build It: They Will Code
Embracing Customization; Hugging Face Announces Safe Coder; OpenAI Announces Fine-tuning of Models.
What it is: Two of the two leaders in the global LLM arms race are finally giving clients the customization they need.
What it means: Safe Coder allows enterprise clients to train custom AI coding copilots on a company's proprietary code base, locally. Fine-Tuning of GPT models gives users an incremental tool with which they can tweak outputs to their unique use cases.
Why it matters: Overnight, coding LLMs went from a rarity to a dime-a-dozen. Both companies are responding to client feedback, luring-in enterprises and AI developers with more flexible tools to meet their needs.
One more thing: Overall, this represents a positive trend of increased accessibility and customization of underlying models as well as responsiveness to customer concerns.
Tech News
Double-Talk: AI's Own Jekyll and Hyde Story in the Copyright Universe
Google Announces Approach to Youtube Content Mediation that lets it have its Cake and Eat it too
What it is: Google plans to vigorously enforce pseudo-copyrights on YouTube to appease music industry partners. An awkward dichotomy given the tech giant's own casual brushing aside of IP concerns as it voraciously scrapes the open web to train its AI.
What it means: Google is willing to act as the sheriff of an expanded, extra-legal music copyright frontier, all while playing the role of a freebooter when it comes to AI training data. This selective adherence raises critical questions about the ethical underpinnings of AI and copyright enforcement in the digital age.
Why it matters: This isn't a mere corporate quirk. This could be the bellwether of a dangerous Big Tech trend; arbitrary enforcement of a new kind of bottom-line-driven "private copyright", paired with the unfettered consumption by those same players of rich copywritten data without remuneration for the creators.
One more thing: While this risks impeding innovation to a degree, it forces a broader conversation on two multifaceted issues society must address in the very near term: antitrust concerns around Big Tech's inordinate power and the determination of the appropriate compensation structure for the valuable data that fuels the intelligence of modern AI.
Swinging for the Fences: Nvidia's Record Earnings Driven by Insatiable AI Demand
Jaw-Dropping Q2 Results Highlight Nvidia's Sheer Dominance in AI Hardware
What it is: Headlined by a staggering $13 Bn in topline, Nvidia reported Q2 results surpassing Street expectations, driven primarily by an even larger-than-expected surge in demand for data center-class GPUs, which are the backbone of modern AI systems.
What it means: As of Friday's close, the stock is up over 220% YTD, underscoring the market's bullish sentiment. Not only did management significantly outperform expectations, it provided a solid long-term outlook and demonstrated in clear dollars and cents that it remains the undisputed leader in the AI hardware landscape for the foreseeable future.
Why it matters: Nvidia's explosive growth has ramifications beyond its soaring stock price. It sets the bar for performance in an industry increasingly dependent on specialized hardware for AI and machine learning tasks. Additionally, Nvidia's success could trigger a reevaluation of investment strategies across the board, compelling investors and competitors alike to pivot or double down on AI-centric technologies (a polite way of describing FOMO...).
One more thing: Why does it feel like the tech theme of the year, off-and-on for the past seven years has been shortages of NVDA silicon?
Hugging Face's Billion-Dollar Hug: Who's Who of Tech Fuels AI Unicorn
Star-Studded Series D Raise Elevates Hugging Face to a $4.5 Billion Valuation
What it is: HF's $235 mm Series D raise included an ensemble of tech giants—Google, Amazon, Nvidia, Intel, AMD, Qualcomm, IBM, Salesforce, and Sound Ventures—valuing Hugging Face at a whopping $4.5 billion, effectively doubling the startup's valuation from mid-2022 and reportedly more than 100x revenue (feels frothy).
What it means: In the world of AI, this Series D round isn't just capital—it's a clarion call. The participation of industry behemoths underscores Hugging Face's pivotal role in the AI landscape, and the stratospheric valuation suggests the private market sees the company as a long-term leader.
Why it matters: The valuation is more than just a number; it's a testament to 1) the company's transformative impact on machine learning + integral role in the AI ecosystem, and 2) the frenzied demand for exposure to legitimate AI investment opportunities, technology, and ecosystems (reminiscent of Databricks' $1.3 Bn acquisition of MosaicML).
That's it for this week! —Feel free to share the AI Geekly with friends, colleagues, family, etc.
About the Author: Brodie Woods
With over 18 years of capital markets experience as a publishing equities analyst, an investment banker, a CTO, and an AI Strategist leading North American banks and boutiques, I bring a unique perspective to the AI Geekly. This viewpoint is informed by participation in two decades of capital market cycles from the front lines; publication of in-depth research for institutional audiences based on proprietary financial models; execution of hundreds of M&A and financing transactions; leadership roles in planning, implementing, and maintaining of the tech stack for a broker dealer; and, most recently, heading the AI strategy for the Capital Markets division of the eighth-largest commercial bank in North America.
Glossary
Terms:
TL;DR (Too Long; Didn't Read): Common internet/email abbreviation indicating that if you don't have time to read the longer content, read this short blurb for the gist.
AI (Artificial Intelligence): A subfield of computer science focused on creating intelligent machines capable of performing tasks that require human intelligence, such as natural language understanding, decision-making, and problem-solving.
Capital Markets: Financial markets for buying and selling equity and debt instruments, often facilitating the raising of capital for companies and governments.
Generative AI: A subset of AI that can generate new content, such as text or images, based on patterns learned from large corpora of data. These models are particularly useful in a range of applications, seemingly able to mimic more complex elements of human cognition (with specific, important limitations).
LLM (Large Language Models): Machine learning models trained on massive datasets to understand and generate human-like text. These models have applications in chatbots, translation services, and content creation.
Consciousness: In the context of AI, refers to the theoretical ability of a machine to have self-awareness and subjective experiences, a topic of ongoing debate among ethicists, scientists, and philosophers.
Open Source: Software or models whose source code is made publicly available, allowing anyone to view, modify, and distribute it. Open-source projects often have community contributions and are less restrictive than proprietary software.
Autonomy: The capacity for self-governance. In the context of AI, this often refers to the ability of a system to operate and make decisions without human intervention.
Coding AI: Artificial intelligence models specifically designed to assist with or automate coding tasks, such as code completion, debugging, or even writing entire programs.
Enterprise Market: Refers to the market that sells technology solutions and services to large organizations rather than individual consumers. Products are often customized and sold in larger quantities.
Series D Financing: A late-stage venture capital funding round, often the last step before an IPO. Companies use Series D to expand market reach, acquire other businesses, or prepare for public trading.
Fine-Tuning: The process of adapting a pre-trained machine learning model for a specific task. Fine-tuning involves additional training on a smaller dataset related to the task.
Data Center: A facility used to house and manage computer systems and related components, such as telecommunications and storage systems. It generally includes backup power supplies, data communication connections, and environmental controls.
Topline Revenue: The total revenue generated by a business before any expenses are subtracted. In financial reporting, it is often the first line, or "top line," on an income statement.
Intellectual Property (IP): Legal rights (patents, copyrights, trademarks, etc.) that protect the creations, inventions, and discoveries of individuals or organizations. In the digital age, this includes software code, algorithms, and data.
Valuation: The process of determining the current worth of an asset or a company. In the context of startups, valuation is often estimated based on future potential.
FOMO (Fear Of Missing Out): An emotional response driven by the desire to stay informed and involved, often cited as a reason for quick decision-making in investments or technology adoption.
Entities
Nvidia (NVDA): A leading technology company specializing in Graphics Processing Units (GPUs) for gaming and professional markets. It's a key player in AI hardware, offering high-performance computing solutions.
Hugging Face (Private): An AI research organization known for its work in natural language processing. They offer a range of open-source software and have a strong presence in the AI community.
OpenAI (Private): A private research organization committed to ensuring artificial general intelligence benefits humanity. Known for developing models like GPT-3 and GPT-4.
Google (GOOGL): A multinational technology co specializing in internet-related services, including search, advertising, cloud computing, and AI.
Microsoft (MSFT): A multinational technology co known for its Windows operating system, productivity and enterprise/cloud offerings. Partnership/investment in OpenAI has catapulted MSFT into the AI lead.
Meta (META): Previously Facebook, Meta Platforms Inc. aims to bring the world closer together through a comprehensive virtual experience known as the metaverse. Meta is a major proponent of quasi-open AI models.
Key People
Sam Altman, CEO OpenAI: focused on ensuring the safe and beneficial development of artificial general intelligence.
Jensen Huang, CEO Nvidia: has led the company's efforts in accelerated computing and AI.
Clement Delangue, CEO Hugging Face: has steered the company from fledgling start-up to unicorn over the past seven years.
Sundar Pichai, CEO Google, Alphabet: Overseeing all products and services, inclusive of AI
Satya Nadella, CEO Microsoft: Responsible for all of the company’s product, engineering, marketing, and operations.
Mark Zuckerberg, CEO Meta: recently shifted the company's focus from Metaverse to AI.