A.I. Companies Are Losing A LOT Of Money

Wall Street Millennial
17 Mar 202413:16

TLDROpenAI, the creator of chatbot GPT, has rapidly become one of the most valuable tech startups with an $80 billion valuation. By the end of 2023, its revenue reached a $2 billion run rate, but the company, like many AI firms, faces significant computational costs. AI companies are believed to be operating at a loss due to the expense of running complex AI models. For instance, Microsoft's GitHub Copilot, powered by OpenAI, is reportedly losing $20 per user per month. The high costs are not just about electricity but also the upfront expense of GPUs, with Nvidia's revenue from AI applications doubling in 2023. AI startups need to charge enough to cover these costs and generate profit, but the question remains whether the market will be large enough to support such investments in the long term.

Takeaways

  • πŸš€ OpenAI, the company behind chat GPT, has reached an $80 billion valuation and a revenue run rate of $2 billion by the end of 2023.
  • πŸ’° Despite the revenue growth, AI companies, including OpenAI, are believed to be operating at a loss due to high computational costs.
  • πŸ”§ The increasing complexity of AI models has led to a surge in computational costs, with companies like Microsoft and Google investing heavily in AI infrastructure.
  • 🏒 OpenAI's operating expenses for 2022 were estimated at $540 million, primarily due to computing and employee costs.
  • πŸ”„ AI companies like OpenAI and Anthropic are privately held, making it difficult to ascertain their exact financial status from public sources.
  • πŸ’‘ The high cost of running AI models is not solely due to electricity but also the upfront cost of GPUs and ongoing data center operational costs.
  • πŸ“ˆ Nvidia, a key supplier of GPUs for AI, saw a significant increase in revenue attributed to AI applications, highlighting the demand for AI computing power.
  • πŸ’Έ AI startups need to charge high prices for their services to cover the massive R&D and operational costs, which could limit widespread adoption.
  • 🌐 The success of AI companies depends on the market's willingness to pay for AI services and the overall growth of the AI market.
  • πŸ“Š The current investment in AI is massive, but there is no guarantee that the end-user market will be able to support the high costs and justify the investments made by tech giants and startups.

Q & A

  • What was OpenAI's reported valuation in a recent share sale?

    -OpenAI reportedly achieved an $80 billion valuation in a recent share sale.

  • What was OpenAI's revenue run rate by the end of 2023?

    -By the end of 2023, OpenAI's revenue reportedly reached a run rate of $2 billion.

  • What is a major challenge faced by AI companies in terms of costs?

    -A major challenge faced by AI companies is the massive computational costs associated with running increasingly complex AI models.

  • How much did OpenAI's operating expenses reportedly amount to in 2022?

    -OpenAI's 2022 operating expenses were estimated to be $540 million.

  • What does the cost of operating an AI model primarily consist of?

    -The cost of operating an AI model primarily consists of the expenses related to training and testing the model, as well as the ongoing costs of computing power.

  • What is the estimated electricity usage per query for ChatGPT?

    -The estimated electricity usage per query for ChatGPT is between 0.001 and 0.01 kilowatt-hours.

  • How much has Anthropic raised over the past 2 years?

    -Anthropic has raised over $7 billion over the past 2 years.

  • What is the reported gross profit margin for Anthropic?

    -Anthropic's reported gross profit margin is around 50%.

  • How much does it cost to train each model according to Contexto's report on Anthropic?

    -According to Contexto's report, Anthropic incurs up to $100 million of server cost to train each model.

  • What is the estimated monthly cost for an enterprise using Anthropic's most expensive model, Opus?

    -The estimated monthly cost for an enterprise using Anthropic's most expensive model, Opus, would be about $40 per month.

  • What does the future of AI profitability depend on according to the script?

    -The future of AI profitability depends on the ability of AI companies to charge high enough prices for their services to cover their massive R&D and operational costs, as well as the market's willingness to pay for these services.

  • How does the script compare the early days of the internet with the current state of AI?

    -The script compares the early days of the internet with the current state of AI by noting that both were expensive and inefficient initially, but eventually became more affordable and widespread. It suggests that for AI to have a similar revolutionary impact, costs need to significantly decrease, which could take many years.

Outlines

00:00

πŸš€ Rapid Growth and Valuation of OpenAI

OpenAI, the creator of chat GPT, has rapidly become one of the world's most valuable tech startups, with an $80 billion valuation after a recent share sale. By the end of 2023, the company's revenue reached a run rate of $2 billion, with internal targets aiming for up to $5 billion in 2024. However, despite the revenue growth, AI companies, including OpenAI, are believed to be operating at a loss due to the significant computational costs of running complex AI models. For instance, Microsoft's GitHub Copilot, powered by OpenAI, is estimated to incur a loss of $20 per month per user due to data center costs. The path to profitability for AI companies is a key question, as the high costs of AI are not solely due to electricity but also the upfront costs of GPUs, which are essential for AI applications.

05:02

πŸ’° AI Companies' Financials and the High Cost of Computing

AI companies like OpenAI and Anthropic are privately held, making their financials difficult to ascertain. However, reports suggest that OpenAI's 2022 operating expenses were estimated at $540 million, including $420 million for computing costs. Anthropic, a competitor to OpenAI, has raised over $7 billion in the past two years but is burning through funds quickly, with its chatbot reaching $8 million in monthly revenue by the end of 2023. The high costs of AI are primarily due to the massive computing power required to train and operate complex models. For example, one query on chat GPT uses significantly more electricity than a Google search, and the costs are largely driven by the upfront expense of GPUs, which saw substantial revenue growth for companies like Nvidia in 2023.

10:02

πŸ€” The Viability of AI Investments and Market Adoption

The current growth in AI is primarily funded by venture capitalists and cloud service providers. Companies like OpenAI and Anthropic may be generating gross profits, but they are still spending heavily on R&D. The AI market needs significant adoption to justify the massive investments made in GPUs and other infrastructure. For instance, Nvidia's data center segment alone needed to generate an additional $32 billion in revenue in 2024 to cover their investment. The challenge lies in whether the market can grow large enough to support the high costs and whether end-users will be willing to pay enough to make the industry viable. The comparison is made to the early days of the internet, which also started with high costs and inefficiency but eventually became viable once technology improved and high-speed internet became affordable and widespread.

Mindmap

Keywords

πŸ’‘OpenAI

OpenAI is an artificial intelligence research lab that has developed various AI technologies, including ChatGPT. In the video, OpenAI is highlighted as a significant player in the AI industry, with its valuation reaching $80 billion and a reported revenue run rate of $2 billion by the end of 2023. The company's financial success and growth exemplify the broader trend of AI becoming a big business, despite the high computational costs associated with running complex AI models.

πŸ’‘Valuation

Valuation refers to the value assigned to a company, often used to determine its worth in financial terms. In the context of the video, OpenAI's valuation of $80 billion demonstrates the market's perception of its potential and the growing importance of AI technology in the tech industry. This high valuation, achieved in a short period, underscores the rapid growth and investment interest in AI startups.

πŸ’‘Computational Costs

Computational costs are the expenses associated with running and maintaining computational systems, particularly data centers that power AI models. In the video, it is mentioned that the computational costs for AI companies, including OpenAI, have increased significantly due to the complexity of AI models. These costs include electricity usage, GPU purchases, and employee costs, which are critical factors in determining the path to profitability for AI companies.

πŸ’‘Generative AI

Generative AI refers to AI systems that can create new content or data, such as text, images, or code. The video highlights the cost difference between generative AI chatbots and traditional search engines, with the former being approximately 10 times more expensive per query. This cost discrepancy is a significant factor in the financial viability of AI companies and their products.

πŸ’‘Revenue Run Rate

A revenue run rate is a financial metric that annualizes a company's current revenue to project its annual earnings. In the video, OpenAI's revenue run rate of $2 billion indicates the company's potential annual revenue if its current revenue pace is sustained throughout the year. This metric is used to assess the growth and financial health of a company, especially in the tech industry.

πŸ’‘Data Centers

Data centers are large facilities that house computer servers and associated components, networking equipment, and storage devices. They are essential for running AI models and are known for their high energy consumption. The video emphasizes the significant upfront and ongoing costs of operating data centers, which include the purchase and maintenance of GPUs and other equipment, as well as the electricity required to power them.

πŸ’‘GPUs

GPUs, or Graphics Processing Units, are specialized electronic circuits designed to rapidly manipulate and alter memory to accelerate the creation of images in a frame buffer intended for output to a display. In the context of AI, GPUs are crucial as they perform the complex calculations needed to train and operate AI models. The video notes that the demand for GPUs has driven significant revenue growth for companies like Nvidia, which in turn sells these GPUs to cloud service providers who then lease their computing power to AI companies.

πŸ’‘AI Startups

AI startups are new businesses that focus on developing and commercializing artificial intelligence technologies. The video discusses the financial challenges faced by AI startups, such as OpenAI and Anthropic, which require substantial funding to cover their research and development costs, as well as the operational costs of running AI models. These companies often rely on venture capital and the hope of future market growth to justify their investment in AI technology.

πŸ’‘Cloud Service Providers

Cloud service providers are companies that offer services such as cloud computing, data storage, and other IT services over the internet. They typically provide the infrastructure and platforms that enable AI startups to develop and deploy their AI models. The video highlights the role of these providers in the AI ecosystem, noting that they incur significant costs to purchase and operate the necessary hardware, like GPUs, and pass on these costs to their AI clients.

πŸ’‘Electricity Usage

Electricity usage refers to the amount of electrical energy consumed by a device or system over a period of time. In the context of AI, electricity usage is a significant cost factor, as operating data centers and powering AI models require substantial electricity. The video provides estimates of the electricity used by AI queries compared to traditional Google searches, illustrating the high energy demands of AI technology.

πŸ’‘Consumption-Based Pricing Model

A consumption-based pricing model is a billing system where customers pay based on the amount of a service or product they use. In the context of AI, this model is used by companies like Anthropic, where the cost of using their AI chatbot is based on the number of tokens consumed, representing characters input and output. This pricing strategy allows AI companies to generate revenue that can cover their operational costs and potentially yield a profit, depending on the scale of adoption and usage.

Highlights

OpenAI, the creator of chatbot GPT, has become one of the world's most valuable tech startups with an $80 billion valuation.

By the end of 2023, OpenAI's revenue reportedly reached a run rate of $2 billion.

OpenAI is targeting up to $5 billion in revenue for 2024.

AI companies, including OpenAI, are believed to be losing money due to the high computational costs of running complex AI models.

Microsoft's GitHub Copilot, powered by OpenAI, is estimated to incur a $20 loss per month per user due to data center costs.

Generative AI chatbots are estimated to cost approximately 10 times more per query than a traditional Google search.

OpenAI's 2022 operating expenses were estimated at $540 million, including $420 million for computing costs.

Anthropic, a competitor to OpenAI, has raised over $7 billion in the past 2 years and is expected to seek additional financing.

Anthropic's chatbot achieved $8 million in monthly revenue by the end of 2023.

The largest cost for AI companies is the massive amount of computing power used to train and operate complex models.

One query on ChatGPT uses between 0.001 and 0.01 kWh to process, significantly more than the electricity used for a Google search.

Nvidia, a key supplier of GPUs for AI, generated $61 billion in revenue in 2023, largely due to AI applications.

Cloud service providers are spending billions on GPUs and other equipment, incurring ongoing costs to operate data centers.

AI startups rent computing power from cloud service providers, who charge a markup over their own costs.

Anthropic's gross profit margins are around 50%, with the cost of goods sold accounting for ongoing AI model operation costs.

AI companies need to charge high prices for their services to cover the massive R&D and operational expenses.

For AI to have a revolutionary impact similar to the internet, costs need to significantly decrease, which could take many years.

The current growth in AI is largely funded by venture capitalists and cloud service providers, with companies still burning through billions on R&D.

The success of AI companies hinges on the market's willingness to pay high prices for services, and the industry's ability to justify the massive investments made.