After the success of ChatGPT, OpenAI faces its toughest challenge: to survive financially

Because ChatGPT found a vein generative artificial intelligence, in less than a year, much of the tech sector has embarked on a new gold rush. But the astonishment spread among scholars and laymen has now, after the initial shock, dissipated. And the massive adoption of OpenAI by users is only a good start, far from its definitive consolidation as a market leader.

There are many challenges ahead for OpenAI to consolidate its dominance in generative AI services. One of the most basic concerns your income statement. Being a tech startup, it is expected to invest in user adoption first and then make its offering profitable, and only later worry about cost efficiency. However, the company seems to be hastening its steps. This year they took care of increasing turnover with ChatGPT Plusand is already considering cost savings to mitigate its large operating costs.

Shortly after the launch of ChatGPT 3.5, OpenAI CEO Sam Altman publicly acknowledged that the costs of operating the service were bleeding (he used the English expression “eye-watering”). At that time, the chatbot was barely hatched. A few months later, generative AI was predicted to be a big deal for the startup. Daily spend $700,000something more than 20 million per month. To this we would have to add the costs of Dall-E and other tools that also consume computing resources.

One of the formulas that the company would consider to reduce operating costs states Reuters note, citing internal OpenAI sources, would be to create your own tokens. With them, you can optimize the operation of your software and increase energy efficiency. It’s a strategy that many big tech companies have adopted in recent years.

Altman had already warned that his company could be “the most capital-intensive startup in the history of Silicon Valley.” She spoke with reference to the future development of general artificial intelligence (which is capable of performing any intellectual task attributed to a human being), which could cost $100,000 million. But it is not necessary to enter the realm of science fiction (yet) for expenses to skyrocket.

ChatGPT

OpenAI needs to constantly develop its products and release new updates to stay on top of innovation. All of this means continued investment. When to Altman was interviewed at the MIT event If the recent ChatGPT 4 cost $100 million to train (several times more than ChatGPT 3), he replied that he assumed higher expenditure. An answer that doesn’t indicate whether it cost just a little more than that amount or a lot more. The startup is now working on version 4.5 and One of the reasons it wasn’t launched straight into 5 would be cost. Furthermore, his ambitions include more and more areas of generative artificial intelligence it entails new expenses.

According to Information, OpenAI’s revenue would grow exponentially and unexpectedly. The company was targeting a turnover of Rs 1,000 crore in 2024 and could achieve it this year. With these numbers, your finances should be healthy. However, operating and personnel costs must be deducted from the avalanche of income. In 2022, with lower resource consumption, even with high development costs, there would be losses worth $540 million. according to The Information citing sources familiar with the company’s finances.

The need for capital to compete in a fierce market

OpenAI’s relationship with Microsoft would solve the economic question, as the tech giant would commit $10 billion this year alone, with another $3 billion to add. But the agreement between the two, as Fortune was able to reconstruct it, is squeezing the startup’s margins, which would be required to pay 75% of its profits to its partner until it recoups its investment. When it does, it will have to pay 49% of profits until it reaches a whopping $92 billion.

OpenAI’s financial structure is complex. It started as a non-profit entity with a mission to develop Artificial General Intelligence (AGI). But In 2019, it created a business organization within its structure, and he did it with particularity. Instead of investors getting shares in the company, they get right to pocket portion its future benefits. Ideally, this means they don’t have that much influence on company decisions. In fact, the startup suggests it could reinvest all profits until it reaches its goal of reaching IAG.

chatgpt

(Emiliano Vittoriosi/Unsplash)

You might think that OpenAI could spend its entire existence without distributing a single dollar to its investors, always reinvesting what it earns. But this formula is a double-edged sword. If expenses increase, one way for startups to deal with them is to raise new rounds. And if OpenAI doesn’t declare profits and therefore doesn’t distribute money to its investors, it will discourage those yet to come.

Cost optimization could look like this a gesture of goodwill towards investors while also helping OpenAI make its products more economically sustainable. It should be taken into account that their competition is some of the big technology companies that can inject a flood of capital into their projects. Google launched Bard this year and recently would make Gemini available to some developersable to generate texts, images and work with programming languages.

Companies guard their secrets jealously.  OpenAI knows this, so it introduced the privacy-focused ChatGPT

Amazon has also entered the generative AI race. It invested $1.25 billion in Anthropic, whose services Claude is a direct rival of ChatGPT. If certain conditions are met, this figure will rise to 4 billion. But she is not the only one who supports this startup. Google itself participated in a round that reached 450 million and plans to put money into it again a new one that would reach 2,000 crores.

The characters are dizzying. However, they are not the only important ones. Just as OpenAI was integrated into Microsoft’s cloud services offering (Azure), Antropie’s tools are now available on AWS. cloud from Amazon. Thus, competitor ChatGPT suddenly gains access to a huge number of potential clients. At the same time, some leaks indicate this Apple is working on its own generative AI, Ajaxcalled to work in all your software, just like they will Meta AI and Emu (for images) in the Mark Zuckerberg family of apps.

Make your own chips as a natural outlet

Reuters’ report that OpenAI is considering making its own chips falls within this framework of competition. They would be optimized for their services and avoid their technological dependence on Nvidia. The project would follow the trend of large technology companies trying to design their own processors to achieve maximum performance in their operations.

With your own design you can achieve chips with higher efficiency energy, which reduces the cost of servers in data centers. When Amazon introduced its Inferentia For Alexa, they announced that they reduced the cost of the assistant by 30% and its latency by 25%. The benefit becomes even more important in highly intensive workloads such as generative artificial intelligence. Therefore Google has already introduced a new processor in this serieswhile Meta is already working on his.

Microchips

(Brian Kostiuk/Unsplash)

All indications are that having its own chips is a logical step for OpenAI. You could save some of the huge cost of running services like ChatGPT and improve the competitiveness of your software in terms of computing power.

The hard part is starting to create the first chip. A specialized staff is needed which consists of listed engineers in the market, establish relationships with component suppliers, deal with logistics and, above all, spend capital. According to an estimate by consulting firm Digits to DollarsBuilding a processor from scratch could cost $270 million. But the minimum GPU order that the market allows is $3 billion.

The loss of Taiwan's importance in the semiconductor industry has already begun

In addition, any manufacturing agreement with a third party currently has its risks. Almost the entire supply chain has its roots, one way or another, in China or Taiwan, two poles and three-way geopolitical conflict, with the United States, which shakes up the electronics manufacturing sector. Adding to the volatility are supply issues facing the semiconductor industry, which are exacerbated by AI chips. As Taiwan’s TSMC pointed out, the world’s largest chip company, it will take about 18 months to fix the bottleneck in its AI processors. He knows what he’s talking about. Their plants are the only ones that make the Nvidia H100 and A100, exactly what ChatGPT uses to function.

The design of our own chips can thus become an odyssey of technological development. However, the competitive advantages they bring are tempting for OpenAI, which appears to have all the cards in its hands to take on the initiative, which is not without problems.

Image | Zac Wolff

In Xataka | OpenAI was born as a non-profit, but created ChatGPT, an extremely profitable business. It’s conflicting

Leave a Comment