OpenAI's logo on a phone
Feature

OpenAI's $200 Gamble: Who Is the Audience for ChatGPT Pro?

5 minute read
David Barry avatar
By
SAVED
With the introduction of ChatGPT Pro, OpenAI is gambling with its position in the generative AI space. Here's how it might shake out.

When OpenAI announced its new subscription model ChatGPT Pro would cost $200 a month, it raised eyebrows. The $200 price tag is a 10 time mark up from its standard ChatGPT Plus subscription. According to OpenAI, the plan is geared towards professionals and power users who require advanced AI capabilities for complex tasks.

With that kind of outlay, organizations are going to expect something special. OpenAI claims it has that covered: ChatGPT Pro equips users with OpenAI's most advanced AI tools, including the powerful o1 reasoning model, known for handling complex challenges with ease. For high-stakes tasks in fields like data science, programming and legal analysis, the exclusive o1 Pro Mode boosts accuracy and performance by leveraging additional computational resources. This mode also sets a new standard for reliability, cutting coding errors by 75% and rigorously validating responses to ensure precision.

OpenAI's rationale also includes covering substantial operational costs.

How to Justify the ChatGPT Pro Price Tag?

According to SemiAnalysis blog, an independent research and analysis company specializing in the semiconductor and AI industries, ChatGPT already was costing a small fortune to run in September 2023. “Estimating ChatGPT costs is a tricky proposition due to several unknown variables," the post reads. “We built a cost model indicating that ChatGPT costs $694,444 per day to operate in computer hardware costs. OpenAI requires ~3,617 HGX A100 servers (28,936 GPUs) to serve Chat GPT. We estimate the cost per query to be 0.36 cents."

Critics question whether the $200 price tag can be sustained in a competitive market without demonstrating clear superiority over existing offerings. The company argues that the cost may be well worth it for several groups. They include:

  • Professionals: Individuals who rely on AI for tasks that demand high levels of accuracy and efficiency.
  • Researchers and Engineers: Users who need advanced intelligence tools for data analysis, programming or complex problem-solving.
  • Enthusiasts: Those looking to leverage innovative AI technology for personal projects or exploration.

But does this stack up for organizations, or for use in the digital workplace?

Related Article: High Cost Is a Barrier for Corporate Generative AI Use, But Not for Long

ChatGPT Pro: Results vs. Costs

Organizations should be thinking about what they can do with ChatGPT Pro and not the cost of it, even if that is an immediate concern for enterprise leaders, said Sequencr founder Matt Collette.

The release of o1 on Dec. 5, which the company describes as a complement to GPT-4o rather than a successor, offers a new paradigm that improves outputs by spending more computing power on generating answers, he noted. 

Access to o1 is only available through the standard $20/month ChatGPT subscription, but it comes with limited access (50 messages a week). The alternative is the $200/month tier for unlimited access.

“However,  rather than focusing solely on the price, organizations should consider the value and competitive advantage these tools deliver. After two years of experimentation, companies have reported productivity gains of 20% to 50% when teams effectively leverage AI solutions — a return that justifies the investment, even for those on tighter budgets,” Collette said. The most important consideration is not which AI model to choose or its cost, but rather what challenges you are addressing with it, he added.

He sees the $200 price point for ChatGPT Pro as a reflection of the growing demand among power users who are willing to pay a premium for broader access to advanced AI tools. OpenAI’s tiered pricing strategy is a calculated move to tap into this segment, maximizing revenue by catering to users with higher needs for capacity and functionality, Collette argues.

But Collette believes the cost of access to advanced AI models will continue to trend downward, a pattern that will persist as competition intensifies. Here he notes that Amazon's recently introduced updates to Amazon Nova were at significantly lower costs than ChatGPT, and that a variety of open-source options like Llama are also available for organizations to choose from. 

“As more players enter the field, we can expect base-level models to become increasingly affordable,” Collette said.

Related Article: AI Isn't Magic. Prepare Your Data and Your People First

Putting Advanced AI Within Reach

While ChatGPT Pro's cost may be a barrier for some research institutions and small businesses, it is important to note that its advanced AI capabilities were previously only available to large corporations with substantial budgets, Kevin Baragona, founder of DeepAI, told Reworked.

ChatGPT Pro puts innovative technology within reach for smaller organizations at a fraction of the cost, which can potentially level the playing field and promote innovation in various industries, Baragona continued.

He also believes it has the potential to disrupt the current competitive landscape in the AI industry. It may attract new customers and challenge established players in the market with its advanced capabilities and affordable price points.

“This could lead to increased competition, driving innovation and pushing existing companies to improve their offerings to stay relevant,” Baragona said.

He sees ChatGPT Pro's potential impact for various industries and applications, including customer service, healthcare, finance and education, where it will improve chatbot interactions with customers, assist with medical diagnoses, automate financial analysis and decision-making processes, and provide personalized learning experiences for students.

“ChatGPT Pro, priced at $200 per month, targets professionals seeking an advanced, uninterrupted AI experience,” he said. “The pricing strategy of ChatGPT Pro reflects the growing demand for advanced AI capabilities and the increasing competition in the market. It also considers the cost of training and maintaining such a complex model. As AI technology continues to advance, we can expect to see similar high-cost subscription models for advanced AI services.”

Related Article: The Cost of AI Adds Up Without Proper Planning

Who Will Invest?

It's too early to say if anyone outside of very wealthy institutions and organizations will invest, but Descrybe.ai founder and OpenAI user Richard DiBona told us he will not.

Learning Opportunities

“I have not (and will not) purchase the $200/month ChatGPT model. It seems suitable if you are doing the most advanced problem-solving tasks,” he said. “As a developer who uses the OpenAI API, I already have access to something called their 'Chat Playground,' in which I can select and use any of their models (other than the newest o1 Pro, released last week) on a pay-as-you-go basis.”

What this means, DiBona said, is that for just a penny or two, he can use its latest models to solve most writing, drafting, factual or programming questions. “It’s hard to imagine using the pay-as-you-go more than $200/month worth — that would require 10,000 usages at 2 cents apiece to break even," DiBona said.

The pricing is also part of a wider strategy in the AI space and is the logical endgame of OpenAI's commercialization strategy, Sarah Shugars, assistant professor of communication at Rutgers University, concludes.

Many AI companies have been taking a "first one is free" approach to hooking customers on this emerging technology, she said. "There will always be people willing to pay a lot if they think they are getting the 'best' version of something — even if the main benefit is to signal that they can afford it."

"As AI becomes more established and even essential, we will only see further product stratification — with options ranging from free, open-source tools to expensive 'luxury' brands. Most won’t need the high-end model for their use case, but no doubt some will be willing to pay for it," Shugars said.

About the Author
David Barry

David is a European-based journalist of 35 years who has spent the last 15 following the development of workplace technologies, from the early days of document management, enterprise content management and content services. Now, with the development of new remote and hybrid work models, he covers the evolution of technologies that enable collaboration, communications and work and has recently spent a great deal of time exploring the far reaches of AI, generative AI and General AI.

Main image: Levart_Photographer | unsplash
Featured Research