OpenAI is rethinking ChatGPT pricing — and 'unlimited' plans may not last, its boss says
ChatGPT's head said its subscription model was "accidental" but said "there's no world in which pricing doesn't significantly evolve."
Matthias Balk/picture alliance via Getty Images
- OpenAI may drop "unlimited" ChatGPT plans as AI costs surge and usage explodes, its head said.
- "There's no world in which pricing doesn't significantly evolve," Nick Turley said.
- As AI systems grow more compute-intensive, some tech companies are rethinking pricing.
ChatGPT's current pricing model may not stick around for long.
Nick Turley, OpenAI's head of ChatGPT, said the company expects to change how it charges for its AI products — and suggested that "unlimited" subscriptions could eventually disappear.
"There's no world in which pricing doesn't significantly evolve when the technology is changing this quickly," Turley told Altimeter Partner Apoorv Agrawal on the "Bg2 Pod" podcast on Sunday.
Turley said its large language model was initially launched as a temporary demo that OpenAI planned to shut down after a month. But after it went viral and users loved it, he said, the company quickly realized it had a real product on its hands.
Subscriptions, he said, were introduced to manage overwhelming demand.
ChatGPT offers a free version with usage limits. As well as paid plans like Plus for $20 a month with higher usage limits, and Pro, which costs $200 a month, and unlocks faster performance and unlimited prompts.
"We stumbled into subscriptions," Turley said, describing the model as an "accidental" solution to capacity constraints.
AI is breaking pricing models
Now, as AI capabilities rapidly improve — and become more compute-intensive — the subscription model is under pressure.
"It's possible that in the current era, having an unlimited plan is like having an unlimited electricity plan," Turley said. "It just doesn't make sense."
The shift echoes comments from OpenAI CEO Sam Altman, who said last week that AI could be sold like electricity — metered by usage — as demand for the technology surges. Major tech companies are set to spend hundreds of billions of dollars this year on compute to meet the soaring demand for AI.
As a result, OpenAI is exploring how to better align pricing with usage while still expanding access.
Turley said the company's "north star is access," and pointed to experiments like advertising as one way to reach users who may not be able to pay for subscriptions.
"Obviously I want to be really thoughtful about the way that we evolve our plans," Turley said, but would be "incredibly surprised if it didn't change, given the magnitude and profoundness of the technical breakthroughs that we've had and the product breakthroughs that follow."
The industry is rethinking pricing
The move at OpenAI echoes a wider industry rethink.
In an episode of the "Dwarkesh Podcast" published last November, Microsoft's CEO Satya Nadella said the company is already talking about charging "per agent" rather than per user as AI becomes a coworker.
Meanwhile, cloud and model providers like Anthropic and Google price many services on a pay-as-you-go, per-token basis.
Even vendors in services and consulting like Globant are experimenting with token bundles and "AI Pods" — monthly subscriptions that include a token allotment — to align revenue with usage rather than hours.
Read the original article on Business Insider