SEOSEO News

OpenAI’s ChatGPT & Whisper API Now Available For Developers


OpenAI has announced that its ChatGPT and Whisper ****** are now available on its API, offering developers access to AI-powered language and speech-to-text capabilities.

Through system-wide optimizations, OpenAI has managed to reduce the cost of ChatGPT by 90% since December, and it is now passing these savings on to API users.

OpenAI believes the best way to realize the full potential of AI is to allow everyone to build with it.

The changes announced today can lead to numerous applications that everyone can benefit from.

More businesses can leverage OpenAI’s language and speech-to-text capabilities to develop next-generation apps powered by ChatGPT and Whisper.

Further, OpenAI has taken into account feedback from developers and made changes to its API terms of service to suit their needs better.

ChatGPT API

OpenAI is releasing a new ChatGPT model family called gpt-3.5-turbo, priced at $0.002 per 1k tokens, making it ten times cheaper than the existing GPT-3.5 ******.

This model is ideal for many non-chat use cases and is the same in the ChatGPT product.

While GPT ****** traditionally consume unstructured text represented as a sequence of tokens, ChatGPT ****** consume a sequence of messages with metadata.

However, the input is rendered to the model as a sequence of tokens for the model to consume.

The gpt-3.5-turbo model uses a new format called Chat Markup Language (ChatML).

ChatGPT Upgrades

OpenAI continuously improves its ChatGPT ****** and aims to offer these upgrades to developers.

Those who use the gpt-3.5-turbo model will always receive the recommended stable model, while still being able to choose a specific version.

OpenAI is launching a new version called gpt-3.5-turbo-0301, which will receive support until at least June 1st, and a new stable release is expected in April.

Developers can find updates on the ****** page for switching to the latest version.

Dedicated Instances

OpenAI now offers dedicated instances for users who want more control over their model versions and system performance.

By default, requests are processed on shared compute infrastructure, and users pay per request.

However, with dedicated instances, developers pay for a time period to allocate compute infrastructure reserved exclusively for their requests.

Developers have complete control over the instance’s load, the option to enable longer context limits, and the ability to pin the model snapshot.

Dedicated instances can be cost-effective for developers who process beyond approximately 450M tokens per day.

Whisper API

OpenAI introduced Whisper, a speech-to-text model, as an open-source API in September 2022.

The Whisper API has garnered considerable praise from the developer community. However, it can be challenging to operate.

OpenAI is making the large-v2 model available through its API, providing developers with convenient on-demand access, priced at $0.006 per minute.

Additionally, OpenAI’s serving stack guarantees faster performance compared to other services. The Whisper API is accessible through transcriptions or translation endpoints, which can transcribe or translate the source language into English.

Developer Focus

OpenAI has made specific changes after receiving developer feedback. Examples of those changes include the following:

  • Not using data submitted through the API for service improvements, including model training, unless the organization consents to it.
  • Establishing a default 30-day data retention policy, with the option for stricter retention depending on the user’s needs.
  • Improving its developer documentation.
  • Simplifying its Terms of Service and usage policies.

OpenAI recognizes that providing reliable service is necessary to guarantee AI benefits everyone. To that end, OpenAI is committed to improving its uptime over the next few months.


Featured Image: Shaheerrr/Shutterstock

Source: OpenAI





Source link

Related Articles

Back to top button
error

Enjoy Our Website? Please share :) Thank you!