The current state of ChatGPT
As of writing this post, the only reliable interaction with ChatGPT is possible using their chat UI. Reliability is still an issue for the platform, as it expanded at an unprecedented rate, often leaving users hanging in messages like "An error occurred" and "Service unavailable." Because of its rapid expansion, OpenAI engineers are working hard to meet demand. To battle this, they launched ChatGPT Plus, a $20/mo subscription that will allow paid users to enjoy the service during peak traffic, get responses with lower latency, and access new features first. One is the expected ChatGPT API - a programmatic way of interacting with the LLM.
What is an API
An API, or Application Programming Interface, is a set of rules and protocols that allows different software systems to communicate. Think of it as a language that various programs can speak to exchange information and perform specific tasks. With an API, you can use another company's existing software or services without having to recreate everything from scratch.
For example, let's say you want to add a map feature to your weather app. Instead of building a map system from the ground up, you can use the Google Maps API. The API provides you access to Google's mapping technology, allowing you to display maps and location information within your weather app. The API will handle all the behind-the-scenes work, freeing you up to focus on building a great weather app. Another example is the Twitter API, which allows developers to embed tweets, display timelines, and more on their websites or apps. These are just a few examples of how APIs can streamline the development process and enhance the capabilities of software applications.
How the API will change the way we interact with ChatGPT
We already see many unofficial integrations with ChatGPT in the form of browser extensions, apps, and chatbots. These integrations allow for a wide range of uses for personal or business use. Those integrations use bootstrapped solutions to get information in and out of the platform, often scrapers and browser automation tools like Apify, Selenium, or Puppeteer. These tools allow developers to extract data from the platform and use it in their own applications, even though there is no official API available. The main risk is that these bootstrapped solutions are not officially supported by the platform and may break or stop working at any time. With the aggressive use of Clouflare's Captcha ("Are you a human?") on the UI, the bar is high for building a reliable integration to ChatGPT.
Enter the API. Developers can now access and integrate the functionality of other systems and services in a fast, controlled, and documented way. Building on top of an official API significantly enhances the ability to develop and deliver more sophisticated and innovative solutions.
ChatGPT API launch date
The official launch date for the public API has yet to be announced. However, with the heating up competition in the face of Google, we expect this to happen in 2023. It is important to note that OpenAI is now accepting applications for the waitlist. The form can be found here: https://share.hsforms.com/1u4goaXwDRKC9-x9IvKno0A4sk30
How to integrate with the ChatGPT API
The steps to integrate with an API can vary depending on the specific API and the protocols used. APIs are programming language agnostic and use the HTTP protocol to transmit data. This will likely be a REST API, judging by the requests sent by the Chat UI. If we check the developer console in the network tab, we can see that it's calling the https://chat.openai.com/backend-api/
endpoint.
Currently the API works by exposing an endpoint on /conversation
behind a Cloudflare protection. Take this input for example:
Where we ask a simple question - "can you read text from links". This becomes the messages[0].content.parts[0]
field in the request JSON data. The endpoint returns a response with the (content-type: text/event-stream
) MIME type. This is not great DX (developer experience) and we expect it to be replaced with a JSON endpoint.
The final official API will be slightly different and potentially will call a different URL, so it's essential to follow the general steps for integrating any other API:
- Obtain API credentials: Before you can consume an API, you'll need to register for an API key or obtain other credentials from the API provider. Getting an API key will require you to have a ChatGPT Plus subscription.
- Review the API documentation: Before you start coding, review the API documentation to understand the endpoints available, how to authenticate your requests, and what data is returned in response to your requests.
- Test the API: Many APIs provide a sandbox environment or a testing suite that you can use to test your requests and responses without affecting live data.
- Write code to access the API: Using the programming language of your choice, you'll need to write code to access the API. This will involve sending HTTP requests to the API endpoints and processing the responses.
- Handle errors: APIs can return errors for various reasons, such as invalid credentials, rate limits, or invalid requests. Your code should be prepared to handle these errors and respond appropriately.
- Implement caching and rate limiting: To improve performance and avoid hitting rate limits, cache responses from the API and determine the rate of requests you make. Knowing OpenAI is having scaling issues, there will likely be modest rate limits to the API.
- Test and deploy: Once your integration is complete, you'll want to thoroughly test it and then deploy it to your production environment.
- Monitor and maintain: Ongoing maintenance and monitoring of your integration will help ensure it continues to work smoothly and efficiently.
These are general steps that can be applied to many APIs, but some APIs may have specific requirements or actions that are unique to that API. After launch, it's crucial to follow OpenAI's documentation and best practices to ensure a successful integration.
ChatGPT API integration use cases
In a previous post, we discussed the many sectors where ChatGPT can be of use. LLM (Large Language Model) APIs like ChatGPT can boost a business by providing access to advanced natural language processing capabilities, allowing for the creation of innovative and sophisticated digital products and services. Companies can utilize LLM APIs through integrations with existing digital products and platforms. For example, integrations with a headless CMS like Contentful can allow businesses to create chatbots that can answer customer inquiries and provide personalized recommendations, improving the customer experience.
Another way the ChatGPT API integrations can boost a business is by automating repetitive and time-consuming tasks. For example, by integrating with a web scraping tool like Apify, companies can automate data collection and analysis, freeing up valuable time and resources for other tasks. Integrations with customer support platforms like Zendesk can also allow businesses to automate routine support inquiries and provide instant, personalized responses to customers, improving overall customer satisfaction.
Finally, such integrations can also be used to enhance sales and marketing efforts. For example, by integrating with a CRM like Pipedrive, businesses can use advanced natural language processing capabilities to analyze sales data and provide valuable insights and recommendations to sales teams, helping to improve sales effectiveness and efficiency. These integrations can also be used to create advanced chatbots that can assist with lead generation and qualification, helping to streamline the sales process and improve overall business performance.
The possibilities of possible integrations with existing digital platforms are vast with "integrate everything" apps like Zapier and Make.com.
Custom ChatGPT API integrations
Sometimes businesses use proprietary, custom-made, or on-premise software. Then a custom ChatGPT integration is necessary by a dedicated dev team. Working with a professional agency ensures a smooth integration process by following good practices and guidelines set by the API provider, handling edge cases, and understanding how your specific use case might best be implemented. Promptly Engineering is a trusted partner for ChatGPT integrations to any scale of business. Contact us today to get started.