Plugins

OpenAI has introduced preliminary compatibility for plugins in ChatGPT. These plugins are utilities expressly created for language models with an emphasis on safety. They assist ChatGPT in retrieving the most recent information, executing computations, or leveraging third-party services. Plugins will be fully supported in ChatGPT-5 release, for now they are in Beta.

Ever since the inception of ChatGPT, users have expressed interest in plugins (with a plethora of developers also exploring similar concepts) due to their potential to unlock a multitude of potential applications. OpenAI developers initiated this journey with a limited user base, intending to incrementally expand access on a larger scale as we gather more insights. This applies to plugin developers, ChatGPT-4 users, and following an initial alpha phase, API users interested in incorporating plugins into their offerings. Engineers are thrilled to foster a community that’s poised to shape the future dynamics of human-AI interaction.

Plugin developers who’ve been selected from ChatGPT-5 Plugins Waitlist have access to documentation for developing a plugin for chatbot. The documentation also includes guidance for the model on how to utilize each plugin, and a list of enabled plugins presented to the language model.

The pioneering organizations developing plugins for today:

  • Expedia,
  • FiscalNote,
  • Instacart,
  • KAYAK,
  • Klarna,
  • Milo,
  • OpenTable,
  • Shopify,
  • Slack,
  • Speak,
  • Wolfram,
  • Zapier.

OpenAI also hosting two plugins themselves, a web browser and code interpreter. They’ve also open-sourced the code for a knowledge base retrieval plugin, to be self-hosted by any developer with information with which they’d like to augment ChatGPT-4 and upcoming ChatGPT-5.

Browsing Plugin

An experimental model that knows when and how to browse the internet (for today in Beta).

Drawing inspiration from previous projects (including our WebGPT, as well as GopherCite, BlenderBot2, LaMDA2, and others), OpenAI has enabled language models to access information from the internet, thereby significantly expanding the range of content they can cover. This goes beyond the training corpus, allowing the models to tap into the most recent information.

Here’s an illustrative example of how this browsing capability enhances the experience for ChatGPT users. Before, the model would politely explain that its training data didn’t encompass enough information to provide an answer.

However, with browsing, ChatGPT-5 users will be able to pull in recent information, such as the latest Oscars updates. Furthermore, it current plugin version can even perform its well-known feats of generating poetry. This exemplifies how browsing can enrich the overall user experience.

Beyond the clear advantages it provides to end-users, we believe that empowering language and chat models to conduct comprehensive and understandable research holds promising potential for scalable alignment.

Code Interpreter Plugin

An experimental ChatGPT-4 and ChatGPT-5 model that can use Python, handle uploads and downloads (for today in Alpha).

OpenAI equips its models with a functional Python interpreter situated within a secure, isolated execution environment, complemented by a temporary disk space. The code executed by interpreter plugin is evaluated in a session that remains active throughout a chat conversation, subject to a maximum timeout limit, and subsequent calls can progressively build upon each other.

Developers also facilitate file uploads to the ongoing conversation workspace and allow for downloading the outcomes of your operations.

OpenAI aims for its models to utilize their programming prowess to offer a more intuitive gateway to the basic functions of our computers. Imagine having an enthusiastic junior programmer at your disposal, working at the speed of your typing – this could revolutionize workflows, making them more effortless and efficient. Moreover, it could introduce the perks of programming to new demographics.

In their preliminary user studies, enginners pinpointed scenarios where the code interpreter is exceptionally beneficial:

  1. Resolving mathematical problems, both quantitative and qualitative
  2. Undertaking data analysis and visualization
  3. Transferring files between different formats

They encourage users to explore the code interpreter integration and uncover additional beneficial applications.

Retrieval Plugin

OpenAI’s open-source retrieval plugin empowers ChatGPT to tap into personal or organizational data sources (granted appropriate permissions). This allows users to extract the most pertinent snippets from their data repositories, which may include files, notes, emails, or public documentation, simply by posing questions or stating needs in natural language.

As a self-hosted, open-source solution, developers have the liberty to roll out their own iteration of the plugin and integrate it with ChatGPT. The plugin utilizes OpenAI embeddings and provides developers the flexibility to select a vector database (Milvus, Pinecone, Qdrant, Redis, Weaviate or Zilliz) for indexing and document search purposes.

Please visit the retrieval plugin repository for more informatiton.

Third-party Plugins

An experimental model that knows when and how to use plugins (for today in Alpha).

Third-party plugins use manifest file. This file contains a machine-friendly description of the plugin’s capabilities and the method to activate them, along with documentation for user reference.

Here are the steps to create a plugin:

  1. Develop an API with the endpoints you intend for a language model to access. This could be a brand-new API, an existing API, or a wrapper created around a pre-existing API specifically tailored for language models.
  2. Produce an OpenAPI specification that documents your API, and a manifest file that links to the OpenAPI spec while including some metadata specific to the plugin.

When initiating a conversation on chat.openai.com, users have the option to select which third-party plugins they’d like to activate. Information about the activated plugins is presented to the language model as part of the conversation context, enabling the model to call upon the relevant plugin APIs as necessary to satisfy user requests.

Currently, plugins are structured to invoke backend APIs, but OpenAI’s engineers also considering the possibility of plugins that can access client-side APIs.

Scroll to Top