should we be aware of this... ChatGPT plugins/store will go public soon

(2024/01/06) - 6 minute read

 

Granted, ChatGPT plugins are not new, but the store is and will open soon to the public.

A lot of plugins will be available there for different purposes and functionalities.

Some of them will be useful and enhance the user experience, but (most) some of them will be not so relevant or helpful.

You can look at the news and see what is all the hype about this new feature.

 

ChatGPT has an API that can be leveraged to build your own customized chat that suits your needs and preferences.

With this, the LLM (Large Language Model) which understands your natural language (or images) can trigger your service (API) to enrich the result of the chat with custom and automatic calls to your service.

 

This means that not only is the service being called by the LLM, but the result of the service will be also integrated into the answer that the LLM generates.

Users have to select and activate such plugins from the store. As a software company, this opens new possibilities for you to reach more customers and provide more value, leaving the rights and data protection issues aside for now, users using ChatGPT can now benefit from your software and the information in it without leaving the chat interface.

 

It is quite easy to build such a plugin. You only need a couple of files, a JSON that describes your service and its parameters, and a manifest for the AI to know what to pass to your API and how to handle the response.

 

The API itself is just a REST service with specific endpoint names inside the JSON. These endpoints must be hosted and accessed from the internet, so ChatGPT can call them whenever needed.

 

Java will not be the first go-to language to write such an API.

Don't get me wrong, everything is possible in Java too, but building a small service hosted in the cloud and calling some API or just a bridge to your real system and service, Java will not be the first choice for many developers.

Even when you look at the tutorials, Java is not that prominent there. Enough rant about Java, it is about AI and plugins.

 

What I learned is, doing AI and LLM stuff locally on your development PC, the machine is not strong enough nor has it the resources it needs (GPU, memory, VRAM) to handle the complex computations and data processing involved in AI and LLM tasks.

Acquiring such a PC is quite expensive and may not be worth it for occasional use.

 

Using cloud services may be a better/cheaper choice as they offer scalable and flexible solutions for AI and LLM projects.

Inferring or training a LLM is hard, takes a lot of resources and time, and requires a lot of expertise and fine-tuning.

 

With the plugins and store, more possibilities open up for you and you can easily achieve fast results by relying on the provider, in this case OpenAI, and their pre-trained and optimized LLMs.

 

There are others too, Google Vertex, Microsoft with Copilot, Mixtral and so on. Google is thinking to make the new Gemini Ultra AI a paid service, with similar features and extensions like OpenAI.

 

More on Chat plugins can be found here.

What do you think?

 

Renier Roth