In the recent 2 months, we saw crazy hype over this new trend:
Extend the use case of your application to ChatGPT, call your features directly from the chat, and get your results there.
Why bother some asked. We can just log in to the application itself and use what it has to prose. Well, not really, understand the following:
It’s always a question of seamless usage and interface. The easier and more seamless using your application becomes, the highest your chance of finding the right market.
Oh and of course, while we’re talking about the market, visibility is key, working on your own and creating an AI-native application is great.
Distributing it to an educated audience that might correspond to your Ideal Client is even better.
The two main reasons to get your application as a Gpt plugin a question of easy integration leveraging the UI of ChatGPTt and maximizing your impact with a low cost of acquisition (Gpt visibility).
Why chatGPT don’t take the monopole and build themselves all the available plugins?
Well let’s dive into this:
First, a quick and boring definition of GPT plugins:
GPT plugins are tools designed specifically for language models with safety as a core principle, to provide access to up-to-date information, enable computations, or use third-party services.
These plugins can act as the "eyes and ears" of language models, providing access to information too recent, personal, or specific to be included in the training data.
I told you, a boring definition, but it’s essential to have it in mind. Why permit builders to promote their tools on the Gpt plugin interface?
You may don’t know it but OpenAI doesn’t intend to be a Product company, their goal is to build the best LLM and streamline its usage through their API.
To make that happen, they had to answer the need for language models to interact with the latest information on the internet and perform more complex tasks.
On top of that many users and developers have been asking for plugins since the launch of ChatGPT (Argil included). OpenAI recognized the potential of such integration and decided to give access to this feature.
What does it mean in the grand scheme of things?
It’s our role, companies such as Argil to build specific use cases and streamline them through the Gpt plugins feature of Openai.
Doing so, OpenAI can track the performance of each plugin, and thus see what the market is currently looking for. But wait that's not it, GPT plugins will enable their LLMs model which will lead to even more plugin use cases.
That’s an amazing move from OpenAI, they gather market data, preferences, and use cases but they don’t have to do the frontal work.
Doesn’t this make you think of someone… Or something?
If you thought to Apple and the AppStore then you're a genius, let’s develop on that vision of the GPT plugins.
The success of the AppStore is principally due to the massive adoption from developers, the seamless integration of their application, and the massive opportunity it represented as a real breakthrough in product distribution.
At Argil, we truly believe that’s where we’re heading, an easy and simple way for builders to leverage LLMs, and test their hypothesis by building POCs in the form of GPT plugins.
Once they did, they streamline the use case, to see if it drives traffic to their website/app and convert from GPT plugins users to paid app users.
That’s for the period of time during which OpenAI will test the pertinence of the GPT plugins, and people the seamless usage of the integrations.
Once we end this period, we’ll see a new way to monetize your LLM-based App, you’ll get paid by OpenAI for each time your GPT plugins are used.
This is freaking exciting, and that’s the reason why we launched our first GPT plugins, we want to be part of this history, and so do you…
But not everyone got access yet, and that’s for a specific reason.
The current user base of the GPT+ is based on people of every age, no restrictions you can use every feature it proposes for 20$ per month.
That’s ridiculous when you see how powerful and efficient those tools are, which as you may involves risks, introduction of GPT plugins to the grand public means safety challenges.
Harmful content from bad actors can be streamlined in a seamless way, that’s why a check needs to be done on the applications submitted to OpenAI to get listed as Gpt plugins.
Here are a few examples of risks OpenAI mentioned in regard to the misuse of GPT plugins: Sophisticated prompt injection, spam emails, and bypassing safety restrictions.
Currently to get access to the Gpt plugins you need to fill out a form and be part of a waiting list. In the meantime, you’ll get access soon to our GPT plugins.
For those that never heard about us, here’s a quick introduction to what is Argil and how can it help you in your day-to-day work.
We’re currently building the Zapier of AI, one tool that enables tech and non-tech people to build in No-Code their personalized AI automation.
You can train LLMs on your data by creating ‘models’, then integrate these models in a specific workflow that may correspond to:
1/ A mini SaaS you’re trying out
2/ A daily task flow you performing at work
3/ A POC you want to streamline to test a business hypothesis
That’s why we decided to build our own ‘POC’ of use cases enabled by the platform and streamline them as GPT plugins.
If you want to test the platform and empower your productivity, click here.
In the meantime do not hesitate to follow us on our socials: