Blog Layout

Are GPTs Relevant for my AI Strategy?

Nina Habicht • May 03, 2024

This article gives you a comprehensive overview of implementation strategies when adopting Generative AI.


Are GPTs relevant for my Company's AI Strategy?


MyGPTs or GPTs are customized "ChatGPTs" (learn more about GPTs and alternatives in our last blog post) offered by OpenAI for your own purpose. They can be also offered on an OpenAI GPT store for other users. That means, you can build your own GPT for your specific domain. the newly published GPT store in ChatGPT on 10.01.2024.


With 3m GPTs in January 2023 available this is a crucial distribution channel for companies. In addition, OpenAI launched the Team collaboration option (costing 25$ / month) focussing on small medium enterprises to collaborate easier within ChatGPT. 


We have made a research around companies and in the GPT store to see whether companies see GPTs as a major part of their AI investment areas. Our outcome: Most GPTs are still created by single creators or as an extension of tool providers.

But why do businesses still avoid building their own GPT in March 2024? GPTs have main drawbacks such as the inflexibility to control Prompt Engineering and RAG processes to provide better quality based on internal data. Furthermore, companies are bound to the ecosystem of ChatGPT. This means, to use the company's GPT users need to have a ChatGPT Plus account. Builders who wish to share their GPTs in the store need to follow specific guidelines and ensure their creations comply with OpenAI's usage policies.


Three Strategies How Companies Can Make Use of GPTs


1) GPTs as prototypes: Most corporates implement GPTs as a playground or "trial and error" sandbox to develop prototypes for clients or their own future generative AI applications.


2) Community and AI branding presence: Companies see it as a community enabler and growth option to be present in the GPT Store from day one.


3) Companies use it as internal support tool: There are companies that train important internal FAQs in order to use their GPTs as cost-effective internal support assistants for client support. This sounds like a funny use case, however, we've seen it across our market research.

What is the OpenAI Enterprise Teams Version?


If you decide to use GPTs as part of your Ai tool set, you can consider various options: There is an Enterprise Version of ChatGPT which makes sense when teams want to collaborate. Most companies in the US seem to use this option. It is also comparable to the Claude Team version from Anthropic.

ChatGPT Enterprise Teams vs. ChatGPT vs. Copilot

ChatGPT Enterprise Team version for SMEs to collaborate and create own "GPTs". More on what GPTs are and how to build them.

Need support with your Generative Ai Strategy and Implementation?

🚀 AI Strategy, business and tech support 

🚀 ChatGPT, Generative AI & Conversational AI (Chatbot)

🚀 Support with AI product development

🚀 AI Tools and Automation

Get in touch
Top ChatGPT Prompts and Prompt Engineering for Product Managers
By Nina Habicht 10 May, 2024
Top ChatGPT Prompts and Prompt Engineering for Product Managers
How Does Perplexity AI Work?
By Nina Habicht 04 May, 2024
Article about Perplexity AI and how to properly use it. We highlight the difference between ChatGPT and Perplexity AI.
How to choose between ChatGPT, Claude, Copilot and Gemini
By Nina Habicht 04 May, 2024
How to choose between ChatGPT, Claude, Copilot and Gemini. Please find our ultimate guideline on how the models differ and which llm is strong at which task.
Why implementing ai tools is not an ai strategy
By Nina Habicht 03 May, 2024
This post explains why implementing ai tools without any strategy and business view can be detrimental and lead to not successful ai projects.
Generative AI in 2024, Investment areas in 2024
By Nina Habicht 01 Jan, 2024
This post is abou the major generative AI trends and investment areas in 2024
How schools and universities can use Generative AI
By Nina Habicht 29 Dec, 2023
universities and schools need to change learining approach due to generative AI. How schools and universities can use Generative AI
Supports with the definition of GPTs, alternatives and options to build own chatbots or assistant
By Nina Habicht 25 Dec, 2023
A comprehensive Guide to Alternatives of GPTs and Assistant API from OpenAI
By Nina Habicht 26 Nov, 2023
Many companies are reluctant when implementing llm-based products because they fear bein confronted with high costs. Especially for medium-sized companies which have not the ressouces or enough capacity to deploy and oprimize their AI models nor to set up an own infrastructure with MLOps. As described in our article about sustainability of Gen. AI applications , cloud and performance costs of running an llm can become very high. What are the cost types when implementing OpenAI or other llms? T here are four types of costs related to llms: Inference Costs Setup and Maintenance Costs Costs depending on the Use Case Other Costs related to Generative AI products What are inference costs? An llm has been trained on a huge library of books, articles, and websites. Now, when you ask it something, it uses all that knowledge to make its best guess or create something new that fits what you asked for. That process of coming up with answers or creating new text based on what it has learned is called inference in LLMs . Usually, developers would call a large language model like GPT-4. But here comes the "but": usually not only large language models account to the total costs when running the final product. To explain: LLMs can be used to classify data (e.g undestand that the text talks about "searching a new car insurance"), for summarization, for translation and for many other tasks. Download the ultimative Gen. AI Task Overview to learn where llms make sense:
Checklist to implement Generative AI in your company
By Nina Habicht 24 Nov, 2023
this article helps companies like enterprises and sme to successfully implement generative AI by providing best-in-breed frameworks.
By Nina Habicht 01 Nov, 2023
In this blog you will learn about the alternatives to ChatGPT and OpenAI. Where is Bard better than ChatGPT? Bard is the response to OpenAI's ChatGPT. What makes Bard so different to OpenAI? It is free! So you can try it out here whereas ChatGPT costs $20 per month. Another advantage is the microphone on the desktop version to directly speak in your question and get a response. Bard has internet access whereas ChatGPT you need to jump from one service (Web Browsing) to the other Bard covers far more languages (265 as of October 2023) Some drawbacks: it is not able to generate pictures. With ChatGPT DALL E-3 you can generate pictures. Bard only offers you a nice description. Where is Claude better than ChatGPT? Claude is the version of ChatGPT developed by the company Anthropic. This tool is currently accessible only in the UK and US, and not yet available in Switzerland. You might consider using Nord VPN to explore its functionality in your country. Claude has one big advantage to ChatGPT: It can process more "context" ( Generative AI from A to Z ), meaning the input token (100 token equals around 75 words) can be up to 100'000 tokens (75'000 words!). GPT-3 has a limit of 4096 tokens (3072 words) and GPT-4 of 8192 tokens (= 6000 words). So when you want to upload huge files, use Claude.
Show More
Share by: