Blog Layout

Checklist for companies to successfully implement Generative AI

Nina Habicht • Nov 24, 2023

Often companies struggle to start with Generative AI as there are many unknowns when it comes to the costs, responsible AI and security concerns. Using ChatGPT Plus accounts is fun but of course companies need to be able to establish their own custom environment in order to upskill people and prevent data being sent to OpenAI. As stated in our recent article about ChatGPT and data policy - a simple guide , you need take care when sending questions to ChatGPT as the data may be used for the training of OpenAI's models or provided to third-party vendors.


How to find Generative AI Business Cases?


First, it makes sense to identify the use cases that really make sense to be automized with Generative AI.

Here the Generative AI Checklist that can help you to scale faster. The checklist is built upon of hands-on AI product implementation experience (chatbots, voicebots, NLP search and more) and - unlike other sources - considers all relevant aspects which you will encounter during your AI journey. Take these steps:


1. Customer: Identify the target users and customers you serve with your Gen. AI solution (e.g. internal employees, external clients, new or specific segments?)

2. Pain Point: Write down the top problems or pain points that your target customer faces.

3. Gen. AI Suitability: Check how your product can address the identified pain point and whether Gen. AI makes sense for that.

4. Data and Content Quality: First, you need good data to create a generative AI application.


  • Think which data sources are needed for your AI product?
  • Where are these data sources in your company? Are there databases, excels, intranet, confluence pages, data sources from call centers,
  • How good is the data quality itself (e.g. we see often low text content in websites, or inconsistent maintenance such as missing titles and structure which are key for a good chatbot or search)
  • Is the data/content owner identified in your company? Or are several owners of topics scattered around? Identify them and list them together. Involve these stakeholders in your Gen. AI project from the beginning!


5. Following AI Governance Principles: Are you adhering to AI ethics principles with your solution?

6. Unique Value Proposition: Why is your solution better than others in the market? (e.g. you can faster extract data, or you can search within more information context or your solution is more straightforward to use, hence has less entry hurdles?). Add a slogan to your solution: “ChatGPT to summarize legal docs.”

5. Data Source: Outline what data you have or need to obtain to make the solution work. Define how much and quality of data you have and who ‘owns’ it.

6. Value Streams: Describe how your concept drives value – estimate your saving potential (ROI in monthly hours of work) and revenue potential due to a new market or additional service you can provide thanks to Gen. AI.

7. Cost Structure: Lay out the major costs and expenses necessary to build and run your solution. Here we will provide an additional in-depth article on the costs related to Generative AI solutions.

8. AI Ethics and Governance: Write down any ethical, legal, or compliance hurdles you’ll need to manage.

9. Key Metrics: Outline the essential metrics you will use to track the success and growth of your solution - leading/lagging metrics that reflect the overall health and success of your venture. There are metrics that play a role especially with llm  solution such as:


  • How reliable is the system (hallucinations, accuracy score, llm evaluation metrics)? This helps you to build trust with your customers which is essential with llms.
  • How much efficiency can you create with your product? Measured in cost analysis
  • How transparent is your product? Again outline how your product is build, which data base you use



Which Generative AI use cases are bringing high value?


Most companies currently focus on optimizing efficiency when it comes to finding relevant information. This major use case includes extracting relevant data from multiple sources like confluence, notion, emails, pdfs or websites. Why is this case approached by major enterprises and SME's around the world?


  • Data content can be extensively tested, optimized and amended so that your generative AI application works perfectly together with your underlying data sources
  • Companies can make first experiences regarding Generative AI costs without exposing to a larger group of clients
  • Image risks due to hallucinations (e.g. generated text) can be reduced by using standard llm evaluation metrics.


Thus, before providing your Gen. AI product client-facing you it can be substantially improved.



Need support with your Generative Ai Strategy and Implementation?

🚀 AI Strategy, business and tech support 

🚀 ChatGPT, Generative AI & Conversational AI (Chatbot)

🚀 Support with AI product development

🚀 AI Tools and Automation

Get in touch
Top ChatGPT Prompts and Prompt Engineering for Product Managers
By Nina Habicht 10 May, 2024
Top ChatGPT Prompts and Prompt Engineering for Product Managers
How Does Perplexity AI Work?
By Nina Habicht 04 May, 2024
Article about Perplexity AI and how to properly use it. We highlight the difference between ChatGPT and Perplexity AI.
How to choose between ChatGPT, Claude, Copilot and Gemini
By Nina Habicht 04 May, 2024
How to choose between ChatGPT, Claude, Copilot and Gemini. Please find our ultimate guideline on how the models differ and which llm is strong at which task.
How to strategically use GPTs from OpenAI
By Nina Habicht 03 May, 2024
This blog explains how gpts can be used as a part of your Generative AI journey and exploration towards your Ai strategy.
Why implementing ai tools is not an ai strategy
By Nina Habicht 03 May, 2024
This post explains why implementing ai tools without any strategy and business view can be detrimental and lead to not successful ai projects.
Generative AI in 2024, Investment areas in 2024
By Nina Habicht 01 Jan, 2024
This post is abou the major generative AI trends and investment areas in 2024
How schools and universities can use Generative AI
By Nina Habicht 29 Dec, 2023
universities and schools need to change learining approach due to generative AI. How schools and universities can use Generative AI
Supports with the definition of GPTs, alternatives and options to build own chatbots or assistant
By Nina Habicht 25 Dec, 2023
A comprehensive Guide to Alternatives of GPTs and Assistant API from OpenAI
By Nina Habicht 26 Nov, 2023
Many companies are reluctant when implementing llm-based products because they fear bein confronted with high costs. Especially for medium-sized companies which have not the ressouces or enough capacity to deploy and oprimize their AI models nor to set up an own infrastructure with MLOps. As described in our article about sustainability of Gen. AI applications , cloud and performance costs of running an llm can become very high. What are the cost types when implementing OpenAI or other llms? T here are four types of costs related to llms: Inference Costs Setup and Maintenance Costs Costs depending on the Use Case Other Costs related to Generative AI products What are inference costs? An llm has been trained on a huge library of books, articles, and websites. Now, when you ask it something, it uses all that knowledge to make its best guess or create something new that fits what you asked for. That process of coming up with answers or creating new text based on what it has learned is called inference in LLMs . Usually, developers would call a large language model like GPT-4. But here comes the "but": usually not only large language models account to the total costs when running the final product. To explain: LLMs can be used to classify data (e.g undestand that the text talks about "searching a new car insurance"), for summarization, for translation and for many other tasks. Download the ultimative Gen. AI Task Overview to learn where llms make sense:
By Nina Habicht 01 Nov, 2023
In this blog you will learn about the alternatives to ChatGPT and OpenAI. Where is Bard better than ChatGPT? Bard is the response to OpenAI's ChatGPT. What makes Bard so different to OpenAI? It is free! So you can try it out here whereas ChatGPT costs $20 per month. Another advantage is the microphone on the desktop version to directly speak in your question and get a response. Bard has internet access whereas ChatGPT you need to jump from one service (Web Browsing) to the other Bard covers far more languages (265 as of October 2023) Some drawbacks: it is not able to generate pictures. With ChatGPT DALL E-3 you can generate pictures. Bard only offers you a nice description. Where is Claude better than ChatGPT? Claude is the version of ChatGPT developed by the company Anthropic. This tool is currently accessible only in the UK and US, and not yet available in Switzerland. You might consider using Nord VPN to explore its functionality in your country. Claude has one big advantage to ChatGPT: It can process more "context" ( Generative AI from A to Z ), meaning the input token (100 token equals around 75 words) can be up to 100'000 tokens (75'000 words!). GPT-3 has a limit of 4096 tokens (3072 words) and GPT-4 of 8192 tokens (= 6000 words). So when you want to upload huge files, use Claude.
Show More
Share by: