Blog Layout

Checklist for companies to successfully implement Generative AI

Nina Habicht • Nov 24, 2023

Often companies struggle to start with Generative AI as there are many unknowns when it comes to the costs, responsible AI and security concerns. Using ChatGPT Plus accounts is fun but of course companies need to be able to establish their own custom environment in order to upskill people and prevent data being sent to OpenAI. As stated in our recent article about ChatGPT and data policy - a simple guide , you need take care when sending questions to ChatGPT as the data may be used for the training of OpenAI's models or provided to third-party vendors.


How to find Generative AI Business Cases?


First, it makes sense to identify the use cases that really make sense to be automized with Generative AI.

Here the Generative AI Checklist that can help you to scale faster. The checklist is built upon of hands-on AI product implementation experience (chatbots, voicebots, NLP search and more) and - unlike other sources - considers all relevant aspects which you will encounter during your AI journey. Take these steps:


1. Customer: Identify the target users and customers you serve with your Gen. AI solution (e.g. internal employees, external clients, new or specific segments?)

2. Pain Point: Write down the top problems or pain points that your target customer faces.

3. Gen. AI Suitability: Check how your product can address the identified pain point and whether Gen. AI makes sense for that.

4. Data and Content Quality: First, you need good data to create a generative AI application.


  • Think which data sources are needed for your AI product?
  • Where are these data sources in your company? Are there databases, excels, intranet, confluence pages, data sources from call centers,
  • How good is the data quality itself (e.g. we see often low text content in websites, or inconsistent maintenance such as missing titles and structure which are key for a good chatbot or search)
  • Is the data/content owner identified in your company? Or are several owners of topics scattered around? Identify them and list them together. Involve these stakeholders in your Gen. AI project from the beginning!


5. Following AI Governance Principles: Are you adhering to AI ethics principles with your solution?

6. Unique Value Proposition: Why is your solution better than others in the market? (e.g. you can faster extract data, or you can search within more information context or your solution is more straightforward to use, hence has less entry hurdles?). Add a slogan to your solution: “ChatGPT to summarize legal docs.”

5. Data Source: Outline what data you have or need to obtain to make the solution work. Define how much and quality of data you have and who ‘owns’ it.

6. Value Streams: Describe how your concept drives value – estimate your saving potential (ROI in monthly hours of work) and revenue potential due to a new market or additional service you can provide thanks to Gen. AI.

7. Cost Structure: Lay out the major costs and expenses necessary to build and run your solution. Here we will provide an additional in-depth article on the costs related to Generative AI solutions.

8. AI Ethics and Governance: Write down any ethical, legal, or compliance hurdles you’ll need to manage.

9. Key Metrics: Outline the essential metrics you will use to track the success and growth of your solution - leading/lagging metrics that reflect the overall health and success of your venture. There are metrics that play a role especially with llm  solution such as:


  • How reliable is the system (hallucinations, accuracy score, llm evaluation metrics)? This helps you to build trust with your customers which is essential with llms.
  • How much efficiency can you create with your product? Measured in cost analysis
  • How transparent is your product? Again outline how your product is build, which data base you use



Which Generative AI use cases are bringing high value?


Most companies currently focus on optimizing efficiency when it comes to finding relevant information. This major use case includes extracting relevant data from multiple sources like confluence, notion, emails, pdfs or websites. Why is this case approached by major enterprises and SME's around the world?


  • Data content can be extensively tested, optimized and amended so that your generative AI application works perfectly together with your underlying data sources
  • Companies can make first experiences regarding Generative AI costs without exposing to a larger group of clients
  • Image risks due to hallucinations (e.g. generated text) can be reduced by using standard llm evaluation metrics.


Thus, before providing your Gen. AI product client-facing you it can be substantially improved.



Need support with your Generative Ai Strategy and Implementation?

🚀 AI Strategy, business and tech support 

🚀 ChatGPT, Generative AI & Conversational AI (Chatbot)

🚀 Support with AI product development

🚀 AI Tools and Automation

Get in touch
How to strategically use GPTs from OpenAI
By Nina Habicht 03 May, 2024
This blog explains how gpts can be used as a part of your Generative AI journey and exploration towards your Ai strategy.
Why implementing ai tools is not an ai strategy
By Nina Habicht 03 May, 2024
This post explains why implementing ai tools without any strategy and business view can be detrimental and lead to not successful ai projects.
Generative AI in 2024, Investment areas in 2024
By Nina Habicht 01 Jan, 2024
This post is abou the major generative AI trends and investment areas in 2024
How schools and universities can use Generative AI
By Nina Habicht 29 Dec, 2023
universities and schools need to change learining approach due to generative AI. How schools and universities can use Generative AI
Supports with the definition of GPTs, alternatives and options to build own chatbots or assistant
By Nina Habicht 25 Dec, 2023
A comprehensive Guide to Alternatives of GPTs and Assistant API from OpenAI
By Nina Habicht 26 Nov, 2023
Many companies are reluctant when implementing llm-based products because they fear bein confronted with high costs. Especially for medium-sized companies which have not the ressouces or enough capacity to deploy and oprimize their AI models nor to set up an own infrastructure with MLOps. As described in our article about sustainability of Gen. AI applications , cloud and performance costs of running an llm can become very high. What are the cost types when implementing OpenAI or other llms? T here are four types of costs related to llms: Inference Costs Setup and Maintenance Costs Costs depending on the Use Case Other Costs related to Generative AI products What are inference costs? An llm has been trained on a huge library of books, articles, and websites. Now, when you ask it something, it uses all that knowledge to make its best guess or create something new that fits what you asked for. That process of coming up with answers or creating new text based on what it has learned is called inference in LLMs . Usually, developers would call a large language model like GPT-4. But here comes the "but": usually not only large language models account to the total costs when running the final product. To explain: LLMs can be used to classify data (e.g undestand that the text talks about "searching a new car insurance"), for summarization, for translation and for many other tasks. Download the ultimative Gen. AI Task Overview to learn where llms make sense.
By Nina Habicht 01 Nov, 2023
In this blog you will learn about the alternatives to ChatGPT and OpenAI. Where is Bard better than ChatGPT? Bard is the response to OpenAI's ChatGPT. What makes Bard so different to OpenAI? It is free! So you can try it out here whereas ChatGPT costs $20 per month. Another advantage is the microphone on the desktop version to directly speak in your question and get a response. Bard has internet access whereas ChatGPT you need to jump from one service (Web Browsing) to the other Bard covers far more languages (265 as of October 2023) Some drawbacks: it is not able to generate pictures. With ChatGPT DALL E-3 you can generate pictures. Bard only offers you a nice description. Where is Claude better than ChatGPT? Claude is the version of ChatGPT developed by the company Anthropic. This tool is currently accessible only in the UK and US, and not yet available in Switzerland. You might consider using Nord VPN to explore its functionality in your country. Claude has one big advantage to ChatGPT: It can process more "context" ( Generative AI from A to Z ), meaning the input token (100 token equals around 75 words) can be up to 100'000 tokens (75'000 words!). GPT-3 has a limit of 4096 tokens (3072 words) and GPT-4 of 8192 tokens (= 6000 words). So when you want to upload huge files, use Claude.
By Nina Habicht 30 Sep, 2023
In this blog you will learn the slice and dice with Generative AI when it comes to the analysis of your PDFs, excels, CSVs, and more. Learn the first steps on how you can visualize even data with advanced prompt engineering. This article is very useful for analysts, reporting specialists, controllers, and marketers who have to generate reportings and summaries on a regular basis and want to do it more efficiently. Join one of my courses in Zurich to get 1:1 support or send my team a message. What are important analytics use cases with Generative AI? Generative AI can be used to detect patterns and provide ideas when it comes to data visualization. To name some important use cases: Ask and summarize pdfs Analyze sheets with numbers (Web, Sales, News) Extract PDFs, transform them, and query for specific data Generate reports (e.g. make your controlling analysis, generate excel charts) Generate social media posts Generate ideas on how to analyze and visualize data with Generative AI  What is ChatGPT Advanced Data Analysis? This tool was formerly known as "Code Interpreter". Now, it comes with the brand new name "Advanced Data Analysis" but still many people do not know its power and capabilities. So continue reading ... Where can I find ChatGPT Advanced Data Analysis? 1) If you cannot see this option, go to "settings" (left corner of the main dashboard) and activate all Beta release options 2) The availability sometimes is depending on the device and operating system you are working on (iOS, Tablets) 3) Contact the OpenAI support in case you cannot see the option.
By Nina Habicht 17 Sep, 2023
With the advances in Generative AI companies should consider how to be "on top" when it comes to new technologies such as ChatGPT and Generative AI assistants. While Google SEO was one of the main drivers before the launch of ChatGPT by OpenAI, today companies should optimize towards the next-generation of search engines based on large language models and advanced search models. Why are Generative AI searches so intelligent? Intelligent searches basically understand the semantics in a sentence. So old index-based searches used to be sufficient for keyword entries but did often fall short when users did not enter exact the correct keywords or did spelling errors. Also AI based searches with older NLP classification models before the launch of OpenAI were handling some of these challenges but did never achieve the level of intelligence of so-called embeddings. With new vectorbased embeddings ( which are also available by OpenAI ) companies can build intelligent searches that can give to their users the best solution across all their websites. What are embeddings? Basically you can think of a vector represenation. A user question is translated into a vector representation and this vector is used to query a vector database. Think about when you play with your toys. You might group them together based on what they are or how you use them. You might put all your cars together because they all have wheels and can move around. Then you might put all your action figures together because you can play make-believe with them. And your board games would be in another group because you play them on a table. This is similar to how vector embeddings work in computers. But instead of toys, we have words. Just like how you group your toys, a computer groups words that are similar. For example, words like "cat", "dog", and "hamster" could be in one group because they are all pets. But how does a computer know which words are similar? It learns from reading a lot, like how you learn from playing and studying. If the computer sees the word "dog" being used in similar places as the word "cat", it will think these words are related and put them close together in its group. So, in the end, vector embeddings are like a big, organized toy box for a computer, but with words instead of toys. Just like how you can more easily pick a toy to play with when your toy box is organized, a computer can more easily understand and use words when they are nicely grouped by vector embeddings. How can companies create their own Generative AI search? OpenAI did not only change the way companies could search with ChatGPT but it also changed the way companies can create their own page search. There are several paid and open-source models (embeddings from HuggingFace or OpenAI and Meta) to create your own intelligent search. If you need support with the development of your own search contact our team . This graph from TheAiEdge.io nicely illustrates how embeddings work:
By Nina Habicht 17 Sep, 2023
This article gives you a great overview of the most relevant use cases for generative AI. What are the most relevant Generative AI industries? Based on Citi Research, the financial and consumer sectors are among the most significant business fields where General AI is disrupting the current status quo. At Voicetechhub, we observe similar patterns from our clients' needs. The following industries we see most relevant for Gen. AI: Insurance, financial services, education, consumer markets/e-commerce, and healthcare. The most prevalent use cases involve enhancing productivity in IT systems and automating tasks to save daily efforts. Additional Facts: Generative AI in Finance and Accounting : AI is transforming the financial sector by automating trading, personal finance, fraud detection, and robo-advisors ( Boston ). E-commerce and Generative AI: AI helps e-commerce businesses with personalized recommendations, chatbots, and predictive sales analytics ( Businesswire ) Healthcare Generative AI: AI applications in healthcare include disease identification, drug discovery, and personalized treatment plans ( Walton , McKinsey ) Generative AI in the Insurance Sector: AI aids in automating claims processing, detecting fraud, and customizing policy pricing based on individual risk assessment ( McKinsey ).
Show More
Share by: