top of page

2024: Generative AI Goes Mainstream

By Ken Jones & Dan Hughes, Claritee.AI Co-founders



The launch of ChatGPT by OpenAI in November 2022 marked a significant innovation, bringing Artificial Intelligence (AI) into the public consciousness. It shifted AI from the realm of technical specialists to something accessible to everyone. Generative AI now possesses capabilities that were once thought to be uniquely human – creativity, reasoning, and conversational.

 

Generative AI has been the ‘topic du jour’ in the media for some time now with reporting that it will be world-changing – everything from saving to destroying humanity. As a business leader, you may have likely formed impressions about it. Perhaps you have played with ChatGPT or similar tools like BARD or Claude. Here is a sample of impressions we have heard:

 

  • “ChatGPT is great for personal use, but it doesn’t know about my corporate information.”

  • “I can see Generative AI is a disruptive technology with the potential for major impact in the workforce and workplace, but is it ready for prime-time business use?”

  • “There is so much change happening in the Generative AI marketplace, how do I find the signal in the noise? I am ready, but how do I begin?”

  • “It makes up information and cannot be trusted.”

  • “Privacy risks are too high since we don’t understand how the data is being used by the AI model.”

  • “Is it anything more than just writing emails and automating job specifications?”

 

The concerns raised are valid, yet it's important to recognize that generative AI is a persistent and evolving technology. As we progress into 2024, we can anticipate that many of these issues will be mitigated as the technology advances. Given that generative AI is a transformative force impacting numerous industries and functions, the upcoming year promises significant developments and breakthroughs that may surpass our current expectations. 

 

Having worked in technology and specifically in the data world for over 25 years, and in learning and working with Generative AI for the past year, we hope to offer some useful guidance on Generative AI capabilities and uses, what happened in 2023, and how to get moving.  We promise to keep the technical details to a minimum.

 

Generative AI Capabilities and Use Cases


You could be forgiven for thinking that ChatGPT = Generative AI. However, ChatGPT (and chatbots in general) are just the tip of the iceberg. A chatbot is an application of Generative AI. However, Generative AI has so many other potential applications that it can be difficult to draw a box around what is possible with the technology.

 

The unique capability of Generative AI is the ability to understand and generate language. Natural Language Understanding (NLU) enables software to interpret and comprehend human language, identify sentiments, and extract content and semantic meaning. Natural Language Generation (NLG) allows software to speak or write in a way that sounds just like humans. Together, natural language understanding and generation enable more natural and seamless human-technology interactions, as software can both understand and appropriately respond to human language.

 

At the heart of a Generative AI application is a Large Language Model (LLM). Think of an LLM as an artificial brain that “knows” about the world. An LLM is trained on vast quantities of textual data including books, websites, scientific papers, social media conversations, and creative works like poetry and performative scripts. The training process equips the LLM with the natural language understanding and generation capabilities just discussed. Don’t lose sight of the fact that a LLM is just software that runs on computer hardware.


It is important to separate in your mind two things; the natural language capabilities of an LLM and what it “knows”. Initial impressions from using ChatGPT may lead you to think it only knows about what it has been trained on, which by the way is very impressive. A variety of approaches are available to extend the knowledge accessible to an LLM to corporate data including documents and databases.

 

Before we examine specific use cases in the HR domain, let’s begin by reviewing five general uses of Generative AI.

 

  • Writing – An LLM can assist in writing by generating coherent and contextually relevant content. It understands user prompts, gathers information, and produces well-structured content on a given topic in a desired format.

  • Summarizing - For condensing large volumes of text, an LLM can summarize information, providing concise and relevant overviews. This is useful for quickly grasping the key points of documents or meeting transcripts.

  • Translating - LLMs excel in language translation by understanding the context and nuances of phrases. It helps bridge language gaps by providing accurate and natural-sounding translations between various languages. This includes the ability to translate natural language into computer code.

  • Extracting - When dealing with vast amounts of textual data, an LLM can extract specific information. It understands text and retrieves relevant details, making it valuable for tasks such as research or data analysis.

  • Agents - An LLM can act as a reasoning entity that can invoke other programs to perform actions. This is particularly useful for automating repetitive tasks or interfacing with external systems.

 

It's important to be aware that many of the vendors we collaborate with, such as Workday, Salesforce, Mailchimp, and MS Office, are integrating Generative AI into their core product offerings. This integration will continue in full force over the next few years.  The real-world use cases presented below are designed to inspire and guide your thinking about this emerging technology. They are not specific to any particular vendor's application but rather provide a broader perspective on how Generative AI can be utilized across various platforms and scenarios. This approach will help you understand the potential applications and benefits of Generative AI in a general sense, enabling you to adapt more effectively as these capabilities become more prevalent in the tools you use regularly.

 

  • Use Case #1 – Personal Augmentation Assistant

Consider a scenario where you, an HR leader, are navigating unfamiliar territory in your role or industry. Your responsibilities could involve anything from implementing HR software to creating an extensive HR Dashboard, featuring vital Key Performance Indicators (KPIs), or integrating complex legal changes into company policies. In these situations, Generative AI tools can be a significant asset.

 

These tools serve as your personal augmentation advisor, granting you access to in-depth knowledge that's often restricted to specialized fields like IT and Legal. For instance, you could use a platform like ChatGPT to draft in-depth project plans, define specific requirements, select the right vendors, and create critical inquiries. This method reduces your reliance on domain-specific experts, empowering you with the necessary knowledge and insights to lead in areas beyond your traditional expertise.

 

It’s important to note, however, that while Generative AI is an invaluable tool for enhancing understanding and decision-making, it doesn't replace the need for legal counsel or professional consultation. Rather, it helps you become more informed and formulate more effective questions in your role.

 

  • Use Case #2 - Utilizing Generative AI for Meeting Insight Extraction

Generative AI is widely known for its ability to summarize meeting transcripts, but its potential extends far beyond just summarization. It can extract valuable insights from any meeting, which are often missed or forgotten.

 

Meetings between two or more people are often full of ideas and insights, but capturing their real value can be challenging. These valuable thoughts can fade from memory or remain unknown to those not in attendance. In the flow of conversation, many ideas and comments are lost because of our limited capacity to process information. While traditional meeting minutes offer a summary, Generative AI enables a deeper exploration. It can extract subtle details and overlooked ideas.

 

Consider the context of important project meetings. These meetings often involve complex discussions and decision-making processes that are pivotal to a project's success. However, the full depth of these interactions can be challenging to capture and analyze. Generative AI offers a solution by analyzing meeting transcripts to uncover key insights that could influence the project's trajectory.

 

By applying Generative AI to these transcripts, you can gain a clearer understanding of various aspects that are critical to project management and development:

o   Communication dynamics among team members.

o   Collective attitudes towards project goals and challenges.

o   Problem-solving strategies employed during discussions.

o   Creative solutions and innovative ideas proposed.

o   Leadership and management skills displayed in the course of the meeting.

 

This analysis can provide a richer, more detailed view of the project's progress and the team's performance, which might otherwise be overlooked in traditional meeting summaries. It ensures that valuable insights from these important project meetings are recognized and utilized effectively.

 

  • Use Case #3 - Specialized Employee Knowledge and Know-how Transfer

Many organizations possess specialized knowledge, often concentrated in a few experts or dispersed across the workforce. This includes knowledge about processes, products, customer interactions, or technical details. A significant challenge is the potential loss of this valuable knowledge due to factors like employee turnover or retirement.

 

Generative AI introduces a novel solution to this challenge. Consider a corporate knowledge base created to preserve and expand this intellectual capital. It begins with a collection of initial documents. A chatbot, programmed to interact with employees, asks targeted questions to draw out specific knowledge. Additionally, audio recordings of process walk-throughs or expert interviews can be incorporated as sources of information. This approach enables the organization to capture and retain knowledge continuously, turning it into a cumulative, strategic asset.

 

The knowledge base evolves as the chatbot, through regular interactions with employees and contractors, compiles and synthesizes information. Over time, this process builds a comprehensive repository of the organization's collective wisdom.

 

This dynamic knowledge base can then be made accessible to other employees via summary documents or another chatbot interface, revolutionizing the traditional method of knowledge sharing, which relies heavily on human-generated documents. This AI-driven method not only protects an organization's internal expertise but also makes it widely accessible to its members.

 

Regarding security and privacy, just like current practices, there will need to be protocols to ensure that sensitive information is secured and accessible only to authorized personnel. This ensures that while knowledge is shared and preserved, it is also protected from unauthorized access.


2023 in the rear view and what is next?


There is much to cover but here are the key developments.

 

Open AI broadened its offering. Open AI introduced the pay-for “Plus” and “Team” versions of ChatGPT. The Plus version offers new capabilities including the GPT-4 LLM and the ability to upload files for Q&A and analysis of data sets. The Team version offers a collaboration platform for a team to share their use of ChatGPT. OpenAI also now offers “My GPTs”. This is a feature that allows users to create and manage their own custom instances of the GPT. This feature is particularly useful for those who want to tailor the GPT model for specific applications or to fine-tune it with their own data.

 

Competitors to OpenAI, both commercial and open source have created many LLMs with different capabilities and cost models. OpenAI is not the only game in town. The best OpenAI LLM is the most capable but not all use cases require such a capable LLM. So the main thing to understand is that there can be more cost-effective choices than what OpenAI offers. There is a lot of energy in the Open Source LLM space and we expect organizations to leverage those offerings in their Generative AI solutions.

 

Point solutions appeared in the marketplace place filling obvious product opportunities. The most obvious example is the explosion of note-taker solutions. Note-takers are AIs that join online meetings, transcribe the conversation, and then offer summaries of the conversation. The top-tier solutions distinguish speakers and offer the capability to ask questions about the conversation. We use one of these in all of our online meetings.

 

Low code, no code “talk to your documents” products offer rapid prototyping and even production solutions. Solving the same problem as OpenAI My GPTs, developers and tech-savvy business users can quickly prove out chatbot use cases, confirming and communicating business value to leadership.

 

Existing Automation tool vendors added Generative AI capabilities to their products. Existing automated business processes can now take advantage of LLM capabilities beyond the chatbot type of use case. Lights out business processes can use LLM capabilities to skip steps that may in the past have waited for humans to perform.


These developments lead us to believe that 2024 will be the year of custom chatbots and AI automation. Chatbots will be created with specialized knowledge and will provide a new way to interact with corporate data and external current events.

 

Organizations will rethink existing business processes and leverage Generative AI capabilities in automated processes. Human tasks that require the understanding and synthesis of spoken or written content will be replaced by LLM capabilities. People will still play a critical role in the review of LLM outputs. Keeping a human in the loop will be important to ensure quality. We expect the focus in 2024 will be dramatically improving the efficiency of business processes. Better, faster, and cheaper.


What is my strategy?

 

Every organization is unique. Different business models, resources both human and non-human, and of course, unique constraints. So, what we offer next is a broad and general approach, recognizing there is no one-size-fits-all approach.

 

  • The first and most important step is to educate yourself and your personnel about Generative AI capabilities, risks, and costs. While it may seem daunting, adopting a breadth-first approach is what we recommend. By breath first, we mean to survey the entire scene first before you get too deep. Get an understanding of the key capabilities of Generative AI so that when new tools are announced, you can identify where they fit. Consider this a core competency your organization needs to develop. Hopefully, our overview in this article will help you get started.

  • The second step is to engage with your internal IT function to gauge their attitudes about and awareness of Generative AI. Hopefully, your internal IT function is ahead of you and has already developed a good understanding of Generative AI. If they have not, it will be difficult to implement a production solution without their involvement and support.

  • The third step is to create a list of potential use cases in your organization. We recommend you identify potential problems and challenges in your organization that you think Generative AI could help solve. Treat this as a first step in brainstorming a portfolio of potential projects. We recommend you focus on tasks that humans perform today that involve listening to transcripts, reading documents, and researching on the web. In most cases, there will be subsequent analysis, data extraction, or synthesis of the gathered information. Define the use cases as accurately as possible. This is the most important step in creating a solution using Generative AI. No idea should be excluded during this process.

  • The fourth step is to assess the feasibility of each use case. What is the potential value? What are the risks? and finally, what is the development and ongoing operational cost? A qualitative answer for each of these questions is a fine place to start. No need to do a detailed financial analysis at this point.

 

With a list of feasibility-scored use cases, implement a “Crawl, walk, run, fly” approach to one (or more) of the highest-scoring use case(s). “Crawl, walk, run, fly" is often used to describe the gradual progression or stages of development in various contexts.

 

  • Crawl - Start with basic, foundational steps. Implement a pilot or prototype to understand feasibility. Gather insights and feedback to inform future stages.

  • Walk - Expand on successful elements from the crawl phase. Refine processes and address any issues identified during initial implementation. Prepare for increased scale while maintaining a controlled pace.

  • Run - Implement the initiative at its intended scale. Leverage refined processes for optimal efficiency. Establish mechanisms for continuous improvement based on real-world usage.

  • Fly - Explore cutting-edge solutions and technologies. Embrace flexibility and adaptability to changing landscapes. Foster a culture of ongoing improvement and exploration.

 

Final Thoughts

 

You probably noticed that most of our discussion paints a pretty rosy picture. We have focused mostly on the value of Generative AI for organizations. There are real challenges to adopting the technology though. Without much elaboration, here are the key challenges.

 

  • Data Quality and Availability - Garbage in, garbage out applies to Generative AI too. Quality data is a key requirement for successful use of Generative AI. Expect that curation of existing data will be required before use in a Generative AI solution.

  • Regulatory Compliance - Navigating the evolving regulatory landscape, especially concerning data privacy is a significant challenge. Staying compliant while innovating can be a delicate balance.

  • Talent and Expertise - There is a high demand for skilled professionals in Generative AI. Attracting and retaining talent, and ensuring continuous skill development in this rapidly evolving field, is essential for successful implementation.

  • Managing Expectations - It is important to manage stakeholder expectations regarding the capabilities and limitations of Generative AI. Overpromising can lead to disappointment while underestimating the potential of the technology can result in missed opportunities.

  • Cost Management - The financial aspect of implementing generative AI, from development to deployment and maintenance, can be considerable. Balancing the cost with the expected ROI is a significant challenge.

 

Addressing these challenges requires a thoughtful approach, involving not only technological solutions but also strategic planning, policy development, stakeholder engagement, and ongoing education and training.

 

We hope after reading this content that you can see that Generative AI has many applications in your business and you do not need to wait to get started.


 

 

댓글

댓글을 불러올 수 없습니다.
기술적인 오류가 발생하였습니다. 연결 상태를 확인한 다음 페이지를 새로고침해보세요.
bottom of page