The development of Gen AI capabilities has quickly become a top priority for many companies, but building these capabilities isn’t straightforward.

Jeepney in Manila Traffic | Andrey X.

5 Roadblocks in Your Gen AI Capabilities Development, and How to Solve Them | AI Notes #22

The development of Gen AI capabilities has quickly become a top priority for many companies, but building these capabilities isn’t straightforward.

The development of Generative AI (Gen AI) capabilities has quickly become a top priority for many companies looking to leverage AI-powered automation and innovation. Yet, building these capabilities isn’t straightforward. As the technology landscape continues to evolve rapidly, businesses are often left grappling with knowledge gaps, operational challenges, and unclear paths to value.

In this article, we’ll outline five common roadblocks that stall Gen AI capability development and provide actionable strategies to overcome them. From addressing knowledge gaps and defining clear ROI to building robust workflows and ensuring model adaptability, this guide will help your organization unlock the full potential of Gen AI.

(1) Knowledge Gaps

Building Gen AI capabilities requires a broad understanding that goes beyond just engineering. Teams across product management, data science, and even business functions like sales and operations need to be aligned on how these technologies work, what the potential use cases are, and how they can be effectively integrated into the organization’s products and processes.

This lack of holistic understanding can hinder progress and limit innovation. Moreover, the knowledge gap can create silos within the organization, where only a few experts understand how Gen AI solutions are developed and deployed.

How to solve: One effective way to bridge these knowledge gaps is by organizing a guided internal hackathon. A hackathon provides a collaborative environment where learning happens across functions, not just within isolated teams. It’s an opportunity to experiment with the basic building blocks of Gen AI technology — from data preparation to model training and deployment — while also fostering cross-functional collaboration.

This approach ensures that everyone, from product managers to data analysts, is familiar with how to develop and integrate Gen AI use cases. Follow this up with a post-hackathon review to share learnings and identify areas for deeper exploration.

(2) Unclear ROI of Gen AI Use Cases

One of the biggest challenges companies face is determining where Gen AI can truly add value. Without a clear return on investment (ROI), it’s difficult for management to justify further investment or dedicate resources to Gen AI projects. This uncertainty often stems from not having a clear framework for identifying high-impact use cases.

Gen AI can automate workflows, enhance customer service, or even generate new business models — but only if its potential is systematically mapped against existing business processes.

How to solve: Management should start by defining specific criteria for evaluating Gen AI use cases. This involves pinpointing areas where the company’s data flywheel intersects with workflow inefficiencies or opportunities for automation.

For example, consider which customer touchpoints generate a lot of repetitive work or where data can be better leveraged to make real-time decisions. By setting these parameters early on, you create a structured approach to explore and pilot Gen AI initiatives, reducing uncertainty around potential ROI.

(3) Lack of a Structured Path to Build and Deploy Gen AI Applications

Building Gen AI solutions often involves trial and error, especially when dealing with large language models (LLMs) prone to hallucinations or unpredictable outputs. Traditional development frameworks may not fully address the unique requirements of Gen AI, such as model training, validation, and the need for constant fine-tuning.

This can lead to delays and inefficiencies in deploying AI applications that are production-ready and reliable. Additionally, the process of incorporating user feedback and retraining models can be cumbersome without a well-defined workflow.

How to solve: Revisit your development and deployment workflow to accommodate the complexities of Gen AI projects. This means establishing a process that includes not only engineering but also product and business teams from the outset. A successful Gen AI project requires close collaboration between data scientists, developers, and business stakeholders to ensure the model is contextually relevant and aligned with business goals.

Consider introducing practices such as continuous integration and deployment (CI/CD) for model updates, as well as setting up automated guardrails to detect and mitigate potential issues (e.g., hallucinations) early in the pipeline.

(4) Misalignment of the Operating Model with User Interface

Gen AI use cases are often consumed through various interfaces, including web applications, chat platforms, and third-party integrations. This means that the operating model — the way your AI solution is deployed and accessed — needs to be closely aligned with the user interface (UI). A mismatch between the two can result in a poor user experience, such as slow response times or limited access, which diminishes the perceived value of your Gen AI capabilities. For instance, if the UI relies on real-time responses, the model’s deployment environment must support fast and reliable execution.

How to solve: This often involves selecting a deployment environment that can accommodate external accessibility and ensuring that the AI model can process queries asynchronously. For example, consider using serverless architectures or microservices that can scale up and down based on demand. Additionally, ensure that your infrastructure supports multiple interaction modes, such as synchronous and asynchronous processing, depending on the complexity of the task and the UI requirements. A flexible deployment setup will ensure that users get a seamless experience regardless of how they interact with your Gen AI solution.

(5) Inability to Adapt to Different Language Models Over Time

The Gen AI landscape is rapidly evolving, with new foundational models being released regularly. Some models may offer better accuracy, lower latency, or unique features that make them a better fit for specific use cases. However, companies often find it challenging to switch between models due to rigid data pipelines or deeply integrated APIs, which can lock them into a single vendor or technology stack. This limits the ability to continuously improve the Gen AI solution and keep up with advancements in the field.

How to solve: The solution lies in building a flexible data platform that can easily integrate different language models over time. Consider setting up your Gen AI infrastructure in a way that abstracts model integration through a common interface, such as a standardized API layer. This approach enables you to swap in new models as needed, without having to overhaul your entire system. Additionally, plan for the ongoing monitoring and evaluation of various models to ensure you’re always using the best option available for your specific business needs. The ability to quickly integrate new models will allow your business to stay competitive and leverage cutting-edge capabilities as they emerge.

Insights from Fazz and AWS collaboration, which we’ve covered here as well

+ posts

Paulo Joquiño is a writer and content producer for tech companies, and co-author of the book Navigating ASEANnovation. He is currently Editor of Insignia Business Review, the official publication of Insignia Ventures Partners, and senior content strategist for the venture capital firm, where he started right after graduation. As a university student, he took up multiple work opportunities in content and marketing for startups in Asia. These included interning as an associate at G3 Partners, a Seoul-based marketing agency for tech startups, running tech community engagements at coworking space and business community, ASPACE Philippines, and interning at workspace marketplace FlySpaces. He graduated with a BS Management Engineering at Ateneo de Manila University in 2019.

***