Categories
AI Digital Transformation

Preparing for Four Types of Enterprise AI in 2024

AI enables us to solve problems at scale that were previously unsolvable. That’s why many see 2024 as the “Year of AI”.

Companies are moving quickly to allocate budgets toward AI investments. 40% of organizations surveyed by CapGemini said they’re already funding AI endeavors – and another 49% plan to do so within the next 12 months, and 74% believe the benefits will outweigh associated concerns. 

In this blog, we’ll seek to understand four categories of AI application and what’s required to be successful with each type.  The businesses that create the right building blocks for GenAI early stand to reap benefits that we can’t even yet comprehend. 

Four Categories of Enterprise AI Applications

Any observations in this exponentially evolving space become dated very quickly. With that in mind, our team has observed four discrete categories of AI solutions entering the market. We’ll discuss these below and recommend how an organization can best incorporate them.

  • Category 1: First-generation GenAI products
  • Category 2: Product-enhancing GenAI 
  • Category 3: AI-enhanced business workflows
  • Category 4: Enterprise-connected AI 

Categories 3 and 4 offer the most potential for organizations seeking to gain a competitive advantage through AI. We’ll discuss the details of all four types below.

Category 1: First-generation GenAI products

These products would not exist without AI. Their main purpose is to provide broad, direct access to GenAI Large Language Models (LLMs) for end users. Examples of this category include ChatGPT, Midjourney, and Bard.

These solutions come from large tech companies that have invested heavily in creating and training their own LLMs to power commercial product offerings. The barrier to entry for using these types of solutions is low. (The barrier to creating them yourself from scratch, however, is quite high!) Organizations that wish to leverage them only need to train employees on their use for boosting productivity by eliminating tedious tasks.

Competitive Differentiation: Low
Complexity of Rollout: Low

Category 2: Product-enhancing GenAI

Companies are rushing to embed GenAI capabilities into existing products to capture more license revenue by adding value to existing product capabilities. An example is Salesforce’s GPT extensions to its sales, service, Slack, and other portfolio products. 

These solutions make existing software offerings better and are very logical places to start since they own the screen pixels that drive user workflows. The tradeoff is that value is siloed to those products and can’t impact the enterprise at scale. 

Organizations wishing to leverage these can upgrade existing licenses (if necessary) to key software offerings from Salesforce, Adobe, or Microsoft. They then need to provide employees with training and enablement on the newly surfaced GenAI features.

Competitive Differentiation: Low
Complexity of Rollout: Low/Medium

Category 3: AI-enhanced business workflows

Savvy companies are beginning to leverage APIs from LLM providers like OpenAI, Microsoft, or Google to improve upon existing business processes in ways that were impossible without GenAI. Here are a few examples:

  • An HR department that uses GenAI to score resumes against a job posting and provide gap assessments to recruiters
  • A home insurer that automates common underwriting tasks by searching documents using context and reason
  • A quoting engine capable of detecting the intent of the user and aligning it to the right potential product offerings, increasing conversion, sales velocity, and CSAT

When applied to the right use cases, this category of AI solution allows a business to eliminate process bottlenecks, reduce overhead costs, increase employee effectiveness, and ultimately provide a significant impact on critical revenue and profitability drivers. 

Organizations serious about these solutions should take inventory of problematic, high-impact bottlenecks impacting the bottom line. These include pipeline management, inventory and supply chain, customer onboarding, call center workflows, and talent acquisition processes. GenAI will enable many solutions to these problems that weren’t previously possible.

Competitive Differentiation: Medium/High
Complexity of Rollout: Medium

Category 4: Enterprise-connected AI 

These future-generation solutions will allow AI to perform strategic actions on behalf of an employee or customer persona. This will enable interaction with the brand in ways impossible before GenAI. 

A great example is a chatbot that can book a travel plan to a user’s granular travel requirements. GenAI will perform tasks that would’ve taken a travel agent half a day in a few seconds.

This type of AI application promises to revolutionize how we interact with brands. It stands to cause a monumental shift in how we think of human-computer interaction. We’ll take a deeper look into this specific category down below.

Companies looking to harness this type of AI solution should start by taking a fresh look at poor user experiences driven by task complexity. GenAI promises us the ability to detect the intent of the user and act on their behalf. In other words, there are now better ways to drive complex workflows in a more user-friendly way.

Competitive Differentiation: High
Complexity of Rollout: Medium/High

Differentiating with AI

Categories 3 and 4 of AI solutions unlock the most enterprise value for organizations. Innovative solutions have the potential to provide enormous competitive advantage when executed correctly. These solutions will require companies to address an age-old problem: systems integration.

Modern cloud software like Salesforce, Dynamics365, and NetSuite powers business user workflows. And as long as an implementer stays within the platform’s best practices, it’s rare to find any major issues in the SaaS tier.

The technology obstacles that cause issues – the ones that take months or even years to address – are almost always caused by a lack of interoperability with the enterprise’s hundreds or thousands of other systems and solutions. 

Unfortunately, too many organizations today brute-force these solutions to get them to “just work.” The result is that these systems integrations are inherently brittle and sloppy. This leads to an immense level of fragility. It’s a major reason why 80% of IT leaders surveyed by MuleSoft said that integration hinders digital transformation.

As MuleSoft Founder Ross Mason said, without connectivity, AI is merely a “brain in a jar.”

Next Steps

When you consider the load that AI integration will put on interconnectivity, you can see how the integration issue is the biggest hindrance to adopting GenAI at scale in the enterprise. Tune in next week for the second part of this blog series on how to prepare your systems integration strategy with AI in mind.

Ready to learn more about riding the Generative AI wave? Download our comprehensive white paper on AI Readiness and start your AI Superhero training today.

Categories
AI Digital Transformation

A decade ago, my team delivered the #1 mobile e-comm site in the big box electronics and appliances category. How we did it can teach you about success with AI.

AI is changing technology as we know it, and it’s only going to speed up from here. Winners and losers will be created by this shift. Leaders everywhere are scrambling to figure out how to prepare their organizations to consume this change. 

A similar disruptive situation unfolded with smartphones in the 2012 – 2014 timeframe. Entering those years, there were very few mobile-specific web experiences. Just about everything required “pinch/zoom” navigation, and as mobile traffic began to increase materially in this timeframe, brands knew they had to adapt quickly. How to adapt and common obstacles for delivering mobile experiences were still big mysteries. 

With OpenAI CEO Sam Altman frequently implying we’re only at the beginning of an exponential period of innovation and with everyone agreeing that AI will be a game changer, this situation repeats itself. Brands know they need to adapt quickly to AI–even more so than mobile–but how to prepare for AI is largely a mystery to most organizations. 

Read on to discover lessons learned from my history working with companies to adopt mobile and how to apply these lessons as you prepare for AI.

Look to the past as a crystal ball to the future

Think back to the days before smartphone usage hit critical mass.

Calling an Uber. Scrolling Instagram. Looking up directions on Google Maps. Tracking your steps with a smartwatch. Watching high-definition video. None of these now-everyday activities were obvious when the iPhone 2G was released by Apple and Steve Jobs in 2007. 

Thinking back, I recall much less fanfare than there was skepticism. We had to watch it unfold in the wild, gradually resetting what the idea of normal was and layering idea on top of idea in this new normal. 

We believe the same is currently true for AI. We don’t know exactly where it’ll go. But we know its impact will be widespread, like nothing we’ve experienced since the smartphone disrupted every major business. 

40% of organizations surveyed by CapGemini said they’re already funding AI endeavors – and another 49% plan to do so within the next 12 months, and 74% believe the benefits will outweigh associated concerns. The businesses that create the right building blocks for GenAI early stand to realize value from AI first. 

So 2024 corporate AI objectives will be well-funded, but most organizations I talk to are searching for a roadmap to success. Understanding of major hurdles will be critical to ensuring these budgets are put to use wisely to enable high-value AI use cases. 

An old, boring, and (largely) unsolved IT problem

In the summer of 2013, I had the privilege of leading a team of engineers at a major big-box electronics and appliances retailer. Like every retailer in that time period, evolving buyer web traffic patterns shifting to mobile meant that our customer needed a mobile site to be competitive, and it needed it now. Our goal was to deliver this experience in a very narrow timeframe for an October go-live so that we had ample time to tune it before Black Friday, which was like retail’s Super Bowl.

Our technical challenge was that desktop websites at that time rendered entire megabyte-heavy Web pages in one shot and that Browser technology was still evolving from its infancy. Mobile required a lighter-weight solution that enabled devices to fetch new data without a full page refresh to avoid overtaxing the limited resources of the iPhone 3G and 4. As a result, the industry shifted to multi-tiered apps with heavyweight back-ends exposed by APIs, and front-ends driven by newer JavaScript frameworks.

We were trailblazers at the beginning of this shift with our client. The mobile site we delivered received a lot of accolades and won the prestigious JDPower award for its category, beating out much bigger budgeted brands like Home Depot, Lowes, Costco, and Best Buy. It was praised for its user experience, page speed, and for taking advantage of newer device and browser capabilities. Most importantly, it produced great revenue funnel metrics for our customer and was easily scalable to handle the huge traffic spikes of retail. 

In a world of poorly-operating mobile sites, what was our secret? The same secret that will enable AI: a flexible, big-picture systems integration strategy.

None of what we achieved would have been possible without the flexibility created when our engineering team prioritized solving the integration problem as a necessity for solving everything else. Solving integration enabled our rapid delivery schedule, unlocked features to lift the buyer journey, and enabled our site to perform at scale. We knew it was a must-have for our client to be successful.

Fast forward a little over a decade, and we sit at the precipice of another huge technological shift with AI. And with AI, the integration demands will be even higher. For it to have a widespread impact, it needs to understand which systems have which data and which systems are used to trigger which business actions. 

In 2023, Shyam Sankar, CTO of Palantir, said “The popular view is that [systems integration] is a boring and solved problem, but it might be a boring and highly unsolved problem, where people are just duct taping everything together.”

Sankar goes on to link this problem to AI enablement, implying that only companies who solve this problem will reap the biggest benefits from AI. This belief is shared by MuleSoft Founder Ross Mason, who wrote about it in a 2018 blog. Green Irony’s own experience building custom AI solutions to address high-value use cases also backs up the need for scalable systems integration. 

Unfortunately, too many organizations today brute-force integration strategies to get them to “just work.” The result is that these systems integrations are inherently brittle and sloppy, serving only the narrow needs of each integration consumer instead of being flexible enough to quickly meet the needs of ANY consumer. 

Engineering teams must prioritize unlocking key systems in a scalable way if their organizations are to be successful with AI.

Getting your house in order

Acknowledging a problem is the first step toward solving it. 

GenAI solutions, much like mobile, will presuppose that this integration problem has been solved. Savvy enterprises will learn from mobile and ensure that this presupposition is correct, paving the road to AI success for their organizations. 

Don’t be caught off guard. Educate yourself by taking a deep dive into this topic with our AI Enablement White Paper. Learn more about AI enablement and why a flexible, composable integration strategy is critical to your success.

Interested in a discussion on this topic? Reach out.

Categories
AI

A Deep Dive into Salesforce’s AI Innovations Ahead of Dreamforce

Salesforce has been at the forefront of integrating cutting-edge artificial intelligence into its platform, making waves with its latest foray into Generative AI through Einstein GPT. With Dreamforce around the corner, it’s the perfect time to explore what this means for Salesforce users and why the future looks so promising.

  • Revolutionizing the Platform: Einstein GPT Integration
    Salesforce isn’t just tinkering around the edges. It’s embedding the generative prowess of Einstein GPT across the platform. What does this mean? Well, whether you’re in sales, service, or development, expect a productivity boost like never before.

  • Trust is Paramount: The Einstein GPT Trust Layer
    With great power comes great responsibility. Enter the Einstein GPT Trust Layer, a feature that ensures AI does not become a Pandora’s box. From controlling user access to maintaining an audit trail and ensuring no data leakage outside of Salesforce, it’s clear that safety is not an afterthought.

  • Real-World Applications: Sales & Service Clouds
    Already, Salesforce’s core clouds are beginning to feel the Einstein GPT touch:
    • Sales Cloud: The newly announced generative email feature empowers sales representatives, automating the creation of email content. The result? Faster communication without compromising on quality.
    • Service Cloud: Additions such as Service Replies and Work Summaries are tools that elevate user efficiency, streamlining the service process.
  • For The Developers: A Game-Changer
    As someone who understands the intricacies of software development, I can’t stress enough the impact of Einstein GPT on developer productivity:
    • Einstein GPT for Developers: Imagine generating Apex code or SOQL queries through VS Code commands. Forget the nitty-gritty of syntax; focus on creating.
    • Einstein GPT for Flow: This is about revolutionizing automation on Salesforce. Engage in a natural language conversation with Einstein and watch as it crafts a scaffold flow, giving developers a significant head start.

Final Thoughts

Salesforce is only just getting started. With every release, anticipate a suite of new features enhancing productivity across the board. In essence, if you’re on the Salesforce platform, there’s an AI accelerator tailored to enhance your efficiency.

Recommended Reads:

Stay tuned for more updates, and here’s to a transformative Dreamforce!

Categories
AI Green Irony News

AI-enabled Applicant Screening Delivered by Our MuleSoft OpenAI Connector

AI and Integration

The general release of ChatGPT in November of 2022 changed everything. Unique use cases for ChatGPT poured in every day, changing previously-held beliefs on what was possible.

Since connectivity enables everything, we knew OpenAI’s API would open up even more possibilities. So we began investing in creating our own OpenAI Connector for MuleSoft to make connectivity easy. We knew that while we were working on enabling connectivity, we’d be able to think of lots of great ways to use it when it was ready.

And we did. Below is our first organizational use case integrating into OpenAI’s services, and it’s been incredibly valuable to Green Irony already. 

The Problem

Like many technology companies, Green Irony receives a huge volume of resume submissions to open job postings. 

This can be both a blessing and a curse.

It’s a blessing because we have the reach and employment brand to attract hundreds of applicants to open positions within days of posting.

It’s a curse because someone needs to go through these resumes and whittle them down into a candidate pool for phone interviews. With techniques like keyword stuffing now combined with the likes of AI-generated resumes, properly screening resumes is getting more challenging by the day. Not only does it require a lot of person-hours to do properly, but it also requires heavy domain knowledge and experience in the roles being screened.

In short, doing it right required a lot of internal experts’ time and effort, and these experts were already very busy. We needed to get them some help.

The Solution

Fortunately, Green Irony’s R&D team, armed with the most disruptive technology we’ve ever laid eyes on, was up to the challenge.

Using Green Irony’s OpenAI Connector for MuleSoft, our team delivered a set of APIs capable of providing a candidate rating against a job post, reasoning for the rating, and a set of questions to validate the candidate and ensure gaps are filled. This is all done real-time, triggered by a webhook integration on every inbound candidate submission and performed asynchronously, writing scores and other relevant contextual information back into our Applicant Tracking System.

Real-time integration allows us to quickly alert our team of promising candidates, enabling us to get in front of them faster. Our solution was driven by a belief in the “human in the middle” concept, so our goals are to arm our team with the best information, not to disqualify candidates based on an AI-only recommendation.

The secret sauce of the solution lies in a few key areas:

  1. Easy, connector-based connectivity into OpenAI, enabling us to focus on delivering and testing what matters for the use cases
  2. Prompt engineering and A/B testing of various prompts, ensuring we mitigate any hallucination issues and receive the most accurate feedback possible about every resume
  3. Working up an accurate job posting that is very clear about what success looks like and then parsing resumes for key information that is relevant to scoring against a job posting
  4. Fine tuning of the OpenAI AI model and continuously sampling results with our human experts, ensuring alignment between human ratings and feedback and AI-generated ratings and feedback (in progress)

The Results

  • Labor Bandwidth Savings: The AI provides a sophisticated scoring system that significantly reduces the time spent on evaluating each resume, from an average of 10 minutes to mere seconds. This labor savings comes from critical resources who are very busy with other work in a fast-growing startup.
  • Increased Interview Quality: The AI system offers detailed insights on each candidates’ strong matches and gaps in relation to the job description. This has resulted in stronger selections for interviews and reduction in the number of necessary interviews in half from an average of 6 to 3. Increased quality of interviews leads to increased quality of onboarded new hires.
  • Enablement of HR Resources: The contextual information delivered by our solution enables our HR resources to ask better, more technical questions to assess the quality of candidates sooner within our process. This results in a higher quality of candidate pool reaching our second interview stage.
  • Reduction in Recruiting Timeline: The use of AI allows for swift identification and rejection of unsuitable candidates, reducing time spent in the recruitment process. This has led to a 5X+ reduction in the amount of time we spend thinning 100 applicants into 15 for phone interviews, so we’re able to talk to better candidates more quickly.

Key Takeaways for AI

If someone would’ve thought of this solution 12 months ago, our first thought would be that it would take millions of dollars worth of technology labor investment to deliver because of how time-consuming machine learning is. 

The release of GPT-3.5 shattered this point of view. We now have access to an extensible LLM capable of performing this level of expert analysis with the right prompts and tuning. We’re in uncharted territory, and it’s up to leaders to find the right tasks for AI, tasks to take off the plates of overworked humans.

To us, our own resume screening process was the perfect type of scenario for delivering value with generative AI. We had a very tedious, time-consuming task that also required experts to perform it. This task was a scale inhibitor since not only does it require so much time that an organization like ours can’t possibly keep up, it’s also VERY important to get right.

Generative AI has been a revelation for our applicant screening process, and it’s just the beginning.