Integrating OpenAI With Salesforce AppExchange: Formula Debugger Case Study

Formula Debugger began its journey as a straightforward tool for testing and debugging formula fields in Salesforce

Setting up OpenAI API integration

Remote Site Settings

Custom Metadata

  1. Endpoint: Stores the OpenAI API endpoint, which is

  2. API_Key: Stores the API key required for authenticating with the OpenAI API.

  3. Max_Tokens: Stores an integer value that represents the maximum number of tokens allowed in the API response.

  4. Context: A string used to provide the context for the OpenAI API call.

  5. Command: A string used to represent the command or query that you want ChatGPT to understand and respond to.
Custom Metadata definition

Obtaining an API Key from OpenAI

  1. Sign up for an OpenAI account or log in to your existing account.
  2. Visit the API Keys section of your account dashboard.
  3. Generate a new API key or use an existing one.

Apex Classes for OpenAI Integration

In order to integrate OpenAI’s ChatGPT API into the FormulaDebugger AppExchange application, we’ve created three Apex classes: FormulaInsightsControllerFormulaInsightsMock, and FormulaInsightsControllerTest. Each of these classes plays a crucial role in handling API calls, mocking API responses, and testing the integration.


					public with sharing class FormulaInsightsController {
    public static String getInsights(String formula) {
        ChatGPTIntegration__mdt integration = ChatGPTIntegration__mdt.getInstance('OpenAI');

        if (integration == null) {
            return 'Error: ChatGPTIntegration configuration not found';
        if (FormulaFunctionsCtrl.getConfiguration().showFormulaInsights == false) {
            return 'Error: FMA for Insights not enabled';
        Http http = new Http();
        HttpRequest request = new HttpRequest();
        HttpResponse response;
        request.setHeader('Content-Type', 'application/json');
        request.setHeader('Authorization', 'Bearer ' + integration.API_Key__c);
        // Set maximum timeout

        String content = integration.Context__c +' '
                         + String.escapeSingleQuotes(formula)+' '
                         + integration.Command__c;

        String payload = '{"model": "'+integration.Model__c
                            +'", "max_tokens": '+ integration.Max_Tokens__c
                            +', "messages": [{"role": "user", "content": "' 
                            + content + '"}]}';

        try {
            response = http.send(request);
            if (response.getStatusCode() == 200) {
                return response.getBody();
            } else {
                return 'Error: ' + response.getStatus();
        } catch (Exception e) {
            return 'Error: ' + e.getMessage();


global class FormulaInsightsMock implements HttpCalloutMock {
    global HttpResponse respond(HttpRequest req) {
        HttpResponse res = new HttpResponse();
        res.setHeader('Content-Type', 'application/json');

        res.setBody('{"id": "chatcmpl-abcdefg123456", "object": "chat.completion", "created": 1681152783, "model": "gpt-3.5-turbo-0301", "usage": {"prompt_tokens": 0, "completion_tokens": 0, "total_tokens": 0}, "choices": [{"index": 0, "message": {"role": "assistant", "content": "This is a mock response from the ChatGPT API."}, "finish_reason": "stop"}]}');
        return res;



FormulaInsightsControllerTest is an Apex test class designed to test the functionality of the FormulaInsightsController. It uses the FormulaInsightsMock class to simulate API responses and test various scenarios to ensure that the controller is working as expected. ChatGPT generated this test class in its entirety, demonstrating its ability to generate comprehensive test cases that cover various scenarios.

private class FormulaInsightsControllerTest {

    static void testGetInsights() {
        // Set up the mock for the API call
        Test.setMock(HttpCalloutMock.class, new FormulaInsightsMock());


        // Call the getInsights method with a sample message
        String formula = 'IF ( ( CHANNEL_ORDERS__Renewal_Date__c - TODAY () ) <= 45 && ( ( CHANNEL_ORDERS__Renewal_Date__c - TODAY () ) > 0 ), IMAGE ("/resource/CHANNEL_ORDERS__uilib/images/warning_60_yellow.png", "warning", 16, 16) & \' Ends in \' & TEXT ( CHANNEL_ORDERS__Renewal_Date__c - TODAY ()) & \' Days\', null)';
        String insights = FormulaInsightsController.getInsights(formula);


        // Assert that the insights are not empty or an error message
        System.assertNotEquals(null, insights, 'Insights should not be null');
        System.assert(insights.length() > 0, 'Insights should not be empty');
        System.assert(!insights.startsWith('Error:'), 'Insights should not start with "Error:"');


While ChatGPT generated FormulaInsightsMock and FormulaInsightsControllerTest classes in their entirety, FormulaInsightsController required some minor manual adjustments. As the FormulaDebugger application already uses elements like FMA Feature Flags (which allows me to enable or disable the OpenAI for specific subscriber orgs as needed — it’s one of the cool technologies available to ISVs and one of the general best practices for rolling out new features, but it’s not mandatory for this integration), it was more efficient to manually align the generated code with the existing codebase instead of teaching ChatGPT about the entire context required to support these features.

showFormulaInsights.featureParameterBoolean-meta.xml — Feature Management App allows us to seamlessly enable or disable the OpenAI component for specific subscriber orgs as needed

Lightning Web Component Integration with OpenAI

Implementing getInsights

					import { LightningElement, track, api } from 'lwc';
import getInsightsFromApex from '@salesforce/apex/FormulaInsightsController.getInsights';
import getConfiguration from '@salesforce/apex/FormulaFunctionsCtrl.getConfiguration';

export default class formulaInsights extends LightningElement {
    @track insightsText = '';

    set formulaContent(value) {
        this._formulaContent = value;

    get formulaContent() {
        return this._formulaContent;

    async connectedCallback(){
        try {
          this.configuration = await getConfiguration();
        } catch (exception) {
          console.error('FormulaFunctions Connected Callback exception', exception);

    async getInsights() {
        if (this.formulaContent && this.formulaContent.trim().length > 0 && this.configuration?.showFormulaInsights) {
            try {
                const response = await getInsightsFromApex({ formula: this.formulaContent });
                const jsonResponse = JSON.parse(response);
                const content = jsonResponse.choices[0].message.content;
                this.insightsText = content;
            } catch (error) {
                this.insightsText = '';
                console.error('Error with insights loading:', error.message);
        } else {
            this.insightsText = '';


    <div class="slds-m-top_medium">
        <p class="response-text">{insightsText}</p>

Displaying Insights on the Formula Debugger Main Screen

The formulaInsights LWC is designed to seamlessly integrate with the existing FormulaDebugger application’s main screen. When a user select a formula, the LWC fetches insights from the OpenAI API and presents them in an intuitive and visually appealing format alongside the formula. This allows users to better understand the formula they are working with and receive valuable insights from OpenAI directly within the application.

Getting Meaningful Responses from OpenAI

Leveraging Context and Command
					String content = integration.Context__c + ' '
                 + String.escapeSingleQuotes(formula) + ' '
                 + integration.Command__c;


You are Salesforce Expert and Certified Advanced Administrator. Explain what this salesforce formula field is doing, step by step, and how it works:

IF (ISBLANK (TEXT (Salutation)), ‘’, TEXT (Salutation) & ‘ ‘) & FirstName & ‘ ‘ & LastName

describe what can be an issue with this formula field and how to fix it

Fine-tuning Context and Command

Summary: A Seamless OpenAI Integration with AppExchange App

Integrating OpenAI’s ChatGPT with my AppExchange application, FormulaDebugger, proved to be a smooth and efficient process. The majority of the required code was generated by ChatGPT itself, enabling me to release an updated version of the FormulaDebugger app within just a few hours. The latest version is now available for free on AppExchange for users to enjoy.

New Formula Debugger: Insights in the red box are generated by OpenAI

It’s important to note that, depending on the information stored within your application, using OpenAI’s public endpoint may not be suitable for security reasons. There are alternative options available, such as hosting your private ChatGPT instance. Deciding on the best approach requires a thorough analysis of your specific requirements.

Get ChatGPT-powered hints on your ISV App Errors: Video Demo


Picture of Case Study by Jakub Stefaniak
Case Study by Jakub Stefaniak

Vice President of Technology Strategy and Innovation at Aquiva Labs

Picture of Video Demo by  Robert Sösemann
Video Demo by Robert Sösemann

Principal ISV Architect at Aquiva Labs

More posts

Reflecting on a Year of AI Integration in Salesforce: Insights from Aquiva Labs

Manufacturing Cloud Spring ’24 Release: Quick Insights

Reflecting on INSIGHT 2024: A Deep Dive into Marketing and Sales Alignment

Are you interested?
if you want to join Aquiva, please take a look at our current offers. Join and start your Aquiva adventure!
Contact Aquiva Labs today for solutions that are as ambitious as your goals. Let us guide you to Salesforce success.