Salesforce Simplified: The new frontier of Prompt Engineering, Salesforce Copilot and all things AI

Juil 2, 2024 by Amber Reynolds

Enhancing Productivity with Salesforce Copilot: A Practical Guide to Prompt Engineering and Generative AI 

Generative AI has moved from hype to hands-on reality—especially inside enterprise platforms like Salesforce. In this Salesforce Simplified webinar, Derek Cassese and Josiah Nisbett break down what generative AI really is, why prompt engineering matters, and how Salesforce Copilot and Prompt Builder turn theory into practical business value. 

This article distills the key insights from the session into a clear, actionable guide for Salesforce admins, architects, and business leaders. 

What Is Generative AI—and Why Does It Matter in Salesforce? 

Generative AI refers to AI models capable of creating new content—text, summaries, recommendations, and more—based on large language models (LLMs). Unlike traditional automation or rule-based bots, generative AI allows users to interact conversationally with systems and receive contextual, human-like responses. 

In Salesforce, generative AI doesn’t exist in isolation. It is tightly integrated with CRM data, workflows, and security controls—making it usable for real business outcomes, not just experimentation. 

Key takeaway: 

Generative AI becomes truly powerful when it is grounded in your business data. 

Large Language Models (LLMs): The Foundation Behind the Scenes 

LLMs are deep-learning models trained on massive datasets. Salesforce leverages these models (such as OpenAI’s GPT models) through a controlled framework that ensures: 

  • Data grounding with CRM records 
  • Secure data handling via the Salesforce Trust Layer 
  • Predictable, repeatable outputs for business use cases 

Generative AI in Salesforce doesn’t simply “guess”—it reasons over structured data like accounts, contacts, cases, and opportunities. 

Prompt Engineering: Why the Question Matters as Much as the Answer 

A major theme of the webinar is prompt engineering—the practice of crafting precise, context-rich prompts to get useful outputs from an LLM. 

Poor Prompt Example: 

“Tell me about the car.” 

Better Prompt: 

“Tell me about the 2024 Audi A4, including key features and performance highlights.” 

The difference? Specificity and context. 

In Salesforce, prompts go beyond static text. They can dynamically inject CRM data such as: 

  • Account names 
  • Contacts 
  • Related records 
  • Opportunity details 

This enables reusable, reliable prompts that consistently generate meaningful business content. 

Salesforce Prompt Builder: Turning Prompts into Reusable Assets 

Salesforce Prompt Builder provides a declarative, admin-friendly way to design prompts that users can consume without needing to understand AI mechanics. 

With Prompt Builder, teams can: 

  • Create standardized prompts for emails, summaries, and field generation 
  • Dynamically merge Salesforce record data into prompts 
  • Select different LLMs based on tone, verbosity, or use case 
  • Preview and refine AI outputs before deployment 

Instead of every user “guessing” how to ask AI a question, organizations define prompts once and reuse them everywhere. 

Salesforce Copilot: Conversational AI for Everyday Work 

Salesforce Copilot acts as a conversational assistant embedded directly in the CRM UI. It uses Prompt Builder, flows, and business logic behind the scenes to help users complete tasks faster. 

Examples demonstrated in the webinar include: 

  • Summarizing accounts and cases 
  • Finding high-priority open cases for a specific contact 
  • Drafting personalized sales emails 
  • Navigating records without opening multiple tabs 

Importantly, Copilot actions can call Salesforce Flows, making AI an orchestration layer—not just a chatbot. 

Internal vs. External AI Use Cases 

One important distinction highlighted in the discussion is internal versus external use of generative AI

You don’t have to start by sending AI-generated content to customers. Many organizations begin with: 

  • Internal summaries 
  • Faster data consumption 
  • Sales and service productivity improvements 

This lowers risk while still delivering immediate ROI. 

The Einstein One Platform and Trust Layer 

Salesforce positions all of this under the Einstein One Platform, which unifies AI, automation, analytics, and CRM data. 

A critical component is the Salesforce Trust Layer, which ensures: 

  • Customer data is not used to train public AI models 
  • Sensitive fields can be masked 
  • AI responses are grounded only in authorized data 

This addresses one of the biggest concerns organizations have about adopting generative AI: data security and trust

Why This Matters Now 

Salesforce Copilot and Prompt Builder are not futuristic concepts—they are available today through Einstein One editions, add-ons, and trial orgs. 

The real shift is not just AI inside Salesforce, but: 

  • AI that understands intent 
  • AI that executes workflows 
  • AI that reduces friction across sales, service, and operations 

As Derek notes, many of these tasks previously took too long to bother with. AI makes them fast enough to actually happen. 

Réflexions finales 

Salesforce’s approach to generative AI is pragmatic: 

  • Build trust first 
  • Ground AI in real data 
  • Empower admins with declarative tools 
  • Deliver value through everyday workflows 

Prompt engineering and Copilot are not about replacing users—they’re about removing busywork and amplifying human decision-making

fr_CAFrench