integration-patterns
Purpose
This document answers: How can Claude Agent SDK integrate with Salesforce to read uploaded files? Can the agent SDK run in Salesforce instance or does it need an API endpoint?
Executive Summary
The Claude Agent SDK CANNOT run directly inside Salesforce. It requires external hosting (Python 3.10+ or Node.js 18+) in a sandboxed container environment. Integration happens through:
- External hosting - Heroku, AWS, Azure, or other cloud platforms
- API bridge - REST APIs to access Salesforce ContentVersion for files
- MCP servers - Optional Model Context Protocol servers for tool integration
- Authentication - OAuth 2.0 flows for secure access
Claude Agent SDK: Architecture & Requirements
What Is Claude Agent SDK?
The Claude Agent SDK (formerly Claude Code SDK) is an open-source framework from Anthropic that enables building autonomous AI agents powered by Claude. The SDK provides:
- Autonomous tool orchestration
- Session management
- File reading/writing capabilities
- Command execution
- Web search integration
- Integrated agent loop for context gathering and verification
Available in Python and TypeScript/Node.js.
Runtime Requirements
Software Prerequisites:
- Python 3.10+ OR Node.js 18+
- Sandboxed container environment (Docker, gVisor, Firecracker, or Vercel Sandbox)
- Anthropic API key OR cloud provider credentials (AWS Bedrock, Google Vertex AI, Azure)
Why It Can’t Run in Salesforce:
- Salesforce runs Apex (Java-like language) and Lightning Web Components (JavaScript)
- No native Python or Node.js runtime inside Salesforce org
- Agent SDK needs filesystem access, command execution, and container isolation
- Salesforce Functions (serverless Node.js) is limited and doesn’t support full Agent SDK requirements
Conclusion: Agent SDK must run externally and integrate via APIs.
Salesforce File Access Architecture
Core Objects for File Management
Salesforce file handling uses four interconnected objects:
- ContentVersion - Stores actual file content and metadata
- ContentDocument - Represents the logical file (independent of versions)
- ContentDocumentLink - Shares files with users, records, and groups
- ContentDistribution - Creates public/authenticated download URLs
Key Fields:
ContentLocation:"S"(inside Salesforce),"E"(external),"L"(social network)VersionData: Binary file content (BLOB)PathOnClient: Original filenameTitle: Display nameContentDocumentId: Reference to parent document
REST API for File Access
Upload Files (Multipart POST):
POST /services/data/v66.0/sobjects/ContentVersionContent-Type: multipart/form-data
--boundaryContent-Disposition: form-data; name="entity_content"Content-Type: application/json
{ "PathOnClient": "example.pdf", "Title": "Example Document"}--boundaryContent-Disposition: form-data; name="VersionData"; filename="example.pdf"Content-Type: application/pdf
[binary data]--boundary--Download Files:
GET /services/data/v66.0/sobjects/ContentVersion/{id}/VersionDataAuthorization: Bearer {access_token}Download Multiple Files as ZIP:
GET /sfc/servlet.shepherd/version/download/{version1}/{version2}/...Query for Files:
SELECT Id, Title, FileExtension, ContentSize, ContentDocumentId, VersionDataFROM ContentVersionWHERE ContentDocumentId = '{documentId}'ORDER BY CreatedDate DESCLIMIT 1Maximum File Size:
- ContentVersion: 2 GB
- Other objects: 500 MB
Integration Pattern 1: Heroku + Heroku Connect
Overview
Host the Claude Agent SDK on Heroku (Node.js or Python dyno) and sync Salesforce data bidirectionally using Heroku Connect.
Architecture
Salesforce Org ↕ (Heroku Connect - bidirectional sync)Heroku Postgres Database ↕ (native Postgres connection)Node.js/Python App with Claude Agent SDK → (Anthropic API)Claude ModelHow It Works
- Heroku Connect continuously syncs Salesforce objects (including ContentVersion) to Heroku Postgres
- Claude Agent SDK queries Postgres directly for metadata
- For file content, SDK makes REST API calls to Salesforce ContentVersion endpoint
- Agent processes files and can write results back to Salesforce via Postgres sync
Implementation Steps
- Deploy Node.js or Python app to Heroku
- Add Heroku Connect add-on
- Configure sync for
ContentVersion,ContentDocument,ContentDocumentLink - Install JSforce (Node.js) or Simple-Salesforce (Python) for REST API calls
- Implement MCP server (optional) to expose Salesforce file tools to Agent SDK
- Configure OAuth 2.0 Connected App in Salesforce
Benefits
- Low latency - Postgres queries are fast
- Protects Salesforce API limits - Heroku Connect handles bulk sync
- Real-time sync - Changes flow bidirectionally
- Native Node.js/Python - Full Agent SDK support
Costs
- Heroku dyno: ~
25/month (Eco/Basic) - Heroku Postgres:
50/month - Heroku Connect: $100/month (handles sync)
- Container runtime: ~
36/month for 24/7)
Sources:
- Heroku Connect Overview
- Salesforce Heroku Integration Guide
- Integrating Heroku and Salesforce Platform
Integration Pattern 2: Heroku AppLink (2025-2026)
Overview
Heroku AppLink is a new add-on (announced 2025) that exposes Heroku apps as native API services in Salesforce, usable in Flow, Apex, and Agentforce.
Architecture
Salesforce Flow/Apex/Agentforce ↓ (AppLink - automatic auth & discovery)Heroku Node.js/Python Service ↓ (contains Claude Agent SDK)Claude ModelHow It Works
- Deploy Node.js/Python service with Claude Agent SDK to Heroku
- Add Heroku AppLink add-on
- AppLink auto-generates Salesforce Actions from your service endpoints
- Salesforce calls your service with automatic OAuth validation
- Service uses Agent SDK to process requests and access files via ContentVersion API
Key Features
- Automatic authentication - No stored credentials, short-lived tokens
- Service mesh - Managed bridge between Salesforce and Heroku
- Request validation - Built-in security
- SDK support - Primary focus on Node.js and Python
- Native Salesforce actions - Use in Flow, Apex, Agentforce
Implementation
- Create Heroku app with Node.js/Python + Agent SDK
- Define REST endpoints that accept Salesforce payloads
- Add AppLink add-on to expose endpoints
- Configure Connected App in Salesforce
- Use generated actions in Salesforce Flow or Apex
Benefits
- Native Salesforce integration - No custom API management
- Secure by default - Managed credentials
- Agentforce compatible - Can power Salesforce’s native agents
- Zero stored credentials - Short-lived tokens only
Sources:
Integration Pattern 3: Self-Hosted + Direct REST API
Overview
Host Claude Agent SDK on any cloud platform (AWS, Azure, GCP, self-hosted) and integrate via direct REST API calls to Salesforce.
Architecture
Docker Container (AWS/Azure/GCP/Self-Hosted) → Python/Node.js with Claude Agent SDK → REST API calls to Salesforce • OAuth 2.0 authentication • ContentVersion API for files • SOQL queries for metadataHow It Works
- Deploy containerized Agent SDK app (Docker, gVisor, Firecracker)
- Configure Salesforce Connected App with OAuth 2.0
- Implement authentication flow (Username-Password or JWT Bearer)
- Query files using SOQL and download via ContentVersion API
- Process with Agent SDK
- Write results back via Salesforce REST API
Implementation (Node.js Example)
const jsforce = require('jsforce');const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
// Authenticate with Salesforceconst conn = new jsforce.Connection({ oauth2: { clientId: process.env.SF_CLIENT_ID, clientSecret: process.env.SF_CLIENT_SECRET, redirectUri: process.env.SF_REDIRECT_URI }});
await conn.login(username, password);
// Query for filesconst files = await conn.query(` SELECT Id, Title, FileExtension, ContentDocumentId FROM ContentVersion WHERE Title LIKE 'Contract%' ORDER BY CreatedDate DESC`);
// Download file contentconst fileData = await conn.request(`/services/data/v66.0/sobjects/ContentVersion/${files.records[0].Id}/VersionData`);
// Process with Claude Agent SDKconst agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY, mcpServers: { salesforce: salesforceMcpServer }});
const result = await agent.query({ prompt: `Analyze this contract: ${fileData}`, maxTurns: 5});Implementation (Python Example)
from simple_salesforce import Salesforcefrom anthropic_agent_sdk import ClaudeAgentimport requests
# Authenticatesf = Salesforce( username=os.getenv('SF_USERNAME'), password=os.getenv('SF_PASSWORD'), security_token=os.getenv('SF_SECURITY_TOKEN'))
# Query filesfiles = sf.query(""" SELECT Id, Title, FileExtension, ContentDocumentId FROM ContentVersion WHERE Title LIKE 'Contract%' ORDER BY CreatedDate DESC LIMIT 10""")
# Download filefile_id = files['records'][0]['Id']file_url = f"{sf.base_url}/sobjects/ContentVersion/{file_id}/VersionData"headers = {'Authorization': f'Bearer {sf.session_id}'}file_content = requests.get(file_url, headers=headers).content
# Process with Agent SDKagent = ClaudeAgent(api_key=os.getenv('ANTHROPIC_API_KEY'))result = agent.query(prompt=f"Analyze this file: {file_content}")Benefits
- Full control - Any hosting provider
- Cost flexibility - Choose your infrastructure
- No vendor lock-in - Platform agnostic
- Custom security - Implement your own controls
Challenges
- Manual authentication - Must implement OAuth flow
- API limit management - Need to track Salesforce API usage
- No automatic sync - Must poll or use webhooks
Sources:
Official Salesforce MCP Servers
Salesforce DX MCP Server (Official)
Repository: salesforcecli/mcp
The official Salesforce DX MCP Server provides 60+ tools across toolsets for developer workflows:
| Toolset | Purpose |
|---|---|
orgs | Manage authorized orgs |
metadata | Deploy/retrieve metadata |
data | Run SOQL queries |
users | Assign permission sets |
testing | Run Apex/agent tests |
lwc-experts | LWC development & testing |
code-analysis | Static code analysis |
mobile | Mobile LWC development |
File Access Capability: The DX MCP Server does NOT support ContentVersion file downloads or binary file access. The data toolset can run SOQL queries against ContentVersion (metadata only), but there is no dedicated file download tool.
Salesforce Hosted MCP Servers (Beta → GA Feb 2026)
Salesforce is rolling out platform-level hosted MCP servers:
- Zero code - Exposes Salesforce APIs as MCP tools without writing code
- Enterprise security - Centralized registry, policy enforcement, rate limiting
- AgentExchange - Curated catalog of vetted MCP servers
- Agentforce native MCP client - In pilot, enables agents to connect to any MCP server
GA targeted for February 2026. File access capabilities at GA are TBD.
MuleSoft MCP Server
Convert any API or existing Mule application to an MCP server via Anypoint Platform. This could wrap the ContentVersion REST API for file access.
Third-Party MCP Servers with File Access
| Server | ContentVersion Support | Notes |
|---|---|---|
| CData MCP Server | Yes | Explicit ContentVersion support including file versioning |
| AiondaDotCom MCP | Yes | Full CRUD + file backup/download of ContentVersions, Attachments, Documents |
| tsmztech MCP | Yes | Generic CRUD, can query/modify any Salesforce object |
Bottom Line: For file access, use a third-party MCP server or build a lightweight custom one (~50 lines of code wrapping ContentVersion /VersionData).
Sources:
- Official Salesforce DX MCP Server
- Introducing MCP Support Across Salesforce
- Salesforce Hosted MCP Servers Beta
- Agentforce MCP Support
Integration Pattern 4: MCP Server as Bridge
Overview
Build a custom Model Context Protocol (MCP) server that acts as a bridge between Claude Agent SDK and Salesforce.
Architecture
Claude Agent SDK ↓ (MCP protocol)Custom MCP Server ↓ (REST API / JSforce)Salesforce ContentVersion APIHow It Works
- Create MCP server with tools like
salesforce_list_files,salesforce_read_file,salesforce_upload_file - MCP server handles Salesforce authentication (OAuth)
- Agent SDK calls MCP tools by name:
mcp__salesforce__read_file - MCP server translates to Salesforce API calls
- Returns file content to agent
MCP Server Implementation (Conceptual)
import { MCPServer, Tool } from '@anthropic-ai/mcp-sdk';import jsforce from 'jsforce';
const server = new MCPServer({ name: 'salesforce', version: '1.0.0', tools: [ { name: 'list_files', description: 'List files in Salesforce', inputSchema: { type: 'object', properties: { query: { type: 'string' } } }, execute: async (input) => { const conn = await authenticateWithSalesforce(); const result = await conn.query(` SELECT Id, Title, FileExtension, ContentSize FROM ContentVersion WHERE Title LIKE '%${input.query}%' `); return result.records; } }, { name: 'read_file', description: 'Read file content from Salesforce', inputSchema: { type: 'object', properties: { fileId: { type: 'string' } }, required: ['fileId'] }, execute: async (input) => { const conn = await authenticateWithSalesforce(); const url = `/services/data/v66.0/sobjects/ContentVersion/${input.fileId}/VersionData`; const content = await conn.request(url); return { content }; } } ]});
server.listen();Agent SDK Configuration
from anthropic_agent_sdk import ClaudeAgent
agent = ClaudeAgent( api_key=os.getenv('ANTHROPIC_API_KEY'), mcp_servers={ 'salesforce': { 'type': 'stdio', 'command': 'node', 'args': ['mcp-salesforce-server.js'] } })
# Agent can now use Salesforce toolsresult = agent.query( prompt="List all contract files in Salesforce and summarize the latest one", allow_tools=['mcp__salesforce__list_files', 'mcp__salesforce__read_file'])Benefits
- Abstraction - Agent doesn’t need Salesforce knowledge
- Reusable - MCP server can be used by multiple agents
- Permission control - Fine-grained tool permissions
- Authentication isolation - Credentials managed in MCP server only
MCP Server Hosting Options
- In-process SDK server - Run MCP server in same Python/Node.js process
- External subprocess - Separate process communicating via stdio
- Remote HTTP/SSE server - Deploy MCP server separately, access via HTTP
Sources:
Integration Pattern 5: Event-Driven with Platform Events
Overview
Use Salesforce Platform Events for event-driven integration where Salesforce triggers the Agent SDK.
Architecture
Salesforce Org → Platform Event published (e.g., FileUploadedEvent) → External subscriber (Node.js/Python listener) → Triggers Claude Agent SDK → Processes file → Writes results back to SalesforceHow It Works
- Define Platform Event in Salesforce (e.g.,
File_Uploaded__e) - Create Apex trigger that publishes event when file is uploaded
- External service subscribes to event via Streaming API or CometD
- When event received, trigger Agent SDK workflow
- Agent fetches file via ContentVersion API and processes
- Write results back to Salesforce via REST API
Implementation (Node.js)
const jsforce = require('jsforce');const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
const conn = new jsforce.Connection({ /* auth */ });
// Subscribe to Platform Eventconn.streaming.topic('/event/File_Uploaded__e').subscribe((message) => { const fileId = message.payload.File_Id__c;
// Trigger agent workflow processFileWithAgent(fileId);});
async function processFileWithAgent(fileId) { const fileData = await conn.request(`/services/data/v66.0/sobjects/ContentVersion/${fileId}/VersionData`);
const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY }); const result = await agent.query({ prompt: `Process this file: ${fileData}` });
// Write result back to Salesforce await conn.sobject('File_Analysis__c').create({ File_Id__c: fileId, Analysis_Result__c: result.output });}Benefits
- Real-time - Immediate response to file uploads
- Decoupled - Salesforce and Agent SDK don’t need tight coupling
- Scalable - Handle high-volume events
- Asynchronous - Non-blocking operations
Sources:
Integration Pattern 6: AWS/EKS Self-Hosted with Salesforce File Push
Overview
Host the Claude Agent SDK on AWS EKS (or any self-hosted Kubernetes) and expose a REST API endpoint that Salesforce calls directly, optionally pushing the file content in the request body.
Can Salesforce Push the File Directly?
Yes, but with limitations. Salesforce can send file binary data to an external endpoint via Apex callout using setBodyAsBlob(). However, there are critical governor limits:
| Constraint | Limit |
|---|---|
| Heap size (synchronous) | 6 MB |
| Heap size (asynchronous) | 12 MB |
| Single callout timeout | 120 seconds |
| Total callout timeout per transaction | 120 seconds |
| Max callouts per transaction | 100 |
| Max request/response size | 12 MB (async) |
This means files > ~6 MB cannot be pushed directly from Salesforce. For larger files, use the hybrid push-notification + pull pattern.
Architecture Option A: Push Model (Small Files < 6 MB)
Salesforce Org → Apex Trigger on ContentVersion insert → Apex Callout (async @future or Queueable) → POST https://your-eks-endpoint.com/api/process-file → Body: binary file via setBodyAsBlob() → Headers: file metadata (title, extension, record ID)
AWS EKS → REST API receives file + metadata → Passes to Claude Agent SDK → Returns result → (Optional) Writes back to Salesforce via REST APIApex Implementation (Push Model)
// Trigger: fires when a new file is uploadedtrigger ContentVersionTrigger on ContentVersion (after insert) { for (ContentVersion cv : Trigger.new) { // Call async to avoid "uncommitted work" error FileProcessorService.processFileAsync(cv.Id); }}
// Async servicepublic class FileProcessorService { @future(callout=true) public static void processFileAsync(Id contentVersionId) { ContentVersion cv = [ SELECT Id, Title, FileExtension, ContentSize, VersionData, ContentDocumentId FROM ContentVersion WHERE Id = :contentVersionId LIMIT 1 ];
// Guard: only push files under 6 MB if (cv.ContentSize > 6000000) { // Fall back to notification-only (let external service pull) sendNotificationOnly(cv); return; }
HttpRequest req = new HttpRequest(); req.setEndpoint('callout:Claude_Agent_Endpoint/api/process-file'); req.setMethod('POST'); req.setHeader('Content-Type', 'application/octet-stream'); req.setHeader('X-File-Title', cv.Title); req.setHeader('X-File-Extension', cv.FileExtension); req.setHeader('X-ContentVersion-Id', cv.Id); req.setHeader('X-ContentDocument-Id', cv.ContentDocumentId); req.setBodyAsBlob(cv.VersionData); req.setTimeout(120000);
Http http = new Http(); HttpResponse res = http.send(req);
if (res.getStatusCode() == 200) { // Parse response and create analysis record Map<String, Object> result = (Map<String, Object>) JSON.deserializeUntyped(res.getBody()); File_Analysis__c analysis = new File_Analysis__c( ContentVersion_Id__c = cv.Id, Analysis_Result__c = (String) result.get('analysis') ); insert analysis; } }
private static void sendNotificationOnly(ContentVersion cv) { HttpRequest req = new HttpRequest(); req.setEndpoint('callout:Claude_Agent_Endpoint/api/process-file-notification'); req.setMethod('POST'); req.setHeader('Content-Type', 'application/json'); req.setBody(JSON.serialize(new Map<String, String>{ 'contentVersionId' => cv.Id, 'contentDocumentId' => cv.ContentDocumentId, 'title' => cv.Title, 'extension' => cv.FileExtension, 'size' => String.valueOf(cv.ContentSize) })); new Http().send(req); }}Architecture Option B: Hybrid Push-Notification + Pull (Any File Size)
Recommended for production. Salesforce sends a lightweight notification with file metadata, and the AWS endpoint pulls the file directly from Salesforce.
Salesforce Org → Apex Trigger / Flow on ContentVersion insert → POST notification to AWS endpoint → JSON: { contentVersionId, title, size, extension } → NO binary file in body
AWS EKS → REST API receives notification → Authenticates to Salesforce (OAuth 2.0 JWT Bearer) → GET /services/data/v66.0/sobjects/ContentVersion/{id}/VersionData → Downloads file (no size limit via REST API) → Passes to Claude Agent SDK → Returns result to Salesforce via REST APIAWS Endpoint Implementation (Node.js)
const express = require('express');const jsforce = require('jsforce');const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
const app = express();
// Option A: Receive file directly (small files)app.post('/api/process-file', express.raw({ limit: '12mb', type: '*/*' }), async (req, res) => { const fileBuffer = req.body; const title = req.headers['x-file-title']; const extension = req.headers['x-file-extension']; const cvId = req.headers['x-contentversion-id'];
const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY }); const result = await agent.query({ prompt: `Analyze this ${extension} file titled "${title}": ${fileBuffer.toString('base64')}`, maxTurns: 5 });
res.json({ analysis: result.output, contentVersionId: cvId });});
// Option B: Receive notification, pull file from Salesforceapp.post('/api/process-file-notification', express.json(), async (req, res) => { const { contentVersionId, title, extension, size } = req.body;
// Acknowledge immediately (Salesforce has callout timeout limits) res.json({ status: 'accepted', contentVersionId });
// Process asynchronously setImmediate(async () => { // Authenticate to Salesforce const conn = new jsforce.Connection({ /* OAuth config */ }); await conn.login(username, password);
// Pull file from Salesforce (no size limit) const fileData = await conn.request( `/services/data/v66.0/sobjects/ContentVersion/${contentVersionId}/VersionData` );
// Process with Agent SDK const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY }); const result = await agent.query({ prompt: `Analyze this ${extension} file: ${fileData}`, maxTurns: 5 });
// Write results back to Salesforce await conn.sobject('File_Analysis__c').create({ ContentVersion_Id__c: contentVersionId, Analysis_Result__c: result.output }); });});
app.listen(3000);Architecture Option C: Flow-Driven (No-Code Trigger)
Use Salesforce Flow instead of Apex for simpler setups:
Salesforce Flow Builder → Record-Triggered Flow on ContentDocumentLink (after insert) → HTTP Callout Action (or Outbound Message) → POST to AWS endpoint with file metadata
AWS EKS → Receives notification → Pulls file via REST API → Processes with Agent SDKNote: Outbound Messages use SOAP format. For REST/JSON, use Apex HTTP Callout or Streamscript for Flow.
Push vs Pull: Decision Matrix
| Factor | Push (Apex sends file) | Hybrid (notify + pull) | Pull (agent polls/subscribes) |
|---|---|---|---|
| File size limit | 6 MB sync / 12 MB async | Unlimited (REST API) | Unlimited (REST API) |
| Latency | Lowest (single hop) | Medium (2 hops) | Highest (polling interval) |
| SF API usage | 1 callout | 1 callout + 1 inbound REST | 1+ inbound REST per poll |
| Complexity | Simple | Medium | Medium-High |
| Reliability | Dependent on callout success | Can retry pull independently | Depends on polling strategy |
| Best for | Small text files, metadata | Production workloads, any file size | Batch processing, scheduled jobs |
Recommendation for AWS/EKS
Use the Hybrid model (Option B):
- Salesforce sends lightweight JSON notification on file upload
- AWS endpoint acknowledges immediately (within SF callout timeout)
- AWS pulls file from Salesforce asynchronously (no size limit)
- Agent SDK processes file
- Results written back to Salesforce
This gives you:
- No file size constraints
- Decoupled processing (can queue and retry)
- Minimal Salesforce governor limit impact
- Full control over processing pipeline on EKS
AWS/EKS Specific Considerations
- Ingress: Use AWS ALB Ingress Controller for HTTPS termination
- Authentication: Validate Salesforce callout identity via Named Credentials + client certificate or shared secret
- Scaling: EKS HPA can scale pods based on request volume
- Storage: Use EBS or EFS for temporary file storage during processing
- Secrets: Store Salesforce OAuth credentials in AWS Secrets Manager
- Networking: Whitelist Salesforce IP ranges or use AWS PrivateLink
Sources:
- Post File From Salesforce Apex to External HTTP Webservices
- Apex REST Callouts (Trailhead)
- Salesforce Governor Limits for Files
- Sending Outbound Messages from Flow
Recommended Architecture
For AWS/EKS: Hybrid Notification + Pull with Agent SDK
┌─────────────────────────────────────────────────────────┐│ Salesforce Org ││ • ContentVersion (files stored) ││ • Apex Trigger on ContentVersion insert ││ │ → Small files (<6MB): Push binary via callout ││ │ → Large files (>6MB): Send JSON notification only │└──────────────────┬──────────────────────────────────────┘ │ │ HTTPS (Named Credentials / OAuth) ▼┌─────────────────────────────────────────────────────────┐│ AWS EKS Cluster ││ ┌─────────────────────────────────────────────────┐ ││ │ ALB Ingress (HTTPS termination) │ ││ └──────────┬──────────────────────────────────────┘ ││ ▼ ││ ┌─────────────────────────────────────────────────┐ ││ │ Node.js/Python Service (Pod) │ ││ │ • POST /api/process-file (receives push) │ ││ │ • POST /api/process-notification (hybrid) │ ││ │ │ │ ││ │ ├→ Pull file from SF ContentVersion API ────────┼───┼→ Salesforce REST API│ │ │ (OAuth 2.0 JWT Bearer, no size limit) │ ││ │ │ │ ││ │ ├→ Claude Agent SDK │ ││ │ │ • Process file content │ ││ │ │ • (Optional) MCP server for SF tools │ ││ │ │ │ ││ │ └→ Write results back to Salesforce ────────────┼───┼→ Salesforce REST API│ └─────────────────────────────────────────────────┘ ││ ││ AWS Secrets Manager (SF OAuth creds, API keys) ││ CloudWatch (logging + monitoring) │└─────────────────────────────────────────────────────────┘Why This Architecture?
- Hybrid push/pull - Small files arrive instantly; large files pulled without SF governor limits
- AWS/EKS - Full control, existing infrastructure, auto-scaling via HPA
- No Heroku dependency - Self-hosted, platform-agnostic approach
- MCP Server (optional) - Can run in-process for additional Salesforce tool access
- OAuth 2.0 JWT Bearer - Server-to-server auth, no stored passwords
- ContentVersion API - Direct file access with 2GB limit (pull model)
For Heroku Users: AppLink + MCP Server
┌─────────────────────────────────────────────────────────┐│ Salesforce Org ││ • Flow / Apex / Agentforce ││ • ContentVersion (files stored) │└──────────────────┬──────────────────────────────────────┘ │ │ Heroku AppLink │ (auto auth, service discovery) ▼┌─────────────────────────────────────────────────────────┐│ Heroku Dyno (Node.js/Python) ││ ┌────────────────────────────────────────────────┐ ││ │ Claude Agent SDK │ ││ │ • Session management │ ││ │ • Tool orchestration │ ││ └──────────┬─────────────────────────────────────┘ ││ │ ││ │ MCP Protocol ││ ▼ ││ ┌────────────────────────────────────────────────┐ ││ │ Custom MCP Server (Salesforce Bridge) │ ││ │ • salesforce_list_files │ ││ │ • salesforce_read_file │ ││ │ • salesforce_upload_file │ ││ │ • salesforce_query │ ││ └──────────┬─────────────────────────────────────┘ │└─────────────┼──────────────────────────────────────────┘ │ │ REST API (OAuth 2.0) ▼┌─────────────────────────────────────────────────────────┐│ Salesforce REST API ││ • /services/data/v66.0/sobjects/ContentVersion/... ││ • /services/data/v66.0/query?q=SELECT... │└─────────────────────────────────────────────────────────┘Authentication Patterns
OAuth 2.0 Username-Password Flow
IMPORTANT: SOAP login() API is deprecated and will be retired in Summer ‘27. Use OAuth 2.0.
const jsforce = require('jsforce');
const conn = new jsforce.Connection({ oauth2: { clientId: process.env.SF_CLIENT_ID, clientSecret: process.env.SF_CLIENT_SECRET, redirectUri: 'http://localhost:3000/oauth/callback' }});
await conn.login(username, password + securityToken);OAuth 2.0 JWT Bearer Flow
For server-to-server integration:
import jwtimport requestsimport time
# Create JWTpayload = { 'iss': client_id, 'sub': username, 'aud': 'https://login.salesforce.com', 'exp': int(time.time()) + 300}
encoded_jwt = jwt.encode(payload, private_key, algorithm='RS256')
# Exchange for access tokenresponse = requests.post('https://login.salesforce.com/services/oauth2/token', data={ 'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer', 'assertion': encoded_jwt})
access_token = response.json()['access_token']Security Best Practices (2026)
- ✅ Use OAuth 2.0 (not SOAP login)
- ✅ Implement short-lived tokens (AppLink provides this)
- ✅ Run Agent SDK in sandboxed containers (Docker, gVisor, Firecracker)
- ✅ Use ephemeral containers for one-off tasks
- ✅ Implement network controls - restrict outbound connections
- ✅ Use MCP servers for credential isolation
- ✅ Set maxTurns to prevent infinite loops
- ✅ Monitor token costs - dominant expense
- ✅ Implement logging for audit trails
- ✅ Use ContentDistribution with expiry for public URLs
Performance Considerations
File Size Limits
- ContentVersion: 2 GB max
- Other objects: 500 MB max
API Limits
- Salesforce enforces API call limits (varies by edition)
- Use Heroku Connect for bulk operations to preserve limits
- Implement exponential backoff for rate limiting
Latency
- Heroku Connect: Low latency (Postgres local)
- Direct REST API: ~100-300ms per call
- Bulk API: Single call for large datasets
Token Budgets
- Claude Agent SDK uses
cl100k_basetoken counting - Monitor context window consumption with MCP tools
- Keep MCP tools under 80 total (under 10 servers recommended)
Cost Analysis
| Component | Monthly Cost |
|---|---|
| Heroku Eco Dyno | $7 |
| Heroku Postgres (Mini) | $9 |
| Heroku Connect | $100 |
| Container runtime (24/7) | ~$36 |
| Infrastructure Total | ~$152 |
| Claude API (Sonnet) | Variable (~ |
| Salesforce API calls | Included in edition (or |
Note: Token costs typically dominate infrastructure costs.
Deployment Checklist
- Choose hosting platform (Heroku recommended)
- Set up Salesforce Connected App with OAuth 2.0
- Configure OAuth credentials (client ID, secret, redirect URI)
- Deploy Agent SDK application (Python or Node.js)
- Implement MCP server for Salesforce integration (optional but recommended)
- Configure Heroku AppLink (if using Heroku)
- Test file upload to ContentVersion
- Test file download via REST API
- Test agent workflow end-to-end
- Set up monitoring and logging
- Configure maxTurns and timeout limits
- Implement error handling and retries
- Document API usage patterns
- Set up security controls (network restrictions, etc.)
Example: Complete Workflow
User uploads file to Salesforce
- File stored as ContentVersion object
- Platform Event published (optional)
Agent SDK triggered (via AppLink Action or API call)
- Heroku receives request from Salesforce
- Agent SDK initialized with MCP Salesforce server
- Agent calls
mcp__salesforce__list_filestool - MCP server authenticates and queries ContentVersion
- Agent calls
mcp__salesforce__read_filewith file ID - MCP server downloads file via REST API
- Agent processes file content
- Agent returns analysis to Heroku app
- Heroku app writes results back to Salesforce via REST API
Results visible in Salesforce
- Custom object record created with analysis
- User sees results in Salesforce UI
Conclusion
While Claude Agent SDK cannot run directly inside Salesforce, there are robust integration patterns available:
Best for 2026: Heroku AppLink + MCP Server
- Native Salesforce integration
- Automatic authentication
- Works with Agentforce
- Full Agent SDK capabilities
Alternative: Self-hosted + Direct REST API
- Platform agnostic
- Full control
- Custom security
Both patterns enable Claude Agent SDK to access Salesforce files via the ContentVersion API, providing powerful AI capabilities for Salesforce users.
Sources
Claude Agent SDK
- Agent SDK Overview
- Hosting the Agent SDK
- Secure Deployment Guide
- MCP in the Agent SDK
- Building Agents with Claude Agent SDK
- Enterprise MCP Deployment Guide
Salesforce Integration
- Salesforce File Upload Architecture
- ContentVersion Object Reference
- How to Upload Files via REST API
Heroku Integration
- Heroku Connect Overview
- Integrating Heroku and Salesforce
- Heroku AppLink Announcement
- Securing Salesforce with AppLink
- Salesforce Heroku Integration (Trailhead)
Node.js Integration
Official MCP Servers
- Official Salesforce DX MCP Server
- Introducing MCP Support Across Salesforce
- Salesforce Hosted MCP Servers Beta
- Agentforce MCP Support
- CData MCP Server - ContentVersion
- AiondaDotCom MCP Salesforce