Purpose

This document answers: How can Claude Agent SDK integrate with Salesforce to read uploaded files? Can the agent SDK run in Salesforce instance or does it need an API endpoint?

Executive Summary

The Claude Agent SDK CANNOT run directly inside Salesforce. It requires external hosting (Python 3.10+ or Node.js 18+) in a sandboxed container environment. Integration happens through:

  1. External hosting - Heroku, AWS, Azure, or other cloud platforms
  2. API bridge - REST APIs to access Salesforce ContentVersion for files
  3. MCP servers - Optional Model Context Protocol servers for tool integration
  4. Authentication - OAuth 2.0 flows for secure access

Claude Agent SDK: Architecture & Requirements

What Is Claude Agent SDK?

The Claude Agent SDK (formerly Claude Code SDK) is an open-source framework from Anthropic that enables building autonomous AI agents powered by Claude. The SDK provides:

  • Autonomous tool orchestration
  • Session management
  • File reading/writing capabilities
  • Command execution
  • Web search integration
  • Integrated agent loop for context gathering and verification

Available in Python and TypeScript/Node.js.

Runtime Requirements

Software Prerequisites:

  • Python 3.10+ OR Node.js 18+
  • Sandboxed container environment (Docker, gVisor, Firecracker, or Vercel Sandbox)
  • Anthropic API key OR cloud provider credentials (AWS Bedrock, Google Vertex AI, Azure)

Why It Can’t Run in Salesforce:

  • Salesforce runs Apex (Java-like language) and Lightning Web Components (JavaScript)
  • No native Python or Node.js runtime inside Salesforce org
  • Agent SDK needs filesystem access, command execution, and container isolation
  • Salesforce Functions (serverless Node.js) is limited and doesn’t support full Agent SDK requirements

Conclusion: Agent SDK must run externally and integrate via APIs.

Salesforce File Access Architecture

Core Objects for File Management

Salesforce file handling uses four interconnected objects:

  1. ContentVersion - Stores actual file content and metadata
  2. ContentDocument - Represents the logical file (independent of versions)
  3. ContentDocumentLink - Shares files with users, records, and groups
  4. ContentDistribution - Creates public/authenticated download URLs

Key Fields:

  • ContentLocation: "S" (inside Salesforce), "E" (external), "L" (social network)
  • VersionData: Binary file content (BLOB)
  • PathOnClient: Original filename
  • Title: Display name
  • ContentDocumentId: Reference to parent document

REST API for File Access

Upload Files (Multipart POST):

POST /services/data/v66.0/sobjects/ContentVersion
Content-Type: multipart/form-data
--boundary
Content-Disposition: form-data; name="entity_content"
Content-Type: application/json
{
"PathOnClient": "example.pdf",
"Title": "Example Document"
}
--boundary
Content-Disposition: form-data; name="VersionData"; filename="example.pdf"
Content-Type: application/pdf
[binary data]
--boundary--

Download Files:

GET /services/data/v66.0/sobjects/ContentVersion/{id}/VersionData
Authorization: Bearer {access_token}

Download Multiple Files as ZIP:

GET /sfc/servlet.shepherd/version/download/{version1}/{version2}/...

Query for Files:

SELECT Id, Title, FileExtension, ContentSize, ContentDocumentId, VersionData
FROM ContentVersion
WHERE ContentDocumentId = '{documentId}'
ORDER BY CreatedDate DESC
LIMIT 1

Maximum File Size:

  • ContentVersion: 2 GB
  • Other objects: 500 MB

Integration Pattern 1: Heroku + Heroku Connect

Overview

Host the Claude Agent SDK on Heroku (Node.js or Python dyno) and sync Salesforce data bidirectionally using Heroku Connect.

Architecture

Salesforce Org
↕ (Heroku Connect - bidirectional sync)
Heroku Postgres Database
↕ (native Postgres connection)
Node.js/Python App with Claude Agent SDK
→ (Anthropic API)
Claude Model

How It Works

  1. Heroku Connect continuously syncs Salesforce objects (including ContentVersion) to Heroku Postgres
  2. Claude Agent SDK queries Postgres directly for metadata
  3. For file content, SDK makes REST API calls to Salesforce ContentVersion endpoint
  4. Agent processes files and can write results back to Salesforce via Postgres sync

Implementation Steps

  1. Deploy Node.js or Python app to Heroku
  2. Add Heroku Connect add-on
  3. Configure sync for ContentVersion, ContentDocument, ContentDocumentLink
  4. Install JSforce (Node.js) or Simple-Salesforce (Python) for REST API calls
  5. Implement MCP server (optional) to expose Salesforce file tools to Agent SDK
  6. Configure OAuth 2.0 Connected App in Salesforce

Benefits

  • Low latency - Postgres queries are fast
  • Protects Salesforce API limits - Heroku Connect handles bulk sync
  • Real-time sync - Changes flow bidirectionally
  • Native Node.js/Python - Full Agent SDK support

Costs

  • Heroku dyno: ~25/month (Eco/Basic)
  • Heroku Postgres: 50/month
  • Heroku Connect: $100/month (handles sync)
  • Container runtime: ~36/month for 24/7)

Sources:

Overview

Heroku AppLink is a new add-on (announced 2025) that exposes Heroku apps as native API services in Salesforce, usable in Flow, Apex, and Agentforce.

Architecture

Salesforce Flow/Apex/Agentforce
↓ (AppLink - automatic auth & discovery)
Heroku Node.js/Python Service
↓ (contains Claude Agent SDK)
Claude Model

How It Works

  1. Deploy Node.js/Python service with Claude Agent SDK to Heroku
  2. Add Heroku AppLink add-on
  3. AppLink auto-generates Salesforce Actions from your service endpoints
  4. Salesforce calls your service with automatic OAuth validation
  5. Service uses Agent SDK to process requests and access files via ContentVersion API

Key Features

  • Automatic authentication - No stored credentials, short-lived tokens
  • Service mesh - Managed bridge between Salesforce and Heroku
  • Request validation - Built-in security
  • SDK support - Primary focus on Node.js and Python
  • Native Salesforce actions - Use in Flow, Apex, Agentforce

Implementation

  1. Create Heroku app with Node.js/Python + Agent SDK
  2. Define REST endpoints that accept Salesforce payloads
  3. Add AppLink add-on to expose endpoints
  4. Configure Connected App in Salesforce
  5. Use generated actions in Salesforce Flow or Apex

Benefits

  • Native Salesforce integration - No custom API management
  • Secure by default - Managed credentials
  • Agentforce compatible - Can power Salesforce’s native agents
  • Zero stored credentials - Short-lived tokens only

Sources:

Integration Pattern 3: Self-Hosted + Direct REST API

Overview

Host Claude Agent SDK on any cloud platform (AWS, Azure, GCP, self-hosted) and integrate via direct REST API calls to Salesforce.

Architecture

Docker Container (AWS/Azure/GCP/Self-Hosted)
→ Python/Node.js with Claude Agent SDK
→ REST API calls to Salesforce
• OAuth 2.0 authentication
• ContentVersion API for files
• SOQL queries for metadata

How It Works

  1. Deploy containerized Agent SDK app (Docker, gVisor, Firecracker)
  2. Configure Salesforce Connected App with OAuth 2.0
  3. Implement authentication flow (Username-Password or JWT Bearer)
  4. Query files using SOQL and download via ContentVersion API
  5. Process with Agent SDK
  6. Write results back via Salesforce REST API

Implementation (Node.js Example)

const jsforce = require('jsforce');
const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
// Authenticate with Salesforce
const conn = new jsforce.Connection({
oauth2: {
clientId: process.env.SF_CLIENT_ID,
clientSecret: process.env.SF_CLIENT_SECRET,
redirectUri: process.env.SF_REDIRECT_URI
}
});
await conn.login(username, password);
// Query for files
const files = await conn.query(`
SELECT Id, Title, FileExtension, ContentDocumentId
FROM ContentVersion
WHERE Title LIKE 'Contract%'
ORDER BY CreatedDate DESC
`);
// Download file content
const fileData = await conn.request(`/services/data/v66.0/sobjects/ContentVersion/${files.records[0].Id}/VersionData`);
// Process with Claude Agent SDK
const agent = new ClaudeAgent({
apiKey: process.env.ANTHROPIC_API_KEY,
mcpServers: {
salesforce: salesforceMcpServer
}
});
const result = await agent.query({
prompt: `Analyze this contract: ${fileData}`,
maxTurns: 5
});

Implementation (Python Example)

from simple_salesforce import Salesforce
from anthropic_agent_sdk import ClaudeAgent
import requests
# Authenticate
sf = Salesforce(
username=os.getenv('SF_USERNAME'),
password=os.getenv('SF_PASSWORD'),
security_token=os.getenv('SF_SECURITY_TOKEN')
)
# Query files
files = sf.query("""
SELECT Id, Title, FileExtension, ContentDocumentId
FROM ContentVersion
WHERE Title LIKE 'Contract%'
ORDER BY CreatedDate DESC
LIMIT 10
""")
# Download file
file_id = files['records'][0]['Id']
file_url = f"{sf.base_url}/sobjects/ContentVersion/{file_id}/VersionData"
headers = {'Authorization': f'Bearer {sf.session_id}'}
file_content = requests.get(file_url, headers=headers).content
# Process with Agent SDK
agent = ClaudeAgent(api_key=os.getenv('ANTHROPIC_API_KEY'))
result = agent.query(prompt=f"Analyze this file: {file_content}")

Benefits

  • Full control - Any hosting provider
  • Cost flexibility - Choose your infrastructure
  • No vendor lock-in - Platform agnostic
  • Custom security - Implement your own controls

Challenges

  • Manual authentication - Must implement OAuth flow
  • API limit management - Need to track Salesforce API usage
  • No automatic sync - Must poll or use webhooks

Sources:

Official Salesforce MCP Servers

Salesforce DX MCP Server (Official)

Repository: salesforcecli/mcp

The official Salesforce DX MCP Server provides 60+ tools across toolsets for developer workflows:

ToolsetPurpose
orgsManage authorized orgs
metadataDeploy/retrieve metadata
dataRun SOQL queries
usersAssign permission sets
testingRun Apex/agent tests
lwc-expertsLWC development & testing
code-analysisStatic code analysis
mobileMobile LWC development

File Access Capability: The DX MCP Server does NOT support ContentVersion file downloads or binary file access. The data toolset can run SOQL queries against ContentVersion (metadata only), but there is no dedicated file download tool.

Salesforce Hosted MCP Servers (Beta → GA Feb 2026)

Salesforce is rolling out platform-level hosted MCP servers:

  • Zero code - Exposes Salesforce APIs as MCP tools without writing code
  • Enterprise security - Centralized registry, policy enforcement, rate limiting
  • AgentExchange - Curated catalog of vetted MCP servers
  • Agentforce native MCP client - In pilot, enables agents to connect to any MCP server

GA targeted for February 2026. File access capabilities at GA are TBD.

MuleSoft MCP Server

Convert any API or existing Mule application to an MCP server via Anypoint Platform. This could wrap the ContentVersion REST API for file access.

Third-Party MCP Servers with File Access

ServerContentVersion SupportNotes
CData MCP ServerYesExplicit ContentVersion support including file versioning
AiondaDotCom MCPYesFull CRUD + file backup/download of ContentVersions, Attachments, Documents
tsmztech MCPYesGeneric CRUD, can query/modify any Salesforce object

Bottom Line: For file access, use a third-party MCP server or build a lightweight custom one (~50 lines of code wrapping ContentVersion /VersionData).

Sources:

Integration Pattern 4: MCP Server as Bridge

Overview

Build a custom Model Context Protocol (MCP) server that acts as a bridge between Claude Agent SDK and Salesforce.

Architecture

Claude Agent SDK
↓ (MCP protocol)
Custom MCP Server
↓ (REST API / JSforce)
Salesforce ContentVersion API

How It Works

  1. Create MCP server with tools like salesforce_list_files, salesforce_read_file, salesforce_upload_file
  2. MCP server handles Salesforce authentication (OAuth)
  3. Agent SDK calls MCP tools by name: mcp__salesforce__read_file
  4. MCP server translates to Salesforce API calls
  5. Returns file content to agent

MCP Server Implementation (Conceptual)

mcp-salesforce-server.ts
import { MCPServer, Tool } from '@anthropic-ai/mcp-sdk';
import jsforce from 'jsforce';
const server = new MCPServer({
name: 'salesforce',
version: '1.0.0',
tools: [
{
name: 'list_files',
description: 'List files in Salesforce',
inputSchema: {
type: 'object',
properties: {
query: { type: 'string' }
}
},
execute: async (input) => {
const conn = await authenticateWithSalesforce();
const result = await conn.query(`
SELECT Id, Title, FileExtension, ContentSize
FROM ContentVersion
WHERE Title LIKE '%${input.query}%'
`);
return result.records;
}
},
{
name: 'read_file',
description: 'Read file content from Salesforce',
inputSchema: {
type: 'object',
properties: {
fileId: { type: 'string' }
},
required: ['fileId']
},
execute: async (input) => {
const conn = await authenticateWithSalesforce();
const url = `/services/data/v66.0/sobjects/ContentVersion/${input.fileId}/VersionData`;
const content = await conn.request(url);
return { content };
}
}
]
});
server.listen();

Agent SDK Configuration

from anthropic_agent_sdk import ClaudeAgent
agent = ClaudeAgent(
api_key=os.getenv('ANTHROPIC_API_KEY'),
mcp_servers={
'salesforce': {
'type': 'stdio',
'command': 'node',
'args': ['mcp-salesforce-server.js']
}
}
)
# Agent can now use Salesforce tools
result = agent.query(
prompt="List all contract files in Salesforce and summarize the latest one",
allow_tools=['mcp__salesforce__list_files', 'mcp__salesforce__read_file']
)

Benefits

  • Abstraction - Agent doesn’t need Salesforce knowledge
  • Reusable - MCP server can be used by multiple agents
  • Permission control - Fine-grained tool permissions
  • Authentication isolation - Credentials managed in MCP server only

MCP Server Hosting Options

  1. In-process SDK server - Run MCP server in same Python/Node.js process
  2. External subprocess - Separate process communicating via stdio
  3. Remote HTTP/SSE server - Deploy MCP server separately, access via HTTP

Sources:

Integration Pattern 5: Event-Driven with Platform Events

Overview

Use Salesforce Platform Events for event-driven integration where Salesforce triggers the Agent SDK.

Architecture

Salesforce Org
→ Platform Event published (e.g., FileUploadedEvent)
→ External subscriber (Node.js/Python listener)
→ Triggers Claude Agent SDK
→ Processes file
→ Writes results back to Salesforce

How It Works

  1. Define Platform Event in Salesforce (e.g., File_Uploaded__e)
  2. Create Apex trigger that publishes event when file is uploaded
  3. External service subscribes to event via Streaming API or CometD
  4. When event received, trigger Agent SDK workflow
  5. Agent fetches file via ContentVersion API and processes
  6. Write results back to Salesforce via REST API

Implementation (Node.js)

const jsforce = require('jsforce');
const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
const conn = new jsforce.Connection({ /* auth */ });
// Subscribe to Platform Event
conn.streaming.topic('/event/File_Uploaded__e').subscribe((message) => {
const fileId = message.payload.File_Id__c;
// Trigger agent workflow
processFileWithAgent(fileId);
});
async function processFileWithAgent(fileId) {
const fileData = await conn.request(`/services/data/v66.0/sobjects/ContentVersion/${fileId}/VersionData`);
const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY });
const result = await agent.query({ prompt: `Process this file: ${fileData}` });
// Write result back to Salesforce
await conn.sobject('File_Analysis__c').create({
File_Id__c: fileId,
Analysis_Result__c: result.output
});
}

Benefits

  • Real-time - Immediate response to file uploads
  • Decoupled - Salesforce and Agent SDK don’t need tight coupling
  • Scalable - Handle high-volume events
  • Asynchronous - Non-blocking operations

Sources:

Integration Pattern 6: AWS/EKS Self-Hosted with Salesforce File Push

Overview

Host the Claude Agent SDK on AWS EKS (or any self-hosted Kubernetes) and expose a REST API endpoint that Salesforce calls directly, optionally pushing the file content in the request body.

Can Salesforce Push the File Directly?

Yes, but with limitations. Salesforce can send file binary data to an external endpoint via Apex callout using setBodyAsBlob(). However, there are critical governor limits:

ConstraintLimit
Heap size (synchronous)6 MB
Heap size (asynchronous)12 MB
Single callout timeout120 seconds
Total callout timeout per transaction120 seconds
Max callouts per transaction100
Max request/response size12 MB (async)

This means files > ~6 MB cannot be pushed directly from Salesforce. For larger files, use the hybrid push-notification + pull pattern.

Architecture Option A: Push Model (Small Files < 6 MB)

Salesforce Org
→ Apex Trigger on ContentVersion insert
→ Apex Callout (async @future or Queueable)
→ POST https://your-eks-endpoint.com/api/process-file
→ Body: binary file via setBodyAsBlob()
→ Headers: file metadata (title, extension, record ID)
AWS EKS
→ REST API receives file + metadata
→ Passes to Claude Agent SDK
→ Returns result
→ (Optional) Writes back to Salesforce via REST API

Apex Implementation (Push Model)

// Trigger: fires when a new file is uploaded
trigger ContentVersionTrigger on ContentVersion (after insert) {
for (ContentVersion cv : Trigger.new) {
// Call async to avoid "uncommitted work" error
FileProcessorService.processFileAsync(cv.Id);
}
}
// Async service
public class FileProcessorService {
@future(callout=true)
public static void processFileAsync(Id contentVersionId) {
ContentVersion cv = [
SELECT Id, Title, FileExtension, ContentSize, VersionData, ContentDocumentId
FROM ContentVersion
WHERE Id = :contentVersionId
LIMIT 1
];
// Guard: only push files under 6 MB
if (cv.ContentSize > 6000000) {
// Fall back to notification-only (let external service pull)
sendNotificationOnly(cv);
return;
}
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:Claude_Agent_Endpoint/api/process-file');
req.setMethod('POST');
req.setHeader('Content-Type', 'application/octet-stream');
req.setHeader('X-File-Title', cv.Title);
req.setHeader('X-File-Extension', cv.FileExtension);
req.setHeader('X-ContentVersion-Id', cv.Id);
req.setHeader('X-ContentDocument-Id', cv.ContentDocumentId);
req.setBodyAsBlob(cv.VersionData);
req.setTimeout(120000);
Http http = new Http();
HttpResponse res = http.send(req);
if (res.getStatusCode() == 200) {
// Parse response and create analysis record
Map<String, Object> result = (Map<String, Object>) JSON.deserializeUntyped(res.getBody());
File_Analysis__c analysis = new File_Analysis__c(
ContentVersion_Id__c = cv.Id,
Analysis_Result__c = (String) result.get('analysis')
);
insert analysis;
}
}
private static void sendNotificationOnly(ContentVersion cv) {
HttpRequest req = new HttpRequest();
req.setEndpoint('callout:Claude_Agent_Endpoint/api/process-file-notification');
req.setMethod('POST');
req.setHeader('Content-Type', 'application/json');
req.setBody(JSON.serialize(new Map<String, String>{
'contentVersionId' => cv.Id,
'contentDocumentId' => cv.ContentDocumentId,
'title' => cv.Title,
'extension' => cv.FileExtension,
'size' => String.valueOf(cv.ContentSize)
}));
new Http().send(req);
}
}

Architecture Option B: Hybrid Push-Notification + Pull (Any File Size)

Recommended for production. Salesforce sends a lightweight notification with file metadata, and the AWS endpoint pulls the file directly from Salesforce.

Salesforce Org
→ Apex Trigger / Flow on ContentVersion insert
→ POST notification to AWS endpoint
→ JSON: { contentVersionId, title, size, extension }
→ NO binary file in body
AWS EKS
→ REST API receives notification
→ Authenticates to Salesforce (OAuth 2.0 JWT Bearer)
→ GET /services/data/v66.0/sobjects/ContentVersion/{id}/VersionData
→ Downloads file (no size limit via REST API)
→ Passes to Claude Agent SDK
→ Returns result to Salesforce via REST API

AWS Endpoint Implementation (Node.js)

const express = require('express');
const jsforce = require('jsforce');
const { ClaudeAgent } = require('@anthropic-ai/agent-sdk');
const app = express();
// Option A: Receive file directly (small files)
app.post('/api/process-file', express.raw({ limit: '12mb', type: '*/*' }), async (req, res) => {
const fileBuffer = req.body;
const title = req.headers['x-file-title'];
const extension = req.headers['x-file-extension'];
const cvId = req.headers['x-contentversion-id'];
const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY });
const result = await agent.query({
prompt: `Analyze this ${extension} file titled "${title}": ${fileBuffer.toString('base64')}`,
maxTurns: 5
});
res.json({ analysis: result.output, contentVersionId: cvId });
});
// Option B: Receive notification, pull file from Salesforce
app.post('/api/process-file-notification', express.json(), async (req, res) => {
const { contentVersionId, title, extension, size } = req.body;
// Acknowledge immediately (Salesforce has callout timeout limits)
res.json({ status: 'accepted', contentVersionId });
// Process asynchronously
setImmediate(async () => {
// Authenticate to Salesforce
const conn = new jsforce.Connection({ /* OAuth config */ });
await conn.login(username, password);
// Pull file from Salesforce (no size limit)
const fileData = await conn.request(
`/services/data/v66.0/sobjects/ContentVersion/${contentVersionId}/VersionData`
);
// Process with Agent SDK
const agent = new ClaudeAgent({ apiKey: process.env.ANTHROPIC_API_KEY });
const result = await agent.query({
prompt: `Analyze this ${extension} file: ${fileData}`,
maxTurns: 5
});
// Write results back to Salesforce
await conn.sobject('File_Analysis__c').create({
ContentVersion_Id__c: contentVersionId,
Analysis_Result__c: result.output
});
});
});
app.listen(3000);

Architecture Option C: Flow-Driven (No-Code Trigger)

Use Salesforce Flow instead of Apex for simpler setups:

Salesforce Flow Builder
→ Record-Triggered Flow on ContentDocumentLink (after insert)
→ HTTP Callout Action (or Outbound Message)
→ POST to AWS endpoint with file metadata
AWS EKS
→ Receives notification
→ Pulls file via REST API
→ Processes with Agent SDK

Note: Outbound Messages use SOAP format. For REST/JSON, use Apex HTTP Callout or Streamscript for Flow.

Push vs Pull: Decision Matrix

FactorPush (Apex sends file)Hybrid (notify + pull)Pull (agent polls/subscribes)
File size limit6 MB sync / 12 MB asyncUnlimited (REST API)Unlimited (REST API)
LatencyLowest (single hop)Medium (2 hops)Highest (polling interval)
SF API usage1 callout1 callout + 1 inbound REST1+ inbound REST per poll
ComplexitySimpleMediumMedium-High
ReliabilityDependent on callout successCan retry pull independentlyDepends on polling strategy
Best forSmall text files, metadataProduction workloads, any file sizeBatch processing, scheduled jobs

Recommendation for AWS/EKS

Use the Hybrid model (Option B):

  1. Salesforce sends lightweight JSON notification on file upload
  2. AWS endpoint acknowledges immediately (within SF callout timeout)
  3. AWS pulls file from Salesforce asynchronously (no size limit)
  4. Agent SDK processes file
  5. Results written back to Salesforce

This gives you:

  • No file size constraints
  • Decoupled processing (can queue and retry)
  • Minimal Salesforce governor limit impact
  • Full control over processing pipeline on EKS

AWS/EKS Specific Considerations

  • Ingress: Use AWS ALB Ingress Controller for HTTPS termination
  • Authentication: Validate Salesforce callout identity via Named Credentials + client certificate or shared secret
  • Scaling: EKS HPA can scale pods based on request volume
  • Storage: Use EBS or EFS for temporary file storage during processing
  • Secrets: Store Salesforce OAuth credentials in AWS Secrets Manager
  • Networking: Whitelist Salesforce IP ranges or use AWS PrivateLink

Sources:

For AWS/EKS: Hybrid Notification + Pull with Agent SDK

┌─────────────────────────────────────────────────────────┐
│ Salesforce Org │
│ • ContentVersion (files stored) │
│ • Apex Trigger on ContentVersion insert │
│ │ → Small files (<6MB): Push binary via callout │
│ │ → Large files (>6MB): Send JSON notification only │
└──────────────────┬──────────────────────────────────────┘
│ HTTPS (Named Credentials / OAuth)
┌─────────────────────────────────────────────────────────┐
│ AWS EKS Cluster │
│ ┌─────────────────────────────────────────────────┐ │
│ │ ALB Ingress (HTTPS termination) │ │
│ └──────────┬──────────────────────────────────────┘ │
│ ▼ │
│ ┌─────────────────────────────────────────────────┐ │
│ │ Node.js/Python Service (Pod) │ │
│ │ • POST /api/process-file (receives push) │ │
│ │ • POST /api/process-notification (hybrid) │ │
│ │ │ │ │
│ │ ├→ Pull file from SF ContentVersion API ────────┼───┼→ Salesforce REST API
│ │ │ (OAuth 2.0 JWT Bearer, no size limit) │ │
│ │ │ │ │
│ │ ├→ Claude Agent SDK │ │
│ │ │ • Process file content │ │
│ │ │ • (Optional) MCP server for SF tools │ │
│ │ │ │ │
│ │ └→ Write results back to Salesforce ────────────┼───┼→ Salesforce REST API
│ └─────────────────────────────────────────────────┘ │
│ │
│ AWS Secrets Manager (SF OAuth creds, API keys) │
│ CloudWatch (logging + monitoring) │
└─────────────────────────────────────────────────────────┘

Why This Architecture?

  1. Hybrid push/pull - Small files arrive instantly; large files pulled without SF governor limits
  2. AWS/EKS - Full control, existing infrastructure, auto-scaling via HPA
  3. No Heroku dependency - Self-hosted, platform-agnostic approach
  4. MCP Server (optional) - Can run in-process for additional Salesforce tool access
  5. OAuth 2.0 JWT Bearer - Server-to-server auth, no stored passwords
  6. ContentVersion API - Direct file access with 2GB limit (pull model)
┌─────────────────────────────────────────────────────────┐
│ Salesforce Org │
│ • Flow / Apex / Agentforce │
│ • ContentVersion (files stored) │
└──────────────────┬──────────────────────────────────────┘
│ Heroku AppLink
│ (auto auth, service discovery)
┌─────────────────────────────────────────────────────────┐
│ Heroku Dyno (Node.js/Python) │
│ ┌────────────────────────────────────────────────┐ │
│ │ Claude Agent SDK │ │
│ │ • Session management │ │
│ │ • Tool orchestration │ │
│ └──────────┬─────────────────────────────────────┘ │
│ │ │
│ │ MCP Protocol │
│ ▼ │
│ ┌────────────────────────────────────────────────┐ │
│ │ Custom MCP Server (Salesforce Bridge) │ │
│ │ • salesforce_list_files │ │
│ │ • salesforce_read_file │ │
│ │ • salesforce_upload_file │ │
│ │ • salesforce_query │ │
│ └──────────┬─────────────────────────────────────┘ │
└─────────────┼──────────────────────────────────────────┘
│ REST API (OAuth 2.0)
┌─────────────────────────────────────────────────────────┐
│ Salesforce REST API │
│ • /services/data/v66.0/sobjects/ContentVersion/... │
│ • /services/data/v66.0/query?q=SELECT... │
└─────────────────────────────────────────────────────────┘

Authentication Patterns

OAuth 2.0 Username-Password Flow

IMPORTANT: SOAP login() API is deprecated and will be retired in Summer ‘27. Use OAuth 2.0.

const jsforce = require('jsforce');
const conn = new jsforce.Connection({
oauth2: {
clientId: process.env.SF_CLIENT_ID,
clientSecret: process.env.SF_CLIENT_SECRET,
redirectUri: 'http://localhost:3000/oauth/callback'
}
});
await conn.login(username, password + securityToken);

OAuth 2.0 JWT Bearer Flow

For server-to-server integration:

import jwt
import requests
import time
# Create JWT
payload = {
'iss': client_id,
'sub': username,
'aud': 'https://login.salesforce.com',
'exp': int(time.time()) + 300
}
encoded_jwt = jwt.encode(payload, private_key, algorithm='RS256')
# Exchange for access token
response = requests.post('https://login.salesforce.com/services/oauth2/token', data={
'grant_type': 'urn:ietf:params:oauth:grant-type:jwt-bearer',
'assertion': encoded_jwt
})
access_token = response.json()['access_token']

Security Best Practices (2026)

  1. ✅ Use OAuth 2.0 (not SOAP login)
  2. ✅ Implement short-lived tokens (AppLink provides this)
  3. ✅ Run Agent SDK in sandboxed containers (Docker, gVisor, Firecracker)
  4. ✅ Use ephemeral containers for one-off tasks
  5. ✅ Implement network controls - restrict outbound connections
  6. ✅ Use MCP servers for credential isolation
  7. ✅ Set maxTurns to prevent infinite loops
  8. ✅ Monitor token costs - dominant expense
  9. ✅ Implement logging for audit trails
  10. ✅ Use ContentDistribution with expiry for public URLs

Performance Considerations

File Size Limits

  • ContentVersion: 2 GB max
  • Other objects: 500 MB max

API Limits

  • Salesforce enforces API call limits (varies by edition)
  • Use Heroku Connect for bulk operations to preserve limits
  • Implement exponential backoff for rate limiting

Latency

  • Heroku Connect: Low latency (Postgres local)
  • Direct REST API: ~100-300ms per call
  • Bulk API: Single call for large datasets

Token Budgets

  • Claude Agent SDK uses cl100k_base token counting
  • Monitor context window consumption with MCP tools
  • Keep MCP tools under 80 total (under 10 servers recommended)

Cost Analysis

ComponentMonthly Cost
Heroku Eco Dyno$7
Heroku Postgres (Mini)$9
Heroku Connect$100
Container runtime (24/7)~$36
Infrastructure Total~$152
Claude API (Sonnet)Variable (~15/M output)
Salesforce API callsIncluded in edition (or 2 per 1K)

Note: Token costs typically dominate infrastructure costs.

Deployment Checklist

  • Choose hosting platform (Heroku recommended)
  • Set up Salesforce Connected App with OAuth 2.0
  • Configure OAuth credentials (client ID, secret, redirect URI)
  • Deploy Agent SDK application (Python or Node.js)
  • Implement MCP server for Salesforce integration (optional but recommended)
  • Configure Heroku AppLink (if using Heroku)
  • Test file upload to ContentVersion
  • Test file download via REST API
  • Test agent workflow end-to-end
  • Set up monitoring and logging
  • Configure maxTurns and timeout limits
  • Implement error handling and retries
  • Document API usage patterns
  • Set up security controls (network restrictions, etc.)

Example: Complete Workflow

User uploads file to Salesforce

  1. File stored as ContentVersion object
  2. Platform Event published (optional)
  1. Heroku receives request from Salesforce
  2. Agent SDK initialized with MCP Salesforce server
  3. Agent calls mcp__salesforce__list_files tool
  4. MCP server authenticates and queries ContentVersion
  5. Agent calls mcp__salesforce__read_file with file ID
  6. MCP server downloads file via REST API
  7. Agent processes file content
  8. Agent returns analysis to Heroku app
  9. Heroku app writes results back to Salesforce via REST API

Results visible in Salesforce

  1. Custom object record created with analysis
  2. User sees results in Salesforce UI

Conclusion

While Claude Agent SDK cannot run directly inside Salesforce, there are robust integration patterns available:

Best for 2026: Heroku AppLink + MCP Server

  • Native Salesforce integration
  • Automatic authentication
  • Works with Agentforce
  • Full Agent SDK capabilities

Alternative: Self-hosted + Direct REST API

  • Platform agnostic
  • Full control
  • Custom security

Both patterns enable Claude Agent SDK to access Salesforce files via the ContentVersion API, providing powerful AI capabilities for Salesforce users.

Sources

Claude Agent SDK

  1. Agent SDK Overview
  2. Hosting the Agent SDK
  3. Secure Deployment Guide
  4. MCP in the Agent SDK
  5. Building Agents with Claude Agent SDK
  6. Enterprise MCP Deployment Guide

Salesforce Integration

  1. Salesforce File Upload Architecture
  2. ContentVersion Object Reference
  3. How to Upload Files via REST API

Heroku Integration

  1. Heroku Connect Overview
  2. Integrating Heroku and Salesforce
  3. Heroku AppLink Announcement
  4. Securing Salesforce with AppLink
  5. Salesforce Heroku Integration (Trailhead)

Node.js Integration

  1. JSforce Documentation
  2. Salesforce Integration with Node.js
  3. Node.js Salesforce Integration Guide

Official MCP Servers

  1. Official Salesforce DX MCP Server
  2. Introducing MCP Support Across Salesforce
  3. Salesforce Hosted MCP Servers Beta
  4. Agentforce MCP Support
  5. CData MCP Server - ContentVersion
  6. AiondaDotCom MCP Salesforce

Salesforce Apex Callouts & File Push

  1. Post File From Apex to External HTTP Webservices
  2. Apex REST Callouts (Trailhead)
  3. Bypass Salesforce Governor Limits for Files
  4. Sending Outbound Messages from Flow
  5. Salesforce Insert/Update Blob Data via REST