The Support Squad: Multi-Agent Customer Support
Build a team of specialized AI agents that collaborate seamlessly to deliver exceptional customer support - where shared memory is the secret weapon.
The $2M Problem
TechCorp's customer support was bleeding money. Customers were frustrated, agents were overwhelmed, and the company was losing $2M annually to churn caused by poor support experiences.
The Solution: The Support Squad
A team of specialized AI agents with shared memory - each agent knows everything every other agent knows. When a customer talks to any agent, it's like talking to someone who's been with them from day one.
Meet The Support Squad
Alex - First Contact
Handles initial inquiries, gathers context, and routes to specialists. Remembers every customer interaction.
Taylor - Technical Expert
Solves complex technical issues. Knows the customer's tech stack, past problems, and preferences.
Sam - Account Manager
Handles billing, upgrades, and account issues. Remembers payment history and preferences.
š The Secret: Shared Memory
All three agents access the same memory system. When Alex learns something about a customer, Taylor and Sam instantly know it too. No information is ever lost in handoffs.
A Day in the Life: Sarah's Journey
Monday 9 AM - First Contact with Alex
Initial inquiry about API integration
Sarah: "Hi, I'm trying to integrate your API into our Python application but getting authentication errors."
Alex: "Hi Sarah! I can help you with that. Let me gather some details. What's your tech stack?"
š§ What Alex Stores in Memory:
- ⢠Customer: Sarah, Software Engineer
- ⢠Tech Stack: Python application
- ⢠Issue: API authentication errors
- ⢠Communication Style: Technical, direct
Monday 9:15 AM - Handoff to Taylor
Technical specialist takes over
Taylor: "Hi Sarah! I see you're working on Python API integration and hitting auth errors. I've reviewed your setup - let's get this fixed."
Sarah: "Wow, you already know what I'm working on?"
š§ What Taylor Knows (from shared memory):
- ⢠Everything Alex learned about Sarah
- ⢠Sarah prefers technical explanations
- ⢠Current issue context
- ⢠No need to ask Sarah to repeat herself!
š§ What Taylor Adds to Memory:
- ⢠Solution: API key configuration fix
- ⢠Technical level: Advanced
- ⢠Issue resolved: Authentication
Tuesday 2 PM - Sam Reaches Out
Proactive account management
Sam: "Hi Sarah! I noticed you successfully integrated our API yesterday. Since you're building with Python, you might benefit from our Enterprise plan which includes advanced Python SDK features and priority support. Would you like to hear more?"
Sarah: "This is incredible - you know exactly what I need!"
š§ What Sam Knows (from shared memory):
- ⢠Sarah's tech stack (Python)
- ⢠Recent successful integration
- ⢠Technical skill level (advanced)
- ⢠Perfect timing for upgrade conversation
⨠The Result
Sarah never had to repeat herself. Each agent knew her context, preferences, and history. What could have been a frustrating multi-day ordeal became a seamless experience that led to an upgrade.
Building Your Support Squad
Step 1: The Shared Memory Foundation
First, create the shared memory system that all agents will use:
from memorystack import MemoryStackClient
from openai import OpenAI
import os
from datetime import datetime
from typing import Dict, List, Optional
class SharedMemory:
"""Shared memory system for all support agents"""
def __init__(self, customer_id: str):
self.customer_id = customer_id
self.memory = MemoryStackClient(
api_key=os.getenv("MEMORYSTACK_API_KEY"),
user_id=customer_id # All agents share the same customer memory
)
def store_interaction(self, agent_name: str, content: str,
interaction_type: str, metadata: Dict = None):
"""Store any interaction in shared memory"""
self.memory.create_memory(
content=f"[{agent_name}] {content}",
memory_type=interaction_type,
metadata={
"agent": agent_name,
"timestamp": datetime.now().isoformat(),
**(metadata or {})
}
)
def get_customer_context(self, query: str = "", limit: int = 10) -> List[Dict]:
"""Retrieve relevant customer context"""
memories = self.memory.search_memories(
query=query if query else "customer",
limit=limit
)
return memories.get('results', [])
def get_full_history(self) -> List[Dict]:
"""Get complete interaction history"""
return self.memory.search_memories(
query="",
limit=100
).get('results', [])Step 2: Alex - The First Contact Agent
Alex handles initial inquiries and gathers crucial context:
class AlexAgent:
"""First contact agent - gathers context and routes"""
def __init__(self, customer_id: str, customer_name: str):
self.name = "Alex"
self.customer_id = customer_id
self.customer_name = customer_name
self.memory = SharedMemory(customer_id)
self.ai = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
def handle_inquiry(self, message: str) -> Dict:
"""Handle initial customer inquiry"""
# Get existing context
context = self.memory.get_customer_context(message)
context_text = "\n".join([m.get('content', '') for m in context])
# Analyze and respond
response = self.ai.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "system",
"content": f"""You are Alex, a friendly first-contact support agent.
Customer: {self.customer_name}
Previous context:
{context_text}
Your job:
1. Greet warmly (use their name if returning customer)
2. Understand their issue
3. Gather technical details
4. Determine if you can help or need to route to specialist
5. Extract key information for memory"""
},
{"role": "user", "content": message}
]
)
bot_response = response.choices[0].message.content
# Extract and store key information
self._extract_and_store(message, bot_response)
# Store interaction
self.memory.store_interaction(
agent_name=self.name,
content=f"Customer: {message}\nAlex: {bot_response}",
interaction_type="conversation"
)
# Determine if routing needed
needs_specialist = self._needs_specialist(message, bot_response)
return {
"response": bot_response,
"route_to": needs_specialist,
"context_gathered": True
}
def _extract_and_store(self, customer_message: str, response: str):
"""Extract key information and store in memory"""
# Use AI to extract structured information
extraction = self.ai.chat.completions.create(
model="gpt-4",
messages=[{
"role": "system",
"content": """Extract key information from this conversation.
Return JSON with: tech_stack, issue_type, customer_role, communication_style, urgency"""
}, {
"role": "user",
"content": f"Customer: {customer_message}\nAgent: {response}"
}]
)
try:
import json
info = json.loads(extraction.choices[0].message.content)
# Store each piece of information
if info.get('tech_stack'):
self.memory.store_interaction(
agent_name=self.name,
content=f"Customer tech stack: {info['tech_stack']}",
interaction_type="fact",
metadata={"category": "technical"}
)
if info.get('issue_type'):
self.memory.store_interaction(
agent_name=self.name,
content=f"Issue type: {info['issue_type']}",
interaction_type="issue",
metadata={"status": "open"}
)
if info.get('communication_style'):
self.memory.store_interaction(
agent_name=self.name,
content=f"Communication preference: {info['communication_style']}",
interaction_type="preference"
)
except:
pass
def _needs_specialist(self, message: str, response: str) -> Optional[str]:
"""Determine if specialist is needed"""
keywords = {
"technical": ["api", "integration", "error", "bug", "code"],
"billing": ["payment", "invoice", "upgrade", "subscription", "billing"]
}
message_lower = message.lower()
for specialist, terms in keywords.items():
if any(term in message_lower for term in terms):
return specialist
return NoneStep 3: Taylor - The Technical Specialist
Taylor handles complex technical issues with full context:
class TaylorAgent:
"""Technical specialist with access to all customer context"""
def __init__(self, customer_id: str, customer_name: str):
self.name = "Taylor"
self.customer_id = customer_id
self.customer_name = customer_name
self.memory = SharedMemory(customer_id)
self.ai = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
def handle_technical_issue(self, message: str) -> str:
"""Handle technical support with full context"""
# Get ALL relevant context (from Alex and previous interactions)
context = self.memory.get_customer_context(message, limit=20)
# Build comprehensive context
tech_stack = []
past_issues = []
preferences = []
for mem in context:
content = mem.get('content', '')
mem_type = mem.get('memory_type', '')
if 'tech stack' in content.lower():
tech_stack.append(content)
elif mem_type == 'issue':
past_issues.append(content)
elif mem_type == 'preference':
preferences.append(content)
context_summary = f"""
CUSTOMER: {self.customer_name}
TECH STACK:
{chr(10).join(tech_stack) if tech_stack else 'Not yet documented'}
PAST ISSUES:
{chr(10).join(past_issues[-3:]) if past_issues else 'No previous issues'}
COMMUNICATION PREFERENCES:
{chr(10).join(preferences) if preferences else 'Standard communication'}
RECENT INTERACTIONS:
{chr(10).join([m.get('content', '')[:100] for m in context[:5]])}
"""
# Generate technical response
response = self.ai.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "system",
"content": f"""You are Taylor, a senior technical support specialist.
{context_summary}
Your approach:
1. Acknowledge you have context (don't make customer repeat)
2. Provide technical solutions matching their skill level
3. Reference past issues if relevant
4. Be thorough but respect their communication style
5. Document the solution for future reference"""
},
{"role": "user", "content": message}
]
)
bot_response = response.choices[0].message.content
# Store solution in memory
self.memory.store_interaction(
agent_name=self.name,
content=f"Technical solution provided: {bot_response}",
interaction_type="solution",
metadata={
"issue_type": "technical",
"resolved": True
}
)
# Store the conversation
self.memory.store_interaction(
agent_name=self.name,
content=f"Customer: {message}\nTaylor: {bot_response}",
interaction_type="conversation"
)
return bot_responseStep 4: Sam - The Account Manager
Sam handles billing and proactively identifies opportunities:
class SamAgent:
"""Account manager with full customer intelligence"""
def __init__(self, customer_id: str, customer_name: str):
self.name = "Sam"
self.customer_id = customer_id
self.customer_name = customer_name
self.memory = SharedMemory(customer_id)
self.ai = OpenAI(api_key=os.getenv("OPENAI_API_KEY"))
def proactive_outreach(self) -> Optional[str]:
"""Identify opportunities for proactive outreach"""
# Get full customer history
history = self.memory.get_full_history()
# Analyze for opportunities
analysis = self.ai.chat.completions.create(
model="gpt-4",
messages=[{
"role": "system",
"content": """Analyze customer history for upgrade opportunities.
Look for:
- Successful technical integrations (ready for advanced features)
- Multiple support interactions (needs priority support)
- Growing usage patterns
- Feature requests that match higher tiers
Return JSON: {"opportunity": "description", "timing": "good/excellent/poor"}"""
}, {
"role": "user",
"content": str(history)
}]
)
try:
import json
opportunity = json.loads(analysis.choices[0].message.content)
if opportunity.get('timing') in ['good', 'excellent']:
return self._create_outreach_message(opportunity)
except:
pass
return None
def _create_outreach_message(self, opportunity: Dict) -> str:
"""Create personalized outreach message"""
context = self.memory.get_customer_context(limit=10)
context_text = "\n".join([m.get('content', '') for m in context])
message = self.ai.chat.completions.create(
model="gpt-4",
messages=[{
"role": "system",
"content": f"""You are Sam, an account manager.
Customer context:
{context_text}
Opportunity:
{opportunity.get('opportunity', '')}
Create a warm, personalized message that:
1. References their recent success/activity
2. Suggests relevant upgrade
3. Explains specific benefits for THEIR use case
4. Feels helpful, not salesy"""
}]
)
return message.choices[0].message.content
def handle_billing_inquiry(self, message: str) -> str:
"""Handle billing questions with context"""
context = self.memory.get_customer_context(message)
context_text = "\n".join([m.get('content', '') for m in context])
response = self.ai.chat.completions.create(
model="gpt-4",
messages=[
{
"role": "system",
"content": f"""You are Sam, an account manager.
Customer context:
{context_text}
Handle billing inquiries professionally, referencing their history."""
},
{"role": "user", "content": message}
]
)
bot_response = response.choices[0].message.content
# Store interaction
self.memory.store_interaction(
agent_name=self.name,
content=f"Customer: {message}\nSam: {bot_response}",
interaction_type="conversation",
metadata={"category": "billing"}
)
return bot_responseStep 5: The Squad Orchestrator
Bring it all together with seamless agent coordination:
class SupportSquad:
"""Orchestrates the entire support squad"""
def __init__(self, customer_id: str, customer_name: str):
self.customer_id = customer_id
self.customer_name = customer_name
# Initialize all agents with shared memory
self.alex = AlexAgent(customer_id, customer_name)
self.taylor = TaylorAgent(customer_id, customer_name)
self.sam = SamAgent(customer_id, customer_name)
self.memory = SharedMemory(customer_id)
def handle_message(self, message: str, current_agent: str = "alex") -> Dict:
"""Route message to appropriate agent"""
if current_agent == "alex":
result = self.alex.handle_inquiry(message)
# Check if routing needed
if result['route_to'] == 'technical':
return {
"response": result['response'],
"next_message": "I'm connecting you with Taylor, our technical specialist. They'll have all the context from our conversation.",
"next_agent": "taylor"
}
elif result['route_to'] == 'billing':
return {
"response": result['response'],
"next_message": "Let me connect you with Sam, our account manager. They'll have all your details.",
"next_agent": "sam"
}
return {"response": result['response'], "next_agent": "alex"}
elif current_agent == "taylor":
response = self.taylor.handle_technical_issue(message)
return {"response": response, "next_agent": "taylor"}
elif current_agent == "sam":
response = self.sam.handle_billing_inquiry(message)
return {"response": response, "next_agent": "sam"}
def run_proactive_outreach(self):
"""Check for proactive outreach opportunities"""
message = self.sam.proactive_outreach()
if message:
print(f"š§ Proactive outreach opportunity:\n{message}")
return message
return None
def get_customer_summary(self) -> Dict:
"""Get comprehensive customer summary"""
history = self.memory.get_full_history()
summary = {
"total_interactions": len(history),
"agents_involved": set(),
"issues_resolved": 0,
"tech_stack": [],
"preferences": []
}
for mem in history:
agent = mem.get('metadata', {}).get('agent')
if agent:
summary['agents_involved'].add(agent)
if mem.get('memory_type') == 'solution':
summary['issues_resolved'] += 1
content = mem.get('content', '')
if 'tech stack' in content.lower():
summary['tech_stack'].append(content)
elif mem.get('memory_type') == 'preference':
summary['preferences'].append(content)
return summary
# USAGE EXAMPLE: Complete Customer Journey
def demo_support_squad():
"""Demonstrate the complete support squad in action"""
# Initialize squad for customer
squad = SupportSquad(
customer_id="sarah_123",
customer_name="Sarah"
)
print("=" * 60)
print("SUPPORT SQUAD DEMO: Sarah's Journey")
print("=" * 60)
# Day 1: Initial contact with Alex
print("\nš
MONDAY 9:00 AM - First Contact")
print("-" * 60)
result1 = squad.handle_message(
"Hi, I'm trying to integrate your API into our Python application but getting authentication errors.",
current_agent="alex"
)
print(f"Alex: {result1['response']}")
# Handoff to Taylor
if result1.get('next_agent') == 'taylor':
print(f"\n{result1.get('next_message')}")
print("\nš
MONDAY 9:15 AM - Technical Specialist")
print("-" * 60)
result2 = squad.handle_message(
"Yes, I'm using Python 3.9 with the requests library. The error says 'Invalid API key'.",
current_agent="taylor"
)
print(f"Taylor: {result2['response']}")
# Day 2: Proactive outreach from Sam
print("\nš
TUESDAY 2:00 PM - Proactive Outreach")
print("-" * 60)
outreach = squad.run_proactive_outreach()
if outreach:
print(f"Sam: {outreach}")
# Summary
print("\nš CUSTOMER SUMMARY")
print("-" * 60)
summary = squad.get_customer_summary()
print(f"Total Interactions: {summary['total_interactions']}")
print(f"Agents Involved: {', '.join(summary['agents_involved'])}")
print(f"Issues Resolved: {summary['issues_resolved']}")
print(f"Tech Stack: {summary['tech_stack']}")
print("\nā
Result: Seamless experience with zero repetition!")
if __name__ == "__main__":
demo_support_squad()The Business Impact
Before: Traditional Support
After: Support Squad with Memory
ROI: $2M+ Saved Annually
Why Memory Makes This Possible
1. Zero Context Loss in Handoffs
When Taylor takes over from Alex, they don't start from scratch. They have every detail Alex learned, making the transition seamless.
2. Proactive Intelligence
Sam can identify upgrade opportunities because they know the customer's technical success, usage patterns, and preferences - all stored in shared memory.
3. Personalization at Scale
Each agent adapts their communication style, technical level, and approach based on learned preferences - impossible without persistent memory.
4. Continuous Learning
Every interaction makes the squad smarter. Past solutions inform future responses. Customer preferences guide every interaction.
