Semantic Search
Search by meaning, not just keywords. Semantic search understands context and finds relevant memories even when exact words don't match.
The Problem with Traditional Search
Traditional Keyword Search
Traditional databases use keyword matching. If the exact word isn't in the text, it won't be found.
Semantic Search
Semantic search understands meaning. It finds relevant content even when the exact words don't match.
How Semantic Search Works
Semantic search uses embeddings - mathematical representations of meaning. Similar concepts have similar embeddings, even if they use different words.
Text → Embedding
When you store a memory, the text is converted into a vector (array of numbers) that represents its meaning.
Query → Embedding
Your search query is also converted into an embedding.
Similarity Matching
The system finds memories with embeddings similar to your query embedding.
The Magic of Embeddings
Embeddings capture semantic relationships. Words like "Python", "programming language", and "coding" have similar embeddings because they're conceptually related. This is why semantic search works - it finds meaning, not just matching text.
Real-World Examples
Example 1: Customer Support
Example 2: Personal Assistant
Using Semantic Search
Basic Search
from memorystack import MemoryStackClient
memory = MemoryStackClient(
api_key="your_api_key",
user_id="user_123"
)
# Simple semantic search
results = memory.search_memories(
query="programming preferences",
limit=5
)
# Process results
for result in results['results']:
print(f"Score: {result['relevance_score']}")
print(f"Content: {result['content']}")
print("---")Advanced Filtering
# Combine semantic search with filters
results = memory.search_memories(
query="authentication issues",
memory_type="conversation", # Only conversations
limit=10,
metadata_filter={
"severity": "high",
"resolved": False
}
)
# Get only highly relevant results
high_confidence = [
r for r in results['results']
if r['relevance_score'] > 0.85
]Building Context for AI
def build_ai_context(user_message: str) -> str:
"""Build rich context from semantic search"""
# Search for relevant memories
results = memory.search_memories(
query=user_message,
limit=10
)
# Filter by relevance
relevant = [
r for r in results['results']
if r['relevance_score'] > 0.75
]
# Build context string
context_parts = []
for mem in relevant:
context_parts.append(
f"[{mem['memory_type']}] {mem['content']}"
)
return "\n".join(context_parts)
# Use in AI prompt
user_msg = "Help me with authentication"
context = build_ai_context(user_msg)
prompt = f"""
Context about the user:
{context}
User's question: {user_msg}
Provide a helpful response based on the context.
"""Search Strategies
Broad Search
Use general queries to find a wide range of related memories.
limit=20
Focused Search
Use specific queries to find exact information.
limit=5
Multi-Query Search
Search multiple times with different queries to get comprehensive results.
"settings", "config"]
Contextual Search
Include context in your query for better results.
in production environment"
Best Practices
✅ Do
- • Use natural language queries
- • Include context in your search terms
- • Filter by relevance_score (> 0.75 is good)
- • Combine with memory_type filters
- • Test different query phrasings
❌ Don't
- • Use single-word queries (too broad)
- • Expect exact keyword matching
- • Ignore relevance scores
- • Search without a clear intent
- • Over-filter (let semantics work)
