Skip to main content

Example Workflows

This document provides a comprehensive collection of AgentMap workflow patterns with downloadable examples, interactive demonstrations, and detailed explanations. Each workflow includes step-by-step breakdowns and troubleshooting guidance.

Try the Interactive Playground

๐ŸŽฎ Want to experiment with these workflows? Use our Interactive Playground to edit, test, and visualize these workflows in real-time!

Quick Referenceโ€‹

PatternComplexityUse CaseKey Features
Linear WorkflowโญSequential processingError handling, basic flow
Branching WorkflowโญโญConditional logicValidation, error recovery
Conversational BotโญโญโญChat applicationsMemory, continuous interaction
Document ProcessingโญโญโญFile automationPDF processing, summarization
Data PipelineโญโญโญโญAnalytics workflowsCSV processing, analysis
API IntegrationโญโญโญโญService integrationREST APIs, external data
Parallel ProcessingโญโญโญโญโญHigh-performanceConcurrent execution, synchronization

Simple Linear Workflowโ€‹

Pattern: Sequential node execution with basic error handling
Best For: Learning AgentMap basics, simple automation tasks
Estimated Time: 5 minutes to implement

Overviewโ€‹

A basic workflow where nodes execute in sequence, perfect for understanding AgentMap fundamentals.

workflows/linear_workflow.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
LinearFlow,Start,,Starting node,echo,Process,ErrorHandler,input,initial_data,
LinearFlow,Process,,Processing node,default,End,ErrorHandler,initial_data,processed_data,Processing: {initial_data}
LinearFlow,End,,Final node,echo,,ErrorHandler,processed_data,final_output,
LinearFlow,ErrorHandler,,Handle errors,echo,,,error,error_message,Error occurred: {error}

Branching Workflow with Error Handlingโ€‹

Pattern: Conditional execution with comprehensive error recovery
Best For: Validation workflows, quality control processes
Estimated Time: 10 minutes to implement

Overviewโ€‹

This workflow demonstrates conditional branching based on validation results, with robust error handling patterns.

workflows/branching_workflow.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
BranchFlow,Start,,Starting node,echo,Validate,ErrorHandler,input,initial_data,
BranchFlow,Validate,,{"validation_rules": ["required", "min_length:5"]},default,Process,HandleInvalid,initial_data,validation_result,Validating: {initial_data}
BranchFlow,Process,,Process valid data,default,End,ErrorHandler,initial_data,processed_data,Processing valid data: {initial_data}
BranchFlow,HandleInvalid,,Handle invalid input,default,RequestNew,ErrorHandler,initial_data,error_context,Invalid input: {initial_data}
BranchFlow,RequestNew,,Request new input,input,Validate,ErrorHandler,,new_input,Please provide valid input (min 5 characters):
BranchFlow,End,,Completion node,echo,,ErrorHandler,processed_data,final_output,
BranchFlow,ErrorHandler,,Handle system errors,echo,End,,error,error_message,System error: {error}

LLM Chain with Memoryโ€‹

Pattern: Conversational AI with persistent memory
Best For: Chatbots, interactive assistants, customer support
Estimated Time: 15 minutes to implement

Overviewโ€‹

A sophisticated conversational workflow that maintains context across multiple interactions using AgentMap's memory system.

workflows/chat_workflow.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
ChatFlow,UserInput,,{"memory_key":"chat_memory","max_memory_messages":10},input,Process,ErrorHandler,,user_message,๐Ÿ’ฌ You:
ChatFlow,Process,,{"preserve_context": true},echo,Respond,ErrorHandler,user_message|chat_memory,processed_input,
ChatFlow,Respond,,{"provider": "openai", "model": "gpt-4", "temperature": 0.7},llm,UpdateMemory,ErrorHandler,processed_input|chat_memory,ai_response,You are a helpful and friendly assistant. Previous conversation: {chat_memory}\n\nUser: {processed_input}
ChatFlow,UpdateMemory,,{"memory_key":"chat_memory","append_format":"User: {user_message}\nAssistant: {ai_response}"},memory,Format,ErrorHandler,user_message|ai_response|chat_memory,updated_memory,
ChatFlow,Format,,Format the response,default,UserInput,ErrorHandler,ai_response|updated_memory,formatted_response,๐Ÿค– Assistant: {ai_response}
ChatFlow,ErrorHandler,,Handle conversation errors,echo,UserInput,,error,error_message,โŒ Error: {error}

Document Processing Workflowโ€‹

Pattern: Automated document analysis and summarization
Best For: Content processing, research assistance, report generation
Estimated Time: 20 minutes to implement

Overviewโ€‹

A comprehensive workflow for processing documents with AI-powered analysis, supporting multiple file formats and output options.

workflows/document_processor.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
DocFlow,GetFile,,Get document path,input,ValidateFile,ErrorHandler,,file_path,๐Ÿ“„ Enter the path to your document:
DocFlow,ValidateFile,,{"allowed_types": ["pdf", "docx", "txt", "md"]},file_validator,ReadFile,ErrorHandler,file_path,validation_result,
DocFlow,ReadFile,,{"should_split": true, "chunk_size": 1000, "overlap": 200},file_reader,ExtractMetadata,ErrorHandler,file_path,documents,
DocFlow,ExtractMetadata,,Extract document metadata,default,AnalyzeContent,ErrorHandler,documents,metadata,Extracting metadata from {documents}
DocFlow,AnalyzeContent,,{"provider": "anthropic", "model": "claude-3-sonnet-20240229", "temperature": 0.3},llm,GenerateSummary,ErrorHandler,documents|metadata,content_analysis,Analyze this document content for key themes, main arguments, and important information: {documents}
DocFlow,GenerateSummary,,{"provider": "openai", "model": "gpt-4", "temperature": 0.5},llm,CreateReport,ErrorHandler,documents|content_analysis,executive_summary,Create a comprehensive executive summary based on this analysis: {content_analysis}. Include key findings, recommendations, and action items.
DocFlow,CreateReport,,{"template": "document_report", "format": "markdown"},report_generator,SaveReport,ErrorHandler,executive_summary|content_analysis|metadata,formatted_report,
DocFlow,SaveReport,,{"output_dir": "outputs/reports", "timestamp": true},file_writer,NotifyComplete,ErrorHandler,formatted_report|file_path,save_result,
DocFlow,NotifyComplete,,Notify completion,echo,End,ErrorHandler,save_result,completion_message,โœ… Document analysis complete! Report saved to: {save_result}
DocFlow,End,,Workflow complete,echo,,,completion_message,final_output,
DocFlow,ErrorHandler,,Handle processing errors,echo,End,,error,error_message,โŒ Error processing document: {error}

Data Processing Pipelineโ€‹

Pattern: Extract, Transform, Load (ETL) workflows
Best For: Business intelligence, data analytics, reporting automation
Estimated Time: 25 minutes to implement

Overviewโ€‹

A sophisticated data processing pipeline that handles CSV data extraction, transformation, analysis, and automated reporting.

workflows/data_pipeline.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
DataFlow,LoadData,,{"file_type": "csv", "headers": true, "encoding": "utf-8"},file_reader,ValidateSchema,ErrorHandler,file_path,raw_data,
DataFlow,ValidateSchema,,{"required_columns": ["id", "date", "value"], "data_types": {"id": "int", "value": "float"}},schema_validator,CleanData,ErrorHandler,raw_data,validation_result,
DataFlow,CleanData,,{"operations": ["remove_nulls", "standardize_dates", "normalize_values"]},data_cleaner,TransformData,ErrorHandler,raw_data,clean_data,
DataFlow,TransformData,,{"aggregations": ["sum", "mean", "count"], "group_by": ["category", "date"]},data_transformer,AnalyzeData,ErrorHandler,clean_data,transformed_data,
DataFlow,AnalyzeData,,{"provider": "openai", "model": "gpt-4", "temperature": 0.2},llm,GenerateInsights,ErrorHandler,transformed_data,statistical_analysis,Analyze this dataset and provide statistical insights, trends, and anomalies: {transformed_data}
DataFlow,GenerateInsights,,{"provider": "anthropic", "model": "claude-3-sonnet-20240229", "temperature": 0.4},llm,CreateVisualizations,ErrorHandler,statistical_analysis|transformed_data,business_insights,Based on this statistical analysis: {statistical_analysis}, generate actionable business insights and recommendations for the data: {transformed_data}
DataFlow,CreateVisualizations,,{"chart_types": ["line", "bar", "scatter"], "output_format": "png"},chart_generator,GenerateReport,ErrorHandler,transformed_data|business_insights,charts,
DataFlow,GenerateReport,,{"template": "business_report", "include_charts": true},report_generator,SaveResults,ErrorHandler,business_insights|statistical_analysis|charts,final_report,
DataFlow,SaveResults,,{"formats": ["csv", "excel", "pdf"], "output_dir": "outputs/analytics"},multi_format_writer,NotifyStakeholders,ErrorHandler,final_report|transformed_data,save_results,
DataFlow,NotifyStakeholders,,{"recipients": ["analytics@company.com"], "template": "data_pipeline_complete"},email_notifier,End,ErrorHandler,save_results|final_report,notification_result,
DataFlow,End,,Pipeline complete,echo,,,notification_result,completion_status,
DataFlow,ErrorHandler,,Handle processing errors,echo,End,,error,error_message,Pipeline failed: {error}

Integration with External Servicesโ€‹

Pattern: API-driven workflows with external service integration
Best For: Third-party integrations, data synchronization, automated workflows
Estimated Time: 30 minutes to implement

workflows/api_integration.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
APIFlow,GetQuery,,Get search query,input,PrepareRequest,ErrorHandler,,search_query,๐Ÿ” Enter your search query:
APIFlow,PrepareRequest,,{"api_version": "v1", "timeout": 30},request_builder,MakeRequest,ErrorHandler,search_query,api_params,
APIFlow,MakeRequest,,{"url": "https://api.example.com/search", "method": "GET", "headers": {"Authorization": "Bearer ${API_TOKEN}"}},rest_client,ProcessResponse,RetryRequest,api_params,api_response,
APIFlow,RetryRequest,,{"max_retries": 3, "backoff": "exponential"},retry_handler,MakeRequest,ErrorHandler,api_params,retry_context,
APIFlow,ProcessResponse,,Extract and validate response,response_processor,EnrichData,ErrorHandler,api_response,processed_data,
APIFlow,EnrichData,,{"provider": "openai", "temperature": 0.3},llm,FormatResults,ErrorHandler,processed_data,enriched_data,Analyze and enhance this API response data with additional insights: {processed_data}
APIFlow,FormatResults,,{"output_format": "structured", "include_metadata": true},formatter,CacheResults,ErrorHandler,enriched_data,formatted_results,
APIFlow,CacheResults,,{"ttl": 3600, "key_prefix": "api_results"},cache_writer,NotifyWebhook,ErrorHandler,formatted_results|search_query,cache_result,
APIFlow,NotifyWebhook,,{"webhook_url": "${WEBHOOK_URL}", "method": "POST"},webhook_client,End,ErrorHandler,formatted_results,notification_result,
APIFlow,End,,Display final results,echo,,,formatted_results,display_output,
APIFlow,ErrorHandler,,Handle API errors,echo,End,,error,error_message,๐Ÿšจ API Error: {error}

Parallel Processing with Joinโ€‹

Pattern: High-performance concurrent execution with synchronization
Best For: Performance-critical workflows, bulk processing, independent task execution
Estimated Time: 35 minutes to implement

workflows/parallel_processing.csv
GraphName,Node,Edge,Context,AgentType,Success_Next,Failure_Next,Input_Fields,Output_Field,Prompt
ParallelFlow,Start,,Initialize workflow,echo,SplitTasks,ErrorHandler,input,initial_data,
ParallelFlow,SplitTasks,,{"split_strategy": "round_robin", "worker_count": 3},task_splitter,ProcessA|ProcessB|ProcessC,ErrorHandler,initial_data,task_batches,
ParallelFlow,ProcessA,,{"worker_id": "A", "timeout": 60},parallel_processor,CollectA,ErrorHandler,task_batches,result_a,
ParallelFlow,ProcessB,,{"worker_id": "B", "timeout": 60},parallel_processor,CollectB,ErrorHandler,task_batches,result_b,
ParallelFlow,ProcessC,,{"worker_id": "C", "timeout": 60},parallel_processor,CollectC,ErrorHandler,task_batches,result_c,
ParallelFlow,CollectA,,Collect results from A,result_collector,WaitForAll,ErrorHandler,result_a,collected_a,
ParallelFlow,CollectB,,Collect results from B,result_collector,WaitForAll,ErrorHandler,result_b,collected_b,
ParallelFlow,CollectC,,Collect results from C,result_collector,WaitForAll,ErrorHandler,result_c,collected_c,
ParallelFlow,WaitForAll,,{"timeout": 300, "required_results": ["collected_a", "collected_b", "collected_c"]},synchronizer,MergeResults,ErrorHandler,collected_a|collected_b|collected_c,sync_result,
ParallelFlow,MergeResults,,{"merge_strategy": "concatenate", "preserve_order": true},result_merger,AnalyzeCombined,ErrorHandler,sync_result,merged_results,
ParallelFlow,AnalyzeCombined,,{"provider": "openai", "temperature": 0.2},llm,GenerateReport,ErrorHandler,merged_results,analysis_result,Analyze these parallel processing results and provide insights on performance and outcomes: {merged_results}
ParallelFlow,GenerateReport,,{"template": "parallel_execution_report"},report_generator,End,ErrorHandler,analysis_result|merged_results,final_report,
ParallelFlow,End,,Workflow complete,echo,,,final_report,completion_status,
ParallelFlow,ErrorHandler,,Handle processing errors,echo,End,,error,error_message,Parallel processing failed: {error}

Best Practices and Troubleshootingโ€‹

Design Principlesโ€‹

  1. Error Handling: Always include error handling nodes with descriptive messages
  2. Logging: Use structured logging with appropriate levels (DEBUG, INFO, ERROR)
  3. Modularity: Design reusable components that can be combined in different workflows
  4. Testing: Test individual nodes before combining into complex workflows
  5. Documentation: Use Context fields to document node purposes and requirements

Common Patternsโ€‹

Error recovery patterns
# Retry Pattern
Flow,Process,,{"max_retries": 3, "backoff": "exponential"},processor,Success,RetryHandler,data,result,
Flow,RetryHandler,,Check retry count,retry_checker,Process,FinalError,data|retry_count,retry_decision,
Flow,FinalError,,Handle final failure,error_handler,End,,error,final_error,

# Circuit Breaker Pattern
Flow,CheckHealth,,{"failure_threshold": 5, "timeout": 60},circuit_breaker,Process,Fallback,data,health_status,
Flow,Process,,Main processing,processor,Success,Fallback,data,result,
Flow,Fallback,,{"strategy": "cached_response"},fallback_handler,Success,,data,fallback_result,

# Graceful Degradation
Flow,TryPrimary,,Primary service call,primary_service,Success,TrySecondary,data,primary_result,
Flow,TrySecondary,,Secondary service call,secondary_service,Success,BasicResponse,data,secondary_result,
Flow,BasicResponse,,Basic fallback response,basic_handler,Success,,data,basic_result,

Common Issues and Solutionsโ€‹

Issue: Workflow stops unexpectedlyโ€‹

Symptoms: Workflow execution halts without clear error message

Solutions:

  1. Enable debug logging: agentmap run --log-level DEBUG
  2. Add echo agents between nodes to trace data flow
  3. Check that all nodes have valid Success_Next/Failure_Next routing
  4. Verify agent types are correctly specified
# Debug command with verbose output
agentmap run --graph MyFlow --csv my_workflow.csv --log-level DEBUG --verbose

Issue: Agent not found errorโ€‹

Symptoms: "Agent type 'custom:MyAgent' not found"

Solutions:

  1. Verify custom agent is in the configured directory
  2. Check agent class name matches filename (CamelCase)
  3. Ensure agent inherits from BaseAgent
  4. Validate agent registration in config
# Correct agent structure
class MyCustomAgent(BaseAgent):
def process(self, inputs):
# Your processing logic
return result

Issue: Memory leaks in long-running workflowsโ€‹

Symptoms: Increasing memory usage over time

Solutions:

  1. Limit memory buffer size in LLM agents
  2. Clear large data objects after processing
  3. Use streaming for large file processing
  4. Monitor memory usage with system agents
# Memory-optimized LLM configuration
Flow,Process,,{"memory_key": "chat", "max_memory_messages": 10, "clear_after": 100},llm,Next,,input,output,

Download Examplesโ€‹

All the workflows in this guide are available as downloadable files:

Individual Workflowsโ€‹

Complete Workflow Packageโ€‹


Next Stepsโ€‹

After working through these examples, explore these advanced topics:

Get Help

๐Ÿ’ฌ Need assistance? Join our Discord community or check the troubleshooting guide for common issues and solutions.