FaaSify logo

Introduction

Function-as-a-Service (FaaS) is a novel and compelling cloud computing concept that aids in achieving the "serverless" dream by just deploying, running, and managing granular code fragments and services without worrying about the infrastructure details and is charged per usage. If applications can be assembled using serverless architecture, they would experience numerous advantages over the traditional architectures, such as scalability, fine-grained billing, and low operational costs. However, adopting serverless is sometimes slow due to the complexity of orchestrating FaaS and the risk of "vendor lock-in" as application porting between vendors becomes challenging without considerable effort and cost.


A study performed by the authors ("Pattern-based Serverless Data Processing Pipeline using Function-as-a-Service (FaaS) Orchestration Systems") proposes a methodology for composing serverless applications using FaaS orchestrating systems with the help of enterprise integration and workflow patterns to produce reusable software design solutions that standardize the integration process, reduce development costs and improve code quality. A detailed description of 20 identified patterns using the problem first approach has been listed on this website. These patterns were then classified into three categories. The construct patterns define the essential components necessary for constructing the workflow. The control flow is responsible for workflow traversal, and finally, the function-specific are patterns that are implemented in the function.


We encourage users to evaluate the proposed methodology and contribute to the list of patterns. Currently, we have listed the corresponding patterns for three diverse state-of-the-art FaaS orchestration systems: AWS Step Functions (ASF), Zeebe, and Azure Durable Functions (ADF).

Patterns

1. Process Manager

Process Manager
Process Manager

Problem

How does the serverless workflow determine the path in which the message needs to flow if it consists of multiple functions and conditions?


Decision

The Process Manager acts as a central processing component for the system. As workflows are influenced by each step's output message, execution states need to be maintained, and based on the result; the succeeding component is invoked.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Construct


Synonyms

-


Mapping

AWS Step Functions

Process Manager
Process Manager
States can be orchestrated using ASF State Machine.

ASF snippet:
                
{
   "Comment":"ASF Template",
   "StartAt":"Function",
   "States":{
      "Function":{
         "Type":"Pass",
         "End":true
      }
   }
}
                
              

Zeebe

Process Manager
Process Manager
The "Process Manager" pattern for Zeebe is the broker coordinating the various tasks in the workflow. Here the various tasks are associated with their corresponding hosted function.

Azure Durable Functions

Process Manager
Process Manager
Here the message routing "Process Manager" pattern for ADF is presented. Here the various functions are orchestrated using the primary Orchestration Function.

2. Event and Document Message

Event and Document Message
Event and Document Message

Problem

How can the serverless workflow and its involved functions be executed/triggered?


Decision

External services or clients can invoke the serverless data processing workflow by an Event Message. Furthermore, Event Messages can be used to invoke other workflows or services. As functions are considered a black box, the Document Message containing the data structure message is the most optimum choice when communicating between internal states/functions.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Construct


Synonyms

-


Mapping

AWS Step Functions

Event Document Message
Event Document Message
ASF can be triggered using an event message via the API Gateway. The various states in ASF are traversed using a document message that is a JSON structured message.

Zeebe

Event Document Message
Event Document Message
In Zeebe, the Event and Document message constructs invoke the workflow and handle the internal communication between elements, respectively. A client can invoke the intermediatory Zeebe client, which in turn invokes the BPMN 2.0 Zeebe workflow via gRPC. Internally, the workflow uses variables and JSON messages to interact with the states.

Azure Durable Functions

Event Document Message
Event Document Message
In ADF, the Event message construct invokes the orchestration function, and the Document message handles the internal message communication between the functions, as depicted by the figure.

3. Message Endpoint

Message Endpoint
Message Endpoint

Problem

How are functions represented in a serverless workflow and connected?


Decision

With the Message Endpoint construct, the various functions do not need to be aware of the message formats, channel, or other functions present in the serverless workflow. The functions only need to be mindful that they will receive requests, and it just needs to process and send the acknowledgment/response back to the system


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Construct


Synonyms

-


Mapping

AWS Step Functions

In ASF, Message Endpoint are linked to states with the "Task" type. The Task state has the following required fields:
  • Resource: ARN that uniquely identifies the specific AWS Lambda to execute.


ASF snippet:
              
{
   "State":{
      "Type":"Task",
      "Resource":"arn:aws:states:::lambda:invoke",
      "Parameters":{
         "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
         "Payload":{
            "Input.$":"$"
         }
      },
      "Next":"NEXT_STATE"
   }
}
              
            

Zeebe

The Message Endpoint construct, which accepts the messages and processes the message, is mapped to the "Service Task" with Type = "lambda" (Based on the FaaS vendor provider). The below figure illustrates how this construct can be used in the BPMN 2.0 Zeebe modeler.
Message Endpoint
Message Endpoint

ASF snippet:
              
<?xml version="1.0" encoding="UTF-8"?>
<bpmn:definitions xmlns:bpmn="http://www.omg.org/spec/BPMN/20100524/MODEL" xmlns:bpmndi="http://www.omg.org/spec/BPMN/20100524/DI" xmlns:dc="http://www.omg.org/spec/DD/20100524/DC" xmlns:zeebe="http://camunda.org/schema/zeebe/1.0" id="Definitions_0dmi4p0" targetNamespace="http://bpmn.io/schema/bpmn" exporter="Zeebe Modeler" exporterVersion="0.11.0">
<bpmn:process id="Zeebe_Process" name="Zeebe Model" isExecutable="true">
    <bpmn:serviceTask id="ServiceTask_Lambda" name="Service Task">
    <bpmn:extensionElements>
        <zeebe:taskDefinition type="lambda" />
    </bpmn:extensionElements>
    </bpmn:serviceTask>
</bpmn:process>
<bpmndi:BPMNDiagram id="BPMNDiagram_1">
    <bpmndi:BPMNPlane id="BPMNPlane_1" bpmnElement="Zeebe_Process">
    <bpmndi:BPMNShape id="Activity_079frpn_di" bpmnElement="ServiceTask_Lambda">
        <dc:Bounds x="160" y="80" width="100" height="80" />
    </bpmndi:BPMNShape>
    </bpmndi:BPMNPlane>
</bpmndi:BPMNDiagram>
</bpmn:definitions>
              
            

Azure Durable Functions

The Message Endpoint construct, which receives the messages and processes the message, is realized by the Activity Function. The functions must be idempotent as it follows the at-least-once execution strategy. The below code snippet illustrates how this construct can be used in ADF.

ASF snippet:
              
                const function = yield context.df.callActivity("Activity Function", "Payload")
              
            

4. Pipes and Filters

Pipes and Filters
Pipes and Filters

Problem

How to decompose a task that performs complex processing into a series of separate elements that can be reused?


Decision

Pipes and Filters help in implementing complex processing in a granular, independent, resilient and sequential manner. Moreover, the fundamental building blocks of serverless workflows are functions, and each function in the pipeline is generally responsible for small transactions making this pattern style optimum.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Construct


Synonyms

Sequence, Sequential routing, Serial Routing


Mapping

AWS Step Functions

ASF externalize the Pipes and Filters pattern as represented in the below figure and code snippet by extracting the coordination from the filter implementations into a state machine that orchestrates the sequence of events.
Pipes and Filter
Pipes and Filter

ASF code snippet:
             
{
  "Comment":"Pipes And Filter Pattern",
  "StartAt":"State 1",
  "States":{
     "State 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME"
        },
        "Next":"State 2"
     },
     "State 2":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME"
        },
        "End":true
     }
  }
}
             
           

Zeebe

The below figure shows how each Service task (filter) performs only one distinct operation, and the pipes that are the sequence/message flow coordinate the various tasks.
Pipes and Filters
Pipes and Filters

Azure Durable Functions

Pipes and Filters play an essential aspect in standardizing a workflow execution and this is referred to as Function Chaining pattern in ADF. The below code snippet shows how each Activity Function (filter) performs only one distinct operation and the pipes that are the JSON message that coordinate the various functions.

ASF snippet:
                
import * as df from "durable-functions"

module.exports = df.orchestrator(function* (context) {
    try {
        const function1Result = yield context.df.callActivity("function1", context.df.getInput())
        const function2Result = yield context.df.callActivity("function2", function1Result)
        const function3Result = yield context.df.callActivity("function3", function2Result)
        return function3Result;
    }
    catch (error) {
        console.error(error)
    }
});
                
              

5. Multicast

Multicast
Multicast

Problem

How will the serverless workflow route the same message to several endpoints and process them differently?


Decision

A Multicast pattern is used to model the execution of parallel flows/concurrency by sending a copy of the same message to multiple recipients without checking any conditions. Here all outgoing flows are executed at the same time.


Source

[Ibsen and Anstey 2010]


Pattern

Enterprise Integration Pattern


Type

Control Flow


Synonyms

Parallel Split, AND-Split, Parallel Routing, Fork


Mapping

AWS Step Functions

The depicted figure and code snippet shows how the Multicast pattern can be mapped to the Parallel state offered by ASF. Here the mandatory field while configuring this state is Branches. Branches support one to multiple possible paths, with each route consisting of one or many state transitions. In the Parallel state, each branch is provided with a copy of the input data. This pattern does not require the output of each branch to produce outputs that have a homogenous structure. However, for a data processing pipeline's simplicity and ease of operations, each branch should produce outputs complying with a uniform format.
Multicast
Multicast

ASF code snippet:
 
{
  "Comment":"Multicast Pattern",
  "StartAt":"MulticastState",
  "States":{
     "MulticastState":{
        "Type":"Parallel",
        "Branches":[
           {
              "StartAt":"Function 1",
              "States":{
                 "Function 1":{
                    "Type":"Pass",
                    "End":true
                 }
              }
           },
           {
              "StartAt":"Function 2",
              "States":{
                 "Function 2":{
                    "Type":"Pass",
                    "End":true
                 }
              }
           }
        ],
        "End":true
     }
  }
}
 

Zeebe

With the "Parallel Gateway", the Multicast pattern can be implemented using BPMN 2.0 Zeebe Modeler. The below figure shows how the same message is transferred to two functions parallelly.
Multicast
Multicast

Azure Durable Functions

The Multicast pattern is implemented in ADF by following the below code snippet. In this implementation, the same data is sent to multiple Activity Functions and executed simultaneously.

ADF code snippet:
    
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    const parallelTasks = [];

    // Get input
    const data = context.df.getInput()

    // Perform parallel processing
    parallelTasks.push(context.df.callActivity("function1", data));
    parallelTasks.push(context.df.callActivity("function2", data));

    const arrayParallelTasksResult = yield context.df.Task.all(parallelTasks);

    return arrayParallelTasksResult
});
    
  

6. Content-based Router

Content-based Router
Content-based Router

Problem

Functions must be orchestrated to adhere to a process flow to generate an error-free/desired output. How can the messages be routed to the correct workflow execution path within the workflow based on the message content?


Decision

A Content-based Router helps in controlling the workflow based on the message content. Each outgoing flow connected from the router corresponds to a condition, and the flow with the satisfied condition is traversed. Based on the condition, one or many flows can be traversed. In this pattern, the router examines the message content using numerous criteria like fields, values, and conditions before routing to the appropriate path.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Control Flow


Synonyms

Exclusive Choice, XOR-Split, Conditional Routing, Switch, Decision, Selection and OR-Split


Mapping

AWS Step Functions

ASF offers a Choice state as shown in the below figure and code snippet which is equivalent to the Content-based Router. Here ASF allows users to parse the document message, and based on the defined rules, one or multiple paths are chosen. Additionally, if none of the criteria are met, the Choice state offers a Default path.
Content-based Router
Content-based Router

ASF code snippet:
 
  {
    "Comment":"Content-based Router",
    "StartAt":"ChoiceState",
    "States":{
       "ChoiceState":{
          "Type":"Choice",
          "Choices":[
             {
                "Variable":"$.variable",
                "BooleanEquals":true,
                "Next":"Choice 1"
             },
             {
                "Variable":"$.variable",
                "BooleanEquals":false,
                "Next":"Choice 2"
             }
          ]
       },
       "Choice 1":{
          "Type":"Task",
          "Resource":"arn:aws:states:::lambda:invoke",
          "Parameters":{
             "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
             "Payload":{
                "Input.$":"$"
             }
          },
          "End":true
       },
       "Choice 2":{
          "Type":"Task",
          "Resource":"arn:aws:states:::lambda:invoke",
          "Parameters":{
             "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
             "Payload":{
                "Input.$":"$"
             }
          },
          "End":true
       }
    }
 }
 

Zeebe

"Exclusive and Inclusive Gateway" simulates the Content-based Router operations by controlling the message flow between the various branches. The former gateway can deliver the message to exactly one branch while the latter can deliver the message to one or more branches based on the condition expression, as shown in the below figure.
Content-based Router
Content-based Router - Exclusive Gateway

Content-based Router
Content-based Router - Inclusive Gateway

Azure Durable Functions

With the below code snippet, a Content-based Router is realized in ADF by using conditionals to control the orchestration flow.

ADF code snippet:
    
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    var result
    // Get input
    const data = context.df.getInput()

    // Perform parallel processing
    if (data.isFunction1) {
        result = yield context.df.callActivity("function1", data)
    } else {
        result = yield context.df.callActivity("function2", data)
    }

    return result
});
    
  

7. Loop

Loop
Loop

Problem

In a serverless workflow, certain functions have to be executed multiple times to produce the desired outcome. How can the workflow orchestrate a function to be reused when it needs to be triggered recursively?


Decision

The Loop pattern is used to loop through the function multiple times


Source

[Ibsen and Anstey 2010]


Pattern

Enterprise Integration Pattern


Type

Control Flow


Synonyms

Arbitrary Cycles, Iteration, Cycle


Mapping

AWS Step Functions

ASF does not natively support the Loop construct. However, the Loop construct can be achieved by orchestrating multiple states. The below figure and code snippet represents the pattern assembled using the "While" loop logic.
While Loop
While Loop

ASF code snippet:
 
{
  "Comment":"Loop",
  "StartAt":"ChoiceState",
  "States":{
     "State 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "Next":"ChoiceState"
     },
     "ChoiceState":{
        "Type":"Choice",
        "Choices":[
           {
              "Variable":"$.variable",
              "BooleanEquals":true,
              "Next":"CompletedState"
           }
        ],
        "Default":"State 1"
     },
     "CompletedState":{
        "Type":"Pass",
        "End":true
     }
  }
}
 

While, the below figure and code snippet shows how the same Loop construct has been realized using a "Do-While" logic.
Do-While Loop
Do-While Loop

ASF code snippet:
   
{
  "Comment":"Loop",
  "StartAt":"State 1",
  "States":{
     "State 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "Next":"ChoiceState"
     },
     "ChoiceState":{
        "Type":"Choice",
        "Choices":[
           {
              "Variable":"$.variable",
              "BooleanEquals":true,
              "Next":"PassState"
           }
        ],
        "Default":"State 1"
     },
     "PassState":{
        "Type":"Pass",
        "End":true
     }
  }
}
   
  

Zeebe

Zeebe Modeler does not provide an out-of-the-box implementation to perform loops even if BPMN 2.0 has a loop task element. However, the below figure shows how the loop pattern can be implemented using a combination of "Service task", "Exclusive Gateway", and sequence/message flow connector.
Loop
Loop

Azure Durable Functions

In ADF, looping of the functions can be implemented using entry/exit controlled loops. The below code snippet shows how Loop pattern is implemented using a While loop.

ADF code snippet:
    
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    var result
    // Get input
    const data = context.df.getInput()

    // Loop till condition is false
    while (data.loopCondition) {
        result = yield context.df.callActivity("function1", data)
    }

    return result
});  
    
  

8. Delay

Delay
Delay

Problem

There are situations during a workflow execution when it needs to be paused or delayed to wait for a response/acknowledgment from an external system. How can the workflow incorporate a delay or wait?


Decision

The Delay pattern helps in waiting or delaying a function from executing. The delay/wait can be configured by setting a time/period.


Source

[Ibsen and Anstey 2010]


Pattern

Enterprise Integration Pattern


Type

Control Flow


Synonyms


Mapping

AWS Step Functions

The Delay construct can be mapped to the ASF Wait state as shown by the below figure and code snippet. ASF offers the option to delay/pause the ASF execution using seconds or with a relative date-time value. The Delay pattern is ideal when using ASF Standard execution. Although ASF Express offers a Wait state, the users need to be aware that the whole state execution for express workflow is capped at 5 minutes. Hence, making the Wait state an anti-pattern when implemented for Express workflow.
Delay
Delay

ASF code snippet:
 
{
  "Comment":"Wait",
  "StartAt":"State 1",
  "States":{
     "State 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "Next":"WaitState"
     },
     "WaitState":{
        "Type":"Wait",
        "Seconds":10,
        "End":true
     }
  }
}
 

Zeebe

The below figure depicts how the BPMN 2.0 "Timer" event is used to achieve the Delay pattern.
Delay
Delay

Azure Durable Functions

ADF provides durable timers for orchestrator functions to implement delays or set up timeouts on async actions. The below code snippet depicts how the Delay pattern is used in ADF.

ADF code snippet:
    
const df = require("durable-functions");
const moment = require("moment");

module.exports = df.orchestrator(function* (context) {

    const function1Result = yield context.df.callActivity("function1", context.df.getInput())

    // Perform delay operation
    const delay = moment.utc(context.df.currentUtcDateTime).add(30, "s");
    yield context.df.createTimer(delay.toDate())

    const function2Result = yield context.df.callActivity("function2", function1Result)

    return function2Result
});
    
  

9. Gateway

Gateway
Gateway

Problem

Business and operational/implementation logic must be as decoupled as possible to allow core business logic to remain simple?


Decision

The ingestion and output logic need to be encapsulated in separate functions with the help of Message Gateway pattern and this pattern also helps in dividing messaging-specific implementation from the business logic code.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

The Messaging Gateway is a function-specific design pattern. In ASF, this pattern is accomplished by encapsulating input/output messaging-specific method calls as a separate package in AWS Lambda, shown in the below code snippet. This pattern enables the developers to focus on the business logic present in AWS Lambda without worrying about handling the input/output of data.

ASF code snippet:
 
// Import Gateway Logic
const lambdaGateway = require("/opt/utility/lambda_gateway.js");

exports.lambdaHandler = async (event, context, callback) => {
// Input Gateway logic
const event = lambdaGateway.inputGateway(event, context);

// Start : Business logic
// End : Business logic

// Output Gateway logic
lambdaGateway.outputGateway(JSON.stringify(event), callback);
};
 

Zeebe

The Gateway pattern is function-specific, and the pattern mapping is equivalent to the pattern defined for AWS Step Functions.

Azure Durable Functions

The Gateway pattern is function-specific, and the pattern mapping is equivalent to the pattern defined for AWS Step Functions.

10. Content Filter

Content Filter
Content Filter

Problem

How can the workflow simplify dealing with large messages and transmit only the essential data to the required functions?


Decision

The Content Filter pattern simplifies the structure of the messages by removing irrelevant data.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

ASF offers InputPath and ResultSelector to limit the input or output passed by filtering the JSON notation using Paths respectively. The below figure shows how the Content Filter pattern can be accomplished using InputPath or using ResultSelector. Additionally, ASF provides user with OutputPath that enables to select a portion of the message output.
Content Filter
Content Filter

Users have the added benefit of performing the entire filtering or code-specific filtering using custom-developed filters through AWS Lambdas, shown by the below code snippet.

ASF code snippet:
 
// Import utils
const lambdaGateway = require("/opt/utility/utlis.js");

exports.lambdaHandler = async (event, context, callback) => {
// Input Gateway logic
const data = lambdaGateway.inputGateway(event, context);

// Start : Business logic
// Content Filter logic - Users can follow custom logic
const transformedData = utils.removeField(data, ["field_1", "field_2"]);
// End : Business logic

// Output Gateway logic
lambdaGateway.outputGateway(JSON.stringify(transformedData), callback);
};
 

Zeebe

The Content Filter pattern is function-specific, and the pattern mapping is equivalent to ASF Content Filter pattern. This pattern can also be realized according to the below figure. Here the Input/Output Variable can be used to modify the payload sent to the functions. Furthermore, with the help of Expressions and FEEL (Friendly Enough Expression Language), variables can be accessed and calculated dynamically.
Content Filter
Content Filter

Azure Durable Functions

The Content Filter pattern is function-specific, and the pattern mapping is equivalent to code snippet presented for ASF. This pattern can also be realized in the Orchestration function using the below code snippet by filtering data before sending it as a payload to the subsequent function call.

ADF code snippet:
  
const df = require("durable-functions");
const utils = require("../utility/utils.js");

module.exports = df.orchestrator(function* (context) {
    var function1Result = yield context.df.callActivity("function1", context.df.getInput())
    // Start : Filter result
    function1Result = utils.removeField(function1Result,'parameter1')
    // End : Filter result
    const function2Result = yield context.df.callActivity("function2", function1Result)
    return function2Result
});
  
 

11. Content Enricher

Content Enricher
Content Enricher

Problem

How can the workflow fetch additional data required by the functions to process the message?


Decision

The Content Enricher pattern accesses external data source and augments the original message with the missing information.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

ASF offers a feature called Parameters that facilitates users to augment the message with missing information by using static key-value pairs or use values selected from the input data with Paths.
Content Enricher
Content Enricher

For complex Content Enricher scenarios, the below code snippet presents how the user can implement custom fetch and enrich message logic through AWS S3.

ASF code snippet:
 
// Import required libraries
const lambdaGateway = require("/opt/utility/utlis.js");
const s3Operation = require("/opt/utility/aws_s3_service.js");

exports.lambdaHandler = async (event, context, callback) => {
// Input Gateway logic
const data = lambdaGateway.inputGateway(event, context);

// Start : Business logic

// Content Enricher logic - Users can follow custom logic
// Fetch data from data source
const new_data = await s3Operation.getPayload("key", "bucketName");
// Append data
const transformedData = utils.addNewField(data, {
    new_field: new_data,
});
// End : Business logic

// Output Gateway logic
lambdaGateway.outputGateway(JSON.stringify(transformedData), callback);
};
 

Zeebe

The Content Enricher pattern is function-specific, and the pattern mapping is equivalent to the ASF Content Enricher pattern. With the help of Expressions and FEEL (Friendly Enough Expression Language), as shown in the below figure, the Input/Output Variables can be altered and augmented with the necessary information.
Content Enricher
Content Enricher

Azure Durable Functions

The Content Enricher pattern is function-specific, and the pattern mapping is equivalent to ASF Content Enricher. The pattern can also be performed in the orchestration function by following the below template. The below code snippet shows how a function's result enriches another function and then is used as a payload to another activity function.

ADF code snippet:
  
const df = require("durable-functions");
const utils = require("../utility/utils.js");

module.exports = df.orchestrator(function* (context) {
    var function1Result = yield context.df.callActivity("function1", context.df.getInput())
    var function2Result = yield context.df.callActivity("function2", context.df.getInput())
    // Start : Enrich result
    function1Result = utils.addNewField(function1Result, function2Result)
    // End : Enrich result
    const function3Result = yield context.df.callActivity("function3", function1Result)
    return function3Result
});
  
 

12. Claim Check

Claim Check
Claim Check

Problem

Functions that pass large payloads of data within the workflow can be terminated due to size limitations. How will the communication between functions be handled when large messages need to be passed within the workflow?


Decision

Large fields are temporarily filtered in the source function and enriched in the destination function using the Claim Check pattern. The payload is stored in a persistent store, and a Claim Check is passed to the target component. Internally, Claim Check uses the Content Filter and Content Enricher pattern. The Content Filter pattern removes insignificant data from an output message leaving only essential information, thus simplifying its structure. The target function then uses the Content Enricher pattern to augment the received message with the missing information, usually with the help of an external data source.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

ASF limits the payload size that can be passed to the workflow and between states to 256KB, and any executions that pass large payloads can be terminated. The payload restriction barrier can be solved by employing the Claim Check pattern as shown by the below figure. In ASF, this pattern is implemented in AWS Lambdas using the Content Filter that first ensures that the large payload is stored in Amazon Simple Storage Service (Amazon S3) persistent storage and replaced by the Amazon Resource Name (ARN) bucket name and key value. Then the filtered message is sent to the following states, and any state that needs this data uses the Content Enricher pattern to recover the saved data again.
Claim Check
Claim Check

Zeebe

The Claim Check pattern internally uses the Content Filter and Enricher pattern. Hence, the mapping is similar to AWS Step Functions Claim Check pattern.

Azure Durable Functions

The Claim Check pattern internally uses the Content Filter and Enricher pattern. Hence, the mapping is similar to AWS Step Functions Claim Check pattern.

13. Normalizer

Normalizer
Normalizer

Problem

How can the output from each terminal function in the workflow branches be normalized, which otherwise would require having an additional normalization function?


Decision

The Normalizer pattern helps solve this problem by ensuring that the messages produced from any branch confirm with a standard format that is understandable by the recipient component. In this pattern, each message is passed through a custom message translator so that the resulting messages match a standard format. Hence this pattern helps in preventing the creation and invoking of additional functions to handle this scenario.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

In ASF, the Normalizer pattern is achieved using either the ResultSelector, ResultPath, or OutputPath Output processing constructs. These output processing constructs also can be combined to achieve complex normalization of the message. The transformation can also be performed inside AWS Lambdas using Content Filter or Content Enricher patterns.

Zeebe

The Normalizer pattern is function-specific, and the pattern can be achieved using the Content Filter or Content Enricher patterns.

Azure Durable Functions

The Normalizer pattern is function-specific, and the pattern can be achieved using the Content Filter or Content Enricher patterns.

14. Message History

Message History
Message History

Problem

How can we effectively analyze and debug the flow of messages in a loosely coupled and granular system?


Decision

The primary purpose of employing a serverless paradigm is to build loosely coupled and granular systems. However, building such systems induces the complexity of debugging and traceability as it is not intuitively possible to comprehend the flow of the message. This problem can be solved using the _Message History_ pattern, in which the system maintains the history of the message. Thus when a message fails to be processed in the system, the developer can trace back the steps and provide instant feedback and solution.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Construct/Function Specific


Synonyms


Mapping

AWS Step Functions

The Message History pattern is available only for ASF Standard Executions, as the states are persisted on disk, while the Express Execution lacks this pattern due to its processing strategy of utilizing the memory. Furthermore, ASF Standard Executions provide the developers with a visual representation of the path that the message has traversed and information like input/output payload, response, execution time, and execution status.

Zeebe

The message history of the Zeebe workflow is maintained using variables as shown in the below figure.
Message History
Message History

Azure Durable Functions

The Message History of the Azure Durable Orchestration function is maintained using the execution history table as shown in the below figure. When yield is invoked the Activity function result is stored in the History Table. In a Azure Durable Orchestration function execution, the Activity Functions have an at-least-once policy making the History Table crucial to check if the function has been executed or not. The History Table provides the input and result for each function.
Message History
Message History

15. Splitter and Aggregator

Splitter and Aggregator
Splitter and Aggregator

Problem

How can the serverless workflow process multiple homogeneous records concurrently that are part of a single payload?


Decision

A Splitter pattern helps split a single message into a sequence of sub-messages that can be processed individually. Likewise, the Aggregator pattern performs the contrary by collecting a complete set of related messages. Combining the two patterns simulates the MapReduce2 implementation, which can be used to split the array payload into smaller chunks that be processed in a parallel fashion and, more importantly, avoid payload limit issues.


Source

[Hohpe and Woolf 2004]


Pattern

Enterprise Integration Pattern


Type

Function Specific


Synonyms

Fan-out, Fan-in


Mapping

AWS Step Functions

ASF supports Splitter and Aggregator with the help of dynamic parallelism Map state type as shown by the below figure and code snippet. The Map state will execute the same steps for multiple entries of an array in the state input. The below are the mandatory fields for Map state:
  • Iterator: The object that defines a state machine that will process each element of the array
  • ItemsPath: Path of the array where the input is located
  • MaxConcurrency: Upper bound on how many invocations of the Iterator may run in parallel

Splitter and Aggregator
Splitter and Aggregator


ASF snippet:

{
  "Comment":"Callback",
  "StartAt":"MapState",
  "States":{
     "MapState":{
        "Type":"Map",
        "ItemsPath":"$.array",
        "MaxConcurrency":0,
        "Iterator":{
           "StartAt":"State 1",
           "States":{
              "State 1":{
                 "Type":"Pass",
                 "Result":"Done!",
                 "End":true
              }
           }
        },
        "ResultPath":"$.output",
        "End":true
     }
  }
}


Zeebe

The below figure presents how the Splitter and Aggregator pattern is implemented using BPMN 2.0. Here "Function 2" is configured with Parallel Multi Instance which takes an input collection and performs dynamic parallelism to process the data.
Splitter and Aggregator
Splitter and Aggregator

Azure Durable Functions

The presented code snippet shows how the Splitter and Aggregator pattern is implemented using ADF and is also referred to as Fan-out/Fan-in pattern. Similar to Multicast, in this pattern, the data is processed using a parallel construct. In ADF, the Splitter and Aggregator pattern is implemented by first splitting the data into batches, and then each batch is processed using the same function parallelly. The result of each branch is aggregated using another Activity Function.

ADF code snippet:
  
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    const mapTasks = [];

    // Get a list of batches to process in parallel
    const batch = yield context.df.callActivity("function1");

    // Perform parallel processing of the batches (Map)
    for (let i = 0; i < batch.length; i++) {
        mapTasks.push(context.df.callActivity("function2", batch[i]));
    }
    const arrayParallelTasksResult = yield context.df.Task.all(mapTasks);

    // Aggregate the results (Reduce)
    yield context.df.callActivity("function3", arrayParallelTasksResult);
});
  

16. Implicit Termination

Implicit Termination
Implicit Termination

Problem

How to terminate the workflow when no execution steps are remaining?


Decision

The Implicit Termination pattern states that if there is no task to be performed, stop the workflow.


Source

[Russell et al. 2006a], [van der Aalst et al. 2003]


Pattern

Workflow Control-Flow Pattern


Type

Control Flow


Synonyms


Mapping

AWS Step Functions

ASF provides three options to terminate an execution. If the user wants a stop the workflow the below terminal states can be used:
  • Succeeded: Terminate Workflow execution with Succeeded status
  • End: Stop the workflow with the normal flow
  • Fail: Terminate Workflow execution with Failed status

The below figure and code snippet presents an example how the Implicit Termination pattern is achieved by ASF.
Implicit Termination
Implicit Termination


ASF snippet:

{
  "Comment":"Implicit Termination",
  "StartAt":"State 1",
  "States":{
     "State 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "Next":"ChoiceState"
     },
     "ChoiceState":{
        "Type":"Choice",
        "Choices":[
           {
              "Variable":"$.variable",
              "BooleanEquals":true,
              "Next":"SuccessState"
           },
           {
              "Variable":"$.variable",
              "BooleanEquals":false,
              "Next":"OtherState"
           }
        ]
     },
     "SuccessState":{
        "Type":"Succeed"
     },
     "OtherState":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "End":true
     }
  }
}


Zeebe

The Implicit Termination pattern in BPMN 2.0 is implemented using the End Event. The below figure illustrates how the End Event can stop the execution of the workflow. Here, the functions have been terminated in both branches, with the first branch ends after one function's execution, while the other branch has two.
Implicit Termination
Implicit Termination

Azure Durable Functions

Implicit termination pattern in ADF occurs when the Orchestration function reaches the last execution statement or when a "return" statement is reached. The below code snippet depicts when the return statement is reached, the ADF orchestration function is terminated.

ADF code snippet:
  
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    const function1Result = yield context.df.callActivity("function1", context.df.getInput())
    return function1Result
});
  

17. Nested Workflows

Nested Workflows
Nested Workflows

Problem

If some tasks are alike, how do we abstract and represent them as a hierarchical and reusable model?


Decision

Nested Workflows patterns help facilitate reusable workflows, abstracting complex logic, effective communication, and hierarchical and modular modeling.


Source

[Russell et al. 2006a], [van der Aalst et al. 2003]


Pattern

Workflow Control-Flow Pattern


Type

Control Flow


Synonyms


Mapping

AWS Step Functions

ASF can execute another ASF workflow by utilizing the state with Task type and its ARN identifier, shown by the below figure and code snippet. ASF additionally allows the user to pass a payload when executing another ASF workflow.
Nested Workflow
Nested Workflow


ASF snippet:

{
  "Comment":"Nested Workflow",
  "StartAt":"Start state machine execution",
  "States":{
     "Start state machine execution":{
        "Type":"Task",
        "Resource":"arn:aws:states:::states:startExecution",
        "Parameters":{
           "StateMachineArn":"arn:aws:states:REGION:ACCOUNT_ID:stateMachine:STATE_MACHINE_NAME",
           "Input":{
              "StatePayload":"Hello from Step Functions!",
              "AWS_STEP_FUNCTIONS_STARTED_BY_EXECUTION_ID.$":"$$.Execution.Id"
           }
        },
        "End":true
     }
  }
}


Zeebe

Using BPMN 2.0 with Zeebe Modeler, a Nested Workflow pattern is constructed using the SubProcess element. The below figure presents how another workflow that consist of "Function 2" can be invoked from the parent workflow.
Nested Workflow
Nested Workflow

Azure Durable Functions

In ADF, a nested workflow pattern is constructed when another durable orchestration function is invoked from the parent orchestration function. The below code snippet shows how the sub orchestration function can be triggered using the callSubOrchestrator function call.

ADF code snippet:
  
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    const result = context.df.callSubOrchestrator("subOrchestration", context.df.getInput())
    return result
});
  

18. Callback

Callback
Callback

Problem

How can the serverless workflow handle external invocations from a service or a human-performed activity?


Decision

In the Callback pattern, the workflow pauses execution and waits until an appropriate response is received to proceed with the execution. These tasks can be human, service, or some response from an external process.


Source

[Russell et al. 2006a], [van der Aalst et al. 2003]


Pattern

Workflow Control-Flow Pattern


Type

Control Flow / Function Specific


Synonyms


Mapping

AWS Step Functions

ASF provides a means to suspend a workflow until a task token is returned. A task might need to wait for human approval, integrate with a third party, or call legacy systems that can pause ASF indefinitely and wait for an external process or workflow to complete. For these situations, ASF with Callback task as shown in the below figure and code snippet allows passing a task token to the target integrated services, and it will wait until the task token is returned.
Callback
Callback


ASF snippet:

{
  "Comment":"Callback",
  "StartAt":"Step 1",
  "States":{
     "Step 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke.waitForTaskToken",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$",
              "TaskToken.$":"$$.Task.Token"
           }
        },
        "End":true
     }
  }
}


Zeebe

The below figures present two ways how the Callback pattern can be invoked. The former method shows how using variables, expression conditions, and Service tasks; the callback can be invoked. In the latter method, the User needs to perform the operation and manually accept the workflow's task to progress until the end.
Callback
Callback

User Callback
User Callback

Azure Durable Functions

The callback pattern can be implemented in ADF using waitForExternalEvent, allowing an orchestrator to wait and listen for an external event asynchronously. The below code snippet presents how the pattern is implemented using ADF.

ADF code snippet:
  
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    const token = yield context.df.waitForExternalEvent("externalFunction");
    if (token) {
        // token received from external and continue processing
    } else {
        // token failed
    }
});
  

19. Error Handling

Error Handling
Error Handling

Problem

How can the system handle error exceptions that might occur in the workflow and manage them gracefully?


Decision

The Error Handling pattern helps handle exceptions due to abnormal input or conditions and can retry the processing when needed.


Source

[Russell et al. 2006a], [van der Aalst et al. 2003]


Pattern

Workflow Control-Flow Pattern


Type

Control Flow / Function Specific


Synonyms


Mapping

AWS Step Functions

The Error handling pattern is crucial for any system, and ASF provides the Error handling feature to handle various errors like state machine definition issues, task failures, and transient issues. Apart from handling errors, ASF offers the Catch and Retry fields that help reprocess the state in case of failures. The below figure and code snippet shows how ASF handles exceptions with a retry option.
Error Handling
Error Handling


ASF snippet:

{
  "Comment":"Nested Workflow",
  "StartAt":"Step 1",
  "States":{
     "Step 1":{
        "Type":"Task",
        "Resource":"arn:aws:states:::lambda:invoke",
        "Parameters":{
           "FunctionName":"arn:aws:lambda:REGION:ACCOUNT_ID:function:FUNCTION_NAME",
           "Payload":{
              "Input.$":"$"
           }
        },
        "Catch":[
           {
              "ErrorEquals":[
                 "States.TaskFailed"
              ],
              "Next":"NotifyError"
           }
        ],
        "Retry":[
           {
              "ErrorEquals":[
                 "States.Timeout"
              ],
              "IntervalSeconds":3,
              "MaxAttempts":2,
              "BackoffRate":1.5
           }
        ],
        "End":true
     },
     "NotifyError":{
        "Type":"Fail",
        "Cause":"Invalid response.",
        "Error":"ErrorA"
     }
  }
}


Zeebe

The below figure presents how error handling can be implemented using Zeebe. Here the Error boundary event is placed on the Service task, and in case an error happens in the function, the error handling function is triggered. Furthermore, Service tasks have a retries feature that will reinvoke the task in case of a failure.
Error Handling
Error Handling

Azure Durable Functions

Error handling in ADF is implemented using the programming language's built-in error-handling features (try-catch), as shown in the code snippet. Exceptions thrown in an Activity Function are directed back to the orchestrator function and thrown as a FunctionFailedException.

ADF code snippet:
  
const df = require("durable-functions");

module.exports = df.orchestrator(function* (context) {
    try {
        const function1 = yield context.df.callActivity("function1", context.df.getInput())
        return function1;
    }
    catch (error) {
        console.error(error)
    }
});
  

20. Workflow Data

Workflow Data
Workflow Data

Problem


Decision

The Workflow Data pattern states that the data required for the whole workflow will be available to all functions. In this pattern, the shared libraries and packages are placed under the appropriate directory or vendor-specific offerings.


Source

[Russell et al. 2005]


Pattern

Workflow Data Pattern


Type

Function Specific


Synonyms


Mapping

AWS Step Functions

AWS Lambda Layers help keep the deployment package granular and making development more manageable, which can help avoid errors when installing package dependencies. Here all the utility functions like accessing AWS Secret Manager, AWS S3 operations, and accessing external services can be deployed on the layer, and attaching this layer to the Lambdas offer all the functionalities.
Error Handling
Error Handling

Zeebe

The Workflow Data pattern is function-specific, and the pattern mapping is equivalent to AWS Step Functions.

Azure Durable Functions

The Workflow Data pattern is function-specific, and sharing utilities, libraries, and helper code can be done by placing all these compiled files in a folder at the root level of the functions.

ADF code snippet:
Workflow Data
Workflow Data

References

  1. Hohpe, G. and Woolf, B., 2004. Enterprise integration patterns: Designing, building, and deploying messaging solutions. Addison-Wesley Professional.
  2. Ibsen, C. and Anstey, J., 2018. Camel in action. Simon and Schuster.
  3. Russell, N., Ter Hofstede, A.H., Van Der Aalst, W.M. and Mulyar, N., 2006. Workflow control-flow patterns: A revised view. BPM Center Report BPM-06-22, BPMcenter. org, pp.06-22.
  4. van Der Aalst, W.M., Ter Hofstede, A.H., Kiepuszewski, B. and Barros, A.P., 2003. Workflow patterns. Distributed and parallel databases, 14(1), pp.5-51.
  5. Russell, N., Ter Hofstede, A.H., Edmond, D. and Van der Aalst, W.M., 2005, October. Workflow data patterns: Identification, representation and tool support. In International Conference on Conceptual Modeling (pp. 353-368). Springer, Berlin, Heidelberg.

Contributors

The constructed artifacts available in this website is part of a study performed by the University of Groningen & Researchable B.V.