A workflow describes the flow and transformation of incoming data into outgoing data. It all begins with an inbound connector and an inbound data processor.
Between the inbound data processors and the outbound data processors there is a data routing element.
After the routing element there can be 0 to N outbound data processors and it all ends with outbound connectors.
However, this is obviously not the end of a workflow! Workflows can be connected to other workflows.
A connector is the actual data transport interface from/to an external system. A connector is also used to transport data between workflows on the same platform.
A processor is a data processing element in which the main logic of the workflow is exectuted.
An Inbound Connector creates a message object called the “payload” for the workflow. This payload goes through the workflow and each workflow step can manipulate the payload.
No matter what data an Inbound Connector creates, the data is converted internally to a JSON dataformat.
In the screenshot you can see an Inbound Processing element. The orange and blue circles are Workflow step. All this steps are executed sequentially and can manipulate the payload.
The routing between an Inbound and an Outbound Processor can be done with a routing element. These settings are available:
- Disabled – no message takes this route
- Always – no matter what happens during Inbound Processing – always take this route
- Always on OK – take this route if the Inbound Processing does not fail
- Always on Error – take this route if the Inbound Processing fails
As you can see in the screenshot of the Routing element, we can split messages. By defining a “Split Count”, the Inbound Message is splitted into multiple Outbound Messages.
Imagine the Inbound Message is an array of 105 records. Defining a Split Count of 1 would generate 105 Outbound Messages, each with a single record.
Defining the Split Count of 20 would generate six Outbound Messages. Five holding 20 records and the last one holding the last 5 records.
If the Inbound and the Outbound Processor have defined a Dataformat, the data is validated against the Dataformat and you can use a Drag & Drop mapper.
If a Dataformat is defined in the Inbound Processor, the incoming data is validated against this Dataformat. A Dataformat can define a complex structure like a JSON or XML or a flat structure as it is used in SQL or CSV.
Each attribute in the Dataformat can have different settings, like mandatory, empty allowed, datatype, …. This settings are used to check the incoming data.
Dataformat Date/Time Normalization
The real challenges when doing data transformations from one system into another are always timestamps and time zones. Lomnido will help you with a normalization there. You can define the expected date and time format and the time zone and Lomnido normalizes your data. Internally, Lomnido works with UTC. Incoming timestamps are transformed to UTC and outgoing timestamp are transformed to the format and timezone that the connected system requires.
Of course, a workflow can save data to be used later. Therefore Lomnido has Datatables. Technically speaking, a Datatable is a MongoDB collection. Each workflow step can save data in a Datatable or load data.