Hook

A hook pattern enables an event-driven integration model in which data is pushed into a system. This can be a webhook accepting HTTP/S requests or a messaging bus accepting messages. In some cases, this can also be implemented as a file listener depending on the operating system.

The primary advantages of this pattern is that it further decouples the origination of the source data from the integration platform and it and helps trigger real-time or near-time downstream processes.

While some source systems support sending HTTP/S requests for near-time integration some implementations require additional complex processing steps or a callback for acknowledgement, in which case a polling approach may be sufficient and/or simpler.

Pattern Overview

This pattern describes an event-driven data integration system in which data is pushed into the integration platform. As a result, this pattern continuously waits for incoming data and could receive a payload at anytime. Depending on the source system sending the data, an ackowledgement that the message was received may be required.

DataZen Implementation

DataZen supports two kinds of hooks: webhooks and message consumers.

Hooks are only available with Self-Hosted DataZen agents.

Hook Implementation Comments
Webhooks Uses a Job Writer (or Direct Job) The URI needs to specify the Job GUID or Job Key
Messaging Consumers Uses a Messaging Consumer Job Messaging Consumers can process XML and JSON payloads directly

Webhooks

A webhook implementation in DataZen requires the source system to issue a POST operation with the payload representing the data. The HTTP/S used by the source system needs to generally comply with the following example:

POST https://YourDataZenURL/job/push?jobguid=123456789
Authorization: Token ....

{
    "data": "a sample JSON document to send; could be xml, CSV, raw data"
}
            

Note that when this operation is sent to the Job Reader of a Direct Job (using the Job Reader GUID or Job Key), both the Source and Target Data Pipelines are also executed, if any; however, the CDC settings are not observed.

If the Job GUID or Job Key represents a Job Writer (or the Job Writer of a Direct Job), the Source Data Pipeline is bypassed. However, the Target Data Pipeline will be executed if any has been defined.

Although native CDC is not available for hooks. you can apply a custom CDC component as part of the Source or Target Data Pipeline to implement change capture.

Messaging Consumers

Messaging consumers can be created as a specific type of job and can also apply JSON or XML transformations optionally. Unlike webhooks, messaging consumers implement specific protocols which carry additional metadata information; this metadata is also available and can be forwarded to other messaging consumers or sent to any other system.