Monday, November 21, 2022

Kafka Integraion

 

Kafka Integration with ServiceNow

Prerequisite:

·       MID Server Up and running

·       Credentials to connect to Kafka

·       Topic details to pull the data from

 

1)    Install IntegrationHub Enterprise Plugin

Installs IntegrationHub enterprise pack to automate human resources, customer management, enterprise resource planning, and more. Includes IntegrationHub professional pack, Microsoft SCCM spoke, and Data Stream actions.

Plugin Name: ServiceNow IntegrationHub Enterprise Pack Installer        [com.glide.hub.integrations.enterprise]

To Kafka integration work we need below two plugins as well.

·       IntegrationHub Runtime (com.glide.hub.integration.runtime)

·       ServiceNow IntegrationHub Action Step - RTE (com.glide.hub.action_step.rte)

 

2)    Install Confluent Kafka REST Proxy spoke

The Confluent Kafka REST Proxy spoke provides actions to automate tasks when events occur in your ServiceNow instance.

Available actions include:

·       Assign Consumer Instance to Topic Partition

·       Commit Consumer Offsets

·       Create a Consumer Instance within a Consumer Group

·       Delete Consumer Instance

·       Get Kafka Partition Details for A Topic

·       Get Messages

·       Publish Message


3)    Install ServiceNow Kafka Consumer

Integrates your ServiceNow instance with Kafka Consumer and stores data in the ServiceNow tables.

Note:

·       This spoke require IntegrationHub Enterprise plugin to be installed first.

·       This spoke was built for Confluent REST Proxy API v2



4)    Set up the Confluent Kafka REST Proxy spoke

a)     Create a credential record for the Confluent Kafka REST Proxy spoke

·       Connections & Credentials > Credentials

·       New & Basic Auth Credentials


 

b)     Create a connection record for the Confluent Kafka REST Proxy spoke

i)       Connections & Credentials > Connection & Credential Aliases

ii)     Select OOB “Kafka REST” record

iii)    From the Connections tab, click New

iv)    While creating connection from Connection & Credential Aliases don’t forgot to select MID Server as Kafka is on premises and ServiceNow is on cloud.





 

5)    Created Robust Transform Engine along with Entity, Entity fields, and Entity operations

Robust transform engine is used in place of transform map in ServiceNow.

Transform Maps

Robust Import Set Transformers

 

Transform maps define the mapping from imported data stored in a staging table to a single target table in the Now Platform. It does perform both transform and processing functions.

Features / Limitation

Ø  One to one mapping

Ø  Staging table to target table

Ø  Need to have multiple transform map for multiple table mapping

 

Robust Transform Engine (RTE) and the robust import set transformer separate the transform and processing functions, providing a more flexible alternative to transform maps.

You can transform the data as desired and then load that data to one or more target tables. Records are processed as batches to enhance performance.

Features / Limitation

Ø  One to Many mapping

Ø  No mandate Staging table require

Ø  Single transform map for multiple table mapping

 

a)     Create ETL Definition



b)     ETL Entities and its fields



c)     Entities Fields



d)     Entities Operation if at all require







e)     RTE Entity Mapping





 

 

6)    Created Kafka Consumer

a)     Navigate to Process Automation > Kafka Consumer

b)     Click New and fill all the fields

c)     Remember when you start the listener use offset as Earliest first then in second time change to Latest.

d)     Partition per group on Kafka Consumer form is to divide the partitions.

i)       If there are three partitions (0, 1, 2) then it’s better to put value of “Partition per group” 3 so it will create 1 record in partition group table.



Reference

https://docs.servicenow.com/bundle/quebec-servicenow-platform/page/administer/integrationhub-store-spokes/concept/kafka-consumer.html

7)    Change Schedule from on-demand to interval

This is the schedule which triggering OOB subflow to fetch and pass the data to Robust transform engine.



8)    Start Kafka Consumer

This will help us to fetch the partition details from Kafka and pass into “Kafka Partition Group Listener States” table.

Remember when you start the listener use offset as Earliest first then in second time change to Latest





 

Troubleshooting:

1.      Check for MID Server à IT should be up and running

2.      Test the credentials   à  Connection should be successful

3.      Check for the MID Server Logs à Navigate ECCQ and check for the input records

4.      Open the flow executions to see the received message à navigate to today’s executions under flow

5.      Open the Robust transform map attached to Kafka consumer and check the data operation weather its working as expected.

6.      While pushing data from postman you should be in client network or do it from MID Server

 

Possible Errors

Fetching of Customer Data

 

1.     Scenario 1:

Topic: DATA-MKT-FOUNDATION-CUSTOMER-EVENT-CHANGE-SUBSCRIBER

Offset: Earliest

Format: JSON



 

Postman:

Message published in Kafka successfully



Result from Kafka side while pulling data from ServiceNow

Status: Error with No data  à This kind of error occurs when ServiceNow is fetching data from Kafka topics its receiving key (9eb90fea-4694-489b-993c-b7e2247f54a4    see in below error) from there side which is not in JSON format.

Flow Designer: Kafka Consumer [ Kafka Consumer ] - com.fasterxml.jackson.core.JsonParseException: Unexpected character ('b' (code 98)) in numeric value: Exponent indicator not followed by a digit
at [Source: (byte[])"9eb90fea-4694-489b-993c-b7e2247f54a4"; line: 1, column: 4] with error code 50002



 For second partition:

Flow Designer: Kafka Consumer [ Kafka Consumer ] - com.fasterxml.jackson.core.JsonParseException: Unexpected character ('-' (code 45)): Expected space separating root-level values
at [Source: (byte[])"3-603072"; line: 1, column: 3] with error code 50002

 



2.     Scenario 2:

Topic: DATA-MKT-FOUNDATION-CUSTOMER-EVENT-CHANGE-SUBSCRIBER

Offset: Latest

Format: JSON



Result: No Error but No data as well  à This is when ServiceNow is not able to pull the data from Kafka or there is no data in Kafka topics.

{
    "Messages Response": {
        "number_of_messages": 0,
        "status": "success"
    }
}

 

 

3.   If I use avro as format

Result: error   --- These kind of errors comes when ServiceNow is not able to pass the offset in Kafka to go to next offset. (Exm offset: 46309)

{

    "Messages Response": {

        "error_code": "50002",

        "error_message": "Error deserializing key/value for partition DATA-MKT-FOUNDATION-CUSTOMER-EVENT-CHANGE-SUBSCRIBER-0 at offset 46309. If needed, please seek past the record to continue consumption.",

        "status": "error"

    }

}

 

{

    "Messages Response": {

        "error_code": "50002",

        "error_message": "Error deserializing key/value for partition DATA-MKT-FOUNDATION-CUSTOMER-EVENT-CHANGE-SUBSCRIBER-0 at offset 46381. If needed, please seek past the record to continue consumption.",

        "status": "error"

    }

}

 

{

    "Messages Response": {

        "error_code": "50002",

        "error_message": "Error deserializing key/value for partition DATA-MKT-FOUNDATION-CUSTOMER-EVENT-CHANGE-SUBSCRIBER-0 at offset 46264. If needed, please seek past the record to continue consumption.",

        "status": "error"

    }

}

 

 

 

No comments:

Post a Comment