Loading Kafka Configuration dynamically for Memsql Pipeline

Hi,

I’m able to create a Pipeline which will connect to Kafka end point and load the data. Now, I’m trying to send Kafka details dynamically. Is there anyway without hardcode the Kafka parameters in Memsql pipeline?

CREATE PIPELINE mypipeline AS
LOAD DATA KAFKA '127.0.0.1/my-topic’
INTO TABLE my_table;

START PIPELINE mypipeline;

Thanks
Arun

1 Like

Hi Arun

I am not 100% sure what your ask is – do you mean, how you could use code to dynamically change/assign the my-topic directory?

Not within MemSQL directly. You could use a bash script with env variables and substitute them out programmatically, see attached mini-example:

create_pipeline_template.sql

use db;

CREATE OR REPLACE PIPELINE test_pipeline
AS LOAD DATA KAFKA  '127.0.0.1/KAFKA_TOPIC'
INTO TABLE my_table;

start pipeline test_pipeline; 

update_pipeline.sh

#!/bin/bash

CREATE_PIPELINE_TEMPLATE_PATH=$(readlink -f create_pipeline_template.sql)

# use your logic here to set topic name
export KAFKA_TOPIC=mytopic

# Use perl rather than sed to avoid problems with special characters,
# e.g. '/', in replacement text.
#
cat $CREATE_PIPELINE_TEMPLATE_PATH | \
    perl -pe 's/KAFKA_TOPIC/$ENV{KAFKA_TOPIC}/g;' \
    | tee /dev/stderr \
    | mysql -u root -h 0 -P 3306

Then you would just run “update_pipeline.sh”

Thanks alec. Let me try it out.

Regards
Arun