Strings are enclosed with double quotes when we insert data into a table using stored procedure pipelien

{“ip”: “10.77.44.94”, “org”: “52f95eb3-350c-4512-ba42-e0938a66c9eb”, “time_zone”: “IST”, “node_id”: “000C29E54018”, “name”: “Analytics-94”}

Above string we are getting from Kafka topic.

We are inserting it into a table through a stored procedure using Kafka pipelien.

We are able to insert data into the table successfully but the string columns are enclosed with a double-quote.

How to rectify it.

Could you paste the output of show create pipeline YOUR_PIPELINE_NAME and show create procedure YOUR_PROCEDURE_NAME, as well as a sample of the problematic inserted data - e.g. via select * from bad_column limit 1? This is in general unexpected, though it depends on how exactly the procedure is performing the insert.

Are you potentially hitting the issue mentioned towards the end of Unable to ingest JSON data with PIPELINE INTO PROCEDURE - #4 by sasha - JSON - SingleStore Forums ?

The takeaway there is that you should use ::$ or json_extract_string() instead of :: to extract subobjects, if you want them to get assigned as SQL strings, rather than as JSON string objects (which will have escapes and enclosing quotes).

Also, since I know that it’s a common point of confusion: CREATE PIPELINE ... FORMAT JSON can handle extracting subobjects of each incoming JSON and assigning them to columns, with automatic type detection/conversion. If the stored procedure is just doing that, it’ll be more efficient to use that functionality and remove the stored procedure entirely.

Thank you very much for the solution.