Spark-connector run error

i use spark-connector like this in spark-streaming appliction:

kafkaDirectStream.foreachRDD(rdd => {
      try {
        if (!rdd.isEmpty) {
          val rowRdds = rdd.mapPartitions(partitions => {
            partitions.map(x => {
	    ......
	    })
	   })
	   val schema = StructType(Seq(StructField("Name", StringType, false),
                            StructField("Age", IntegerType, false)))

	  val df = sqlContext.createDataFrame(rdd, schema)
	  df
	    .write
	    .format("com.memsql.spark.connector")
	    .mode("error")
	    .save("people.students")
      } catch {
        case e: Exception => {
          throw new RuntimeException(e)
        }
      }
})

and result is there is only a little of data has been writed into memsql , and the speed is very slow. then i killed the application and find some error logs in then yarn application logs as follow. how can i resolve this issue.

Hello Juoon,

Sorry for the late response. Can you see if our current doc on Spark is helpful? Here is the link SingleStoreDB Cloud · SingleStore Documentation

Our engineering has also released a new connector on GitHub for Spark - Beta version. Here is that link https://github.com/memsql/memsql-spark-connector/tree/3.0.0-beta

Here is the doc link for the above connector

Hope this is helpful. Let us know how you ultimately implemented your solution.

Thanks,