Spark-connector run error

i use spark-connector like this in spark-streaming appliction:

kafkaDirectStream.foreachRDD(rdd => {
      try {
        if (!rdd.isEmpty) {
          val rowRdds = rdd.mapPartitions(partitions => {
   => {
	   val schema = StructType(Seq(StructField("Name", StringType, false),
                            StructField("Age", IntegerType, false)))

	  val df = sqlContext.createDataFrame(rdd, schema)
      } catch {
        case e: Exception => {
          throw new RuntimeException(e)

and result is there is only a little of data has been writed into memsql , and the speed is very slow. then i killed the application and find some error logs in then yarn application logs as follow. how can i resolve this issue.

Hello Juoon,

Sorry for the late response. Can you see if our current doc on Spark is helpful? Here is the link

Our engineering has also released a new connector on GitHub for Spark - Beta version. Here is that link

Here is the doc link for the above connector

Hope this is helpful. Let us know how you ultimately implemented your solution.