Friday, August 12, 2016

MySQL Driver Error in Apache Spark

I was following the Spark example to load data from MySQL database. See "http://spark.apache.org/examples.html"

There was an error upon executing:
org.apache.spark.SparkException: Job aborted due to stage failure: Task 0 in stage 20.0 failed 4 times, most recent failure: Lost task 0.3 in stage 20.0 (TID 233, ip-172-22-11-249.ap-southeast-1.compute.internal): java.lang.IllegalStateException: Did not find registered driver with class com.mysql.jdbc.Driver

To force Spark to load the "com.mysql.jdbc.Driver", add the following option as highlighted below
val df = sqlContext
  .read
  .format("jdbc")
  .option("url", url)
  .option("dbtable", "people") 
  .option("driver","com.mysql.jdbc.Driver").load()

2 comments:

  1. Wow i can say that this is another great article as expected of this blog.Bookmarked this site..
    Z3X Box Driver

    ReplyDelete
    Replies

    1. cheap windows and office product keys , buy server 2013 r2 keys , window 7 ultimate product key , klucz do win 7 home premium , windows 7 ultimate product key , windows 7 product key purchase , windows 10 activation status , windows 7 enterprise activation key download for sale , lp100V

      windows 7 ult key sale online

      buy office pro plus 2016 keys

      cheap windows 10 pro keys for sale

      windows server 2016 standard key sale and download

      buy windows 7 ult keys online

      Delete