乡下人产国偷v产偷v自拍,国产午夜片在线观看,婷婷成人亚洲综合国产麻豆,久久综合给合久久狠狠狠9

  • <output id="e9wm2"></output>
    <s id="e9wm2"><nobr id="e9wm2"><ins id="e9wm2"></ins></nobr></s>

    • 分享

      Zeppelin spark.executor.extraClassPath 和

       飲茶仙人 2018-04-03

      報錯如下:


      WARN [2017-06-27 15:47:59,777] ({pool-2-thread-2} Logging.scala[logWarning]:66) - 

      SPARK_CLASSPATH was detected (set to '/home/raini/spark/lib/mysql-connector-java-5.1.38-bin.jar:').
      This is deprecated in Spark 1.0+.

      Please instead use:
       - ./spark-submit with --driver-class-path to augment the driver classpath
       - spark.executor.extraClassPath to augment the executor classpath
              

       WARN [2017-06-27 15:47:59,778] ({pool-2-thread-2} Logging.scala[logWarning]:66) - Setting 'spark.executor.extraClassPath' to '/home/raini/spark/lib/mysql-connector-java-5.1.38-bin.jar:' as a work-around.
      ERROR [2017-06-27 15:47:59,780] ({pool-2-thread-2} Logging.scala[logError]:91) - Error initializing SparkContext.
      org.apache.spark.SparkException: Found both spark.driver.extraClassPath and SPARK_CLASSPATH. Use only the former.
      at org.apache.spark.SparkConf
      anonfun$validateSettings$7
      anonfun$apply$8.apply(SparkConf.scala:543)
      at org.apache.spark.SparkConf
      anonfun$validateSettings$7
      anonfun$apply$8.apply(SparkConf.scala:541)
      at scala.collection.immutable.List.foreach(List.scala:381)
      at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:541)
      at org.apache.spark.SparkConf$$anonfun$validateSettings$7.apply(SparkConf.scala:529)






      解決:
      1.提交的作業(yè)可以通過加入--driver-class-path參數(shù)來設(shè)置driver的classpath。

      $  bin/spark-submit --master local[2]  --driver-class-path lib/mysql-connector-java-5.1.35.jar --class  spark.SparkToJDBC ./spark-test_2.10-1.0.jar


      2.其實,我們還可以在spark安裝包的conf/spark-env.sh通過配置SPARK_CLASSPATH來設(shè)置driver的環(huán)境變量,如下:

      export SPARK_CLASSPATH=$SPARK_CLASSPATH:/iteblog/com/mysql-connector-java-5.1.35.jar


      這樣也可以解決上面出現(xiàn)的異常。但是,我們不能同時在conf/spark-env.sh里面配置SPARK_CLASSPATH和提交作業(yè)加上–driver-class-path參數(shù),否則會出現(xiàn)以上異常。


      所以,刪掉一個配置即可,這里刪掉了spark配置項:export SPARK_CLASSPATH=...




        本站是提供個人知識管理的網(wǎng)絡(luò)存儲空間,所有內(nèi)容均由用戶發(fā)布,不代表本站觀點。請注意甄別內(nèi)容中的聯(lián)系方式、誘導購買等信息,謹防詐騙。如發(fā)現(xiàn)有害或侵權(quán)內(nèi)容,請點擊一鍵舉報。
        轉(zhuǎn)藏 分享 獻花(0

        0條評論

        發(fā)表

        請遵守用戶 評論公約

        類似文章 更多