Spark-scala Xgboost mismatch

hi every one

My project using xgboost on spark- scala. My project name is house price prediction using xgboots.

ml.dmlc
xgboost4j_${scala.binary.version}
LATEST

    <dependency>
        <groupId>ml.dmlc</groupId>
        <artifactId>xgboost4j-spark_${scala.binary.version}</artifactId>
        <version>LATEST</version>
    </dependency>

    <dependency>
        <groupId>com.microsoft.ml.spark</groupId>
        <artifactId>mmlspark_2.11</artifactId>
        <version>LATEST</version>
    </dependency>

This is my error

    Exception in thread "main" java.lang.NoSuchMethodError: scala.Predef$.refArrayOps([Ljava/lang/Object;)Lscala/collection/mutable/ArrayOps;
      at ml.dmlc.xgboost4j.scala.spark.DataUtils$.convertDataFrameToXGBLabeledPointRDDs(DataUtils.scala:161)
      at ml.dmlc.xgboost4j.scala.spark.XGBoostRegressor.train(XGBoostRegressor.scala:175)
      at ml.dmlc.xgboost4j.scala.spark.XGBoostRegressor.train(XGBoostRegressor.scala:44)
      at org.apache.spark.ml.Predictor.fit(Predictor.scala:118)
      at sahibinden.XgbootRegression$.main(XgbootRegression.scala:105)
      at sahibinden.XgbootRegression.main(XgbootRegression.scala

**My main question is how do I set these dependencies. Why am I getting this error. Are there any dependencies or scala compatible scala versions you are using?**Preformatted text

Are you using Spark 2.4.3 and Scala 2.11?

1 Like

yes

  <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>


    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.12</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-core -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-core_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>


    <!-- https://mvnrepository.com/artifact/org.scala-lang/scala-library -->
    <dependency>
        <groupId>org.scala-lang</groupId>
        <artifactId>scala-library</artifactId>
        <version>2.11.12</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-sql -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-sql_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>

    <!-- https://mvnrepository.com/artifact/org.apache.spark/spark-mllib -->
    <dependency>
        <groupId>org.apache.spark</groupId>
        <artifactId>spark-mllib_2.12</artifactId>
        <version>2.4.3</version>
    </dependency>

what is the problem?

This line is a problem. Please reinstall Spark that uses Scala 2.11.

1 Like