Fetch parameters in custom objective defined in param map

If I want to use certain parameters defined in param map in my customized objective function,how to do it?

The example code do not have any parameters.

How can customized objective function support parameters, anyone can help??

In the example, I think you can add the param map as a field to CustomObjective. You will have to manually set that field

@hcho3
I added field and succeeded in training the model, but after I save the pipeline and load it in the production environment, it failed to work and got following error:

19/06/11 10:29:35 ERROR yarn.ApplicationMaster: User class threw exception: org.json4s.package$MappingException: No constructor for type EvalTrait, JObject(List())
org.json4s.package$MappingException: No constructor for type EvalTrait, JObject(List())
	at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$constructor(Extraction.scala:417)
	at org.json4s.Extraction$ClassInstanceBuilder.org$json4s$Extraction$ClassInstanceBuilder$$instantiate(Extraction.scala:468)
	at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:515)
	at org.json4s.Extraction$ClassInstanceBuilder$$anonfun$result$6.apply(Extraction.scala:512)
	at org.json4s.Extraction$.org$json4s$Extraction$$customOrElse(Extraction.scala:524)
	at org.json4s.Extraction$ClassInstanceBuilder.result(Extraction.scala:512)
	at org.json4s.Extraction$.extract(Extraction.scala:351)
	at org.json4s.Extraction$.extract(Extraction.scala:42)
	at org.json4s.ExtractableJsonAstNode.extract(ExtractableJsonAstNode.scala:21)
	at ml.dmlc.xgboost4j.scala.spark.params.CustomEvalParam.jsonDecode(CustomParams.scala:43)
	at ml.dmlc.xgboost4j.scala.spark.params.CustomEvalParam.jsonDecode(CustomParams.scala:27)
	at ml.dmlc.xgboost4j.scala.spark.params.DefaultXGBoostParamsReader$$anonfun$getAndSetParams$1.apply(DefaultXGBoostParamsReader.scala:117)
	at ml.dmlc.xgboost4j.scala.spark.params.DefaultXGBoostParamsReader$$anonfun$getAndSetParams$1.apply(DefaultXGBoostParamsReader.scala:115)
	at scala.collection.immutable.List.foreach(List.scala:381)
	at ml.dmlc.xgboost4j.scala.spark.params.DefaultXGBoostParamsReader$.getAndSetParams(DefaultXGBoostParamsReader.scala:115)
	at ml.dmlc.xgboost4j.scala.spark.XGBoostClassificationModel$XGBoostClassificationModelReader.load(XGBoostClassifier.scala:523)
	at ml.dmlc.xgboost4j.scala.spark.XGBoostClassificationModel$XGBoostClassificationModelReader.load(XGBoostClassifier.scala:505)
	at org.apache.spark.ml.util.DefaultParamsReader$.loadParamsInstance(ReadWrite.scala:438)
	at org.apache.spark.ml.Pipeline$SharedReadWrite$$anonfun$4.apply(Pipeline.scala:273)
	at org.apache.spark.ml.Pipeline$SharedReadWrite$$anonfun$4.apply(Pipeline.scala:271)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.TraversableLike$$anonfun$map$1.apply(TraversableLike.scala:234)
	at scala.collection.IndexedSeqOptimized$class.foreach(IndexedSeqOptimized.scala:33)
	at scala.collection.mutable.ArrayOps$ofRef.foreach(ArrayOps.scala:186)
	at scala.collection.TraversableLike$class.map(TraversableLike.scala:234)
	at scala.collection.mutable.ArrayOps$ofRef.map(ArrayOps.scala:186)
	at org.apache.spark.ml.Pipeline$SharedReadWrite$.load(Pipeline.scala:271)
	at org.apache.spark.ml.PipelineModel$PipelineModelReader.load(Pipeline.scala:347)
	at org.apache.spark.ml.PipelineModel$PipelineModelReader.load(Pipeline.scala:341)
	at com.tencent.wx.security.script.CommonXGBPredictor$.main(CommonXGBPredictor.scala:56)
	at com.tencent.wx.security.script.CommonXGBPredictor.main(CommonXGBPredictor.scala)
	at sun.reflect.NativeMethodAccessorImpl.invoke0(Native Method)
	at sun.reflect.NativeMethodAccessorImpl.invoke(NativeMethodAccessorImpl.java:62)
	at sun.reflect.DelegatingMethodAccessorImpl.invoke(DelegatingMethodAccessorImpl.java:43)
	at java.lang.reflect.Method.invoke(Method.java:498)
	at org.apache.spark.deploy.yarn.ApplicationMaster$$anon$2.run(ApplicationMaster.scala:637)

the error of ObjectTrait is same as above!

So can you be so kind to give a small working example for adding parameters in the loss function?

I’m not an expert in Scala, so I don’t know what’s going on. My guess is that there is an issue with serializing the custom Scala class. Can you make your class Kyro serializable?