Error during installation R GPU win64: 'R' is not recognized as an internal or external command, operable program or batch file

I try to install R GPU win64 from:

https://s3-us-west-2.amazonaws.com/xgboost-nightly-builds/release_1.5.0/xgboost_r_gpu_win64_508a0b0dbd674909e8ec53881e17f3363fd9b508.tar.gz

Although if I enter: R CMD INSTALL ./xgboost_r_gpu_linux.tar.gz

I receive this error:
‘R’ is not recognized as an internal or external command,
operable program or batch file.

This is my setup:
R version 4.1.0 (2021-05-18) – “Camp Pontanezen”
Rtools installed.
Windows 10 64 bit.

I see these solutions or the error:

Although my line above does not have spaces so I suppose the root cause is different in my case.

Any idea what can be the cause of the error?

Thank you!

Please check your PATH environment variable. https://support.shotgunsoftware.com/hc/en-us/articles/114094235653-Setting-global-environment-variables-on-Windows

These are my current variables:

I see that the Path variabele is related to NVIDIA GPU Computing Toolkit. If I remember correctly I have set this during installation of Keras for deep learning in R.

Do you know if I have to replace this value?

Thank you!

@marboe123 You should add the folder that contains the executable R. For example, if the R executable is found in C:\Program Files\R\R-4.1.0\bin, then the PATH should be revised to include C:\Program Files\R\R-4.1.0\bin. Do not delete other folders that were already in PATH; it is sufficient to add the R folder to PATH.

@hcho3

Thank you very much!

I did succeed to install, although during installation there are a few lines which has a [ FAIL ] at the end of the line. I have copied the output I received during installation below.

I also receive an error if I try to run xgb.train. See my output at the end of this post.

C:\xgboost_gpu_file>R CMD INSTALL ./xgboost_r_gpu_win64_508a0b0dbd674909e8ec53881e17f3363fd9b508.tar.gz

  • installing to library ‘C:/Users/boers/Documents/R/win-library/4.1’
  • installing source package ‘xgboost’ …
    ** using staged installation
    ** libs
    running ‘src/Makefile.win’ …
    make: Nothing to be done for ‘all’.
    installing to C:/Users/boers/Documents/R/win-library/4.1/00LOCK-xgboost/00new/xgboost/libs/x64
    ** R
    ** data
    ** demo
    ** inst
    ** byte-compile and prepare package for lazy loading
    ** help
    *** installing help indices
    converting help for package ‘xgboost’
    finding HTML links … done
    a-compatibility-note-for-saveRDS-save html
    agaricus.test html
    agaricus.train html
    callbacks html
    cb.cv.predict html
    cb.early.stop html
    cb.evaluation.log html
    cb.gblinear.history html
    cb.print.evaluation html
    cb.reset.parameters html
    cb.save.model html
    dim.xgb.DMatrix html
    dimnames.xgb.DMatrix html
    REDIRECT:topic dimnames<-.xgb.DMatrix -> dimnames.xgb.DMatrix.html [ FAIL ]
    getinfo html
    normalize html
    predict.xgb.Booster html
    prepare.ggplot.shap.data html
    print.xgb.Booster html
    print.xgb.DMatrix html
    print.xgb.cv html
    setinfo html
    slice.xgb.DMatrix html
    xgb.Booster.complete html
    xgb.DMatrix html
    xgb.DMatrix.save html
    xgb.attr html
    REDIRECT:topic xgb.attr<- -> xgb.attr.html [ FAIL ]
    REDIRECT:topic xgb.attributes<- -> xgb.attr.html [ FAIL ]
    xgb.config html
    REDIRECT:topic xgb.config<- -> xgb.config.html [ FAIL ]
    xgb.create.features html
    xgb.cv html
    xgb.dump html
    xgb.gblinear.history html
    xgb.importance html
    xgb.load html
    xgb.load.raw html
    xgb.model.dt.tree html
    xgb.parameters html
    REDIRECT:topic xgb.parameters<- -> xgb.parameters.html [ FAIL ]
    xgb.plot.deepness html
    xgb.plot.importance html
    xgb.plot.multi.trees html
    xgb.plot.shap html
    xgb.plot.shap.summary html
    xgb.plot.tree html
    xgb.save html
    xgb.save.raw html
    xgb.serialize html
    xgb.shap.data html
    xgb.train html
    xgb.unserialize html
    xgbConfig html
    xgboost-deprecated html
    ** building package indices
    ** installing vignettes
    ** testing if installed package can be loaded from temporary location
    ** testing if installed package can be loaded from final location
    ** testing if installed package keeps a record of temporary installation path
  • DONE (xgboost)

Next I have added “tree_method = gpu_hist” to xgb.train as in my code below, although this gives the error:

Error in xgb.train(data = xgb_trainval, tree_method = gpu_hist, booster = “gbtree”, :
object ‘gpu_hist’ not found

  model_n <- xgb.train(data = xgb_trainval,
                     tree_method = gpu_hist,
                     booster = "gbtree",
                     objective = "binary:logistic",
                     max_depth = parameters_df$max_depth[row],
                     eta = parameters_df$eta[row],
                     subsample = parameters_df$subsample[row],
                     colsample_bytree = parameters_df$colsample_bytree[row],
                     min_child_weight = parameters_df$min_child_weight[row],
                     nrounds = 300,
                     eval_metric = "auc",
                     early_stopping_rounds = 30,
                     print_every_n = 100,
                     watchlist = list(train = xgb_trainval, val = xgb_val)

I did find the cause of the error : object ‘gpu_hist’ not found
I had forgot to use quotes in:

tree_method = “gpu_hist”

Great, it is running now!

1 Like

I have compared the outcomes of the same code (which calculates AUC for several hyperparametersettings) on CPU and GPU with the same seed just before

xgb.train

I notice that the outcomes show a high correlation although they are not exactly identical.

I suppose this is acceptable and due to some randomness in the proces. Although I just want to report it in case it should be exactly identical.

The processing time shows an improvement from 167.4 minutes to 26.0 minutes (6.44x) which is great imo.

No, there is no expectation that CPU and GPU algorithms would produce exactly identical results.