Edit: sorry, seems like the issue is actually with n_jobs=-1
Hello,
I recently upgraded xgboost to 1.1.1. After doing so, I noticed my training was not using more than one core at a time for XGBRanker, XGBClassifier, xgboost.train({‘objective’:‘binary:logistic’}) and xgboost.train({‘objective’:‘rank:ndcg’) training methods even when I set n_jobs=-1. However, multiple cores are used when I set n_jobs = 3, n_jobs = 5, etc.
I was under the impression that n_jobs=-1 meant using all available cores? At least this was the case when I used 0.78 and 0.90. I just tried upgrading to 1.2.1 with same results, that n_jobs=-1 only uses 1 core. I’ve tried it inside a Docker container as well as in native computer, both running Ubuntu 20.04.
Commands:
model = XGBRanker(objective='rank:ndcg', verbosity=2, n_jobs=-1, n_estimators=10)
uses 1 core only
model = XGBClassifier(verbosity=2, n_jobs=-1, n_estimators=10)
uses 1 core only
model_ = xgb.train({'objective':'rank:ndcg', 'n_jobs':-1, 'verbosity':2, 'num_boost_round':10}, dtrain=dmat)
uses 1 core only
model = XGBRFClassifier(verbosity=2, n_jobs=3, n_estimators=10)
uses 3 cores as expected
model = XGBRFClassifier(verbosity=2, n_jobs=-1, n_estimators=10)
uses 1 core only
model = XGBRFClassifier(verbosity=2, n_jobs=5, n_estimators=10)
uses 5 cores
model = XGBClassifier(verbosity=2, n_jobs=5, n_estimators=10)
uses 5 cores
model = XGBRanker(objective='rank:ndcg', verbosity=2, n_jobs=8, n_estimators=10)
uses 8 cores
Specifying no argument model = XGBRFClassifier(verbosity=2, n_estimators=10)
uses 8 cores, which is all I have.