XGBoost for Swift

Hi all,

I was working on the implementation of Swift wrapper for XGBoost for some time, and I think it is ready to meet first users. The goal of the wrapper was to provide convenient usage for those who are familiar with Python, but also take advantage of great compatibility of Swift with C to provide even more powerful features in the near future.

Currently, supported features are:

  • Training with an internally calculated objective function
  • Training with a custom objective function
  • Saving / Loading
  • Attributes / Parameters for Booster
  • Callbacks
  • Evaluation using an internal function
  • Evaluation using custom functions
  • Plot importance of features

In short, all core stuff required to train, test, and use your model.

If you would like to try this out, code is hosted at https://github.com/kongzii/SwiftXGBoost along with an example of usage. Just add package dependency:

.package(url: "https://github.com/kongzii/SwiftXGBoost.git", .branch("master")),

and install the XGBoost library if you do not have it yet.

The code is documented at https://github.com/kongzii/SwiftXGBoost/wiki.

If you find anything is missing and not working properly, please fill an issue. Functionality is currently tested against expected behavior or directly against Python implementation.

Hey all,

I just wanted to post a small update. I have added several new features, and they are showcased with new usage examples in repository https://github.com/kongzii/SwiftXGBoost/tree/master/Examples. For example, you can now use python numpy to initialize DMatrix or use it for prediction in Booster.

Currently, I want to focus on better test-coverage, to be sure that everything works as expected. It would be super helpful if anyone would try it out, for things that I perhaps missed. I am currently using this library in one ML project, but it is always possible to miss something :frowning:

Thank you! :blush: