Add support for custom metrics by desilinguist · Pull Request #612 · EducationalTestingService/skll

added 30 commits

May 14, 2020 11:30
- Add a private set called `_CUSTOM_METRICS` that will hold the names of any custom metrics.
- Add a new function called `register_custom_metric()` that allows registering custom metric functions and making them available in SKLL.
- Modify `get_acceptable_classification_metrics()` and `get_acceptable_regression_metrics()` to consider all custom metrics acceptable no matter what.
- Add a new keyword argument to `_parse_and_validate_metrics()` so that it automatically tries to register non-built-in metrics as custom metrics.
- Use this new keyword argument while parsing and validating the `metrics` and `objective` fields in a configuration file.
- check for conflicts for custom metric modules as well as custom metric functions

@desilinguist

@desilinguist

- Add another config test.
- Use `assert_raises_regex()` instead of `assert_raises()` for all tests to be more specific.

@desilinguist

@desilinguist

aoifecahill

mulhod

- The custom metric registration now happens inside `_classify_featureset()` and we also add it to `globals()` so that it gets serialized for gridmap properly.
- Unfortunately, this means that `_parse_and_validate_metrics()` can no longer recognize invalid metrics at config parsing time. This means that the user needs to wait until the experiment starts running to find out that metrics are invalid.
- It now returns the metric function just like the custom learner loader.
- Since we are not attaching to `skll.metrics`, we can remove one of the checks.
- Add new custom metric test for conflicting filename which is now fine.
- Update regex in one of the tests to match new invalid metric finding code.
- Refactor `_cleanup_custom_metrics()` to make it cleaner.
- Make sure all `run_configuration()` calls are local.

@desilinguist

@desilinguist

@desilinguist

@desilinguist

@desilinguist

@desilinguist

@desilinguist