Skip to content

Releases: automl/SMAC3

v2.2.0

24 Jul 14:38
9d19475
Compare
Choose a tag to compare

Features

  • Add example to specify total budget (fidelity units) instead of n_trials for multi-fidelity/Hyperband (#1121)

Dependencies

  • Update numpy NaN (#1122) and restrict numpy version
  • Upgrade to ConfigSpace 1.x.x (#1124)

What's Changed

Full Changelog: v2.1.0...v2.2.0

Version 2.1.0

16 May 13:20
937eb2c
Compare
Choose a tag to compare

2.1.0

Improvements

  • Change the surrogate model to be retrained after every iteration by default in the case of blackbox optimization
    (#1106).
  • Integrate LocalAndSortedPriorRandomSearch functionality into LocalAndSortedRandomSearch (#1106).
  • Change the way the LocalAndSortedRandomSearch works such that the incumbent always is a starting point and that
    random configurations are sampled as the basis of the local search, not in addition (#1106).

Bugfixes

  • Fix path for dask scheduler file (#1055).
  • Add OrdinalHyperparameter for random forest imputer (#1065).
  • Don't use mutable default argument (#1067).
  • Propagate the Scenario random seed to get_random_design (#1066).
  • Configurations that fail to become incumbents will be added to the rejected lists (#1069).
  • SMAC RandomForest doesn't crash when np.integer used, i.e. as generated from a np.random.RandomState (#1084).
  • Fix the handling of n_points/ challengers in the acquisition maximizers, such that this number now functions as the
    number of points that are sampled from the acquisition function to find the next challengers. Now also doesn't
    restrict the config selector to n_retrain many points for finding the max, and instead uses the defaults that are
    defined via facades/ scenarios (#1106).

Misc

  • ci: Update action version (#1072).

Minor

  • When a custom dask client is provided, emit the warning that the n_workers parameter is ignored only if it deviates from its default value, 1 (#1071).

What's Changed

Full Changelog: v2.0.2...v2.1.0

Version 2.0.2

01 Aug 13:38
541ee7e
Compare
Choose a tag to compare

2.0.2

Improvements

  • Add an error when we get an empty dict data_to_scatter so that we can avoid an internal error caused in Dask precautiously
  • Add experimental instruction for installing SMAC in Windows via a WSL.
  • More detailed documentation regarding continuing runs.

Bugfixes

  • Fix bug in the incumbent selection in the case that multi-fidelity is combined with multi-objective (#1019).
  • Fix callback order (#1040).
  • Handle configspace as dictionary in mlp and parego example.
  • Adapt sgd loss to newest scikit-learn version.

Version 2.0.1

23 May 12:56
d5d5456
Compare
Choose a tag to compare

2.0.1

Improvements

  • Callbacks registration is now a public method of the optimizer and allows callbacks to be inserted at a specific position.
  • Adapt developer install instructions to include pre-commit installation
  • Add option to pass a dask client to the facade, e.g. enables running on a hpc cluster (#983).
  • Added scenario.use_default_config argument/attribute=False, that adds the user's configspace default configuration
    as an additional_config to the inital design if set to True. This adds one additional configuration to the number of configs
    originating from the initial design. Since n_trials is still respected, this results in one fewer BO steps
  • Adapt developer install instructions to include pre-commit installation.
  • Add option to pass a dask client to the facade, e.g. enables running on a hpc cluster (#983).
  • Add example for using a callback to log run metadata to a file (#996).
  • Move base callback and metadata callback files to own callback directory.
  • Add a workaround to be able to pass a dataset via dask.scatter so that serialization/deserialization in Dask becomes much quicker (#993).

Bugfixes

  • The ISB-pair differences over the incumbent's configurations are computed correctly now (#956).
  • Adjust amount of configurations in different stages of hyperband brackets to conform to the original paper.
  • Fix validation in smbo to use the seed in the scenario.
  • Change order of callbacks, intensifier callback for incumbent selection is now the first callback.
  • intensifier.get_state() will now check if the configurations contained in the queue is stored in the runhistory (#997)

Version 2.0.0

03 Mar 10:29
731854e
Compare
Choose a tag to compare

2.0.0

Improvements

  • Clarify origin of configurations (#908).
  • Random forest with instances predicts the marginalized costs by using a C++ implementation in pyrfr, which is much faster (#903).
  • Add version to makefile to install correct test release version

Bugfixes

  • Continue run when setting incumbent selection to highest budget when using Successive Halving (#907).
  • If integer features are used, they are automatically converted to strings.

Workflows

  • Added workflow to update pre-commit versions (#874).

Misc

  • Added benchmarking procedure to compare to previous releases.

Version 2.0.0b1

07 Jan 10:44
0a1dff4
Compare
Choose a tag to compare
Version 2.0.0b1 Pre-release
Pre-release
  • Completely reimplemented the intensifiers (including Successive Halving and Hyperband): All intensifiers support multi-fidelity, multi-objective and multi-threading by nature now.
  • Expected behaviour for ask-and-tell interface ensured (also for Successive Halving).
  • Continuing a run is now fully supported.
  • Added more examples.
  • Updated documentation based on new implementation.
  • Added benchmark to compare different versions.

Version 2.0.0a2

26 Oct 12:01
dd249cd
Compare
Choose a tag to compare
Version 2.0.0a2 Pre-release
Pre-release

Bugfixes

  • Fixed random weight (re-)generalization of multi-objective algorithms: Before the weights were generated for each call to build_matrix, now we only re-generate them for every iteration.
  • Optimization may get stuck because of deep copying an iterator for callback: We removed the configuration call from on_next_configurations_end.

Minor

  • Removed example badget in README.
  • Added SMAC logo to README.

Version 2.0.0a1

12 Oct 07:13
1b75cec
Compare
Choose a tag to compare
Version 2.0.0a1 Pre-release
Pre-release

Big Changes

  • We redesigned the scenario class completely. The scenario is implemented as a dataclass now and holds only environment variables (like limitations or save directory). Everything else was moved to the components directly.
  • We removed runtime optimization completely (no adaptive capping or imputing anymore).
  • We removed the command-line interface and restructured everything alongside. Since SMAC was building upon the command-line interface (especially in combination with the scenario), it was complicated to understand the behavior or find specific implementations. With the removal, we re-wrote everything in python and re-implemented the feature of using scripts as target functions.
  • Introducing trials: Each config/seed/budget/instance calculation is a trial.
  • The configuration chooser is integrated into the SMBO object now. Therefore, SMBO finally implements an ask-tell interface now.
  • Facades are redesigned so that they accept instantiated components directly. If a component is not passed, a default component is used, which is specified for each facade individually in the form of static methods. You can use those static methods directly to adapt a component to your choice.
  • A lot of API changes and renamings (e.g., RandomConfigurationChooser -> RandomDesign, Runhistory2EPM -> RunHistoryEncoder).
  • Ambiguous variables are renamed and unified across files.
  • Dependencies of modules are reduced drastically.
  • We incorporated Pynisher 1.0, which ensures limitations cross-platform.
  • We incorporated ConfigSpace 0.6, which simplified our examples.
  • Examples and documentation are completely reworked. Examples use the new ConfigSpace, and the documentation is adapted to version 2.0.
  • Transparent target function signatures: SMAC checks now explicitly if an argument is available (the required arguments are now specified in the intensifier). If there are more arguments that are not passed by SMAC, a warning is raised.
  • Components implement a meta property now, all of which describe the initial state of SMAC. The facade collects all metadata and saves the initial state of the scenario.
  • Improved multi-objective in general: RunHistory (in addition to RunHistoryEncoder) both incorporates the multi-objective algorithm. In other words, if the multi-objective algorithm changes the output, it directly affects the optimization process.
  • Configspace is saved in json only
  • StatusType is saved as integer and not as dict anymore
  • We changed the behavior of continuing a run:
    • SMAC automatically checks if a scenario was saved earlier. If there exists a scenario and the initial state is the same, SMAC automatically loads the previous data. However, continuing from that run is not possible yet.
    • If there was a scenario earlier, but the initial state is different, then the user is asked to overwrite the run or to still continue the run although the state is different (Note that this only can happen if the name specified in the scenario is the same). Alternatively, an old to the old run is added (e.g., the name was test, it becomes test-old).
    • The initial state of the SMAC run also specifies the name (if no name in the scenario is specified). If the user changes something in the code base or in the scenario, the name and, therefore, the save location automatically changes.

New Features

  • Added a new termination feature: Use terminate_cost_threshold in the scenario to stop the optimization after a configuration was evaluated with a cost lower than the threshold.
  • Callbacks are completely redesigned. Added callbacks to the facade are called in different positions in the Bayesian optimization loop.
  • The multi-objective algorithm MeanAggregationStrategy supports objective weights now.
  • RunHistory got more methods like get_incumbent or get_pareto_front.

Fixes

  • You ever noticed that the third configuration has no origin? It's fixed now.
  • We fixed ParEGO (it updates every time training is performed now).

Optimization Changes

  • Changed initial design behavior
    • You can add additional configurations now.
    • max_ratio will limit both n_configs and n_configs_per_hyperparameter but not additional configurations
    • Reduced default max_ratio to 0.1.

Code Related

  • Converted all unittests to pytests.
  • Instances, seeds, and budgets can be set to none now. However, mixing none and non-none will throw an exception.

Version 1.4.0

14 Jul 08:05
83a9bbe
Compare
Choose a tag to compare

Features

  • BOinG: A two-stage Bayesian optimization approach to allow the
    optimizer to focus on the most promising regions.
  • TurBO: Reimplementaion of TurBO-1 algorithm.
  • Updated pSMAC: Can pass arbitrary SMAC facades now. Added example and fixed tests.

Improvements

  • Enabled caching for multi-objectives (#872). Costs are now normalized in get_cost
    or optionally in average_cost/sum_cost/min_cost to receive a single float value. Therefore,
    the cached cost values do not need to be updated everytime a new entry to the runhistory was added.

Interface changes

  • We changed the location of Gaussian processes and random forests. They are in the folders
    epm/gaussian_process and epm/random_forest now.
  • Also, we restructured the optimizer folder and therefore the location of the acquisition functions
    and configuration chooser.
  • Multi-objective functions are located in the folder multi_objective.
  • pSMAC facade was moved to the facade directory.

Version 1.3.4

23 Jun 10:12
99d1129
Compare
Choose a tag to compare
  • Added reference to JMLR paper.
  • Typos in documentations.
  • Code more readable since all typings are imported at the beginning of the file.
  • Updated stale bot options.