You signed in with another tab or window. Reload to refresh your session.You signed out in another tab or window. Reload to refresh your session.You switched accounts on another tab or window. Reload to refresh your session.Dismiss alert
Copy file name to clipboardExpand all lines: website/docs/Use-Cases/Tune-User-Defined-Function.md
+156Lines changed: 156 additions & 0 deletions
Display the source diff
Display the rich diff
Original file line number
Diff line number
Diff line change
@@ -181,6 +181,162 @@ config = {
181
181
182
182
<!-- Please refer to [ray.tune](https://docs.ray.io/en/latest/tune/api_docs/search_space.html#overview) for a more comprehensive introduction about possible choices of the domain. -->
183
183
184
+
#### Hierarchical search space
185
+
186
+
A hierarchical (or conditional) search space allows you to define hyperparameters that depend on the value of other hyperparameters. This is useful when different choices for a categorical hyperparameter require different sets of hyperparameters.
187
+
188
+
For example, if you're tuning a machine learning pipeline where different models require different hyperparameters, or when the choice of an optimizer determines which optimizer-specific hyperparameters are relevant.
189
+
190
+
**Syntax**: To create a hierarchical search space, use `tune.choice()` with a list where some elements are dictionaries containing nested hyperparameter definitions.
191
+
192
+
**Example 1: Model selection with model-specific hyperparameters**
193
+
194
+
In this example, we have two model types (linear and tree-based), each with their own specific hyperparameters:
195
+
196
+
```python
197
+
from flaml import tune
198
+
199
+
search_space = {
200
+
"model": tune.choice(
201
+
[
202
+
{
203
+
"model_type": "linear",
204
+
"learning_rate": tune.loguniform(1e-4, 1e-1),
205
+
"regularization": tune.uniform(0, 1),
206
+
},
207
+
{
208
+
"model_type": "tree",
209
+
"n_estimators": tune.randint(10, 100),
210
+
"max_depth": tune.randint(3, 10),
211
+
},
212
+
]
213
+
),
214
+
# Common hyperparameters for all models
215
+
"batch_size": tune.choice([32, 64, 128]),
216
+
}
217
+
218
+
219
+
defevaluate_config(config):
220
+
model_config = config["model"]
221
+
if model_config["model_type"] =="linear":
222
+
# Use learning_rate and regularization
223
+
# train_linear_model() is a placeholder for your actual training code
224
+
score = train_linear_model(
225
+
lr=model_config["learning_rate"],
226
+
reg=model_config["regularization"],
227
+
batch_size=config["batch_size"],
228
+
)
229
+
else: # tree
230
+
# Use n_estimators and max_depth
231
+
# train_tree_model() is a placeholder for your actual training code
232
+
score = train_tree_model(
233
+
n_est=model_config["n_estimators"],
234
+
depth=model_config["max_depth"],
235
+
batch_size=config["batch_size"],
236
+
)
237
+
return {"score": score}
238
+
239
+
240
+
# Run tuning
241
+
analysis = tune.run(
242
+
evaluate_config,
243
+
config=search_space,
244
+
metric="score",
245
+
mode="min",
246
+
num_samples=20,
247
+
)
248
+
```
249
+
250
+
**Example 2: Mixed choices with constants and nested spaces**
251
+
252
+
You can also mix constant values with nested hyperparameter spaces in `tune.choice()`:
- When a configuration is sampled, only the selected branch of the hierarchical space will be active.
336
+
- The evaluation function should check which choice was selected and use the corresponding nested hyperparameters.
337
+
- Hierarchical search spaces work with all FLAML search algorithms (CFO, BlendSearch).
338
+
- You can specify `low_cost_partial_config` for hierarchical spaces as well by providing the path to the nested parameters.
339
+
184
340
#### Cost-related hyperparameters
185
341
186
342
Cost-related hyperparameters are a subset of the hyperparameters which directly affect the computation cost incurred in the evaluation of any hyperparameter configuration. For example, the number of estimators (`n_estimators`) and the maximum number of leaves (`max_leaves`) are known to affect the training cost of tree-based learners. So they are cost-related hyperparameters for tree-based learners.
0 commit comments