@@ -25,10 +25,6 @@ to see examples of how to use ``adaptive`` or visit the
2525
2626.. summary-end
2727
28- **WARNING: adaptive is still in a beta development stage **
29-
30- .. not-in-documentation-start
31-
3228 Implemented algorithms
3329----------------------
3430
@@ -44,6 +40,8 @@ but the details of the adaptive sampling are completely customizable.
4440
4541The following learners are implemented:
4642
43+ .. not-in-documentation-start
44+
4745 - ``Learner1D ``, for 1D functions ``f: ℝ → ℝ^N ``,
4846- ``Learner2D ``, for 2D functions ``f: ℝ^2 → ℝ^N ``,
4947- ``LearnerND ``, for ND functions ``f: ℝ^N → ℝ^M ``,
@@ -52,10 +50,16 @@ The following learners are implemented:
5250- ``AverageLearner1D ``, for stochastic 1D functions where you want to
5351 estimate the mean value of the function at each point,
5452- ``IntegratorLearner ``, for
55- when you want to intergrate a 1D function ``f: ℝ → ℝ ``,
53+ when you want to intergrate a 1D function ``f: ℝ → ℝ ``.
5654- ``BalancingLearner ``, for when you want to run several learners at once,
5755 selecting the “best” one each time you get more points.
5856
57+ Meta-learners (to be used with other learners):
58+
59+ - ``BalancingLearner ``, for when you want to run several learners at once,
60+ selecting the “best” one each time you get more points,
61+ - ``DataSaver ``, for when your function doesn't just return a scalar or a vector.
62+
5963In addition to the learners, ``adaptive `` also provides primitives for
6064running the sampling across several cores and even several machines,
6165with built-in support for
0 commit comments