@@ -27,10 +27,6 @@ to see examples of how to use ``adaptive`` or visit the
2727
2828.. summary-end
2929
30- **WARNING: adaptive is still in a beta development stage **
31-
32- .. not-in-documentation-start
33-
3430 Implemented algorithms
3531----------------------
3632
@@ -46,6 +42,8 @@ but the details of the adaptive sampling are completely customizable.
4642
4743The following learners are implemented:
4844
45+ .. not-in-documentation-start
46+
4947 - ``Learner1D ``, for 1D functions ``f: ℝ → ℝ^N ``,
5048- ``Learner2D ``, for 2D functions ``f: ℝ^2 → ℝ^N ``,
5149- ``LearnerND ``, for ND functions ``f: ℝ^N → ℝ^M ``,
@@ -54,10 +52,16 @@ The following learners are implemented:
5452- ``AverageLearner1D ``, for stochastic 1D functions where you want to
5553 estimate the mean value of the function at each point,
5654- ``IntegratorLearner ``, for
57- when you want to intergrate a 1D function ``f: ℝ → ℝ ``,
55+ when you want to intergrate a 1D function ``f: ℝ → ℝ ``.
5856- ``BalancingLearner ``, for when you want to run several learners at once,
5957 selecting the “best” one each time you get more points.
6058
59+ Meta-learners (to be used with other learners):
60+
61+ - ``BalancingLearner ``, for when you want to run several learners at once,
62+ selecting the “best” one each time you get more points,
63+ - ``DataSaver ``, for when your function doesn't just return a scalar or a vector.
64+
6165In addition to the learners, ``adaptive `` also provides primitives for
6266running the sampling across several cores and even several machines,
6367with built-in support for
@@ -67,7 +71,6 @@ with built-in support for
6771`ipyparallel <https://ipyparallel.readthedocs.io/en/latest/ >`_ and
6872`distributed <https://distributed.readthedocs.io/en/latest/ >`_.
6973
70-
7174Examples
7275--------
7376
0 commit comments