11Tutorial `~adaptive.AverageLearner1D `
2- ------------------------------
2+ -------------------------------------
33
44.. note ::
55 Because this documentation consists of static html, the ``live_plot ``
@@ -15,11 +15,10 @@ Tutorial `~adaptive.AverageLearner1D`
1515
1616 import adaptive
1717 adaptive.notebook_extension()
18- %config InlineBackend.figure_formats=set(['svg'])
1918
19+ import holoviews as hv
2020 import numpy as np
2121 from functools import partial
22- import random
2322
2423General use
2524..........................
@@ -29,29 +28,25 @@ First, we define the (noisy) function to be sampled. Note that the parameter
2928
3029.. jupyter-execute ::
3130
32- def f(x, sigma=0, peak_width=0.05, offset=-0.5, wait=False):
33- from time import sleep
34- from random import random
35-
36- if wait:
37- sleep(random())
38-
39- function = x ** 3 - x + 3 * peak_width ** 2 / (peak_width ** 2 + (x - offset) ** 2)
40- return function + np.random.normal(0, sigma)
31+ def f(x, sigma=0, peak_width=0.05, offset=-0.5):
32+ y = x ** 3 - x + 3 * peak_width ** 2 / (peak_width ** 2 + (x - offset) ** 2)
33+ noise = np.random.normal(0, sigma)
34+ return y + noise
4135
4236This is how the function looks in the absence of noise:
4337
4438.. jupyter-execute ::
4539
46- import matplotlib.pyplot as plt
47- x = np.linspace(-2,2,500 )
48- plt.plot(x, f(x, sigma=0));
40+ xs = np.linspace(-2, 2, 500)
41+ ys = f(xs, sigma=0 )
42+ hv.Path((xs, ys))
4943
50- This is how a single realization of the noisy function looks :
44+ And an example of a single realization of the noisy function:
5145
5246.. jupyter-execute ::
5347
54- plt.plot(x, [f(xi, sigma=1) for xi in x]);
48+ ys = [f(x, sigma=1) for x in xs]
49+ hv.Path((xs, ys))
5550
5651To obtain an estimate of the mean value of the function at each point ``x ``, we
5752take many samples at ``x `` and calculate the sample mean. The learner will
@@ -62,22 +57,33 @@ We start by initializing a 1D average learner:
6257
6358.. jupyter-execute ::
6459
65- learner = adaptive.AverageLearner1D(
66- function=partial(f, sigma=1),
67- bounds=(-2,2))
60+ learner = adaptive.AverageLearner1D(partial(f, sigma=1), bounds=(-2, 2))
6861
6962As with other types of learners, we need to initialize a runner with a certain
7063goal to run our learner. In this case, we set 10000 samples as the goal (the
7164second condition ensures that we have at least 20 samples at each point):
7265
7366.. jupyter-execute ::
7467
75- runner = adaptive.Runner(learner, goal=lambda l: l.total_samples >= 10000 and min(l._number_samples.values()) >= 20)
68+ def goal(total_samples):
69+ def _goal(learner):
70+ min_samples = min(learner._number_samples.values())
71+ return learner.total_samples >= total_samples and min_samples >= 20
72+ return _goal
73+
74+ runner = adaptive.Runner(learner, goal=goal(10_000))
75+
76+ .. jupyter-execute ::
77+ :hide-code:
78+
79+ await runner.task # This is not needed in a notebook environment!
80+
81+ .. jupyter-execute ::
7682 runner.live_info()
7783 runner.live_plot(update_interval=0.1)
7884
7985Fine tuning
80- ..........................
86+ ...........
8187
8288In some cases, the default configuration of the 1D average learner can be
8389sub-optimal. One can then tune the internal parameters of the learner. The most
@@ -98,8 +104,15 @@ not require accuracy beyond this value:
98104
99105 learner.delta = 0.1
100106 learner.min_error = 0.05
107+ runner = adaptive.Runner(learner, goal=goal(20_000))
108+
109+ .. jupyter-execute ::
110+ :hide-code:
111+
112+ await runner.task # This is not needed in a notebook environment!
113+
114+ .. jupyter-execute ::
101115
102- runner = adaptive.Runner(learner, goal=lambda l: l.total_samples >= 20000 and min(l._number_samples.values()) >= 20)
103116 runner.live_info()
104117 runner.live_plot(update_interval=0.1)
105118
@@ -111,6 +124,13 @@ On the contrary, if we want to push forward the "exploration", we can set a larg
111124 learner.delta = 0.3
112125 learner.max_samples = 1000
113126
114- runner = adaptive.Runner(learner, goal=lambda l: l.total_samples >= 25000 and min(l._number_samples.values()) >= 20)
127+ runner = adaptive.Runner(learner, goal=goal(25_000))
128+
129+ .. jupyter-execute ::
130+ :hide-code:
131+
132+ await runner.task # This is not needed in a notebook environment!
133+
134+ .. jupyter-execute ::
115135 runner.live_info()
116136 runner.live_plot(update_interval=0.1)
0 commit comments