Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Add nbQA for notebook and docs linting #361

Merged
merged 9 commits into from Apr 7, 2023
Merged
Show file tree
Hide file tree
Changes from 6 commits
Commits
File filter

Filter by extension

Filter by extension

Conversations
Failed to load comments.
Jump to
Jump to file
Failed to load files.
Diff view
Diff view
10 changes: 9 additions & 1 deletion .pre-commit-config.yaml
Expand Up @@ -11,9 +11,17 @@ repos:
- repo: https://github.com/psf/black
rev: 23.3.0
hooks:
- id: black
- id: black-jupyter
- repo: https://github.com/charliermarsh/ruff-pre-commit
rev: "v0.0.261"
hooks:
- id: ruff
args: ["--fix"]
- repo: https://github.com/nbQA-dev/nbQA
rev: 1.7.0
hooks:
- id: nbqa-black
additional_dependencies: [jupytext, black]
basnijholt marked this conversation as resolved.
Show resolved Hide resolved
- id: nbqa
args: ["ruff", "--fix", "--ignore=E402,B018"]
additional_dependencies: [jupytext, ruff]
18 changes: 7 additions & 11 deletions docs/source/algorithms_and_examples.md
@@ -1,15 +1,13 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
execution:
timeout: 300
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

```{include} ../../README.md
Expand Down Expand Up @@ -102,7 +100,7 @@ def plot_loss_interval(learner):


def plot(learner, npoints):
adaptive.runner.simple(learner, npoints_goal= npoints)
adaptive.runner.simple(learner, npoints_goal=npoints)
return (learner.plot() * plot_loss_interval(learner))[:, -1.1:1.1]


Expand All @@ -111,6 +109,7 @@ def get_hm(loss_per_interval, N=101):
plots = {n: plot(learner, n) for n in range(N)}
return hv.HoloMap(plots, kdims=["npoints"])


plot_homo = get_hm(uniform_loss).relabel("homogeneous sampling")
plot_adaptive = get_hm(default_loss).relabel("with adaptive")
layout = plot_homo + plot_adaptive
Expand All @@ -122,7 +121,6 @@ layout.opts(toolbar=None)
```{code-cell} ipython3
:tags: [hide-input]


def ring(xy):
import numpy as np

Expand Down Expand Up @@ -155,7 +153,6 @@ hv.HoloMap(plots, kdims=["npoints"]).collate()
```{code-cell} ipython3
:tags: [hide-input]


def g(n):
import random

Expand All @@ -181,7 +178,6 @@ hv.HoloMap(plots, kdims=["npoints"])
```{code-cell} ipython3
:tags: [hide-input]


def sphere(xyz):
import numpy as np

Expand Down
7 changes: 2 additions & 5 deletions docs/source/logo.md
Expand Up @@ -4,19 +4,16 @@ jupytext:
extension: .md
format_name: myst
format_version: 0.13
jupytext_version: 1.14.1
jupytext_version: 1.14.5
kernelspec:
display_name: Python 3 (ipykernel)
language: python
name: python3
execution:
timeout: 300
---

```{code-cell} ipython3
:tags: [remove-input]

import os
import functools
import subprocess
import tempfile
Expand Down Expand Up @@ -75,7 +72,7 @@ def remove_rounded_corners(fname):

def learner_till(till, learner, data):
new_learner = adaptive.Learner2D(None, bounds=learner.bounds)
new_learner.data = {k: v for k, v in data[:till]}
new_learner.data = dict(data[:till])
for x, y in learner._bounds_points:
# always include the bounds
new_learner.tell((x, y), learner.data[x, y])
Expand Down
16 changes: 9 additions & 7 deletions docs/source/tutorial/tutorial.BalancingLearner.md
@@ -1,14 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

# Tutorial {class}`~adaptive.BalancingLearner`

```{note}
Expand Down Expand Up @@ -60,7 +61,8 @@ runner.live_info()
```

```{code-cell} ipython3
plotter = lambda learner: hv.Overlay([L.plot() for L in learner.learners])
def plotter(learner):
return hv.Overlay([L.plot() for L in learner.learners])
runner.live_plot(plotter=plotter, update_interval=0.1)
```

Expand All @@ -83,7 +85,7 @@ combos = {
}

learner = adaptive.BalancingLearner.from_product(
jacobi, adaptive.Learner1D, dict(bounds=(0, 1)), combos
jacobi, adaptive.Learner1D, {"bounds": (0, 1)}, combos
)

runner = adaptive.BlockingRunner(learner, loss_goal=0.01)
Expand Down
15 changes: 7 additions & 8 deletions docs/source/tutorial/tutorial.IntegratorLearner.md
@@ -1,14 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

# Tutorial {class}`~adaptive.IntegratorLearner`

```{note}
Expand Down Expand Up @@ -60,9 +61,7 @@ learner = adaptive.IntegratorLearner(f24, bounds=(0, 3), tol=1e-8)
# We use a SequentialExecutor, which runs the function to be learned in
# *this* process only. This means we don't pay
# the overhead of evaluating the function in another process.
runner = adaptive.Runner(
learner, executor=SequentialExecutor()
)
runner = adaptive.Runner(learner, executor=SequentialExecutor())
```

```{code-cell} ipython3
Expand Down
19 changes: 13 additions & 6 deletions docs/source/tutorial/tutorial.Learner1D.md
@@ -1,14 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

(TutorialLearner1D)=
# Tutorial {class}`~adaptive.Learner1D`

Expand Down Expand Up @@ -112,6 +113,8 @@ random.seed(0)
offsets = [random.uniform(-0.8, 0.8) for _ in range(3)]

# sharp peaks at random locations in the domain


def f_levels(x, offsets=offsets):
a = 0.01
return np.array(
Expand All @@ -124,7 +127,9 @@ The `Learner1D` can be used for such functions:

```{code-cell} ipython3
learner = adaptive.Learner1D(f_levels, bounds=(-1, 1))
runner = adaptive.Runner(learner, loss_goal=0.01) # continue until `learner.loss()<=0.01`
runner = adaptive.Runner(
learner, loss_goal=0.01
) # continue until `learner.loss()<=0.01`
```

```{code-cell} ipython3
Expand Down Expand Up @@ -211,12 +216,14 @@ learner.to_numpy()
```

If Pandas is installed (optional dependency), you can also run

```{code-cell} ipython3
df = learner.to_dataframe()
df
```

and load that data into a new learner with

```{code-cell} ipython3
new_learner = adaptive.Learner1D(learner.function, (-1, 1)) # create an empty learner
new_learner.load_dataframe(df) # load the pandas.DataFrame's data
Expand Down
12 changes: 7 additions & 5 deletions docs/source/tutorial/tutorial.Learner2D.md
@@ -1,14 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

# Tutorial {class}`~adaptive.Learner2D`

```{note}
Expand All @@ -24,6 +25,7 @@ import holoviews as hv
import numpy as np

from functools import partial

adaptive.notebook_extension()
```

Expand Down
14 changes: 7 additions & 7 deletions docs/source/tutorial/tutorial.LearnerND.md
@@ -1,16 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
execution:
timeout: 300
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

# Tutorial {class}`~adaptive.LearnerND`

```{note}
Expand Down Expand Up @@ -111,6 +110,7 @@ You could use the following code as an example:
```{code-cell} ipython3
import scipy


def f(xyz):
x, y, z = xyz
return x**4 + y**4 + z**4 - (x**2 + y**2 + z**2) ** 2
Expand Down
24 changes: 13 additions & 11 deletions docs/source/tutorial/tutorial.advanced-topics.md
@@ -1,14 +1,15 @@
---
kernelspec:
name: python3
display_name: python3
jupytext:
text_representation:
extension: .md
format_name: myst
format_version: '0.13'
jupytext_version: 1.13.8
format_version: 0.13
jupytext_version: 1.14.5
kernelspec:
display_name: python3
name: python3
---

# Advanced Topics

```{note}
Expand All @@ -24,7 +25,6 @@ import adaptive
adaptive.notebook_extension()

import asyncio
from functools import partial
import random

offset = random.uniform(-0.5, 0.5)
Expand Down Expand Up @@ -92,7 +92,7 @@ def slow_f(x):
learner = adaptive.Learner1D(slow_f, bounds=[0, 1])
runner = adaptive.Runner(learner, npoints_goal=100)
runner.start_periodic_saving(
save_kwargs=dict(fname="data/periodic_example.p"), interval=6
save_kwargs={"fname": "data/periodic_example.p"}, interval=6
)
```

Expand Down Expand Up @@ -168,9 +168,7 @@ If you want to enable determinism, want to continue using the non-blocking {clas
from adaptive.runner import SequentialExecutor

learner = adaptive.Learner1D(f, bounds=(-1, 1))
runner = adaptive.Runner(
learner, executor=SequentialExecutor(), loss_goal=0.01
)
runner = adaptive.Runner(learner, executor=SequentialExecutor(), loss_goal=0.01)
```

```{code-cell} ipython3
Expand Down Expand Up @@ -275,6 +273,7 @@ If the runner stopped due to an exception then asking for the result will raise

```{code-cell} ipython3
:tags: [raises-exception]

runner.task.result()
```

Expand Down Expand Up @@ -380,6 +379,7 @@ a slow part `g` which can be reused by multiple inputs and shared across functio
```{code-cell} ipython3
import time


def f(x):
"""
Integer part of `x` repeats and should be reused
Expand Down Expand Up @@ -407,9 +407,10 @@ from dask import delayed
# Convert g and h to dask.Delayed objects
g, h = delayed(g), delayed(h)


@delayed
def f(x, y):
return (x + y)**2
return (x + y) ** 2
```

Next we define a computation using coroutines such that it reuses previously submitted tasks.
Expand All @@ -421,6 +422,7 @@ client = await Client(asynchronous=True)

g_futures = {}


async def f_parallel(x):
# Get or sumbit the slow function future
if (g_future := g_futures.get(int(x))) is None:
Expand Down