Skip to content

Commit d2588cf

Browse files
authored
Update model_benchmarking.md repo name
1 parent 2305135 commit d2588cf

File tree

1 file changed

+5
-5
lines changed

1 file changed

+5
-5
lines changed

docs/model_benchmarking.md

Lines changed: 5 additions & 5 deletions
Original file line numberDiff line numberDiff line change
@@ -2,11 +2,11 @@
22

33
Here we will describe a scenario in which users submit different models to be applied to common data and compare the results. For this we will leverage GitHub's core features to facilitate code versioning and collaborative development and will set up a GitHub Actions configuration which triggers the evaluation when a user creates a `pull request` with a new version of the model and updates a table with user's results and corresponding commit number.
44

5-
We will use a simple approach to approximate the number of ships passing during a time window by counting the number of peaks that appear above a threshold in the broadband plot. The threshold is set in the [`model_benchmarking.py`](https://github.com/uwescience/SciPy2024-GitHubActionsTutorial/blob/main/ambient_sound_analysis/model_benchmarking.py) script.
5+
We will use a simple approach to approximate the number of ships passing during a time window by counting the number of peaks that appear above a threshold in the broadband plot. The threshold is set in the [`model_benchmarking.py`](https://github.com/uwescience/GitHubActionsTutorial-USRSE24/blob/main/ambient_sound_analysis/model_benchmarking.py) script.
66

77

88
## Model Versioning Workflow
9-
The workflow which triggers the model evaluation is in [`model_benchmarking.yml`](https://github.com/uwescience/SciPy2024-GitHubActionsTutorial/blob/main/.github/workflows/model_benchmarking.yml). It consists of the following steps:
9+
The workflow which triggers the model evaluation is in [`model_benchmarking.yml`](https://github.com/uwescience/GitHubActionsTutorial-USRSE24/blob/main/.github/workflows/model_benchmarking.yml). It consists of the following steps:
1010

1111
1. it gets triggered on `pull_request`
1212
* `synchronize` type ensures it get triggered when somebody updates existing pull request
@@ -18,7 +18,7 @@ The workflow which triggers the model evaluation is in [`model_benchmarking.yml`
1818

1919
## Model Benchmarking Workflow
2020

21-
The next workflow follows the steps `create_website_spectrogram` workflow, which converts a notebook [`display_benchmarks`](https://github.com/uwescience/SciPy2024-GitHubActionsTutorial/blob/main/ambient_sound_analysis/display_benchmarks.ipynb) to a website. In this case, we have a very simple notebook which reads all `score_[SHA].csv` and displays a "benchmark table" with the individual entries. This notebook is converted to a webpage ([https://uwescience.github.io/SciPy2024-GitHubActionsTutorial/display_benchmarks.html](https://uwescience.github.io/SciPy2024-GitHubActionsTutorial/display_benchmarks.html/)).
21+
The next workflow follows the steps `create_website_spectrogram` workflow, which converts a notebook [`display_benchmarks`](https://github.com/uwescience/GitHubActionsTutorial-USRSE24/blob/main/ambient_sound_analysis/display_benchmarks.ipynb) to a website. In this case, we have a very simple notebook which reads all `score_[SHA].csv` and displays a "benchmark table" with the individual entries. This notebook is converted to a webpage ([https://uwescience.github.io/GitHubActionsTutorial-USRSE24/display_benchmarks.html](https://uwescience.github.io/GitHubActionsTutorial-USRSE24/display_benchmarks.html/)).
2222

2323
### Exercise
2424

@@ -29,7 +29,7 @@ Create a branch and update the `model_versioning.py` file with a different thres
2929
threshold = ??
3030
```
3131

32-
Submit a pull request from this branch to main and monitor the execution of the workflows. Check out the generated website at [https://uwescience.github.io/SciPy2024-GitHubActionsTutorial/display_benchmarks.html](https://uwescience.github.io/SciPy2024-GitHubActionsTutorial/display_benchmarks.html/).
32+
Submit a pull request from this branch to main and monitor the execution of the workflows. Check out the generated website at [https://uwescience.github.io/GitHubActionsTutorial-USRSE24/display_benchmarks.html](https://uwescience.github.io/GitHubActionsTutorial-USRSE24/display_benchmarks.html/).
3333

3434

3535

@@ -42,4 +42,4 @@ Submit a pull request from this branch to main and monitor the execution of the
4242

4343

4444

45-
45+

0 commit comments

Comments
 (0)