Skip to content

Commit 9d9421e

Browse files
committed
Merge branch 'main' into boland
- move boland enhancement what's new from v0.9.0 --> v0.9.5 - move boland sphinx doc entry from api.rst (deleted) to reference/irradiance/decomposition
2 parents 2faf9ab + f4d7c6e commit 9d9421e

File tree

243 files changed

+25928
-6626
lines changed

Some content is hidden

Large Commits have some content hidden by default. Use the searchbox below for content that may be hidden.

243 files changed

+25928
-6626
lines changed

.coveragerc

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -1,2 +1,2 @@
11
[run]
2-
omit = pvlib/_version.py
2+
omit = pvlib/version.py

.gitattributes

Lines changed: 0 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -1,4 +1,2 @@
1-
pvlib/version.py export-subst
2-
31
# reduce the number of merge conflicts
42
docs/sphinx/source/whatsnew/* merge=union

.github/PULL_REQUEST_TEMPLATE.md

Lines changed: 3 additions & 3 deletions
Original file line numberDiff line numberDiff line change
@@ -3,10 +3,10 @@
33
- [ ] Closes #xxxx
44
- [ ] I am familiar with the [contributing guidelines](https://pvlib-python.readthedocs.io/en/latest/contributing.html)
55
- [ ] Tests added
6-
- [ ] Updates entries to [`docs/sphinx/source/api.rst`](https://github.com/pvlib/pvlib-python/blob/master/docs/sphinx/source/api.rst) for API changes.
7-
- [ ] Adds description and name entries in the appropriate "what's new" file in [`docs/sphinx/source/whatsnew`](https://github.com/pvlib/pvlib-python/tree/master/docs/sphinx/source/whatsnew) for all changes. Includes link to the GitHub Issue with `` :issue:`num` `` or this Pull Request with `` :pull:`num` ``. Includes contributor name and/or GitHub username (link with `` :ghuser:`user` ``).
6+
- [ ] Updates entries in [`docs/sphinx/source/reference`](https://github.com/pvlib/pvlib-python/blob/main/docs/sphinx/source/reference) for API changes.
7+
- [ ] Adds description and name entries in the appropriate "what's new" file in [`docs/sphinx/source/whatsnew`](https://github.com/pvlib/pvlib-python/tree/main/docs/sphinx/source/whatsnew) for all changes. Includes link to the GitHub Issue with `` :issue:`num` `` or this Pull Request with `` :pull:`num` ``. Includes contributor name and/or GitHub username (link with `` :ghuser:`user` ``).
88
- [ ] New code is fully documented. Includes [numpydoc](https://numpydoc.readthedocs.io/en/latest/format.html) compliant docstrings, examples, and comments where necessary.
99
- [ ] Pull request is nearly complete and ready for detailed review.
10-
- [ ] Maintainer: Appropriate GitHub Labels and Milestone are assigned to the Pull Request and linked Issue.
10+
- [ ] Maintainer: Appropriate GitHub Labels (including `remote-data`) and Milestone are assigned to the Pull Request and linked Issue.
1111

1212
<!-- Brief description of the problem and proposed solution (if not already fully described in the issue linked to above): -->

.github/workflows/asv_check.yml

Lines changed: 35 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,35 @@
1+
name: asv
2+
3+
# CI ASV CHECK is aimed to verify that the benchmarks execute without error.
4+
on:
5+
push:
6+
branches:
7+
- main
8+
pull_request:
9+
10+
11+
jobs:
12+
quick-benchmarks:
13+
runs-on: ubuntu-latest
14+
defaults:
15+
run:
16+
shell: bash -el {0}
17+
18+
steps:
19+
- uses: actions/checkout@v3
20+
with:
21+
fetch-depth: 0
22+
23+
- name: Install Python
24+
uses: actions/setup-python@v3
25+
with:
26+
python-version: '3.9'
27+
28+
- name: Install asv
29+
run: pip install asv==0.4.2
30+
31+
- name: Run asv benchmarks
32+
run: |
33+
cd benchmarks
34+
asv machine --yes
35+
asv run HEAD^! --quick --dry-run --show-stderr

.github/workflows/publish.yml

Lines changed: 9 additions & 6 deletions
Original file line numberDiff line numberDiff line change
@@ -1,11 +1,12 @@
11
name: Publish distributions to PyPI
22

3-
# if this workflow is modified to be a generic CI workflow then
4-
# add an if statement to the publish step so it only runs on tags.
53
on:
4+
pull_request:
65
push:
6+
branches:
7+
- main
78
tags:
8-
- "v*"
9+
- "v*"
910

1011
jobs:
1112
build-n-publish:
@@ -26,13 +27,15 @@ jobs:
2627
- name: Install build tools
2728
run: |
2829
python -m pip install --upgrade pip
29-
python -m pip install --upgrade setuptools wheel
30+
python -m pip install build
3031
3132
- name: Build packages
32-
run: python setup.py sdist bdist_wheel
33+
run: python -m build
3334

35+
# only publish distribution to PyPI for tagged commits
3436
- name: Publish distribution to PyPI
35-
uses: pypa/gh-action-pypi-publish@master
37+
if: startsWith(github.ref, 'refs/tags/v')
38+
uses: pypa/gh-action-pypi-publish@release/v1
3639
with:
3740
user: __token__
3841
password: ${{ secrets.pypi_password }}
Lines changed: 111 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,111 @@
1+
# A secondary test job that only runs the iotools tests if explicitly requested
2+
# (for pull requests) or on a push to the main branch.
3+
# Because the iotools tests require GitHub secrets, we need to be careful about
4+
# malicious PRs accessing the secrets and exposing them externally.
5+
#
6+
# We prevent this by only running this workflow when a maintainer has looked
7+
# over the PR's diff and verified that nothing malicious seems to be going on.
8+
# The maintainer then adds the "remote-data" label to the PR, which will then
9+
# trigger this workflow via the combination of the "on: ... types:"
10+
# and "if:" sections below. The first restricts the workflow to only run when
11+
# a label is added to the PR and the second requires one of the PR's labels
12+
# is the "remote-data" label. Technically this is slightly different from
13+
# triggering when the "remote-data" label is added, since it will also trigger
14+
# when "remote-data" is added, then later some other label is added. Maybe
15+
# there's a better way to do this.
16+
#
17+
# But wait, you say! Can't a malicious PR get around this by modifying
18+
# this workflow file and removing the label requirement? I think the answer
19+
# is "no" as long as we trigger the workflow on "pull_request_target" instead
20+
# of the usual "pull_request". The difference is what context the workflow
21+
# runs inside: "pull_request" runs in the context of the fork, where changes
22+
# to the workflow definition will take immediate effect, while "pull_request_target"
23+
# runs in the context of the main pvlib repository, where the original (non-fork)
24+
# workflow definition is used instead. Of course by switching away from the fork's
25+
# context to keep our original workflow definitions, we're also keeping all the
26+
# original code, so the tests won't be run against the PR's new code. To fix this
27+
# we explicitly check out the PR's code as the first step of the workflow.
28+
# This allows the job to run modified pvlib & pytest code, but only ever via
29+
# the original workflow file.
30+
# So long as a maintainer always verifies that the PR's code is not malicious prior to
31+
# adding the label and triggering this workflow, I think this should not present
32+
# a security risk.
33+
#
34+
# Note that this workflow can be triggered again by removing and re-adding the
35+
# "remote-data" label to the PR.
36+
#
37+
# Note also that "pull_request_target" is also the only way for the secrets
38+
# to be accessible in the first place.
39+
#
40+
# Further reading:
41+
# - https://securitylab.github.com/research/github-actions-preventing-pwn-requests/
42+
# - https://github.community/t/can-workflow-changes-be-used-with-pull-request-target/178626/7
43+
44+
name: pytest-remote-data
45+
46+
on:
47+
pull_request_target:
48+
types: [labeled]
49+
push:
50+
branches:
51+
- main
52+
53+
jobs:
54+
test:
55+
56+
strategy:
57+
fail-fast: false # don't cancel other matrix jobs when one fails
58+
matrix:
59+
python-version: [3.7, 3.8, 3.9, "3.10", "3.11"]
60+
suffix: [''] # the alternative to "-min"
61+
include:
62+
- python-version: 3.7
63+
suffix: -min
64+
65+
runs-on: ubuntu-latest
66+
if: (github.event_name == 'pull_request_target' && contains(github.event.pull_request.labels.*.name, 'remote-data')) || (github.event_name == 'push')
67+
68+
steps:
69+
- uses: actions/checkout@v3
70+
if: github.event_name == 'pull_request_target'
71+
# pull_request_target runs in the context of the target branch (pvlib/main),
72+
# but what we need is the hypothetical merge commit from the PR:
73+
with:
74+
ref: "refs/pull/${{ github.event.number }}/merge"
75+
76+
- uses: actions/checkout@v2
77+
if: github.event_name == 'push'
78+
79+
- name: Set up conda environment
80+
uses: conda-incubator/setup-miniconda@v2
81+
with:
82+
activate-environment: test_env
83+
environment-file: ${{ env.REQUIREMENTS }}
84+
python-version: ${{ matrix.python-version }}
85+
auto-activate-base: false
86+
env:
87+
# build requirement filename. First replacement is for the python
88+
# version, second is to add "-min" if needed
89+
REQUIREMENTS: ci/requirements-py${{ matrix.python-version }}${{ matrix.suffix }}.yml
90+
91+
- name: List installed package versions
92+
shell: bash -l {0} # necessary for conda env to be active
93+
run: conda list
94+
95+
- name: Run tests
96+
shell: bash -l {0} # necessary for conda env to be active
97+
env:
98+
# copy GitHub Secrets into environment variables for the tests to access
99+
NREL_API_KEY: ${{ secrets.NRELAPIKEY }}
100+
SOLARANYWHERE_API_KEY: ${{ secrets.SOLARANYWHERE_API_KEY }}
101+
BSRN_FTP_USERNAME: ${{ secrets.BSRN_FTP_USERNAME }}
102+
BSRN_FTP_PASSWORD: ${{ secrets.BSRN_FTP_PASSWORD }}
103+
run: pytest pvlib/tests/iotools pvlib/tests/test_forecast.py --cov=./ --cov-report=xml --remote-data
104+
105+
- name: Upload coverage to Codecov
106+
if: matrix.python-version == 3.7 && matrix.suffix == ''
107+
uses: codecov/codecov-action@v3
108+
with:
109+
fail_ci_if_error: true
110+
verbose: true
111+
flags: remote-data # flags are configured in codecov.yml

.github/workflows/pytest.yml

Lines changed: 89 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,89 @@
1+
name: pytest
2+
3+
on:
4+
pull_request:
5+
push:
6+
branches:
7+
- main
8+
9+
jobs:
10+
test:
11+
strategy:
12+
fail-fast: false # don't cancel other matrix jobs when one fails
13+
matrix:
14+
os: [ubuntu-latest, macos-latest, windows-latest]
15+
python-version: [3.7, 3.8, 3.9, "3.10", "3.11"]
16+
environment-type: [conda, bare]
17+
suffix: [''] # placeholder as an alternative to "-min"
18+
include:
19+
- os: ubuntu-latest
20+
python-version: 3.7
21+
environment-type: conda
22+
suffix: -min
23+
exclude:
24+
- os: macos-latest
25+
environment-type: conda
26+
- os: windows-latest
27+
environment-type: bare
28+
29+
runs-on: ${{ matrix.os }}
30+
31+
steps:
32+
# We check out only a limited depth and then pull tags to save time
33+
- name: Checkout source
34+
uses: actions/checkout@v3
35+
with:
36+
fetch-depth: 100
37+
38+
- name: Get tags
39+
run: git fetch --depth=1 origin +refs/tags/*:refs/tags/*
40+
41+
- name: Install Conda environment with Micromamba
42+
if: matrix.environment-type == 'conda'
43+
uses: mamba-org/provision-with-micromamba@v14
44+
with:
45+
environment-file: ${{ env.REQUIREMENTS }}
46+
cache-downloads: true
47+
extra-specs: |
48+
python=${{ matrix.python-version }}
49+
channel-priority: flexible
50+
env:
51+
# build requirement filename. First replacement is for the python
52+
# version, second is to add "-min" if needed
53+
REQUIREMENTS: ci/requirements-py${{ matrix.python-version }}${{ matrix.suffix }}.yml
54+
55+
- name: List installed package versions (conda)
56+
if: matrix.environment-type == 'conda'
57+
shell: bash -l {0} # necessary for conda env to be active
58+
run: micromamba list
59+
60+
- name: Install bare Python ${{ matrix.python-version }}${{ matrix.suffix }}
61+
if: matrix.environment-type == 'bare'
62+
uses: actions/setup-python@v4
63+
with:
64+
python-version: ${{ matrix.python-version }}
65+
66+
- name: Install pvlib
67+
if: matrix.environment-type == 'conda'
68+
shell: bash -l {0}
69+
run: python -m pip install --no-deps .
70+
71+
- name: Set up bare environment
72+
if: matrix.environment-type == 'bare'
73+
run: |
74+
pip install .[test]
75+
pip freeze
76+
77+
- name: Run tests
78+
shell: bash -l {0} # necessary for conda env to be active
79+
run: |
80+
# ignore iotools & forecast; those tests are run in a separate workflow
81+
pytest pvlib --cov=./ --cov-report=xml --ignore=pvlib/tests/iotools --ignore=pvlib/tests/test_forecast.py
82+
83+
- name: Upload coverage to Codecov
84+
if: matrix.python-version == 3.7 && matrix.suffix == '' && matrix.os == 'ubuntu-latest' && matrix.environment-type == 'conda'
85+
uses: codecov/codecov-action@v3
86+
with:
87+
fail_ci_if_error: true
88+
verbose: true
89+
flags: core # flags are configured in codecov.yml

.gitignore

Lines changed: 3 additions & 2 deletions
Original file line numberDiff line numberDiff line change
@@ -39,9 +39,10 @@ pvlib/spa_c_files/spa.h
3939
pvlib/spa_c_files/spa_tester.c
4040

4141
# generated documentation
42-
docs/sphinx/source/generated
42+
docs/sphinx/source/reference/generated
43+
docs/sphinx/source/reference/*/generated
4344
docs/sphinx/source/savefig
44-
docs/sphinx/source/auto_examples
45+
docs/sphinx/source/gallery
4546

4647
# Installer logs
4748
pip-log.txt

.landscape.yml

Lines changed: 0 additions & 11 deletions
This file was deleted.

.lgtm.yml

Lines changed: 0 additions & 6 deletions
This file was deleted.

.stickler.yml

Lines changed: 1 addition & 1 deletion
Original file line numberDiff line numberDiff line change
@@ -5,4 +5,4 @@ linters:
55
ignore: E201,E241,E226,W503,W504
66
files:
77
ignore:
8-
- 'pvlib/_version.py'
8+
- 'pvlib/version.py'

MANIFEST.in

Lines changed: 10 additions & 15 deletions
Original file line numberDiff line numberDiff line change
@@ -1,20 +1,7 @@
1-
include MANIFEST.in
2-
include AUTHORS.md
3-
include LICENSE
4-
include README.md
5-
include setup.py
6-
7-
# include most everything under pvlib by default
8-
# better to package too much than not enough
9-
graft pvlib
10-
11-
# we included pvlib files needed to compile NREL SPA code in graft above,
12-
# now we exclude the NREL code itself to comply with their license
131
global-exclude */spa.c
142
global-exclude */spa.h
153
prune pvlib/spa_c_files/build
164

17-
graft docs
185
prune docs/sphinx/build
196
prune docs/sphinx/source/generated
207
# all doc figures created by doc build
@@ -31,5 +18,13 @@ global-exclude .git*
3118
global-exclude \#*
3219
global-exclude .ipynb_checkpoints
3320

34-
include versioneer.py
35-
include pvlib/_version.py
21+
exclude .coveragerc
22+
exclude .stickler.yml
23+
exclude codecov.yml
24+
exclude readthedocs.yml
25+
exclude CODE_OF_CONDUCT.md
26+
27+
prune paper
28+
prune .github
29+
prune benchmarks
30+
prune ci

0 commit comments

Comments
 (0)