Unverified Commit c33b31bf authored by David Hoese's avatar David Hoese Committed by GitHub
Browse files

Add initial basic github actions (#321)

* Add initial attempt at using mamba-based github actions

* Update sphinx html generation in github actions

mambaforge container doesn't have Make installed

* Try activating mamba environment in github action CI

* Fix conda activation in github actions CI

* Add conda init bash to github CI

* Try activating conda environment in the next job steps

* Try using setup-miniconda with mambaforge

* Remove miniconda version specifier in setup-miniconda

* Add docs extras to CI pip install

* Add missing doc dependencies to CI install

* Add numba to CI environment

* Use SPHINXOPTS to fail on CI if sphinx has warnings

* Fix docstring issues causing warnings in sphinx documentation generation

* Add _static directory to doc

* Fix type annotation in document

* Fix type annotation throughout SIFT

* Add pytest runs to github action

Move travis config so it no longer runs

* Fix default shell usage in github action

* Use newer miniforge options with setup-miniconda

* Add OpenGL installation to CI and remove windows/osx testing

* Add missing scikit-image dependency in CI

* Add initial sdist and bundle deployment CI

* Allow deploy job on PR branch for testing

* Fix bad deploy CI config

* Fix bad deploy CI config

* Try providing SSH known hosts as a secret

* Attempt to add some error checking to conda pack CI scripts

* Add some debug to conda pack CI scripts

* Re-add -k flag to curl commands host checking doesn't seem to work with SSEC FTP

* Remove temporary PR branch trigger now that upload has been tested

* Add satpy to unstable dep updates
parent 132e6467
name: CI
# https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#concurrency
# https://docs.github.com/en/developers/webhooks-and-events/events/github-event-types#pullrequestevent
concurrency:
group: ${{ github.workflow }}-${{ github.event.number }}-${{ github.event.type }}
cancel-in-progress: true
on: [push, pull_request]
jobs:
lint:
name: lint and style checks
runs-on: ubuntu-latest
steps:
- name: Checkout source
uses: actions/checkout@v2
- name: Set up Python
uses: actions/setup-python@v2
with:
python-version: 3.9
- name: Install dependencies
run: |
python -m pip install --upgrade pip
pip install flake8 flake8-docstrings flake8-debugger flake8-bugbear pytest
- name: Run linting
run: |
flake8 uwsift/
website:
name: build website
runs-on: ubuntu-latest
defaults:
run:
shell: "bash -l {0}"
steps:
- name: Checkout source
uses: actions/checkout@v2
with:
fetch-depth: 0
- name: Setup Conda Environment
uses: conda-incubator/setup-miniconda@v2
with:
miniforge-variant: Mambaforge
miniforge-version: latest
use-mamba: true
environment-file: continuous_integration/environment.yaml
activate-environment: test-environment
- name: Install SIFT
shell: bash -l {0}
run: |
pip install sphinx sphinx_rtd_theme sphinxcontrib-apidoc sphinxcontrib-seqdiag sphinxcontrib-blockdiag blockdiag
pip install --no-deps -e .
- name: Run Sphinx Build
shell: bash -l {0}
run: |
cd doc
make html SPHINXOPTS="-W"
test:
runs-on: ${{ matrix.os }}
defaults:
run:
shell: "bash -l {0}"
continue-on-error: ${{ matrix.experimental }}
needs: [lint]
strategy:
fail-fast: true
matrix:
# XXX: We don't currently have OpenGL installation on other platforms
#os: ["windows-latest", "ubuntu-latest", "macos-latest"]
os: ["ubuntu-latest"]
python-version: ["3.7", "3.9"]
experimental: [false]
include:
- python-version: "3.9"
os: "ubuntu-latest"
experimental: true
env:
PYTHON_VERSION: ${{ matrix.python-version }}
OS: ${{ matrix.os }}
UNSTABLE: ${{ matrix.experimental }}
ACTIONS_ALLOW_UNSECURE_COMMANDS: true
steps:
- name: Checkout source
uses: actions/checkout@v2
- name: Prepare System Environment
run: |
# opengl system libraries
sudo apt-get update
cat continuous_integration/linux_full_deps_apt.txt | xargs sudo apt-get -y install
# Start xvfb daemon
export DISPLAY=:99.0
/sbin/start-stop-daemon --start --quiet --pidfile /tmp/custom_xvfb_99.pid --make-pidfile --background --exec /usr/bin/Xvfb -- :99 -screen 0 1400x900x24 -ac +extension GLX +render
sleep 5
# export python_version
PY_VER=${{ matrix.python-version }}
echo ::set-output name=python-version::${PY_VER//.}
- name: Setup Conda Environment
uses: conda-incubator/setup-miniconda@v2
with:
miniforge-variant: Mambaforge
miniforge-version: latest
use-mamba: true
environment-file: continuous_integration/environment.yaml
activate-environment: test-environment
- name: Install unstable dependencies
if: matrix.experimental == true
run: |
python -m pip install \
--index-url https://pypi.anaconda.org/scipy-wheels-nightly/simple/ \
--trusted-host pypi.anaconda.org \
--no-deps --pre --upgrade \
matplotlib \
numpy \
pandas \
scipy; \
python -m pip install \
--no-deps --upgrade \
git+https://github.com/dask/dask \
git+https://github.com/dask/distributed \
git+https://github.com/zarr-developers/zarr \
git+https://github.com/Unidata/cftime \
git+https://github.com/mapbox/rasterio \
git+https://github.com/pydata/bottleneck \
git+https://github.com/pydata/xarray \
git+https://github.com/pytroll/satpy;
- name: Install uwsift
run: |
pip install --no-deps -e .
- name: Run unit tests
run: |
export DISPLAY=:99.0
pytest --cov=uwsift uwsift/tests
- name: Coveralls Parallel
uses: AndreMiras/coveralls-python-action@develop
with:
flag-name: run-${{ matrix.test_number }}
parallel: true
if: runner.os == 'Linux'
coveralls:
needs: [test]
runs-on: ubuntu-latest
steps:
- name: Coveralls Finished
uses: AndreMiras/coveralls-python-action@develop
with:
parallel-finished: true
name: Deploy
# https://docs.github.com/en/actions/reference/workflow-syntax-for-github-actions#concurrency
# https://docs.github.com/en/developers/webhooks-and-events/events/github-event-types#pullrequestevent
concurrency:
group: ${{ github.workflow }}-${{ github.event.number }}-${{ github.event.type }}
cancel-in-progress: true
on:
push:
branches:
- master
release:
types:
- published
jobs:
sdist:
runs-on: ubuntu-latest
steps:
- name: Checkout source
uses: actions/checkout@v2
- name: Create sdist
shell: bash -l {0}
run: python setup.py sdist
- uses: actions/upload-artifact@v2
with:
name: dist
path: dist/*.tar.gz
bundle:
runs-on: ${{ matrix.os }}
defaults:
run:
shell: "bash -l {0}"
continue-on-error: false
strategy:
fail-fast: true
matrix:
os: ["windows-latest", "ubuntu-latest", "macos-latest"]
python-version: ["3.9"]
env:
PYTHON_VERSION: ${{ matrix.python-version }}
OS: ${{ matrix.os }}
ACTIONS_ALLOW_UNSECURE_COMMANDS: true
steps:
- name: Checkout source
uses: actions/checkout@v2
- name: Install SSH Key
uses: shimataro/ssh-key-action@v2
with:
key: ${{ secrets.SIFT_SFTP_UPLOAD_KEY }}
name: id_rsa_sftp
known_hosts: ${{ secrets.SIFT_SFTP_UPLOAD_KNOWN_HOSTS }}
- name: Setup Conda Environment
uses: conda-incubator/setup-miniconda@v2
with:
miniforge-variant: Mambaforge
miniforge-version: latest
use-mamba: true
environment-file: continuous_integration/environment.yaml
activate-environment: test-environment
- name: Install conda pack
run: |
conda install conda-pack
- name: Install uwsift
run: |
pip install --no-deps .
- name: Build bundle
run: |
continuous_integration/build_conda_pack.sh
# It would be better to only upload if all platforms had succeeded but
# that would require uploading the bundles as artifacts which would
# quickly hit our temporary storage limits
- name: Upload bundle
run: |
continuous_integration/upload_conda_pack.sh
upload-sdist:
runs-on: ubuntu-latest
needs: [sdist, bundle] # don't deploy unless sdist and bundle building succeeded
# publish when a GitHub Release is created
if: github.event_name == 'release' && github.event.action == 'published'
steps:
- uses: actions/download-artifact@v2
with:
name: dist
path: dist
- uses: pypa/gh-action-pypi-publish@master
with:
user: __token__
password: ${{ secrets.UWSIFT_PYPI_TOKEN }}
\ No newline at end of file
#!/usr/bin/env bash
set -ex
GIT_TAG="${GITHUB_REF##*/}"
if [[ $GIT_TAG =~ [0-9]+.[0-9]+.[0-9]+ ]]; then
# valid tag (use default script options)
oflag=""
else
# master branch
version=$(python -c "from uwsift import __version__; print(__version__)")
if [[ "${OS}" == "windows-latest" ]]; then
ext="zip"
platform="windows"
else
ext="tar.gz"
if [[ "${OS}" == "macos-latest" ]]; then
platform="darwin"
else
platform="linux"
fi;
fi;
oflag="-o SIFT_${version}dev_${platform}_$(date +%Y%m%d_%H%M%S).${ext}"
fi
python build_conda_pack.py -j -1 $oflag
ls -l
set +ex
name: test-environment
channels:
- conda-forge
dependencies:
- appdirs
- Cython
- imageio
- imageio-ffmpeg
- matplotlib
- numba
- numpy
- pyproj
- pyshp
- pyqt
- pyqtgraph
- satpy
- scikit-image
- shapely
- sqlalchemy
- vispy
### Satpy-only Optional Deps
- bottleneck
- dask
- donfig
# 2.19.1 seems to cause library linking issues
- eccodes>=2.20
- fsspec
- h5netcdf
- h5py
- netcdf4
- pillow
- pooch
- pyhdf
- pyresample
- python-eccodes
- python-geotiepoints
- pyyaml
- rasterio
- rioxarray
- xarray
- zarr
### Development/Test dependencies
- coveralls
- coverage
- codecov
- pytest
- pytest-cov
- pytest-mock
- pytest-qt
- pip
- sphinx
### Pip Dependencies
- pip:
- trollsift
- trollimage
- pyspectral
- pyorbital
libglu1-mesa-dev
libgl1-mesa-dev
libxi-dev
libglfw3-dev
libgles2-mesa-dev
libsdl2-2.0-0
mesa-utils
\ No newline at end of file
#!/usr/bin/env bash
set -ex
GIT_TAG="${GITHUB_REF##*/}"
if [[ $GIT_TAG =~ [0-9]+.[0-9]+.[0-9]+ ]]; then
# valid tag
odir=""
else
# master branch
odir="experimental/"
fi
# Upload the new bundle
curl -k --ftp-create-dirs -T SIFT_*.*.*_*.* --key $HOME/.ssh/id_rsa_sftp sftp://sift@ftp.ssec.wisc.edu/${odir}
set +e
# Delete any old
if [[ $GIT_TAG =~ [0-9]+.[0-9]+.[0-9]+ ]]; then
curl -k -l --key $HOME/.ssh/id_rsa_sftp sftp://sift@ftp.ssec.wisc.edu/experimental/ | grep SIFT_*.*.*_*.* | xargs -I{} -- curl -k -v --key $HOME/.ssh/id_rsa_sftp sftp://sift@ftp.ssec.wisc.edu/experimental/ -Q "RM experimental/{}"
if [ $? -ne 0 ]; then
echo "Failed to delete old experimental SIFT tarballs from FTP server"
fi
fi
set +x
\ No newline at end of file
......@@ -55,7 +55,7 @@ and is often used as shorthand between subsystems. Document rarely deals directl
:copyright: 2015 by University of Wisconsin Regents, see AUTHORS for more details
:license: GPLv3, see LICENSE for more details
"""
from uwsift.model.layer import Mixing, DocLayer, DocBasicLayer, DocRGBLayer, DocCompositeLayer
from __future__ import annotations
__author__ = 'rayg'
__docformat__ = 'reStructuredText'
......@@ -78,6 +78,7 @@ from uwsift.queue import TaskQueue
from uwsift.workspace import Workspace
from uwsift.util.default_paths import DOCUMENT_SETTINGS_DIR
from uwsift.model.composite_recipes import RecipeManager, CompositeRecipe
from uwsift.model.layer import Mixing, DocLayer, DocBasicLayer, DocRGBLayer, DocCompositeLayer
from uwsift.view.colormap import COLORMAP_MANAGER, PyQtGraphColormap, SITE_CATEGORY, USER_CATEGORY
from uwsift.queue import TASK_PROGRESS, TASK_DOING
from PyQt5.QtCore import QObject, pyqtSignal
......@@ -447,7 +448,7 @@ class DocumentAsLayerStack(DocumentAsContextBase):
"""
raise NotImplementedError("need to consult mdb to get product info dictionary under playhead")
def get_info(self, dex: [int, UUID]):
def get_info(self, dex: typ.Union[int, UUID]):
"""return info dictionary with top z-order at 0, going downward
"""
if isinstance(dex, UUID):
......
......@@ -128,8 +128,8 @@ class CoordTransform(QObject):
self._time_unit = time_unit
self._track_height = track_height or GFXC.track_height
def calc_time_duration(self, scene_x: [float, None], scene_w: [float, None]) -> Tuple[
Optional[datetime], Optional[timedelta]]:
def calc_time_duration(self, scene_x: Optional[float], scene_w: Optional[float]
) -> Tuple[Optional[datetime], Optional[timedelta]]:
"""Calculate time and duration given scene X coordinate and width (from QRectF typically)
Args:
......@@ -151,8 +151,8 @@ class CoordTransform(QObject):
"""
return self._track_height * (0.5 + float(self._max_z - z))
def calc_scene_rect(self, ztd: ztdtup = None, z: int = None, t: datetime = None, d: timedelta = None) -> [QRectF,
None]:
def calc_scene_rect(self, ztd: ztdtup = None, z: int = None, t: datetime = None, d: timedelta = None
) -> Optional[QRectF]:
"""
calculate scene coordinates given time and Z
Args:
......
......@@ -66,7 +66,7 @@ https://stackoverflow.com/questions/4216139/python-object-in-qmimedata
:license: GPLv3, see LICENSE for more details
"""
import logging
from typing import Mapping, Any, Tuple
from typing import Mapping, Any, Tuple, Union
from uuid import UUID
from weakref import ref
import pickle as pkl
......@@ -131,7 +131,7 @@ class QTrackItem(QGraphicsObject):
_metadata: Mapping = None
_tooltip: str = None
_state: Flags = None # VisualState Flags determine how it's being presented
_colormap: [QGradient, QImage, QPixmap] = None
_colormap: Union[QGradient, QImage, QPixmap] = None
_min: float = None
_max: float = None
_dragging: bool = False # whether or not a drag is in progress across this item
......@@ -146,7 +146,7 @@ class QTrackItem(QGraphicsObject):
def __init__(self, scene, scale: CoordTransform, track: str, z: int,
title: str, subtitle: str = None, icon: QIcon = None, metadata: dict = None,
tooltip: str = None, state: Flags = None, colormap: [QGradient, QImage] = None,
tooltip: str = None, state: Flags = None, colormap: Union[QGradient, QImage, QPixmap] = None,
min: float = None, max: float = None):
"""Create a track and connect it to its Scene
"""
......
......@@ -15,7 +15,7 @@ import logging
import sys
import unittest
from datetime import datetime, timedelta
from typing import Tuple, Optional, Mapping, List, Callable, Set, Iterable, Sequence, Any
from typing import Tuple, Optional, Mapping, List, Callable, Set, Iterable, Sequence, Any, Union
from uuid import UUID
from PyQt5.QtCore import QRectF, Qt, pyqtSignal
......@@ -421,7 +421,7 @@ class QFramesInTracksScene(QGraphicsScene):
"""Yield series of track information tuples which will be used to generate/update QTrackItems
"""
def get(self, item: [UUID, str]) -> [QTrackItem, QFrameItem, None]:
def get(self, item: Union[UUID, str]) -> Union[QTrackItem, QFrameItem, None]:
if isinstance(item, UUID):
z = self._frame_items.get(item)
elif isinstance(item, str):
......@@ -469,7 +469,7 @@ class QFramesInTracksScene(QGraphicsScene):
LOG.warning("using base class menu_for_track which does nothing")
return None
def update(self, changed_tracks: [Set[str], None] = None, changed_frame_uuids: [Set[UUID], None] = None):
def update(self, changed_tracks: Optional[Set[str]] = None, changed_frame_uuids: Optional[Set[UUID]] = None):
"""Populate or update scene, returning number of items changed in scene
Does not add new items for tracks and frames already present
Parameters serve only as hints
......@@ -485,7 +485,9 @@ class TestScene(QFramesInTracksScene):
super(TestScene, self).__init__(*args, **kwargs)
# assert(hasattr(self, '_track_order'))
def update(self, changed_tracks: [Set[str], None] = None, changed_frame_uuids: [Set[UUID], None] = None) -> int:
def update(self,
changed_tracks: Optional[Set[str]] = None,
changed_frame_uuids: Optional[Set[UUID]] = None) -> int:
if self._did_populate:
return 0
self._test_populate()
......
......@@ -655,7 +655,7 @@ class MultiChannelImageVisual(ImageVisual):
they represent the color limits of each channel array.
gamma : float | list
Gamma to use during colormap lookup. Final value will be computed
``val**gamma` for each RGB channel array. If provided as a float then
``val**gamma`` for each RGB channel array. If provided as a float then
it will be used for each channel. If provided as a 3-element tuple
then each value is used for the separate channel arrays. Default is
1.0 for each channel.
......
......@@ -24,7 +24,7 @@ import os
import sys
import unittest
from datetime import datetime
from typing import List, Iterable, Mapping
from typing import List, Iterable, Mapping, Union
from PyQt5.QtCore import QObject
......@@ -130,7 +130,7 @@ class ResourceSearchPathCollector(QObject):
os.utime(self._timestamp_path)
return mtime
def __init__(self, ws: [Workspace, _workspace_test_proxy]):
def __init__(self, ws: Union[Workspace, _workspace_test_proxy]):
super(ResourceSearchPathCollector, self).__init__()
self._ws = ws
self._paths = []
......
......@@ -41,12 +41,12 @@ class Guidebook(object):
return None, None
def time_siblings(self, uuid, infos):
"""
determine the time siblings of a given dataset
"""Determine the time siblings of a given dataset.
:param uuid: uuid of the dataset we're interested in
:param infos: datasetinfo_dict sequence, available datasets
:return: (list,offset:int): list of [uuid,uuid,uuid] for siblings in order;
offset of where the input is found in list
offset of where the input is found in list
"""
return None, None
......
......@@ -11,19 +11,17 @@ Some states have UUIDs and therefore data
Search directories and create index of what data is where
Used by Workspace to respond to adjacency queries / product matrix requests
USAGE
dm = DataAdjacencyMatrix('/data', recurse=True)
# search through files
for _ in dm.finditer():
pass
ds = dm.ix['myproduct', 0]
if ds.state!=state.CACHED:
for _ in ds.loaditer(my_workspace):
pass
uuid = ds.uuid
USAGE::
dm = DataAdjacencyMatrix('/data', recurse=True)
# search through files
for _ in dm.finditer():
pass
ds = dm.ix['myproduct', 0]
if ds.state!=state.CACHED:
for _ in ds.loaditer(my_workspace):
pass
uuid = ds.uuid
REFERENCES
......
......@@ -10,15 +10,17 @@ SQLAlchemy database tables of metadata used by Workspace to manage its local cac
OVERVIEW
Resource : a file containing products, somewhere in the filesystem,
| or a resource on a remote system we can access (openDAP etc)
|_ Product* : product stored in a resource
|_ Content* : workspace cache content corresponding to a product,
| | may be one of many available views (e.g. projections)
| |_ ContentKeyValue* : additional information on content
|_ ProductKeyValue* : additional information on product
|_ SymbolKeyValue* : if product is derived from other products,
symbol table for that expression is in this kv table
::
Resource : a file containing products, somewhere in the filesystem,
| or a resource on a remote system we can access (openDAP etc)
|_ Product* : product stored in a resource
|_ Content* : workspace cache content corresponding to a product,
| | may be one of many available views (e.g. projections)
| |_ ContentKeyValue* : additional information on content
|_ ProductKeyValue* : additional information on product
|_ SymbolKeyValue* : if product is derived from other products,
symbol table for that expression is in this kv table
A typical baseline product will have two content: and overview (lod==0) and a native resolution (lod>0)
......
Supports Markdown
0% or .
You are about to add 0 people to the discussion. Proceed with caution.
Finish editing this message first!
Please register or to comment