Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
30 commits
Select commit Hold shift + click to select a range
3980504
add Docker and environment-associated files
JunAishima Aug 19, 2025
c9f8eb3
add precommit file
JunAishima Aug 19, 2025
447385e
add pixi files
JunAishima Aug 21, 2025
025b25d
update prefect.yaml for new system and repos
JunAishima Aug 21, 2025
d16e41e
update dependencies after finding a problem with mass
JunAishima Aug 21, 2025
4c98dfc
consistent naming for deployment name
JunAishima Aug 21, 2025
f020e6a
add script for running deployment
JunAishima Aug 21, 2025
cacc674
update version number
JunAishima Aug 21, 2025
ec2b2d4
update Dockerfile for Podman
JunAishima Sep 2, 2025
8c41d33
multiple settings
JunAishima Sep 2, 2025
5932bf0
add config file directory
JunAishima Sep 4, 2025
c512d97
update code location to NSLS-II-SST
JunAishima Sep 7, 2025
5764d16
build/push image from github action
JunAishima Sep 16, 2025
98efa0a
update branch name, update repo name
JunAishima Sep 16, 2025
70c2469
use --locked for pixi
JunAishima Sep 16, 2025
fa0b53e
remove Prefect run-post-link option
JunAishima Sep 17, 2025
5b01ab1
auto_remove of containers after running set to true
JunAishima Sep 17, 2025
93f1026
remove commented-outline
JunAishima Sep 17, 2025
4246d34
fix time zone name
JunAishima Sep 17, 2025
84309ce
update pre-commit-related actions
JunAishima Oct 2, 2025
041a9b1
lint: pre-commit fixes
JunAishima Oct 2, 2025
694c5ff
prevent pixi files from being modified
JunAishima Oct 2, 2025
59ec2b6
lint: fix pre-commit issues
JunAishima Oct 2, 2025
170de38
remove file not usable by beamline staff
JunAishima Oct 2, 2025
a12fe92
remove unnecessary line
JunAishima Oct 2, 2025
639bb06
do not use ENV field, do not pass in Tiled API key
JunAishima Oct 6, 2025
5d45c6b
refactor to use common get_tiled_client() function
JunAishima Oct 6, 2025
3fc90d2
add fix to problem during deployment
JunAishima Oct 6, 2025
b065057
ensure latest image is pulled
JunAishima Oct 6, 2025
5976d44
lint: pre-commit fixes
JunAishima Oct 6, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
6 changes: 3 additions & 3 deletions .github/workflows/linting.yml
Original file line number Diff line number Diff line change
Expand Up @@ -9,6 +9,6 @@ jobs:
pre-commit:
runs-on: ubuntu-latest
steps:
- uses: actions/checkout@v2
- uses: actions/setup-python@v2
- uses: pre-commit/action@v2.0.3
- uses: actions/checkout@v4.3.0
- uses: actions/setup-python@v5.6.0
- uses: pre-commit/action@v3.0.1
59 changes: 59 additions & 0 deletions .github/workflows/publish-ghcr.yml
Original file line number Diff line number Diff line change
@@ -0,0 +1,59 @@
#
name: Create and publish a Docker image

# Configures this workflow to run every time a change is pushed to the branch called `release`.
on:
push:
branches: ["main"]

# Defines two custom environment variables for the workflow. These are used for the Container registry domain, and a name for the Docker image that this workflow builds.
env:
REGISTRY: ghcr.io
IMAGE_NAME: ${{ github.repository }}

# There is a single job in this workflow. It's configured to run on the latest available version of Ubuntu.
jobs:
build-and-push-image:
runs-on: ubuntu-latest
# Sets the permissions granted to the `GITHUB_TOKEN` for the actions in this job.
permissions:
contents: read
packages: write
attestations: write
id-token: write
#
steps:
- name: Checkout repository
uses: actions/checkout@v5
# Uses the `docker/login-action` action to log in to the Container registry registry using the account and password that will publish the packages. Once published, the packages are scoped to the account defined here.
- name: Log in to the Container registry
uses: docker/login-action@65b78e6e13532edd9afa3aa52ac7964289d1a9c1
with:
registry: ${{ env.REGISTRY }}
username: ${{ github.actor }}
password: ${{ secrets.GITHUB_TOKEN }}
# This step uses [docker/metadata-action](https://github.com/docker/metadata-action#about) to extract tags and labels that will be applied to the specified image. The `id` "meta" allows the output of this step to be referenced in a subsequent step. The `images` value provides the base name for the tags and labels.
- name: Extract metadata (tags, labels) for Docker
id: meta
uses: docker/metadata-action@9ec57ed1fcdbf14dcef7dfbe97b2010124a938b7
with:
images: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME }}
# This step uses the `docker/build-push-action` action to build the image, based on your repository's `Dockerfile`. If the build succeeds, it pushes the image to GitHub Packages.
# It uses the `context` parameter to define the build's context as the set of files located in the specified path. For more information, see [Usage](https://github.com/docker/build-push-action#usage) in the README of the `docker/build-push-action` repository.
# It uses the `tags` and `labels` parameters to tag and label the image with the output from the "meta" step.
- name: Build and push Docker image
id: push
uses: docker/build-push-action@f2a1d5e99d037542a71f64918e516c093c6f3fc4
with:
context: .
push: true
tags: ${{ steps.meta.outputs.tags }}
labels: ${{ steps.meta.outputs.labels }}

# This step generates an artifact attestation for the image, which is an unforgeable statement about where and how it was built. It increases supply chain security for people who consume the image. For more information, see [Using artifact attestations to establish provenance for builds](/actions/security-guides/using-artifact-attestations-to-establish-provenance-for-builds).
- name: Generate artifact attestation
uses: actions/attest-build-provenance@v3
with:
subject-name: ${{ env.REGISTRY }}/${{ env.IMAGE_NAME}}
subject-digest: ${{ steps.push.outputs.digest }}
push-to-registry: true
92 changes: 80 additions & 12 deletions .pre-commit-config.yaml
Original file line number Diff line number Diff line change
@@ -1,24 +1,92 @@
# See https://pre-commit.com for more information
# See https://pre-commit.com/hooks.html for more hooks
ci:
autoupdate_commit_msg: "chore: update pre-commit hooks"
autofix_commit_msg: "style: pre-commit fixes"

exclude: ^.cruft.json|.copier-answers.yml$

repos:
- repo: https://github.com/adamchainz/blacken-docs
rev: "1.19.1"
hooks:
- id: blacken-docs
additional_dependencies: [black==24.*]

- repo: https://github.com/pre-commit/pre-commit-hooks
rev: v4.0.1
rev: "v5.0.0"
hooks:
- id: trailing-whitespace
- id: end-of-file-fixer
- id: check-ast
- id: check-added-large-files
- id: check-case-conflict
- id: check-merge-conflict
- id: check-symlinks
- id: check-yaml
- id: debug-statements
- id: end-of-file-fixer
- id: mixed-line-ending
- id: name-tests-test
args: ["--pytest-test-first"]
- id: requirements-txt-fixer
- id: trailing-whitespace

- repo: https://github.com/pre-commit/pygrep-hooks
rev: "v1.10.0"
hooks:
- id: rst-backticks
- id: rst-directive-colons
- id: rst-inline-touching-normal

- repo: https://github.com/rbubley/mirrors-prettier
rev: "v3.5.3"
hooks:
- id: prettier
types_or: [yaml, markdown, html, css, scss, javascript, json]
args: [--prose-wrap=always]

- repo: https://github.com/astral-sh/ruff-pre-commit
rev: "v0.11.11"
hooks:
- id: ruff
args: ["--fix", "--show-fixes"]
- id: ruff-format

- repo: https://github.com/pre-commit/mirrors-mypy
rev: "v1.15.0"
hooks:
- id: mypy
files: src|tests
args: []
additional_dependencies:
- pytest

- repo: https://github.com/codespell-project/codespell
rev: "v2.4.1"
hooks:
- id: codespell
additional_dependencies:
- tomli; python_version<'3.11'
exclude: "pixi.*"

- repo: https://github.com/shellcheck-py/shellcheck-py
rev: "v0.10.0.1"
hooks:
- id: shellcheck

- repo: local
hooks:
- id: disallow-caps
name: Disallow improper capitalization
language: pygrep
entry: PyBind|Numpy|Cmake|CCache|Github|PyTest
exclude: .pre-commit-config.yaml

- repo: https://github.com/timothycrosley/isort
rev: 5.9.3
- repo: https://github.com/abravalheri/validate-pyproject
rev: "v0.24.1"
hooks:
- id: isort
- id: validate-pyproject
additional_dependencies: ["validate-pyproject-schema-store[all]"]

- repo: https://github.com/psf/black
rev: 22.3.0
- repo: https://github.com/python-jsonschema/check-jsonschema
rev: "0.33.0"
hooks:
- id: black
- id: check-dependabot
- id: check-github-workflows
- id: check-readthedocs
27 changes: 27 additions & 0 deletions Dockerfile
Original file line number Diff line number Diff line change
@@ -0,0 +1,27 @@
FROM ghcr.io/prefix-dev/pixi:latest

ENV TZ="America/New_York"

RUN apt-get -y update && \
apt-get -y install git

COPY pixi.toml .
COPY pixi.lock .
# use `--locked` to ensure the lockfile is up to date with pixi.toml
RUN pixi install --locked
# create the shell-hook bash script to activate the environment
RUN pixi shell-hook -s bash > /shell-hook

ENV PYTHONUNBUFFERED=1

COPY test.py .

RUN mkdir /etc/tiled
RUN mkdir /.prefect -m 0777
RUN mkdir /repo -m 0777

RUN /bin/bash /shell-hook

#now reapply deployment to push the image that is being created
ENTRYPOINT ["pixi", "run"]
CMD ["python", "-m", "test", "arg"]
3 changes: 2 additions & 1 deletion README.md
Original file line number Diff line number Diff line change
@@ -1,3 +1,4 @@
# ucal Workflows

Repository of Prefect workflows for the micro calorimetry endstation at the SST beamline.
Repository of Prefect workflows for the micro calorimetry endstation at the SST
beamline.
4 changes: 3 additions & 1 deletion end_of_run_export.py
Original file line number Diff line number Diff line change
Expand Up @@ -10,7 +10,9 @@
def get_export_path(run):
proposal_path = get_proposal_path(run)

visit_date = datetime.datetime.fromisoformat(run.start.get("start_datetime", datetime.datetime.today().isoformat()))
visit_date = datetime.datetime.fromisoformat(
run.start.get("start_datetime", datetime.datetime.today().isoformat())
)
visit_dir = visit_date.strftime("%Y%m%d_export")

export_path = join(proposal_path, visit_dir)
Expand Down
12 changes: 5 additions & 7 deletions export_to_athena.py
Original file line number Diff line number Diff line change
@@ -1,5 +1,5 @@
import numpy as np
from os.path import exists, join
from os.path import join
from export_tools import add_comment_to_lines, get_header_and_data
from prefect import get_run_logger

Expand All @@ -12,12 +12,12 @@ def exportToAthena(
c2="",
headerUpdates={},
strict=False,
verbose=True
verbose=True,
):
"""Exports to Graham's ASCII SSRL data format

:param folder: Export folder (filename will be auto-generated)
:param data: Numpy array with data of dimensions (npts, ncols)
:param data: NumPy array with data of dimensions (npts, ncols)
:param header: Dictionary with 'scaninfo', 'motors', 'channelinfo' sub-dictionaries
:param namefmt: Python format string that will be filled with info from 'scaninfo' dictionary
:param c1: Comment string 1
Expand Down Expand Up @@ -59,7 +59,7 @@ def exportToAthena(
metadata["c1"] = c1
metadata["c2"] = c2

headerstring = """NSLS
headerstring = """NSLS
{date}
PTS:{npts:11d} COLS: {ncols:11d}
Sample: {sample} loadid: {loadid}
Expand All @@ -71,9 +71,7 @@ def exportToAthena(
{c1}
{c2}
-------------------------------------------------------------------------------
{cols}""".format(
**metadata, **motors
)
{cols}""".format(**metadata, **motors)
headerstring = add_comment_to_lines(headerstring, "#")
logger.info(f"Writing Athena to {filename}")
with open(filename, "w") as f:
Expand Down
8 changes: 6 additions & 2 deletions export_to_hdf5.py
Original file line number Diff line number Diff line change
Expand Up @@ -14,14 +14,18 @@ def exportToHDF5(folder, run, header_updates={}):
"""

if "primary" not in run:
print(f"HDF5 Export does not support streams other than Primary, skipping {run.start['scan_id']}")
print(
f"HDF5 Export does not support streams other than Primary, skipping {run.start['scan_id']}"
)
return False
metadata = get_xdi_run_header(run, header_updates)
print("Got XDI Metadata")
filename = make_filename(folder, metadata, "hdf5")
print(f"Exporting HDF5 to {filename}")

columns, run_data, metadata = get_xdi_normalized_data(run, metadata, omit_array_keys=False)
columns, run_data, metadata = get_xdi_normalized_data(
run, metadata, omit_array_keys=False
)

with h5py.File(filename, "w") as f:
for name, data in zip(columns, run_data):
Expand Down
15 changes: 11 additions & 4 deletions export_to_tiled.py
Original file line number Diff line number Diff line change
@@ -1,4 +1,4 @@
from export_to_xdi import get_xdi_normalized_data, get_xdi_run_header, make_filename
from export_to_xdi import get_xdi_normalized_data, get_xdi_run_header
import xarray as xr


Expand Down Expand Up @@ -28,20 +28,27 @@ def export_to_tiled(run, header_updates={}):
"""

if "primary" not in run:
print(f"Tiled Export does not support streams other than Primary, skipping {run.start['scan_id']}")
print(
f"Tiled Export does not support streams other than Primary, skipping {run.start['scan_id']}"
)
return False
metadata = get_xdi_run_header(run, header_updates)
print("Got XDI Metadata")

columns, run_data, metadata = get_xdi_normalized_data(run, metadata, omit_array_keys=False)
columns, run_data, metadata = get_xdi_normalized_data(
run, metadata, omit_array_keys=False
)

da_dict = {}
for name, data in zip(columns, run_data):
if name == "rixs":
if len(data) == 3:
counts, mono_grid, energy_grid = data
rixs = xr.DataArray(
counts.T, coords={"emission": energy_grid[:, 0]}, dims=("time", "emission"), name=name
counts.T,
coords={"emission": energy_grid[:, 0]},
dims=("time", "emission"),
name=name,
)
else:
rixs = xr.DataArray(data, dims=("time", "emission"), name=name)
Expand Down
Loading
Loading