Skip to content

astroinfo-hacks/2025-astroPT

 
 

Repository files navigation

astroPT_shoggoth

PyPI PyPI Downloads docs License: AGPL-v3 All Contributors

ICML arXiv arXiv arXiv

Model on HF Dataset on HF

AstroPT: a Large Observation (foundation) Model for astronomy 🔭

Welcome to our simple repository for training astronomical large observation models. This repository began its life as Andrej Karpathy's nanoGPT, and has been altered so that it is usable for astronomical observation data. Within train.py you will find a ~300-line boilerplate training loop and within model.py you will find a ~300-line GPT model definition with an MLP tokeniser and a regressive loss.

Check out the UniverseTBD Discord for updates: discord.gg/MNEVegvfJq

Read the docs here: astropt.readthedocs.io

How does AstroPT work?

AstroPT is an autoregressive transformer under the hood.

Similarly to language models that predict the next word in a sentence, AstroPT processes sequences of astronomical data chunks to predict what comes next.

The intuition here is that this next-token-prediction task requires the model to internalise some understanding of the physical processes underlying the training data.

This is just like how a text GPT needs to have some knowledge of geography to guess a country's capital given a description of that country, or some knowledge of coding to write compilable Fortran.

Below we can see this principle applied to a galaxy image, where we split the image into chunks and pass them into an AstroPT model:

galaxy_im    astroPT_arch  

Of course we can apply this next-token-prediction task across many modalities due to its flexibility.

Check out our work on Euclid data for an example, where we chain galaxy image tokens and spectral energy distribution data and pass them into a single, unified AstroPT model.

I just want to run it! 🗣️

Okay I hear you! First you need to install the model:

Install

You can install via pip from PyPI:

pip install astropt

Or if you install locally via a git clone, you can uv install via:

git clone https://github.com/Smith42/astroPT.git
cd astroPT
uv sync

Load a pre-trained model

To load and run a pre-trained AstroPT model from HuggingFace you can use the load_astropt function:

from astropt.model_utils import load_astropt

model = load_astropt(
    repo_id="smith42/astropt_v2.0",
    path="astropt/095M",
    weights_filename="ckpt.pt",
)
model = model.to("cuda")

where repo_id is the HuggingFace repository ID, and path is the path within the repository that contains the AstroPT model checkpoint.

Pre-trained models

Below are some pre-trained models you can load with the code snippet above. Please make sure that you are using the correct version of AstroPT to load these!

Survey Modalities AstroPT version Model weights Dataset Paper
DESI Legacy Survey JPG galaxy imagery v1.0.0 AstroPT Galaxies Dataset arXiv:2405.14930
Euclid FITS VIS, NISP galaxy imagery and SED data v1.0.2 AstroPT-Euclid Euclid Training Dataset arXiv:2503.15312
DESI Legacy Survey JPG galaxy imagery v2.0.5 AstroPT v2.0 Galaxies Dataset v2.0 arXiv:2405.14930

Scripts for pre-training and processing data

Check out scripts for a collection of all the scripts we have used to get the results in these papers, and scripts/train.py for an example boilerplate script for pre-training your own AstroPT. config contains example user configurations for pre-training.

scripts/linear_probe.py has an example script for inferring embeddings from a pre-trained model and running a finetuning routine on them 🌝.

And finally scripts/finetune.py has an example LoRA finetune routine.

Contributors

Ryan Roberts
Ryan Roberts

💻 🤔 🖋
Mike Smith
Mike Smith

💻 🤔 🖋 🔣
mhuertascompany
mhuertascompany

🤔 🖋
Malgorzata Siudek
Malgorzata Siudek

🤔 🖋 💻 🔣
gimarso
gimarso

🤔 💻
Víctor Alonso
Víctor Alonso

🐛
Add your contributions

About

Transformer based foundation model for astronomy

Resources

License

Code of conduct

Contributing

Stars

Watchers

Forks

Releases

No releases published

Packages

No packages published

Languages

  • Python 99.9%
  • Shell 0.1%