This is the HowToGalaxy tutorial lecture series for PyAutoGalaxy, a Python library for galaxy morphology modeling. Tutorials teach new users how to model galaxy light from first principles.
scripts/— Runnable Python tutorial scriptschapter_1_introduction/— Grids, light profiles, galaxies, data, fittingchapter_2_modeling/— Non-linear searches, Bayesian inference, galaxy modelingchapter_3_search_chaining/— Search chaining, prior passing, automated pipelineschapter_4_pixelizations/— Pixelized galaxy reconstruction, inversions, regularizationchapter_optional/— Alternative non-linear searches and advanced topicssimulators/— Simulator scripts that generate the tutorial datasets at runtime
notebooks/— Jupyter notebook versions of scripts (generated fromscripts/, do not edit directly)config/—PyAutoGalaxyconfiguration YAML filesdataset/— Empty in the repo; tutorial datasets are written here at runtime by the simulator scriptsoutput/— Model-fit results (generated at runtime, not committed)
Scripts are run from the repository root so relative paths to dataset/ and output/ resolve correctly:
python scripts/chapter_1_introduction/tutorial_1_grids_and_galaxies.pyTutorials in chapters 1 and 2 that need a dataset invoke the relevant script in scripts/simulators/ via subprocess if the dataset folder does not already exist — there is no manual simulate-then-run step.
Integration testing / fast mode: set PYAUTO_TEST_MODE=1 to skip non-linear search sampling:
PYAUTO_TEST_MODE=1 python scripts/chapter_2_modeling/tutorial_1_non_linear_search.pyFast smoke tests: combine test mode with the skip flags:
PYAUTO_TEST_MODE=2 PYAUTO_SKIP_FIT_OUTPUT=1 PYAUTO_SKIP_VISUALIZATION=1 PYAUTO_SKIP_CHECKS=1 PYAUTO_FAST_PLOTS=1 python scripts/chapter_1_introduction/tutorial_5_summary.pyNote: PYAUTO_SMALL_DATASETS is deliberately not used in HowToGalaxy. Tutorials assume the full-resolution simulated datasets that the simulator scripts produce.
Codex / sandboxed runs: set writable cache directories so numba and matplotlib do not fail on unwritable home paths:
NUMBA_CACHE_DIR=/tmp/numba_cache MPLCONFIGDIR=/tmp/matplotlib python scripts/chapter_1_introduction/tutorial_1_grids_and_galaxies.pyImports used throughout the tutorials:
import autofit as af
import autogalaxy as ag
import autogalaxy.plot as apltNotebooks in notebooks/ are generated from the .py files in scripts/ using generate.py from the PyAutoBuild repo. Always edit the .py scripts, never the notebooks directly. The # %% marker alternates between code and markdown cells.
Run from the workspace root:
PYTHONPATH=../PyAutoBuild/autobuild python3 ../PyAutoBuild/autobuild/generate.py howtogalaxyThe howtogalaxy project target needs to be added to PyAutoBuild/autobuild/config.yaml. This is a known follow-up from the initial bootstrap.
HowToGalaxy is the teaching companion to autogalaxy_workspace. Many tutorials (particularly in chapters 2–4) point users to autogalaxy_workspace scripts (e.g. scripts/imaging/modeling.py, scripts/guides/...) as the next destination after the relevant concept has been introduced. Those cross-references use absolute paths like autogalaxy_workspace/scripts/... and refer to the separate autogalaxy_workspace repository — not to anything inside HowToGalaxy.
- PyAutoGalaxy source:
../PyAutoGalaxy - autogalaxy_workspace:
../autogalaxy_workspace— main user-facing workspace - PyAutoBuild:
../PyAutoBuild— notebook generation and CI/CD tooling
NEVER perform these operations on any repo with a remote:
git initin a directory already tracked by gitrm -rf .git && git init- Commit with subject "Initial commit", "Fresh start", "Start fresh", "Reset for AI workflow", or any equivalent message on a branch with a remote
git push --forcetomain(or any branch tracked asorigin/HEAD)git filter-repo/git filter-branchon shared branchesgit rebase -irewriting commits already pushed to a shared branch
If the working tree needs a clean state, the only correct sequence is:
git fetch origin
git reset --hard origin/main
git clean -fd
This applies equally to humans, local Claude Code, cloud Claude agents, Codex, and any other agent. The "Initial commit — fresh start for AI workflow" pattern that appeared independently on origin and local for three workspace repos is exactly what this rule prevents — it costs ~40 commits of redundant local work every time it happens.