Skip to content
Open
Show file tree
Hide file tree
Changes from all commits
Commits
Show all changes
39 commits
Select commit Hold shift + click to select a range
0a60877
add lensing and basic Spherical painting functionality
ASKabalan Sep 23, 2025
efcfcf2
Allow Cosmo Cache deactivation
ASKabalan Sep 23, 2025
8ce412d
add Symplectic ODE terms for fast PM integration
ASKabalan Sep 23, 2025
e21ebe4
add test against glass convergence maps
ASKabalan Sep 23, 2025
1c036ec
format
ASKabalan Sep 23, 2025
6f9b02b
add glass[examples] to requirements-test.txt
ASKabalan Sep 23, 2025
f5507ae
format
ASKabalan Sep 23, 2025
36ce358
format
ASKabalan Sep 24, 2025
e8264cc
add Jax healpy as a dependency
ASKabalan Sep 24, 2025
c692638
Roll back growth.py cache
ASKabalan Oct 7, 2025
8f60be6
Update testing dependencies
ASKabalan Oct 7, 2025
b746fe8
update lensing.py and test_convergence_vs_glass to fix jax-cosmo cach…
ASKabalan Oct 7, 2025
70d661e
Revert the fastpm odes to use Growth functions
ASKabalan Oct 7, 2025
72647d9
Add safe division by zero in lensing.py
ASKabalan Oct 7, 2025
5ca4caa
add save division by zero in spherical.py
ASKabalan Oct 7, 2025
e56ccca
Allow deactivating the usage of growth factors in ode.py in the FastP…
ASKabalan Oct 7, 2025
5336b36
Merge branch 'main' into 41-spherical-lensing
ASKabalan Oct 9, 2025
491e0ae
Fix flat sky lensing issues and update the painting method to use the…
ASKabalan Oct 14, 2025
ada5187
merge remote-tracking branch 'origin/main' into 41-spherical-lensing
ASKabalan Oct 14, 2025
0e41e9a
Merge remote-tracking branch 'origin/main' into 41-spherical-lensing
ASKabalan Oct 14, 2025
0bf8059
Add an analytical function to compute visibility mask from observer p…
ASKabalan Oct 22, 2025
e99502c
Merge branch 'main' into 41-spherical-lensing
ASKabalan Oct 22, 2025
0d4e2f0
format
ASKabalan Oct 24, 2025
a129acd
Update jax.experimental.shard_map to jax.shard_map and bump jax version
ASKabalan Oct 28, 2025
7351fc9
format
ASKabalan Oct 28, 2025
ae7c04e
Update shardmap signature
ASKabalan Oct 28, 2025
d0d29c9
Update JAX dependencies and migrate from experimental shard_map API (…
Copilot Oct 28, 2025
ec1af7d
Merge remote-tracking branch 'origin/update-shardmap-import' into 41-…
ASKabalan Oct 30, 2025
a430f4e
Update test workflow install order
ASKabalan Oct 30, 2025
b9eff44
Merge remote-tracking branch 'origin/update-shardmap-import' into upd…
ASKabalan Oct 30, 2025
9ff2eae
Add Set type import in distributed.py
ASKabalan Oct 30, 2025
81cd08a
Merge remote-tracking branch 'origin/update-shardmap-import' into 41-…
ASKabalan Oct 30, 2025
76b7713
add
ASKabalan Nov 6, 2025
9a672ad
temp commit
ASKabalan Nov 6, 2025
abd66a5
Commit
ASKabalan Nov 25, 2025
1622e9f
update jax-cosmo usage to match update in ASKabalan:add_growth_second…
ASKabalan Nov 27, 2025
21ba798
Simplify spherical painting (flatten later to use more devices)
ASKabalan Dec 15, 2025
3aecf0b
small fix
ASKabalan Dec 15, 2025
7017b33
remove print statements used for debugging
ASKabalan Dec 15, 2025
File filter

Filter by extension

Filter by extension


Conversations
Failed to load comments.
Loading
Jump to
Jump to file
Failed to load files.
Loading
Diff view
Diff view
7 changes: 3 additions & 4 deletions .github/workflows/tests.yml
Original file line number Diff line number Diff line change
Expand Up @@ -51,12 +51,11 @@ jobs:
python -m pip install --upgrade pip setuptools wheel
# Install JAX first as it's a key dependency
pip install jax
# Install build dependencies
pip install setuptools cython mpi4py
# Install test requirements with no-build-isolation for faster builds
pip install -r requirements-test.txt --no-build-isolation
# Install packages with test dependencies
pip install -e .[test]
# Install test requirements with no-build-isolation for PFFT
pip install -r requirements-test.txt --no-build-isolation
# Install packages with test dependencies
Copy link

Copilot AI Nov 4, 2025

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

The comment 'Install packages with test dependencies' appears twice (lines 54 and 62) with different actual operations, which is confusing. The second occurrence at line 62 should be removed as it's followed by a simple echo statement, not another pip install command.

Suggested change
# Install packages with test dependencies

Copilot uses AI. Check for mistakes.
echo "numpy version installed:"
python -c "import numpy; print(numpy.__version__)"

Expand Down
21 changes: 21 additions & 0 deletions external/LICENSE
Original file line number Diff line number Diff line change
@@ -0,0 +1,21 @@
MIT License

Copyright (c) 2025 Francois Lanusse

Permission is hereby granted, free of charge, to any person obtaining a copy
of this software and associated documentation files (the "Software"), to deal
in the Software without restriction, including without limitation the rights
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
copies of the Software, and to permit persons to whom the Software is
furnished to do so, subject to the following conditions:

The above copyright notice and this permission notice shall be included in all
copies or substantial portions of the Software.

THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
SOFTWARE.
75 changes: 75 additions & 0 deletions external/README.md
Original file line number Diff line number Diff line change
@@ -0,0 +1,75 @@
# Ray Tracing Tests

A comparison of gravitational lensing ray-tracing methods, comparing the Born approximation with full ray-tracing including lens-lens coupling effects. We are using the Dorian code from (https://arxiv.org/abs/2406.08540) and the GLASS code from (https://arxiv.org/abs/2302.01942) to do these comparisons.

## Overview

This repository tests the impact of different ray tracing strategies on weak lensing forward modeling. We compare:

- **Born Approximation**: Traditional approach treating lensing as a thin-lens approximation
- **Full Ray-tracing**: Complete treatment including lens-lens coupling and deflection accumulation

The analysis uses high-resolution N-body simulation data to generate convergence maps and compare their statistical properties through power spectrum analysis.

## Quick Start

### 1. Setup Environment

```bash
# Clone the repository
git clone <repository-url>
cd RayTracingTests

# Create and activate virtual environment
python -m venv raytracing_env
source raytracing_env/bin/activate # Linux/Mac
# or
raytracing_env\Scripts\activate # Windows

# Install dependencies
pip install -r requirements.txt
```

### 2. Download Simulation Data

Download the first simulation from the Gower Street Simulations:

```bash
# Create data directory
mkdir -p data/sim00001

# Download simulation data (example for first simulation)
# Visit: http://star.ucl.ac.uk/GowerStreetSims/simulations/
# Download the lightcone files and place them in data/sim00001/

# Required files structure:
# data/sim00001/
# ├── control.par # Simulation parameters
# ├── z_values.txt # Redshift information
# ├── run.00024.lightcone.npy # Lightcone data files
# ├── run.00025.lightcone.npy
# └── ... # Additional lightcone files
```

**Note**: The simulation data files are large (~GB each). Download only the lightcone files you need for your redshift range of interest.

### 3. Run Analysis

```bash
# Generate convergence maps using both methods
python run_dorian_full.py

# View results in Jupyter notebook
jupyter notebook notebooks/analysis.ipynb
```

## Results Notebook

**👉 See [`results.ipynb`](results.ipynb) for complete analysis and visualizations**

The results notebook contains:
- **Convergence Maps**: Full-sky and zoomed visualizations at nside=256 resolution
- **Difference Analysis**: Detailed comparison between Born and ray-traced methods
- **Power Spectrum Comparison**: Statistical analysis showing scale-dependent differences
- **Theory Validation**: Comparison with theoretical predictions

220 changes: 220 additions & 0 deletions external/dorian_raytracing.py
Original file line number Diff line number Diff line change
@@ -0,0 +1,220 @@
from dorian.constants import M_sun_cgs, Mpc2cm, c_cgs, G_cgs
from dorian.cosmology import d_c
from dorian.parallel_transport import get_rotation_angle_array, rotate_tensor_array
from dorian.misc import print_logo
from dorian.raytracing import get_val, get_val_nufft, check_theta_poles
import numpy as np
import healpy as hp
import time, h5py


def raytrace(
shells: list,
z_s: float,
omega_m: float,
omega_l: float,
nside: int,
shell_redshifts: list,
shell_distances: list,
interp: str = "ngp",
lmax: int = 0,
parallel_transport: bool = True,
nthreads: int = 1,
):
"""Routine for performing a ray tracing simulation.

Parameters
----------
shells : list
List of HEALPix maps for each shell (mass/density maps).
z_s : float
Source redshift.
omega_m : float
Matter density parameter.
omega_l : float
Dark energy density parameter.
nside : int
HEALPix NSIDE parameter.
shell_redshifts : list
Redshift of each shell.
shell_distances : list
Comoving distance to each shell (Mpc).
interp : str
Interpolation scheme to use: "ngp", "bilinear" and "nufft".
lmax : int
Maximum angular number for SHT computations (default 3 * Nside).
parallel_transport : bool
Whether to apply the parallel transport of the distortion matrix to the
updated angular position of the ray at each plane. Setting this parameter
true is recommended.
nthreads : int
Number of OMP threads to use. At the moment this is only needed in the case
of nufft interpolation.
"""
t_begin = time.time()

print_logo()

# Define factor to give physical units to the convergence
kappa_fac = (1e10 * M_sun_cgs) * (1 / Mpc2cm) * 4 * np.pi * G_cgs / (c_cgs**2)

# Filter shells that contribute to lensing (z < z_s)
contributing_shells = []
for i, (shell, z_k, d_k) in enumerate(zip(shells, shell_redshifts, shell_distances)):
if z_k < z_s:
contributing_shells.append({
'shell_data': shell,
'redshift': z_k,
'distance': d_k,
'index': i
})

print(f"Using {len(contributing_shells)} shells out of {len(shells)} total")

if len(contributing_shells) == 0:
raise ValueError(f"No shells found with z < z_s ({z_s}). Check your shell redshifts.")

print(f"Shell redshift range: {contributing_shells[0]['redshift']:.3f} to {contributing_shells[-1]['redshift']:.3f}")

# Set up map parameters
npix = hp.nside2npix(nside)

# Compute comoving distance of the source
d_s = d_c(z=z_s, Omega_M=omega_m, Omega_L=omega_l)

# Angular position of the rays when they reach the observer
# We shoot one ray for every pixel center
theta = np.array(hp.pix2ang(nside, np.arange(npix)))
nrays = theta.shape[1] # total number of rays
# Angular position of the rays, dimensions are:
# [k-th plane (previous, current), rows of beta (theta, phi), ray index]
beta = np.zeros([2, 2, nrays])
# Distorsion matrix, dimensions are:
# [k-th plane (previous, current), rows of A, columns of A, ray index]
A = np.zeros([2, 2, 2, nrays])
# Convergence field in the Born approximation
kappa_born = np.zeros([nrays])

# Initialize quantities for the first lens plane
beta[0] = theta
beta[1] = theta
for i in range(2):
for j in range(2):
A[0][i][j] = 1 if i == j else 0
A[1][i][j] = 1 if i == j else 0
sh_start = 0

# Define some constants to be used later in the SHT
if lmax==0:
lmax = 3 * nside
ell = np.arange(0, lmax + 1)

# Iterate over the lens planes
for k in range(sh_start, len(contributing_shells)):
t0 = time.time()
# Get shell information
shell_info = contributing_shells[k]
z_k = shell_info['redshift']
d_k = shell_info['distance']
shell_data = shell_info['shell_data']

# Load the mass and compute Sigma ((1e10 M_sun/h)/sr)
Sigma = shell_data / (4 * np.pi / npix)
# physical smoothing of 100 kpc
# Sigma = hp.smoothing(Sigma, sigma=np.radians(1 / 60))
Sigma_mean = np.mean(Sigma)

# Compute convergence at the single lens plane
kappa = kappa_fac * (1 + z_k) * (1 / d_k) * (Sigma - Sigma_mean)

# Compute quantities in spherical harmonics domain
kappa_lm = hp.map2alm(kappa, pol=False, lmax=lmax)
alpha_lm = hp.almxfl(kappa_lm, -2 / (np.sqrt((ell * (ell + 1)))))
f_l = -np.sqrt((ell + 2.0) * (ell - 1.0) / (ell * (ell + 1.0)))
g_lm_E = hp.almxfl(kappa_lm, f_l)

if interp in ["ngp", "bilinear"]:
alpha = hp.alm2map_spin(
[alpha_lm, np.zeros_like(alpha_lm)], nside=nside, spin=1, lmax=lmax
)
alpha = get_val(alpha, beta[1][0], beta[1][1], interp=interp)

g1, g2 = hp.alm2map_spin(
[g_lm_E, np.zeros_like(g_lm_E)], nside=nside, spin=2, lmax=lmax
)
U = np.zeros([2, 2, nrays])
U[0][0] = kappa + g1
U[1][0] = g2
U[0][1] = U[1][0]
U[1][1] = kappa - g1
U[0, 0], U[0, 1], U[1, 1] = get_val(
[U[0, 0], U[0, 1], U[1, 1]], beta[1][0], beta[1][1], interp=interp
)
U[1, 0] = U[0, 1]

elif interp == "nufft":
alpha = get_val_nufft(
alpha_lm, beta[1][0], beta[1][1], spin=1, lmax=lmax, nthreads=nthreads
)
g1, g2 = get_val_nufft(
g_lm_E, beta[1][0], beta[1][1], spin=2, lmax=lmax, nthreads=nthreads
)
kappa_nufft = get_val_nufft(
kappa_lm, beta[1][0], beta[1][1], spin=0, lmax=lmax, nthreads=nthreads
)[0]

U = np.zeros([2, 2, nrays])
U[0][0] = kappa_nufft + g1
U[1][0] = g2
U[0][1] = U[1][0]
U[1][1] = kappa_nufft - g1

# Compute distance of previous and next shell
d_km1 = 0 if k==0 else contributing_shells[k-1]['distance']
d_kp1 = d_s if k == len(contributing_shells) - 1 else contributing_shells[k+1]['distance']
# Compute distance-weighing pre-factors
fac1 = d_k/d_kp1 * (d_kp1-d_km1)/(d_k-d_km1)
fac2 = (d_kp1-d_k)/d_kp1

for i in range(2):
beta[0][i] = (1 - fac1) * beta[0][i] + fac1 * beta[1][i] - fac2 * alpha[i]

# Update angular positions
beta[[0, 1], ...] = beta[[1, 0], ...]

# Make sure that all theta of beta[1] are in range [0, pi]
# (only the poles need to be checked)
check_theta_poles(beta[1])
# Make sure that all phi of beta[1] are in range [0, 2*pi]
beta[1][1] %= 2 * np.pi

for i in range(2):
for j in range(2):
A[0][i][j] = (
(1 - fac1) * A[0][i][j]
+ fac1 * A[1][i][j]
- fac2 * (U[i][0] * A[1][0][j] + U[i][1] * A[1][1][j])
)

# Update distortion matrix
A[[0, 1], ...] = A[[1, 0], ...]

# Parallel transport distortion matrix
if parallel_transport:

cospsi, sinpsi = get_rotation_angle_array(
beta[0][0][:], beta[0][1][:], beta[1][0][:], beta[1][1][:]
)
A[0, :, :, :] = rotate_tensor_array(A[0, :, :, :], cospsi, sinpsi)
A[1, :, :, :] = rotate_tensor_array(A[1, :, :, :], cospsi, sinpsi)

# Compute Born approximation convergence
kappa_born += ((d_s - d_k) / d_s) * kappa

# Save data
print(f"*"*73, flush=True)
print(f"Total time: {round(time.time()-t_begin)} s")
print("Ray tracing finished, bye.")
print(f"*"*73, flush=True)
return kappa_born, A[1], beta[1], theta

Loading