In [1]:
%pylab inline
#plt.style.use('dark_background')
from IPython.display import Image
Populating the interactive namespace from numpy and matplotlib

Py-SPHViewer

by Alejandro Benitez-Llambay

In [2]:
Image("imgs/frame.png",width=200, height=200)
Out[2]:

Py-SPHViewer is a Python framework for rendering particle distributions using the Smoothing Particle Hydrodynamics (SPH) scheme.

  • Parallel and fast: expensive calculations performed in C using OpenMP
  • Interactive: Can be used from a python shell, Ipython, Jupyter notebook
  • Code agnostic: Does not care where the particle distribution comes from
  • Users Friendly: Modules (or tools) can be built on top of its core routines

Getting started

In [4]:
import h5py
from sphviewer.tools import QuickView, cmaps

with h5py.File('data/darkmatter_box.h5py','r') as f:
    pdrk = f['PartType1/Coordinates'].value.astype(np.float64)
    
qv = QuickView(pdrk, r='infinity', plot=False)
In [6]:
fig = plt.figure(1, figsize=(15,7))
ax1 = fig.add_subplot(121)
ax2 = fig.add_subplot(122)

ax1.plot(pdrk[:,0], pdrk[:,1], 'k.', ms=0.1)
ax2.imshow(qv.get_image(), extent=qv.get_extent(),
           origin='lower', cmap=cmaps.twilight(), 
           vmin=-0.5)
Out[6]:
<matplotlib.image.AxesImage at 0x7f5b0f37c668>

Behind Scenes

The SPH approximation

Under the SPH approximation, the projected density field at any pixel is the contribution of all particles within the “projected” kernel.

$\rho(x) = \displaystyle\sum_j m_j W(|x-x_j|, h_j)$

$\Sigma(x,y) = \displaystyle\int \rho(x,y,z') dz' = \displaystyle\sum_j m_j \displaystyle\int W(|x-x_j|, h_j) dz'$
$\Sigma(x,y) = \displaystyle\sum_j m_j \tilde W(R_{j}, h_j) $

We only need to decide which kernel we are going to use. For simplicity we a kernel with analytic integral.

$W(r,h) = \alpha \left [ 1 - \left ( \displaystyle\frac{r}{h}\right )^2 \right ]$
$\tilde W(x,y,h) = \alpha \displaystyle\frac{2}{3} \left [ 1 - \left ( \displaystyle\frac{R}{h}\right )^2 \right ]^{3/2}$

where $\alpha$ is the normalization of the kernel

The API

Py-SPHViewer is structured in four main classes:

  • PARTICLES -> relies on parallel code
  • CAMERA
  • SCENE -> relies heavily on C parallel code
  • RENDER -> relies heavily on C parallel code

The interface between python and C is done through the Python API for C.

Py-SPHViewer scales very well in systems with shared memory. For example, in the cosma7 login node we obtain:

In [7]:
Image("imgs/strong_scaling.png", width=400, height=400)
Out[7]:
In [8]:
Image("imgs/scaling_npixels.png", width=400, height=400)
Out[8]:

Using the API

Rendering EAGLE L012N0376 (53 x 10^6 particles)

In [9]:
# We first load the particle data
pos  = np.load('data/pdrk_L0012N0376.npy') #positions
h    = np.load('data/hsml_L0012N0376.npy') #smoothing lengths
sigma= np.load('data/sigma_L0012N0376.npy') #local velocity dispersion
mass = np.ones(len(pos)) #masses

hlittle = 0.6777
lbox    = 12.5 * hlittle

Define the Particles class

In [10]:
import sphviewer as sph
%time Particles = sph.Particles(pos, mass, h)
CPU times: user 18 µs, sys: 1e+03 ns, total: 19 µs
Wall time: 24.1 µs

Define the Camera class (Regular users do not need to do this)

In [11]:
%time Camera = sph.Camera(r='infinity', t=0, p=0, roll=0, xsize=500, ysize=500, x=lbox/2., y=lbox/2., z=lbox/2., extent=[-lbox/2.,lbox/2.,-lbox/2.,lbox/2.])
CPU times: user 28 µs, sys: 2 µs, total: 30 µs
Wall time: 36.5 µs

Define the Scene class (This depends on the Particles and the Camera)

In [12]:
%time Scene = sph.Scene(Particles, Camera)
CPU times: user 3.56 s, sys: 1.32 s, total: 4.88 s
Wall time: 1.82 s

Rendering the Scene (Define the Render class)

In [13]:
%time Render = sph.Render(Scene)
CPU times: user 47.4 s, sys: 153 ms, total: 47.5 s
Wall time: 12.8 s
In [14]:
extent = Render.get_extent()

lpixel  = (extent[1]-extent[0])/500.

fig = plt.figure(1,figsize=(10,10))
imshow(np.log10(Render.get_image())-np.log10(lpixel**2), 
       extent=[extent[0]+Scene.Camera.get_params()['x'],
               extent[1]+Scene.Camera.get_params()['x'],
               extent[2]+Scene.Camera.get_params()['y'],
               extent[3]+Scene.Camera.get_params()['y']],
       vmin=4, vmax=9, cmap=cmaps.twilight(), origin='lower')
Out[14]:
<matplotlib.image.AxesImage at 0x7f5b0f277ba8>

We can focus on a particular object

In [15]:
Scene.update_camera(x=2.319, y=4.481, z=5.141, extent=[-1,1,-1,1])
Render = sph.Render(Scene)
In [16]:
extent = Render.get_extent()
lpixel  = (extent[1]-extent[0])/500.

fig = plt.figure(1,figsize=(10,10))
imshow(np.log10(Render.get_image())-np.log10(lpixel**2), extent=Render.get_extent(),
      vmin=4, vmax=9, cmap=cmaps.twilight(), origin='lower')
Out[16]:
<matplotlib.image.AxesImage at 0x7f5b0f209fd0>
In [17]:
Scene.update_camera(x=2.319, y=4.481, z=5.141, extent=[-0.1,0.1,-0.1,0.1])
Render = sph.Render(Scene)
In [18]:
extent = Render.get_extent()
lpixel  = (extent[1]-extent[0])/500.

fig = plt.figure(1,figsize=(10,10))
imshow(np.log10(Render.get_image())-np.log10(lpixel**2), extent=Render.get_extent(),
      vmin=5, vmax=9, cmap=cmaps.twilight(), origin='lower')
Out[18]:
<matplotlib.image.AxesImage at 0x7f5b0f1220f0>

Doing fancy stuff with the camera

In [19]:
from IPython.lib.display import YouTubeVideo
YouTubeVideo('9ju8Ukfz_ug', width=1280/2, height=720/2)
Out[19]:

Measuring other properties traced by the particles, e.g., the velocity dispersion

<$F(x,y)> = \displaystyle\frac{\displaystyle\int F({\bf x}) \ \rho({\bf x}) \ dz'}{\displaystyle\int \rho({\bf x}) \ dz'}$

$<F(x,y)> = \displaystyle\frac{\displaystyle\sum_j m_j \ F_j \tilde W(R_j,h_j)}{\displaystyle\sum_j m_j \ \tilde W(R_j,h_j)} = \displaystyle\frac{I_{1}}{I_{2}}$

In [20]:
qv1 = QuickView(pos, mass, h, r='infinity', plot=False, logscale=False)
qv2 = QuickView(pos, mass*sigma, h, r='infinity', plot=False, logscale=False)

img1 = qv1.get_image()
img2 = qv2.get_image()

extent = qv2.get_extent()
qv1 = 0
qv2 = 0
In [21]:
fig = plt.figure(1,figsize=(15,7))
ax1 = fig.add_subplot(121)
ax2 = fig.add_subplot(122)

ax1.imshow(np.log10(img1), extent=extent,
           vmin=0, vmax=5, cmap=cmaps.twilight())
ax2.imshow(np.log10(img2/img1), extent=extent,
           vmin=0, vmax=3, cmap=cmaps.twilight())
Out[21]:
<matplotlib.image.AxesImage at 0x7f5b08f41048>

We can do more fancy stuff

For example, calculate the gas mean density-weighted temperature in various slices for the EAGLE (50 Mpc)^3 simulated with 752^3 particles

In [22]:
from IPython.lib.display import YouTubeVideo
YouTubeVideo('s0cLeRTfYHU', width=1280/2, height=720/2)
Out[22]:

We can also use fancy visualization techniques encoding multiple

properties in the same image, such as density and velocity dispersion

In [23]:
from sphviewer.tools import hsv_tools
from sphviewer_cbar import colorbar

fig = plt.figure(1,figsize=(10,10))

hmin = 180./360.
hmax = 1.0

rgb = hsv_tools.image_from_hsv(h=np.log10(img2/img1), v=np.log10(img1), hmin=hmin, 
                               hmax=hmax, img_hmin=0.5, img_hmax=2, img_vmin=1, img_vmax=4)
imshow(rgb)

mycbar = colorbar()
mycbar.make_colorbar( pos=[0.95,0.15], size=0.2, extent=[1,4,0.5,2], hmin=hmin, hmax=hmax)

We also have tools for creating smooth camera trajectories

In [24]:
from IPython.lib.display import YouTubeVideo
YouTubeVideo('4ZIgVbNlDU4', width=1280/2, height=720/2)
Out[24]:

Tools for rendering stars

In [25]:
from IPython.display import Image
Image("imgs/auriga.png", width=600, height=600)
Out[25]:

Py-SPHViewer with MPI: Producing images is an embarrassingly parallel problem.

In [ ]:
from mpi4py import MPI
import glob
import h5py
import matplotlib.pyplot as plt
from sphviewer.tools import QuickView
import numpy as np

comm = MPI.COMM_WORLD
rank = comm.Get_rank()
size = comm.Get_size()

path = '/cosma5/data/Eagle/ScienceRuns/Planck1/L0050N0752/PE/Z0p10_W1p00_E_3p0_0p3_ALPHA1p0e6_rhogas1_reposlim3p0soft_100mgas/data/snapshot_028_z000p000/'

files = glob.glob(path+'*.hdf5')

local_imgs = []

for i in files[rank::size]:
    print rank, i
    # Reding files
    #-----------------------
    with h5py.File(i,'r') as f:
        pgas   = f['PartType0/Coordinates'].value
        mgas   = f['PartType0/Mass'].value*1e10
        rhogas = f['PartType0/Density'].value*1e10
        tgas   = f['PartType0/Temperature'].value
        hsml   = f['PartType0/SmoothingLength'].value
        lbox   = f['Header'].attrs['BoxSize']
    #-----------------------

    #-----------------------
    #Constructing local images
    xc = yc = zc = lbox/2.0
    extent = [-lbox/2.0,lbox/2.,-lbox/2.,lbox/2.]
    r = 'infinity'
    xsize = ysize = 500
    
    qv = QuickView(pgas, mgas, hsml, r=r, extent=extent, 
                   x=xc, y=yc, z=zc, plot=False, 
                   xsize=xsize, ysize=ysize, logscale=False)
    
    local_imgs.append(qv.get_image())
    #-----------------------

#---------------------
#Collecting local images for final result and saving
all_images = comm.gather(local_imgs, root=0)

final_image = np.zeros([ysize, xsize], dtype=np.float32)

if(rank == 0):
    for imgs in all_images:
        for img in imgs:
            print np.shape(img)
            final_image += img

    np.save('EAGLE_0050N0752.npy', final_image)
#---------------------