Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Proper scaling for non-cubic data #35

Open
npyoung opened this issue Jul 14, 2017 · 25 comments
Open

Proper scaling for non-cubic data #35

npyoung opened this issue Jul 14, 2017 · 25 comments

Comments

@npyoung
Copy link

npyoung commented Jul 14, 2017

All volshow volumes seem to render their data into a cube. This causes distortion when the data provided lies in a box. For example, I am using ipyvolume to render fluorescence images of cultured cells. The volumes are typically hundreds of microns wide, but only tens of microns deep. Rendering in ipyvolume stretches these significantly along the depth direction.

Note that this is related to #11, but not the same problem. Even if you resample your data to isotropic, if the overall volume is not a perfect cube, it gets distorted.

A quick and dirty fix is to pad the data with zeros so that it is cubic, but the overextended cube outline is retained. Instead, the axis outlines should show whatever box the data came in.

Perhaps a nice fix would be a way to control the render volume dimensions in addition to the canvas dimensions that can already be controlled through figure.

@chrisjsewell
Copy link
Contributor

Hey, if I understand what you are after, I think I have already implemented something like your 'quick and dirty' fix (although it wasn't so quick lol) here:
https://stackoverflow.com/questions/44762445/change-basis-of-3d-numpy-array-fractional-to-cartesian-coordinates

and this is what the output looks like:
https://chrisjsewell.github.io/ipyvolume_si_ech3

@maartenbreddels
Copy link
Collaborator

@npyoung following up on our discussion on gitter you are right that pylab.xlim (and ylim, zlim) are where you should start I think, to see how this gets fed back to the javascript side. But on top of that we'd need something similar to matplotlib's extent argument, which will give the limits of the volume that needs to be rendered.

OTOH, it seems that @chrisjsewell did it a bit more general by using a matrix transformation right? I think it would be nice to have both.. so you could do..

pylab.volshow(data, extent=[[xmin, xmax], [ymin, ymax], [zmin, zmax]])
# or
pylab.volshow(data, extent=some_3by3_or_4by4_matrix)

@chrisjsewell looks really nice 👍 , I think I see some rendering artifacts from the volume rendering, maybe that needs a closer look someday to make it look even better!

@npyoung
Copy link
Author

npyoung commented Jul 21, 2017

@chrisjsewell's solution is not quite what I'm thinking. My problem isn't rendering arbitrary affine transformed cubes - that's more general than what I'm talking about. I'm just talking about cubes vs boxes i.e. equal vs unequal data dimension shapes. The example @chrisjsewell provides still renders into a 15x15x15 cube.

My situation is that I have a M x N x P data array where M != N != P. Currently this data gets rendered into a Q x Q x Q cube, stretching the data out anisotropically.

extent isn't quite what I'm trying to emulate either. If I understand it correctly it's basically a convenience method for setting all the limits at once. What I'm really after is more like aspect from mpl, e.g..

import matplotlib.pyplot as plt
plt.imshow(data2d, aspect='equal')

@maartenbreddels from what I can tell messing with {x,y,z}lim just affects how much of the data is shown in the 3D cube, but the data still renders into a perfect cube. So I could use the lim functions to crop my data down to a cube, but seeing as my arrays are typically 1024x1024x100 I'd be missing out on a lot of my data.

If it helps, check out this guy's question. He has the opposite problem: his data is non-square, but represents a square, so he needs to set aspect=<not 1> because the default behavior of imshow is to assume isotropic pixels. volshow in this package makes the opposite assumption: that your data always represents a cube even if your data dimensions are not cubic.

@maartenbreddels
Copy link
Collaborator

I do think it is the solution. If the 'bounding box' of the volume rendering is transformed into say a flat box, where it renders a ndarray of 100x100x10, it would give you something flat, as you want.
We could (maybe even by default), use the aspect keyword to set the xlim/ylim/zlim or tranformation matrix to proper default values.

@choldgraf
Copy link
Contributor

I'm +1 on an aspect keyword. For an example of how the current behavior is a little wonky, here's a little gif I put together to recreate it:

demo

What I'd expect is that the ratio of the data itself (e.g. the axes of the brain) would not change as I rotate the image.

@eldad-a
Copy link

eldad-a commented Jan 8, 2018

@maartenbreddels please allow me to join those who praise ipyvolume and its devs, as it addresses much needed features.

As @npyoung and @choldgraf , for me as well aspect (the way it is used in matplotlib) would be the most intuitive syntax.

@maartenbreddels
Copy link
Collaborator

Hi Eldad,

thanks for your kind words! Yes, I recognise this is quite a limitation, the volume rendering needs a bit of a refactoring anyway, I plan to attack this all in one go.

cheers,

Maarten

@maartenbreddels
Copy link
Collaborator

New option for ipvolume is the extent option, it is not like the aspect, but I think you can get the same effect, see attached image. It contains a non-square volume, and shows non-square. If extent is None the viewport will be used (previous behaviour).

image

@choldgraf
Copy link
Contributor

this is awesome!!!

@maartenbreddels
Copy link
Collaborator

let me know if this is still limited, but I think this should all uses described here, I'll wait a while before I close this, otherwise reopen in the future.

@choldgraf
Copy link
Contributor

cc @kingjr who was playing around with ipyvolume for brain rendering

when I come up for air I will take a stab at this...@maartenbreddels maybe you wanna play around with a viz w/ me when we're both in Paris in 2 weeks? :-)

@maartenbreddels
Copy link
Collaborator

Absolutely, I'll arrive Sunday late, leaving somewhere Wednesday.

@mpu-creare
Copy link
Contributor

Could you add the same functionality to the ScatterView as you've implemented here:
e53f70b ?

@maartenbreddels
Copy link
Collaborator

What is the use case? Since scatters has x y z positions, which are relative to xlim/ylim/zlim right?

@mpu-creare
Copy link
Contributor

mpu-creare commented Mar 21, 2018

Right now when I render a 1x2x3 box, the aspect ratio is off. The axes drawn are for 1x1x1 cube, and all the points are rendered in that cube.
image

The denser portion of that image is actually a 1x1x1 cube of points.

@wdeback
Copy link

wdeback commented Jul 4, 2018

I have the same issue as @mpu-creare: The aspect ratio of non-cubic scatter plots seem incorrect.

To illustrate the problem, I create randomly distributed points in a flat non-cubic box with size (100x100x5). I would expect the z axis to be 20x shorter than the x and y axes. However, when rendered with quickscatter, all axes show up the same size, regardless of setting zlim or squarelim.

@maartenbreddels: Any suggestion? Great project, by the way 👍!

import numpy as np
import ipyvolume as ipv

# generate random points (0,1)
n_points = 100
points = np.random.random(size=(n_points,3))

# transform points to flat (noncubic) box: (100x100x5)
points *= (100,100,5) #<<<<<
print(f'Example points:\n{points[:4]}')

x, y, z = points.T
#ipv.xlim(0,x.max())
#ipv.ylim(0,y.max())
#ipv.zlim(0,z.max())
#ipv.squarelim()

ipv.quickscatter(x,y,z,marker='box')
ipv.show()

# animate rotation 
def set_view(fig, framenr, fraction):
    fig.anglex = 0.0
    fig.angley = fraction*np.pi

ipv.movie('non-cubic.gif', set_view, fps=16, 
          frames=96, endpoint=False)

non-cubic

@maartenbreddels
Copy link
Collaborator

Hi Walker,

do the xlim/ylim/zlim+squarelim after quickscatter (better use scatter btw), plan to deprecate it, and it should work. Maybe this is a bug... so keeping it open. Feel free to dig into it!

cheers,

Maarten

@wdeback
Copy link

wdeback commented Jul 4, 2018

do the xlim/ylim/zlim+squarelim after scatter

Yes, thanks! Now works as expected:

import numpy as np
import ipyvolume as ipv

# generate random points (0,1)
n_points = 100
points = np.random.random(size=(n_points,3))

# transform points to flat (noncubic) box: (100x100x5)
points *= (100,100,5)
print(f'Example points:\n{points[:4]}')

x, y, z = points.T
ipv.scatter(x,y,z,marker='box')

# put x/y/zlim and squarelim after scatter
ipv.xlim(0,x.max()) 
ipv.ylim(0,y.max())
ipv.zlim(0,z.max())
ipv.squarelim()

ipv.show()

non-cubic-2

@wdeback
Copy link

wdeback commented Jul 4, 2018

BTW: Is there an option to fit the box tightly to the x/y/zlim ?

@maartenbreddels
Copy link
Collaborator

No, not at the moment. Maybe that would be easier to support once more integration with bqplot/ipyscales happen.

@buzmakov
Copy link
Contributor

Hi, It is seems that latest ipyvolume build on conda-forge (https://anaconda.org/conda-forge/ipyvolume/0.4.6/download/noarch/ipyvolume-0.4.6-py_1.tar.bz2) not include extent parameters for ipv.volshow.
ipv.__version__ == '0.4.6', but no extent in source code. May be this because "Latest release" on github (https://github.com/maartenbreddels/ipyvolume/releases/latest) linked to version 0.4.5?

Thank you for nice tool!

@maartenbreddels
Copy link
Collaborator

maartenbreddels commented Sep 12, 2018

Thanks!
No, i'm preparing a new release (0.5), which should have that feature.

@buzmakov
Copy link
Contributor

buzmakov commented Sep 12, 2018 via email

@maartenbreddels
Copy link
Collaborator

You can try installing it from pip, and let me know if it works:

pip install --pre ipyvolume

@mdylan2
Copy link

mdylan2 commented Jan 26, 2020

I have a question about the extent functionality. I have a (640, 640, 36) [pixel] 3D image, assuming the last number in the tuple is your z-slice. The actual size of my image is (318, 318, 68) [microns]. How would the extent function interpolate between the slices? Is it linear interpolation or nearest neighbors interpolation?

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
None yet
Projects
None yet
Development

No branches or pull requests

9 participants