-
Notifications
You must be signed in to change notification settings - Fork 95
STIR FAQ
Frequently asked questions about how to use STIR
All info on publications is on the "publications" section of the STIR web-site, currently at http://stir.sourceforge.net/links/publications.shtml
You can use list_projdata_info. run it without arguments first to see what options are available.
STIR comes with very basic utilities called manip_projdata and display_projdata that can display sinograms or viewgram, but this is really only useful for a quick check (or to see if STIR did read your data as expected). Your best option is to use an external display program.
To do this, use the extract_segments utility. This will write each segment in the projection data to a different file as a 3D volume. Currently, this is written in a version of the Interfile format. See the #Image related FAQs FAQ how to display volume.
Please check the STIR glossary, available via the STIR website for information.
What is the difference between default bin size and effective central bin size (e.g. in the .hs Interfile header)?
The "default bin size" is only used when doing arc-correction
("geometric correction" in GE language). This is normally only used
when you use FBP (or when you use arc-correction with
correct_projdata
).
The "effective central bin size" is really only used by STIR for arc-corrected data. It is then the bin-size that will be used by the projectors, so should be the "true" bin size. (Of course, if you've asked STIR to do the arc-correction, it will normally be equal to the "default bin size", but you can ask STIR to use a different bin size for arc-correction).
For non-arccorrected data, the "effective central bin size" is usually a bit larger than the "default", but this is manufacturer dependent. In any case, for non-arccorrected data, its value is ignored by STIR.
You can use list_image_info. There's also list_image_values. manip_image also allows you to get info per plane. As always, run these without arguments first to see what options are available.
STIR comes with a very basic utility called manip_image that displays image slices, but it is really only useful for a quick check (or to see if STIR did read your image as expected). stir_write_pgm can be used to write a slice as a single PGM bitmap. However, normally, your best option is to use an external display program.
By default, STIR uses a version of the Interfile format, although only a subset of keywords is implemented.(see below for output in other file formats) (See the STIR web-site for more links on Interfile).
Amide and xmedcon read STIR .hv files without trouble as long as the data-offset is zero (e.g. files which are written by STIR). Other packages might ignore the scale factor (but STIR by default writes as floats with scale factor 1). And other packages might insist on using the official Interfile 3.3 standard. The .ahv files written by STIR are closer to that, but they have a tweak to let Analyze (from the Mayo) read them correctly (as Analyze misinterprets the z-spacing). Open one of the .ahv files in your text editor and read the comments.
STIR currently uses a home-grown way to specify the image origin. No other program supports this convention as far as we know (as Interfile currently does not have relevant keywords). STIR currently completely ignores patient orientation etc. So if you have an image of the same object written by a different program, and the display program tries to interpret coordinate systems, it's unlikely the 2 objects will be displayed in the same location.
In the case that the file is a 3D image (no timing information) the "import raw" of ImageJ (or any other application with the ability to read binary data) will do the job. The proper size information can be found in the respective header file and the precision should be set to "float - 32 bit" (do check the relevant Interfile keyword).
Reconstruction programs and generate_image have a parameter for setting the output file format which can be set in the .par file. See the example generate_image.par, and others in your examples/samples directory.
Other programs can be passed an explicit file. Probably most conveniently is stir_math. e.g.
stir_math --output-format stir_math_ITK_output_file_format.par output.nii input_filename_in_any_file_format
will use copy the data and write in Nifti (if you built STIR with ITK support). See examples/samples/stir_math_ITK_output_file_format.par
Details on orientation etc are given in the STIR Developer's guide. Here we attempt to give some more info on the origin and how this works for images. This is somewhat complicated because the relation with the scanner needs to be considered.
STIR coordinates are currently related to the scanner, i.e. not to the patient (as in DICOM). A peculiarity is that STIR coordinates are ordered (z,y,x) (with z along the scanner axis, y vertical and x horizontal).
Currently, STIR does not support rotated coordinate systems. Therefore, we only need to give the location of the origin. Inside STIR, this origin can be shifted around using the offset. However, we will first assume that you did not do this ("zero offset").
For an image with zero offset, the origin is assumed to coincide with the centre of the first plane.
Let's say you want to use generate_image to create a cylinder in the centre of the (3D) image (and we use zero offsets for the image and an odd-number of pixels in x,y, see below). Then we need to compute the STIR coordinates of the centre. Given that (0,0,0) is at center of the first plane, the center of the last plane is at ((num_planes-1)*z_voxel_size,0,0). And therefore, the middle of the image is at ((num_planes-1)*z_voxel_size/2,0,0).
There's a small complication. When you have an odd number of voxels in x (or y), the 0 coordinate is indeed in the centre of the image. But if you have an even number, it's actually half a pixel off. This convention is probably different from other programs, therefore
We recommend to use odd number of voxels in all 3 directions.
For an existing image, you can use the list_image_info utility to get some information on geometry.
When developing code in STIR, you want to use the functions
DiscretisedDensity::get_physical_coordinates_for_indices()
etc to find
out what the location of the centre of a particular voxel is.
This coordinate system is currently never exposed to the user, but only
the developers. Functions like ProjDataInfo::get_m()
use a coordinate
system where (0,0,0) is the centre of the (gantry of the) scanner.
There's unfortunately also a set of obsolete function (which will be
removed) that use a coordinate system where (0,0,0) in the centre of the
first ring of the scanner (e.g.
find_cartesian_coordinates_of_detection
).
For all "segments" (see the STIR glossary), the data is assumed to be centred to the scanner (taking their average ring different into account).
When you are forward projecting an image or reconstructing projection, you need to know what the relation between the 2 conventions is. Unfortunately, STIR does not support different "bed positions" yet. For backwards compatibility, the following convention is used.
For images with zero offset, the middle of the scanner is assumed to coincide with the middle plane of the image.
So, for the generate_image example above, the cylinder would be located in the centre of the scanner.
WARNING: the combination of these conventions means that if you change the number of planes in the image, you also have to change the "origin" of the shape such that it would forward project into the same projection data.
You can see this clearly from the formulas used in the generate_image example above.
However, this convention is confusing and therefore might be changed in the future. Together with a current limitation of the STIR projectors, this leads us to the following:
We recommend that you use images which have 2*(num_rings-1) planes with z-voxel-size equal to ring_spacing/2.
It is possible to create images where a different origin is used. How
you do this depends on the file format, but for Interfile, this can be
done by changing the first pixel offset (mm)
keywords. This is not
recommended of course.
The only STIR utility that creates images with non-zero offset is zoom_image. It is designed such that if you specify zero offsets and all image sizes are odd, the object will remain in the same physical location compared to the scanner. Its usage is
zoom_image <output filename> <input filename> \
sizexy [zoomxy [offset_in_mm_x [offset_in_mm_y \
[sizez [zoomz [offset_in_mm_z]]]]]]]
If you need to zoom an image (e.g. for estimate_scatter), it is therefore highly recommended to use zoom_image (as opposed to trying to figure out yourself where the objects will go).
As a developer, you can create images with non-zero offset using the
VoxelsOnCartesianGrid
constructor with non-zero origin. This is not
recommended however for future compatibility. The stir::zoom_image
function should be safe.
ERROR: DataSymmetriesForBins_PET_CartesianGrid can currently only support z-grid spacing equal to the ring of the scanner divided by an integer. Sorry
You can see this error when using forward projection of an image (e.g. when computing attenuation correction factors), or backprojection to an image. It happens because the projectors try to save memory (and time) by using a symmetry in the axial direction. To avoid this error, you have to use a z-spacing for the image which is half the ring-spacing of the scanner. (Although the symmetries could be used in other cases as well, there seem to be some problems with the projectors in such cases in the current version of STIR).
Unless you're building STIR on some exotic or very recent system, the building process should be straightforward (after reading the STIR_UsersGuide!).
Before building, on Ubuntu and debian, you should do the following
apt-get install gcc g++ make libncurses-dev libX11-dev libboost-dev tcsh
(prefix with sudo
for Ubuntu). The User's guide contains details for
other systems.
"make all" (or indeed "make") will only compile. "make install" will copy executables and a few scripts to ${INSTALL_PREFIX}/bin.
The "install" target "depends" on "all", i.e. "make" will implicitly do update "all" before doing "install". That's a complicated way to say you need only "install".
By the way, with current STIR Makefiles, "make test" will first check if the library is compiled and up-to-date, and if not, build it. In contrast, for the Makefiles generated by 'CMake', "make test" will run the compiled test executables without checking if they are up-to-date.
A good practice on multi core systems is to add the option " -jX". Where "X" is the number of the system's cores. This option will speed up dramaticaly the proccess.
For instance
gcc -O3 -DNDEBUG -ffast-math -I/opt/STIR/include -DHAVE_SYSTEM_GETOPT -DSTIR_SIMPLE_BITMAPS
-DSC_XWINDOWS -o opt/display/gen.o -MMD -MP -c display/gen.c ;
In file included from display/gen.c:24:0:
display/gen.h:114:48: fatal error: curses.h: No such file or directory
compilation terminated.
This message is telling you that you need the curses.h
file (it is
used in STIR when selecting GRAPHICS=X
). See
#Prerequisites (and the User's Guide) for installing
extra packages.
You do have boost, but are getting a compilation error as follows
./include/boost/config/compiler/gcc.hpp:92:7: warning: #warning "Unknown
compiler version - please run the configure tests and report the results"
You need to upgrade your boost version. Your compiler version is more recent than the boost version that you have.
STIR 2.1 (and earlier) cannot be compiled with very recent compilers such as gcc 4.6.x or CLang++. This problem is fixed in STIR 2.2. See the stir-users mailing list for more info. Example error:
/opt/STIR/include/stir/numerics/BSplines_weights.inl:344:50:
error: uninitialized const ‘stir::BSpline::near_n_BSpline_function’
[-fpermissive]
/opt/stir_2.1/include/stir/numerics/BSplines_weights.inl:78:9: note:
‘const class stir::BSpline::BSplineFunction<(stir::BSpline::BSplineType)0u,
double>’ has no user-provided default constructor
These errors occur after a long time in the building process as all "object files" will be compiled first, before the executables will be build by linking the object files and the libraries. Errors would say something like a missing library.
You probably miss some development libraries. See the installation guide.
Including the ecat library on some Linux distributions leads to linking errors as such:
/opt/ecat/libecat.a(rts_cmd.o): In function `initAcqTcpClient': rts_cmd.c:(.text+0x9c):
undefined reference to `clnttcp_create'
rts_cmd.c:(.text+0xd2): undefined reference to `clnt_pcreateerror'
/opt/ecat/libecat.a(rts_cmd.o): In function `doAcsAcqCommand':rts_cmd.c:(.text+0x16b): undefined reference to `clnt_perror'
/opt/ecat/libecat.a(rts_cmd.o): In function `rts_rmhd':rts_cmd.c:(.text+0x1ad): undefined reference to `xdr_wrapstring'
/opt/ecat/libecat.a(rts_cmd.o): In function `rts_wmhd':rts_cmd.c:(.text+0x264): undefined reference to `xdr_int'
/opt/ecat/libecat.a(rts_cmd.o): In function `rts_wshd':rts_cmd.c:(.text+0x3c8): undefined reference to `xdr_int'
/opt/ecat/libecat.a(rts_cmd.o): In function `rts_wdat':rts_cmd.c:(.text+0x4e1): undefined reference to `xdr_int'
rts_cmd.c:(.text+0x512): undefined reference to `cfree'
/opt/ecat/libecat.a(rts_cmd.o): In function `rtsWblk': rts_cmd.c:(.text+0x650): undefined reference to `xdr_int'
/opt/ecat/libecat.a(matrix_xdr.o): In function `xdr_MATRIX_FUNCTIONS': matrix_xdr.c:(.text+0x5): undefined reference to `xdr_enum'
/opt/ecat/libecat.a(matrix_xdr.o): In function `xdr_XMAIN_HEAD': matrix_xdr.c:(.text+0x28): undefined reference to `xdr_opaque'
matrix_xdr.c:(.text+0x44): undefined reference to `xdr_opaque'
matrix_xdr.c:(.text+0x54): undefined reference to `xdr_short'
matrix_xdr.c:(.text+0x64): undefined reference to `xdr_short'
matrix_xdr.c:(.text+0x74): undefined reference to `xdr_short'
This can be fixed in two ways. Either remove rts support from ecat (recommended) or link STIR to libtirts.so (installed independently on your system).
- For the first option one would have to apply the patch on the ecat directory (before compilation), by following this procedure:
- Move the patch file to the directory of the original folder
- This folder will get modified, so make a backup of it somewhere
- Run:
patch -s -p0 < stir_ecat_no_rts_no_xdr.patch
At this point, the original folder contains the modified content.
- The second option is achieved by applying the following modification at buildblock/CMakeLists.txt or IO/CMakeLists.txt (one would be enough):
if (LLN_FOUND)
target_link_libraries(buildblock ${LLN_LIBRARIES} -ltirpc)
endif()
You get many messages like
run_tests.sh: 138: run_tests.sh: generate_image: not found
You need to install all STIR executables into a directory and either add
this to your path (see #Why can't I run any of the STIR executables
and/or
scripts?,
or pass this directory to the test script (see
recon_test_pack/README.txt
).
Most likely this due to a bug in the "incremental backprojector" which crops up depending on compiler/optimisation settings etc. An extensive discussion of this bug appeared on the STIR users lists. You might be able to see this using this link.
For example, a known system that has this problem is Ubuntu 64bit using gcc 4.5 in 64 bit mode (and -O3).
There are several ways to check:
- run
recon_test_pack/run_tests.sh
with the--nointbp
flag to remove tests that use this projector - display the sensitivity image that you're getting. Most likely you'll see a hot spot in the middle or some 45 degree lines
You could always compile as 32bit even on 64bit linux of course (use EXTRA_CFLAGS=-m32 EXTRA_LINKFLAGS=-m32, and don't forget to either change DEST or make clean). However, we do not recommend using the incremental backprojector for the iterative algorithms anyway as there's no corresponding forward projector.
For STIR 2.2 (and earlier) there is a known problem with the test for OSSPS in the recon_test_pack on cygwin with gcc 4.5.3. OSSPS is fine, but the test is too optimistic.
If it (only) fails on the ECAT6 data, this is because of a known problem in the LLN Matrix library that appears on more modern systems (e.g. 64 bit). Search the stir-users mailing list for some info. If you really need ECAT6 support, maybe you can disable optimisation when compiling the LLN Matrix library.
STIR comes as as a set of executables and scripts. For normal usage, these need to be in the path of your shell. If this isn't the case, you will see something like
$ generate_image mygreatimage.par
-bash: generate_image: command not found
The exact message you would see depends on your environment.
The recommended way to solve this is to install the STIR executables and scripts into a directory and then add this directory to your path. For example on Linux (or Cygwin) when using the handcrafted Makefiles:
$ cd /where/ever/is/STIR
$ mkdir /where/ever/you/want/it
$ make install INSTALL_PREFIX=/where/ever/you/want/it
$ PATH=$PATH:/where/ever/you/want/it/bin
$ LD_LIBRARY_PATH=$LD_LIBRARY_PATH:/where/ever/you/want/it/lib
$ export LD_LIBRARY_PATH
When using CMake, you normally set the installation location when
running CMake, but you still need to do make install
. And when
using csh as your shell, you need to replace the last 3 lines with
csh syntax
$ set path=( $path /where/ever/you/want/it/bin )
$ setenv LD_LIBRARY_PATH $LD_LIBRARY_PATH:/where/ever/you/want/it/lib
If you get permission denied' messages when creating the installation
directory (or when doing the installation) it is probably because you
choose a location for which you do not have write permission (such as
/usr/local
). Unless you want to make STIR available to all users on a
machine, it is recommended to use a subdirectory of your home*.*
You might want to make sure that STIR is always in your path. To do
that, you should copy the last 3 lines into your startup file. Depending
on your default shell, this will be a file called ~/.bash_profile
,
~/.bashrc
, ~/.profile
or ~/.cshrc
on Unix/Linux/Cygwin. On Windows
you will need to set the Path environment variable via the System
Control Panel.
STIR uses a set of plug-ins to read (or write) data. Which file formats are supported is therefore determined when building STIR with CMake. You might get an error such as the following (this one is from reading a ROOT listmode file):
Available input file formats:
ECAT966
ECAT962
ERROR: no file format found that can read file 'TB_header.hroot'
terminate called after throwing an instance of 'std::__cxx11::basic_string<char, std::char_traits<char>, std::allocator<char> >'
Aborted (core dumped)"
The list of supported file formats depends of course on what you are reading, i.e. it is different for images (hint: installing STIR with ITK expands IO capabilities dramatically), projection data and listmode data (hint: you might want ROOT for GATE output, and HDF5 for GE data). See also the User's Guide.
Note that all utilities/Python functions etc use the samesetof plug-ins. Therefore, if you can read a listmode file with lm_to_projdata, you will be able to read it with list_lm_events and any reconstruction that uses listmode data as input.
Note that since STIR 3.0, there is an example script to do your own analytic simulations with STIR in examples/PET_simulation.
STIR comes with utilities to read SimSET sinograms, check the SimSET subdirectory and its README.txt
See the howto on this wiki
- if you have a ROOT file
- STIR now comes with ROOT support out of the box, at least if
CMake found ROOT during configuration. You need to create a
small text header pointing to the ROOT file to tell STIR more
about the simulation (e.g. scanner characteristics).
Unfortunately, this needs to be done manually. There are 2
examples
.hroot
files inexamples\samples
. After thatlm_to_projdata
and any list mode reconstruction should work. - if you are still using STIR 3.0, try http://sourceforge.net/projects/gatepet2stir/. See also Nikolaos' presentation at the STIR 2013 User's Meeting
- STIR now comes with ROOT support out of the box, at least if
CMake found ROOT during configuration. You need to create a
small text header pointing to the ROOT file to tell STIR more
about the simulation (e.g. scanner characteristics).
Unfortunately, this needs to be done manually. There are 2
examples
- if you output as raw, try STIR 3.0 conv_GATE_raw_ECAT_projdata_to_interfile
- if you output as ECAT7, run
ifheaders_for_ecat7
(you will need to have built STIR with supportfor the LLN ECAT Matrix library), and then manually edit the.hs
file to reflect the correct scanner size. A lot of the info should be alright, but scanner radius and scanner name is wrong (set the latter to unknown). The radius you should know as you ran the simulation! Block info etc is currently ignored by STIR so don't bother filling it in. - Almost working code exists to read LMF data via STIR. You can find it in the `experimental/listmode` directory. Sadly, nobody has ever finalised this.
See also
Most likely, you are running out of memory. Check with your system monitor (on
Linux etc, you can use htop
or top
). There are 2 solutions:
- buy more memory! (recommended)
- if you are using a projector that uses a matrix (such as the default ray-tracing
matrix), the default setting is to "cache" some rows of the matrix in memory. This
can become too large though. Currently, the only solution is to switch caching off.
When using a .par file, add the line
in the settings for the projector.
disable caching := 1
See previous entry.
When you need more memory than is "physically" available, your system is likely swapping memory on/off disk. That's great to be able to continue, but is also very slow. See also the entries above.
This is unfortunately a very hard question to answer. Look for some advice on the mailing lists. Note that the STIR defaults are not optimal, and neither are the sample .par files (and most definitely not the .par files in the recon_test_pack). A few things that seems pretty clear:
For STIR 2.x, the default for the iterative reconstructions was to use a ray tracing forward projector and an incremental interpolating backprojector. This turned out to be a bad choice as these projectors are not matched, which creates problems (even if the incremental backprojector does work on your system, see the known problems). It's probably best to use matched projectors, and the easiest way to do that is to use a "matrix". The fastest is the ray tracing matrix. This is the default projector since STIR 3.0.
However, this is still not ideal as this defaults to using only 1 ray per bin in the sinogram. This can create discretisation artefacts in high count situations. For the iterative algorithms, you should therefore probably use something like this in your .par file:
projector pair type:= Matrix
Projector Pair Using Matrix Parameters:=
Matrix type:= Ray Tracing
Ray tracing matrix parameters:=
number of rays in tangential direction to trace for each bin:= 10
End Ray tracing matrix parameters:=
End Projector Pair Using Matrix Parameters:=
For the analytic algorithms, using a ray tracer as backprojector can create (other) discretisation artefacts. You can alleviate this by using more rays, but in 3D, the ray tracing matrix doesn't take a Jacobian into account (as you don't have to for iterative reconstructions). Either stick to the default interpolating backprojector, or use its matrix equivalent. See the User's Guide.
If you can afford to wait: 1. Otherwise as small as possible. However, because the STIR projectors use symmetries (unless you switch them off) the number of subsets needs to divide number_of_views/x where x is 4,2 or 1 depending on which x gives you an integer number in the division (when using all symmetries). For example, if you have 210 views, x=2, so you could use 1,3,5,7 and their multiples that still divide 105.
Kris Thielemans doesn't like "early stopping". Post-filtering is straughtforward (but you need to iterate longer than you think). You can use inter-iteration filtering or penalised reconstruction, but be aware that these create non-uniform resolution/regularisation. This is well-known in the literature, and can be fixed in STIR, but the relevant code is not yet available. (Remember also that OSL can diverge in noisy cases when the regularisation is too high).
In any case, you should normally not mix different regularisation methods (unless you know what you're doing of course).
This is entirely prior and data dependent. Think about it this way. The objective function is something like
Log-Poisson-likelihood + penalisation_factor * prior
where the prior is image-dependent. The log-Poisson is proportional to the projection data, and so is the (STIR-) reconstructed image.
The QP prior is essentially (image_differences)^2, therefore the prior is proportional to (projection_data)^2. This means that if you want to have the QP to achieve count-independent smoothing, you have to make the penalisation factor inversely-proportional to the counts.
One way to fix this is to use the "uniform resolution weights for the QP" from Fessler et al. That's currently not distributed with STIR however. So, I'm afraid you're down to tuning things for your data.
Although there is not really a prior function for MRP, its gradient is independent of the image-scale, and the gradient of the log-likelihood is independent of counts as well (after enough iterations). Therefore, the effect of MRP is not count dependent like QP. However, it does depend on sensitivity/attenuation etc. You can use the "multiplicative form" of the update (see the User's Guide and http://dx.doi.org/10.1109/NSSMIC.2001.1008688) to avoid this (in which case 1 is a high penalty), although the "additive form" of MRP is in principle better as it has higher influence where the sensitivity is lowest (note that MRP with large penalty is hampered by convergence problems when using OSL).
(see the glossary for what this is. GE tends to use "geometric correction" for the same concept)
STIR gets info about the arc-correction from the data. For instance, for Interfile you could see
applied corrections:={arc correction}
STIR reconstruction algorithms will automatically handle this for you, or tell you that they cannot. For instance, FBP (2D and 3D) will arc-correct first if still necessary. The iterative algorithms will not precorrect the data, but it is up to the projectors to adjust. However, the (default) incremental interpolating backprojector cannot handle non-arccorrected data. It will say something like
ERROR: BackProjectorByBinUsingInterpolation:
can only handle arc-corrected data (cast to ProjDataInfoCylindricalArcCorr)!
Change the projector.
You might not need to. If you specify the scanner geometry in your Interfile header, STIR will handle it ok.
For instance, you could use create_projdata_template, pick a scanner that might be somewhat similar to yours, and then edit the generated Interfile header. The scanner part of the header takes the same information as Scanner:set_params() (take care of changes between mm and cm). Obviously, it contains more information such as the actual number of views, ring differences etc that is supposed to be in your data. (Check the STIR Glossary as well for some info). Once you have this template, you should be good to go.
Alternatively, you will have to modify the Scanner
class. Marc
Chamberland gave a good explanation of this on the stir-users
list.
Note that STIR (at least 2.2 and earlier) ignores view_offset and block information.
This answer refers to current and future STIR developers. Discussion on this issue was conducted on the listmode reconstruction branch at github.
For compilers that use CXX11 STIR uses std::unique_ptr<T>
, while if
CXX11 is not used then std::auto_ptr<T>
is used, in which case the
constructor std::shared_ptr(auto_ptr<T>& ap);
works fine. Allowing us
to write something like:
this->set_proj_data_info_sptr(ProjDataInfo::construct_proj_data_info(...));
But for std::unique_ptr<T>
the case of
std::shared_ptr(unique_ptr<T>&& ap);
, which would allow us to use a
similar constructor, is not allowed. The call of std::move()
or
release()
would be mandatory. Therefore the solution that we decided
to implement was:
shared_ptr<T> s(construct_..());
set_proj_data_info_sptr(s);
This is compatible with both pre-CXX11 and CXX11 compilers.