The Insight ToolKit image registration framework - PubMed Skip to main page content
U.S. flag

An official website of the United States government

Dot gov

The .gov means it’s official.
Federal government websites often end in .gov or .mil. Before sharing sensitive information, make sure you’re on a federal government site.

Https

The site is secure.
The https:// ensures that you are connecting to the official website and that any information you provide is encrypted and transmitted securely.

Access keys NCBI Homepage MyNCBI Homepage Main Content Main Navigation
. 2014 Apr 28:8:44.
doi: 10.3389/fninf.2014.00044. eCollection 2014.

The Insight ToolKit image registration framework

Affiliations

The Insight ToolKit image registration framework

Brian B Avants et al. Front Neuroinform. .

Abstract

Publicly available scientific resources help establish evaluation standards, provide a platform for teaching and improve reproducibility. Version 4 of the Insight ToolKit (ITK(4)) seeks to establish new standards in publicly available image registration methodology. ITK(4) makes several advances in comparison to previous versions of ITK. ITK(4) supports both multivariate images and objective functions; it also unifies high-dimensional (deformation field) and low-dimensional (affine) transformations with metrics that are reusable across transform types and with composite transforms that allow arbitrary series of geometric mappings to be chained together seamlessly. Metrics and optimizers take advantage of multi-core resources, when available. Furthermore, ITK(4) reduces the parameter optimization burden via principled heuristics that automatically set scaling across disparate parameter types (rotations vs. translations). A related approach also constrains steps sizes for gradient-based optimizers. The result is that tuning for different metrics and/or image pairs is rarely necessary allowing the researcher to more easily focus on design/comparison of registration strategies. In total, the ITK(4) contribution is intended as a structure to support reproducible research practices, will provide a more extensive foundation against which to evaluate new work in image registration and also enable application level programmers a broad suite of tools on which to build. Finally, we contextualize this work with a reference registration evaluation study with application to pediatric brain labeling.

Keywords: MRI; brain; death; open-source; registration.

PubMed Disclaimer

Figures

Figure 1
Figure 1
A schematic overview of the prototypical ITK4 registration method. This design is overall similar to that of ITK3. A few key components differ: (1) optimizers require that transforms update themselves; (2) metrics and optimizers are multi-threaded; (3) memory is shared across both optimizers and metrics, greatly increasing efficiency; (4) automated (usually hidden) parameter estimators are available; (5) transforms may include high-dimensional deformation fields. One additional difference (not shown) is that “fixed” images may also have a transformation, although this is not modified by the optimizer.
Figure 2
Figure 2
Clockwise: Define x in ΩI and z in ΩJ as the same material point but existing in different domains. The point y is in a domain that is intermediate between ΩI and ΩJ. The standard approach in the ITKv4 registration framework is to map image J (B) to image I (A) by first identifying the linear transformation, →, between the images, shown in (C). Second, we remove the shape (diffeomorphic) differences (D). Consequently, we have a composite mapping, computed via the mutual information similarity metric, that identifies I(x) ≈mi J(A(ϕ(x))) = JAffine(y) = J(z). The image JAffine(y) represents J after application of the affine transformation A i.e., J(A(x)). Code and data for this example are here.
Figure 3
Figure 3
An ITK diffeomorphic mapping of the type IJ. The “C” and 1/2 “C” example illustrate the large deformations that may be achieved with time varying velocity fields. In this case, the moving (deforming) image is the 1/2 “C.” The right panels illustrate the deformed grid for the transformation of the “C” to 1/2 “C” (middle right) and its inverse mapping (far right) which takes the 1/2 “C” to the reference space. The unit time interval is discretized into 15 segments in order to compute this mapping. 15*5 integration steps were used in the Runge-Kutta ODE integration over the velocity field. A two core MacBook Air computed this registration in 110 s. The images each were of size 150 × 150. See C for a reproducible example of this registration and the data. In addition, we provide an example of how the Jacobian determinant is computed from the deformation field resulting from this registration via an ANTs program CreateJacobianDeterminantImage.
Figure 4
Figure 4
This RGB image registration example employs ITK4 code that repurposes a scalar metric (itkMeanSquaresImageToImageMetricv4) for multichannel registration.
Figure 5
Figure 5
Three reference images, I (left), J (middle top), and K (right top), are used to illustrate the robustness of our parameter scale estimation for setting consistent parameters across both metrics and transform types. K is the negation of J and is used to test the correlation and mutual information registrations. We optimized, by hand, the step-length parameters for one metric (the sum of squared differences) for both the affine and deformable case. Thus, two parameters had to be optimized. We then applied these same parameters to register I and K via both correlation and mutual information. The resulting registrations (bottom row) were all of similar quality. Further, the same metric is used for both affine and diffeomorphic mapping by exploiting the general optimization process given in Equation (1).
Figure 6
Figure 6
We compare a ITK4 composite schema as Icc ↭ ≈miJi for mapping a set of {Ji} images to a template I to a ITK3 schema: ImibmiJi. We use this schematic in a registration-based segmentation of multiple brain structures in a pediatric population as a benchmark for algorithm performance, similar to Klein et al. (2010). An example ANTs-based large-deformation result from the dataset is shown for illustration where we render the extracted brains as well as show select axial slices. All registrations were run on the original MRI data with no preprocessing except what is done by ANTs or BRAINSFit internally. Overlap improvement from v3 to v4, quantified via paired t-test, is highly significant.
Figure 7
Figure 7
Above, a barplot shows the mean Dice score for each region and each algorithm, sorted by ANTs performance. Below, we use star plots of per-brain-region Dice overlap to compare, for each subject, the ITK4 implementation of SyN with the ITK3-based BRAINSFit algorithm. The ITK4 SyN algorithm, with its classic neighborhood correlation metric, outperforms BRAINSFit in several regions and more strongly in some subject pairs than others. The legend for the plots is at lower right and shows the maximum possible value for each region.

Similar articles

Cited by

References

    1. Ackerman M. J., Yoo T. S. (2003). The visible human data sets (VHD) and insight toolkit (ITk): experiments in open source software. AMIA Annu. Symp. Proc. 2003:773 - PMC - PubMed
    1. Avants B. B., Tustison N. J., Song G., Cook P. A., Klein A., Gee J. C. (2011). A reproducible evaluation of ANTs similarity metric performance in brain image registration. Neuroimage 54, 2033–2044 10.1016/j.neuroimage.2010.09.025 - DOI - PMC - PubMed
    1. Avants B., Duda J. T., Kim J., Zhang H., Pluta J., Gee J. C., et al. (2008). Multivariate analysis of structural and diffusion imaging in traumatic brain injury. Acad. Radiol. 15, 1360–1375 10.1016/j.acra.2008.07.007 - DOI - PMC - PubMed
    1. Baloch S., Davatzikos C. (2009). Morphological appearance manifolds in computational anatomy: groupwise registration and morphological analysis. Neuroimage 451 Suppl, S73–S85 10.1016/j.neuroimage.2008.10.048 - DOI - PMC - PubMed
    1. Bearden C. E., van Erp T. G. M., Dutton R. A., Tran H., Zimmermann L., Sun D., et al. (2007). Mapping cortical thickness in children with 22q11.2 deletions. Cereb. Cortex 17, 1889–1898 10.1093/cercor/bhl097 - DOI - PMC - PubMed