Data Preprocessing and Motion Correction - PowerPoint PPT Presentation

1 / 18
About This Presentation
Title:

Data Preprocessing and Motion Correction

Description:

The bulk of this demonstration will focus on quality control measures. Standard processing procedure - Every imager should attempt to implement some set of ... – PowerPoint PPT presentation

Number of Views:77
Avg rating:3.0/5.0
Slides: 19
Provided by: jcu96
Category:

less

Transcript and Presenter's Notes

Title: Data Preprocessing and Motion Correction


1
Data Preprocessing and Motion Correction
  • The bulk of this demonstration will focus on
    quality control measures.

Standard processing procedure - Every imager
should attempt to implement some set of criterion
in which they can effectively evaluate the
quality of their data.
  • FMRI data is inherently very noisy.

Any time you can identify some source of noise
in your data set you then have a chance to remove
or account for the variance in your signal caused
by that source of noise. The more unwanted
variance you can account for the more likely it
is that your data will be statistically
significant.
2
Sources of Noise in fMRI
HSM (ch.9, pp.224-233)
Thermal Noise - Fluctuations in MR signal
intensity over space or time that are caused by
thermal motion of electrons within the sample or
the scanner hardware. System Noise -
Fluctuations in MR signal intensity over space or
time that are caused by imperfect functioning of
the scanner hardware. ? e.g. scanner drift Slow
changes in voxel intensity over time. Motion
and Physiological Noise - Fluctuations in MR
signal intensity over space or time due to
physiological activity of the human body.
Sources of physiological noise include motion,
respiration, cardiac activity, and metabolic
reactions.
3
How can we tell good data from bad data?
Tool 1 Voxel Surfing - take a moment to
snoop your data.
  • Bad data - look for huge abrupt changes in
    signal intensity.

Sustained
4
How can we tell good data from bad data?
Tool 2 Time Course Movies look at the image
intensity of your data as it changes over time.
  • Large movements of the head from one time point
    to the next should be evident as sudden shifts in
    the image intensity.
  • Toggling between the first and last image can
    give you a rough estimate of head movement over
    the entire run (note the potential to detect
    gradual drifts).

5
How can we tell good data from bad data?
Tool 3 BVs Motion Correction look at the
resultant plot of position changes over time as
detected by BV.
6
How can we tell good data from bad data?
Tool 4 Converging Evidence the most powerful
method is to search for overlapping conclusions.
  • For example, volume 62 is consistently
    identified as a problem volume

3rd piece of evidence
7
But, how bad are these data?
Tool 5 Statistical Contrast Maps the
statistics themselves can often tell you how
bad the data is.
  • Recall Fig. 10.6 (HSM). Illustrates the
    characteristic ringing pattern of false
    activation a tell-tale sign of head motion.
  • Also, plenty of activation outside the brain is
    not a good sign ?

8
From Detection to Correction
Motion Correction the procedure most commonly
used to combat head motion is to simply move the
brain back to its original position in space
(i.e. align to a specified reference image).
Coregistration The spatial alignment of two
images or image volumes (HSM ch.10, p.263).
For example, with BVMC an intensity-based
algorithm searches for three parameters which
describe the subpixel translation and rotation of
each measurement relative to the reference image
9
From Detection to Correction
Things to consider
These procedures assume that the physical
properties of the brain image (i.e. its size and
shape) do not change as a function of its
position in the magnetic field. (see p.263,
rigid body transformations)
Spatial distortions vary depending upon where in
the magnetic field youre sampling from.
These procedures do not account for temporal
changes in the signal caused by the motion. The
impact of a head movement on the time course of
activity within a given voxel/region is not
corrected for by simply moving the brain back to
where it ought to be.
10
Can it be Done?
The Goal correct for the unwanted variance in
our data set associated with periods of head
motion.
Two ideas about how we might accomplish this
task
1. Remove the noise due to head motion simply
cut the problem time points out of the data
set.
2. Model the noise due to head motion account
for the variance associated with periods of head
motion by including regressor functions that
accurately predict the changes in the signal
caused by the motion.
11
Idea 1 Cut the Noise
Things to consider
1. We can slice up the data simply enough, but
are there any issues about piecing it back
together?
  • For example, lets say that we decided that
    volumes 60-85 are significantly contaminated
    with head motion ?

period of motion
12
Idea 1 Cut the Noise
Things to consider
2. Loss of Statistical Power the most obvious
problem with this method is that since we are
removing data points we are directly reducing our
statistical power. This can be a serious concern
when the effects of the experimental manipulation
are expected to be very small.
  • For example, studies employing fMR-adaptation
    paradigms usually report significant differences
    of less then .1 ?

every data point is valuable!!
13
Idea 2 Model the Noise
The success of the GLM depends entirely on how
accurately the model fits the actual data (see
HSM ch12, pp.337-343).
Variance in the data that is not explained by the
model is referred to as the residuals.
The ultimate goal account for the variance due
to motion reduce our residuals
14
Idea 2 Model the Noise
Independent Component Analysis (ICA) a
data-driven approach that explores the underlying
statistical structure of a given data set.
ICA can be used to reveal non-task related
components within your data set. As such, these
confounding components of no interest can then be
incorporated into your hypothesis-driven model so
to reduce the amount of noise in your data.
15
Potential Problems with Both of These Ideas
1. They can be very time consuming.
2. Separating meaningful signal from nonsense
noise is often a difficult process. There are
many unexplained components of variance within
any fMRI data set.
To model these components so to increase the
statistical significance of a given comparison of
interest could easily be considered cheating!
Thus, in order to use these methods appropriately
one must be certain that the noise components
they wish to remove are not in any way associated
with the cognitive processes they wish to
investigate
3. There are definitely situations where
confounding noise components actually overlap
with task events.
When in doubt, throw it out!
16
Spatial and Temporal Preprocessing
HSM (ch.10, pp.274-280)
In neuroimaging, filters are used to remove
uninteresting variation in the data that can be
safely attributed to noise sources, while
preserving signals of interest.
17
Temporal Filtering
High-Pass Filter allows high frequency trends
in the data to pass (i.e. remain) while
attenuating low frequency trends.
  • You can end up completely squashing your
    effects of interest!

Low-Pass Filter allows low frequency trends in
the data to pass (i.e. remain) while
attenuating high frequency trends.
  • High frequency variations in fMRI data are
    likely to be meaningful.

Thus, although temporal data smoothing can be
useful, it should approached with caution Ask
yourself Do I really know what Im doing?
Because if you dont, you could be removing
meaningful patterns of activation.
18
Spatial Filtering
Spatial smoothing essentially blurs your
functional data.
Why would you ever want to reduce the spatial
resolution of your data?
Spatial smoothing may often be required when
averaging data across several subjects due to the
individual variations in brain anatomy and
functional organization.
Write a Comment
User Comments (0)
About PowerShow.com