Detecting%20sparse%20signals%20and%20sparse%20connectivity%20in%20scale-space,%20with%20applications%20to%20the%20'bubbles'%20task%20in%20an%20fMRI%20experiment - PowerPoint PPT Presentation

About This Presentation
Title:

Detecting%20sparse%20signals%20and%20sparse%20connectivity%20in%20scale-space,%20with%20applications%20to%20the%20'bubbles'%20task%20in%20an%20fMRI%20experiment

Description:

Sloan Digital Sky Survey, data release 6, Aug. 07. What is bubbles'? Nature (2005) ... Results. Mask average face. But are these features real or just noise? ... – PowerPoint PPT presentation

Number of Views:48
Avg rating:3.0/5.0
Slides: 44
Provided by: KeithW69
Category:

less

Transcript and Presenter's Notes

Title: Detecting%20sparse%20signals%20and%20sparse%20connectivity%20in%20scale-space,%20with%20applications%20to%20the%20'bubbles'%20task%20in%20an%20fMRI%20experiment


1
Detecting sparse signals and sparse connectivity
in scale-space, with applications to the
'bubbles' task in an fMRI experiment
  • Keith Worsley, Nicholas Chamandy, McGill
  • Jonathan Taylor, Stanford and Université de
    Montréal
  • Robert Adler, Technion
  • Philippe Schyns, Fraser Smith, Glasgow
  • Frédéric Gosselin, Université de Montréal
  • Arnaud Charil, Montreal Neurological Institute

2
Astrophysics
3
Sloan Digital Sky Survey, data release 6, Aug. 07
4
What is bubbles?
5
Nature (2005)
6
Subject is shown one of 40 faces chosen at
random
Happy
Sad
Fearful
Neutral
7
but face is only revealed through random
bubbles
  • First trial Sad expression
  • Subject is asked the expression
    Neutral
  • Response
    Incorrect

75 random bubble centres
Smoothed by a Gaussian bubble
What the subject sees
Sad
8
Your turn
  • Trial 2

Subject response Fearful CORRECT
9
Your turn
  • Trial 3

Subject response Happy INCORRECT (Fearful)
10
Your turn
  • Trial 4

Subject response Happy CORRECT
11
Your turn
  • Trial 5

Subject response Fearful CORRECT
12
Your turn
  • Trial 6

Subject response Sad CORRECT
13
Your turn
  • Trial 7

Subject response Happy CORRECT
14
Your turn
  • Trial 8

Subject response Neutral CORRECT
15
Your turn
  • Trial 9

Subject response Happy CORRECT
16
Your turn
  • Trial 3000

Subject response Happy INCORRECT (Fearful)
17
Bubbles analysis
  • E.g. Fearful (3000/4750 trials)

Trial 1 2 3 4
5 6 7 750
Sum
Correct trials
Thresholded at proportion of correct
trials0.68, scaled to 0,1
Use this as a bubble mask
Proportion of correct bubbles (sum correct
bubbles) /(sum all bubbles)
18
Results
  • Mask average face
  • But are these features real or just noise?
  • Need statistics

Happy Sad
Fearful Neutral
19
Statistical analysis
  • Correlate bubbles with response (correct 1,
    incorrect 0), separately for each expression
  • Equivalent to 2-sample Z-statistic for correct
    vs. incorrect bubbles, e.g. Fearful
  • Very similar to the proportion of correct bubbles

ZN(0,1) statistic
Trial 1 2 3 4
5 6 7 750
Response 0 1 1 0
1 1 1 1
20
Results
  • Thresholded at Z1.64 (P0.05)
  • Multiple comparisons correction?
  • Need random field theory

ZN(0,1) statistic
Average face Happy Sad
Fearful Neutral
21
Euler Characteristic Heuristic
Euler characteristic (EC) blobs - holes (in
2D) Excursion set Xt s Z(s) t, e.g. for
neutral face
EC 0 0 -7 -11
13 14 9 1 0
30

Heuristic At high thresholds t, the holes
disappear, EC 1 or 0, E(EC) P(max Z
t).
Observed
Expected
20
10
EC(Xt)
0
  • Exact expression for E(EC) for all thresholds,
  • E(EC) P(max Z t) is extremely accurate.

-10
-20

-4
-3
-2
-1
0
1
2
3
4
Threshold, t
22
The result
Lipschitz-Killing curvatures of S (Resels(S)c)
EC densities of Z above t
filter
white noise
Z(s)


FWHM
23
Results, corrected for search
  • Random field theory threshold Z3.92 (P0.05)
  • 3.82 3.80 3.81
    3.80
  • Saddle-point approx (2007) Z? (P0.05)
  • Bonferroni Z4.87 (P0.05) nothing

ZN(0,1) statistic
Average face Happy Sad
Fearful Neutral
24
Theorem (1981,1995)
25
Steiner-Weyl Tube Formula (1930)
Morse Theory method (1981, 1995)
  • Put a tube of radius r about the search
  • region ?S
  • EC has a point-set representation

r
Tube(?S,r)
?S
  • Find volume, expand as a power series
  • in r, pull off coefficients
  • For a Gaussian random field

26
Tube(?S,r)
r
?S
Steiner-Weyl Volume of Tubes Formula (1930)
Lipschitz-Killing curvatures are just intrinisic
volumes or Minkowski functionals in the
(Riemannian) metric of the variance of the
derivative of the process
27
S
S
S
Edge length ?
Lipschitz-Killing curvature of triangles
Lipschitz-Killing curvature of union of triangles
28
Non-isotropic data?
ZN(0,1)
s2
s1
  • Can we warp the data to isotropy? i.e. multiply
    edge lengths by ??
  • Globally no, but locally yes, but we may need
    extra dimensions.
  • Nash Embedding Theorem dimensions D
    D(D1)/2 D2 dimensions 5.
  • Better idea replace Euclidean distance by the
    variogram
  • d(s1, s2)2 Var(Z(s1)
    - Z(s2)).

29
(No Transcript)
30
Non-isotropic data
ZN(0,1)
ZN(0,1)
s2
s1
Edge length ?(s)
Lipschitz-Killing curvature of triangles
Lipschitz-Killing curvature of union of triangles
31
We need independent identically distributed
random fields e.g. residuals from a linear model

Lipschitz-Killing curvature of triangles
Lipschitz-Killing curvature of union of triangles
32
Scale space smooth Z(s) with range of filter
widths w continuous wavelet transform adds an
extra dimension to the random field Z(s,w)
Scale space, no signal
34
8
22.7
6
4
15.2
2
10.2
0
-2
6.8
-60
-40
-20
0
20
40
60
w FWHM (mm, on log scale)
One 15mm signal
34
8
22.7
6
4
15.2
2
10.2
0
-2
6.8
Z(s,w)
-60
-40
-20
0
20
40
60
s (mm)
15mm signal is best detected with a 15mm
smoothing filter
33
Matched Filter Theorem ( Gauss-Markov Theorem)
to best detect signal white noise, filter
should match signal
10mm and 23mm signals
34
8
22.7
6
4
15.2
2
10.2
0
-2
6.8
-60
-40
-20
0
20
40
60
w FWHM (mm, on log scale)
Two 10mm signals 20mm apart
34
8
22.7
6
4
15.2
2
10.2
0
-2
6.8
Z(s,w)
-60
-40
-20
0
20
40
60
s (mm)
But if the signals are too close together they
are detected as a single signal half way between
them
34
Scale space can even separate two signals at the
same location!
8mm and 150mm signals at the same location
10
5
0
-60
-40
-20
0
20
40
60
170
20
76
15
34
w FWHM (mm, on log scale)
10
15.2
5
6.8
Z(s,w)
-60
-40
-20
0
20
40
60
s (mm)
35
Scale space Lipschitz-Killing curvatures
36
Rotation spaceTry all rotated elliptical
filters
Unsmoothed data
Threshold Z5.25 (P0.05)
Maximum filter
37
Beautiful symmetry
Steiner-Weyl Tube Formula (1930)
Taylor Gaussian Tube Formula (2003)
  • Put a tube of radius r about the search region
    ?S and rejection region Rt

Z2N(0,1)
Rt
r
Tube(Rt,r)
Tube(?S,r)
r
?S
Z1N(0,1)
t
t-r
  • Find volume or probability, expand as a power
    series in r, pull off coefficients

38
Z2N(0,1)
Rejection region Rt
Tube(Rt,r)
r
Z1N(0,1)
t
t-r
Taylors Gaussian Tube Formula (2003)
39
EC densities for some standard test statistics
  • Using Morse theory method (1981, 1995)
  • T, ?2, F (1994)
  • Scale space (1995, 2001)
  • Hotellings T2 (1999)
  • Correlation (1999)
  • Roys maximum root, maximum canonical correlation
    (2007)
  • Wilks Lambda (2007) (approximation only)
  • Using Gaussian Kinematic Formula
  • T, ?2, F are now one line
  • Likelihood ratio tests for cone alternatives (e.g
    chi-bar, beta-bar) and nonnegative least-squares
    (2007)

40
Accuracy of the P-value approximation
The expected EC gives all the polynomial terms in
the expansion for the P-value.
41
Bubbles task in fMRI scanner
  • Correlate bubbles with BOLD at every voxel
  • Calculate Z for each pair (bubble pixel, fMRI
    voxel)
  • a 5D image of Z statistics

Trial 1 2 3 4
5 6 7 3000
fMRI
42
Thresholding? Cross correlation random field
  • Correlation between 2 fields at 2 different
    locations,
  • searched over all pairs of locations, one in S,
    one in T
  • Bubbles data P0.05, n3000, c0.113, T6.22

Cao Worsley, Annals of Applied Probability
(1999)
43
Discussion modeling
  • The random response is Y1 (correct) or 0
    (incorrect), or YfMRI
  • The regressors are Xjbubble mask at pixel j, j1
    240x38091200 (!)
  • Logistic regression or ordinary regression
  • logit(E(Y)) or E(Y) b0X1b1X91200b91200
  • But there are only n3000 observations (trials)
  • Instead, since regressors are independent, fit
    them one at a time
  • logit(E(Y)) or E(Y) b0Xjbj
  • However the regressors (bubbles) are random with
    a simple known distribution, so turn the problem
    around and condition on Y
  • E(Xj) c0Ycj
  • Equivalent to conditional logistic regression
    (Cox, 1962) which gives exact inference for b1
    conditional on sufficient statistics for b0
  • Cox also suggested using saddle-point
    approximations to improve accuracy of inference
  • Interactions? logit(E(Y)) or E(Y)b0X1b1X91200
    b91200X1X2b1,2
Write a Comment
User Comments (0)
About PowerShow.com