SlideShare a Scribd company logo
1
   Probability density function (pdf)
    estimation using isocontours/isosurfaces
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                               2
Histograms         Kernel      Mixture model
                    density
                   estimate

     Parameter selection:       Sample-based
bin-width/bandwidth/number      methods
        of components
    Bias/variance tradeoff:
                                 Do not treat a
  large bandwidth: high bias,
                                  signal as a
low bandwidth: high variance)
                                    signal
                                                  3
Continuous image representation: function I(x,y) at
Trace out isocontours of the intensity
     using some interpolant. values.
               several intensity




                                                      4
P (αLevel< α + Curves at Intensity α α + region
     < I Level ∆ α ) ∝ area of brown ∆ α
          Curves at Intensity α and
                
  Assume a uniform                       1 d                    
   density on (x,y)
                            p I (α ) =          
                                       | Ω | dα        ∫   dxdy 
                                                                 
                                                 I ( x, y )≤ α  
   Random variable               1              du
 transformation from
                            =
                               |Ω |      ∫     I2 + I2
                                    I ( x, y )= α   x    y
     (x,y) to (I,u)
                             Every point in the image
Integrate out u along the
  u = direction to get
                              domain contributes to
  densitylevel set
      the of intensity I
                                   the density.
   (dummy variable)
      Published in CVPR 2006, PAMI 2009.
                                                                     6
7
P (α 1 < I1 < α 1 at Intensity (α 1I 2 <+α∆ 2α + )∆ αI1 ) ∝
          Level Curves + ∆ α 1 ,α 2 < , α 1           1 in 2
          Level Curves at Intensity α 1 in I1 and α 2 in I 2
       area ofα black region I 2
         and ( 2 , α 2 + ∆ α 2 ) in

                          1                               1
p I1 ,I 2 (α 1 , α 2 ) =
                         |Ω |   ∑
                                C
                                    | ∇ I1 ( x, y)∇ I 2 ( x, y) sin(θ ( x, y)) |
C = {( x , y) | I1 ( x , y) = α 1 , I 2 ( x, y) = α 2 }
θ ( x , y) = angle between gradients at ( x, y)

                  Relationships between
                geometric and probabilistic
                         entities.
                                                                                   8
   Similar density estimator developed by Kadir
    and Brady (BMVC 2005) independently of us.

   Similar idea: several differences in
    implementation, motivation, derivation of
    results and applications.



                                                   9
1                             du
                 p I (α ) =
                            |Ω |          ∫
                                     I ( x, y )= α
                                                     | ∇ I ( x, y ) |

                           1                                1
 p I1 ,I 2 (α 1 , α 2 ) =
                          |Ω |   ∑
                                 C
                                      | ∇ I1 ( x, y)∇ I 2 ( x , y) sin(θ ( x, y)) |
 C = {( x , y) | I1 ( x, y) = α 1 , I 2 ( x, y) = α 2 }

Compute cumulative of the cumulative) do not exist
  Densities (derivatives
    where image gradients are (α < or where image
 interval measures.           P zero, I < α + ∆ )
                  gradients run parallel.
                                                                                  10
11
102464 bins
                 256 bins
                 512
                 128
                   32
                   bins


Standard histograms          Isocontour Method




                                            12
13
   Randomized/digital approximation to area
    calculation.

   Strict lower bound on the accuracy of the
    isocontour method, for a fixed interpolant.

   Computationally more expensive than the
    isocontour method.

                                                  14
128 x 128
  bins




            15
   Simplest one: linear interpolant to each half-
    pixel (level curves are segments).

   Low-order polynomial interpolants: high
    bias, low variance.

   High-order polynomial interpolants: low
    bias, high variance.

                                                     16
Accuracy of estimated density
    Polynomial
                         improves as signal is sampled with
    Interpolant
                                 finer resolution.

                            Bandlimited analog signal,
                             Nyquist-sampled digital
Assumptions on signal:                signal:
  better interpolant        Accurate reconstruction by
                                 sincinterpolant!
                               (Whitaker-Shannon
                               Sampling Theorem)
                                                         17
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    18
•Mutual Information: Well known image find the
   Given two images of an object, to
 similarity measure Viola and Wells (IJCV
   geometric transformation that “best”
 1995) and Maes et al (TMI 1997).
   aligns one with the other, w.r.t. some
   image similarity measure.
 Insensitive to illumination changes:
useful in multimodality image
registration



                                                 19
Marginal
 MI ( I1 , I 2 ) = H ( I1 ) − H ( I1 | I 2 )                                Probabilities
                 = H ( I 1 ) + H ( I 2 ) − H ( I1 , I 2 )                     p1 (α 1 )
 H ( I1 ) = − ∑ p1 (α 1 ) log p1 (α 1 )                                       p2 (α 2 )
                α1

 H ( I 2 ) = − ∑ p2 (α 2 ) log p2 (α 2 )                                   Joint Probability
                α   2                                                         p12(α 1 , α 2 )
 H ( I1 , I 2 ) = − ∑     ∑        p12 (α 1 , α 2 ) log p12 (α 1 , α 2 )
                        α1 α   2



Marginal entropy                        H ( I1 )
  Joint entropy                       H ( I1 , I 2 )

Conditional entropy H ( I1 | I 2 )
                                                                                                20
Hypothesis: If the alignment between images is
optimal then Mutual information is maximum.

  I1               I2

                                     Functions of Geometric
                                        Transformation




       p12(i, j)        H ( I1 , I 2 )        MI( I1 , I 2 )
                                                               21
σ = 0.05   σ = 0 .2   σ = 0 .7
                                 22
σ = 0.05
      2
      7     32 bins
           128 bins
            PVI=partial volume
           interpolation (Maes
             et al, TMI 1997)




                           23
PD slice            Warped andsliceslice slice
                       WarpedNoisy T2
                           T2 T2




     Brute force search for the
         maximum of MI
                                                 24
MI with standard     MI with our method
  histograms




σ = 0 .7
Par. of affine transf.



θ = 30, s = t = − 0.3, ϕ = 0
                                     25
Method        Error in     Error in    Error in t
                        Theta        s(avg.,var. (avg., var.)
                        (avg., var.) )


          Histograms    3.7,18.1    0.7,0        0.43,0.08
          (bilinear)
32 BINS
          Isocontours   0,0.06      0,0          0,0

          PVI           1.9, 8.5    0.56,0.08    0.49,0.1

          Histograms    0.3,49.4    0.7,0        0.2,0
          (cubic)

          2DPointProb   0.3,0.22    0,0          0,0

                                                                26
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    27
Anisotropic neighborhood filters (Kernel
density based filters): Grayscale images
                      ∑      K ( I ( x, y ) − I (a, b);σ ) I ( x, y )
               ( x , y )∈ N ( a ,b )
 I ( a, b) =
                              ∑        K ( I ( x, y ) − I (a, b); σ )
                       ( x , y )∈ N ( a ,b )


                                                 K: a decreasing
                                               Parameter σ controls
                                                     function
                                               the degree of
                                                    (typically
                                               anisotropicity of the
                                               smoothing
                                                    Gaussian)
                                               Central Pixel (a,b):
                                                 Neighborhood
                                                 N(a,b) around
                                                     (a,b)              28
Anisotropic Neighborhood filters:
               Problems
                            Sensitivity to
Sensitivity to the         the SIZE of the
  parameter σ              Neighborhood

                   Does not
                 account for
                   gradient
                 information
                                             29
Anisotropic Neighborhood filters:
            Problems
Treat pixels as independent samples




                                      30
Continuous Image Representation
     Interpolate in between the pixel values
                              ∑      K ( I ( x, y ) − I (a, b);σ ) I ( x, y )
                       ( x, y )∈ N ( a ,b )
        I ( a, b) =
                                    ∑         K ( I ( x, y ) − I (a, b);σ )
                              ( x, y )∈ N ( a ,b )




                 ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy
               N ( a ,b )
 I ( a, b) =
                            ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy
                      N ( a ,b )                                                31
Continuous Image Representation
                       Areas between
                       isocontours at
                    intensity α and α+Δ
                     (divided by area of
                      neighborhood)=
                        Pr(α<Intensity
                        <α+Δ|N(a,b))



                                   32
∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy
                      N ( a ,b )
       I ( a, b) =
                                   ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy
                             N ( a ,b )




             Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × α × K (α − I ( a, b); σ )
               Lim∆ → ∑ Pr
I ( a,(b) b) =
    I a, =            α α
               Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × K (α − I ( a, b); σ )
                 Lim∆ → ∑ Pr
                        α α

    Areas between isocontours:                     Published in
     contribute to weights for
             averaging.
                                                  EMMCVPR 2009
                                                                             33
Extension to RGB images




R (a, b), G (a, b), B (a, b)
                                     
   ∑) α Pr(α < ( R, G, B) < α + ∆ ) K (α − ( R, G, B);σ )
   (α
=                              = Area
        Joint Probability< of R,G,B (α − ( R,of,overlap of
             
   ∑ Pr(α < ( R, G, B) α + ∆ ) K
    
  (α )      isocontour pairs from R, G, B images
                                                G B ); σ )
                                                          34
Mean-shift framework
• A clustering method developed by
  Fukunaga& Hostetler (IEEE Trans. Inf.
  Theory, 1975).
• Applied to image filtering by Comaniciu and
  Meer (PAMI 2003).
• Involves independent update of each pixel by
  maximization of local estimate of probability
  density of joint spatial and intensity
  parameters.
                                              35
Mean-shift framework
• One step of mean-shift update around (a,b,c) where
  c=I(a,b).
                       ∑    (x, y, I(x, y))w(x, y)
                   (x, y)∈ N(a,b)
     ˆ ˆ ˆ
 1. (a, b, c) :=
                                ∑     w(x, y)
                            (x, y)∈ N(a,b)

   ˆ ˆ
  (a, b) ≡ updated center of the neighborhood , c ≡ updated intensity value
                                                ˆ
                  (x − a) 2 (y − b) 2 (I(x, y) − c) 2 
 w(x, y) := exp −          −         −                
                       σs2
                               σs2
                                           σr 2        
                                                      
                 ˆ ˆ ˆ
 2. (a, b, c) ⇐ (a, b, c)
 3. Repeat steps (1) and (2) till (a, b, c) stops changing.
 4. Set I(a, b) ⇐ c.
                  ˆ
                                                                          36
Our Method in Mean-shift Setting




 I(x,y)      X(x,y)=x    Y(x,y)=y


                                    37
Our Method in Mean-shift Setting
                                                                    Facets of tessellation
                                                                    induced by isocontours
                                                                    and the pixel grid
                                                               ( X k , Yk ) = Centroid of Facet #k.
                                                               I k = Intensity (from interpolated
                                                                image) at ( X k , Yk ) .
                                                                 Ak = Area of Facet #k.


                                       ∑      ( X k , Yk , I k ) A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ )
                                    k ∈ N ( a ,b )
( X (a, b), Y (a, b), I (a, b)) =
                                                 ∑        A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ )
                                             k ∈ N ( a ,b )
                                                                                                                        38
Experimental Setup: Grayscale Images
• Piecewise-linear interpolation used for our
  method in all experiments.
• For our method, Kernel K = pillbox kernel, i.e.
    K ( z; σ ) = 1 If |z| <= σ
    K ( z; σ ) = 0   If |z| >σ

• For discrete mean-shift, Kernel K = Gaussian.
• Parameters used: neighborhood radius ρ=3, σ=3.
• Noise model: Gaussian noise of variance 0.003
  (scale of 0 to 1).
                                                    39
Original
Image         Noisy Image




Denoised      Denoised
(Isocontour   (Gaussian Kernel
Mean Shift)   Mean Shift)




                                 40
Denoised      Denoised
Original   Noisy Image   (Isocontour   (Std.Mean
Image                    Mean Shift)   Shift)




                                                   41
Experiments on color images
• Use of pillbox kernels for our method.
• Use of Gaussian kernels for discrete mean
  shift.
• Parameters used: neighborhood radius ρ= 6,
σ = 6.
• Noise model: Independent Gaussian noise on
  each channel with variance 0.003 (on a scale
  of 0 to 1).
                                             42
Experiments on color images
• Independent piecewise-linear interpolation
  on R,G,B channels in our method.
• Smoothing of R, G, B values done by coupled
  updates using joint probabilities.




                                            43
Original        Noisy Image
Image




Denoised      Denoised
(Isocontour   (Gaussian Kernel
Mean Shift)   Mean Shift)




                                 44
Denoised      Denoised
Original   Noisy Image   (Isocontour   (Gaussian Kernel
Image                    Mean Shift)   Mean Shift)




                                                   45
Observations
• Discrete kernel mean shift performs poorly with
  small neighborhoods and small values of σ.
• Why? Small sample-size problem for kernel
  density estimation.
• Isocontour based method performs well even in
  this scenario (number of isocontours/facets >>
  number of pixels).
• Large σ or large neighborhood values not always
  necessary for smoothing.
                                               46
Observations
• Superior behavior observed when comparing
  isocontour-based neighborhood filters with
  standard neighborhood filters for the same
  parameter set and the same number of
  iterations.




                                           47
   Probability density function (pdf) estimation
    using isocontours
   Application to Image Registration
   Application to Image Filtering
   Circular/spherical density estimation in
    Euclidean space



                                                    48
 Examples of unit vector data:
1. Chromaticity vectors of color values:
                      ( R, G, B)
    ( r , g , b) =
                     R2 + G2 + B2

2. Hue (from the HSI color scheme) obtained
  from the RGB values.
               3 (G − B) 
    θ = arctan            
               2R − G − B 
                          
                                              49
Convert RGB values to unit                                          ( Ri , Gi , Bi )
                                      vi = (ri , g i , bi ) =
         vectors                                                   Ri2 + Gi2 + Bi2


 Estimate density of unit                            1
                                                            N

        vectors                             p (v ) =
                                                     N    ∑i= 1
                                                                  K(v; κ , vi )

 K ( v; κ , u ) = C p ( κ )e κ vT u
                                                  Other mixture
                                                   voMF popular
 K = von - Mises Fisher Kernel                   kernels: Watson,
                                                      models
 κ = concentration parameter                    Banerjee et al (JMLR
                                                      cosine.
 C p ( κ ) = normalization constant                    2005)
                                                                                       50
Density of
 Estimate density of RGB
(magnitude,chromaticity)
 using KDE/Mixture models               p ( m, v) = m 2 p ( R, G, B )
   using random-variable
                                       m = ) 2 + 2 B −G 2 2 B 2
        transformation ( R − Ri ) 2 + (G − Gi R ( + Bi ) +
                         
                 1 N
p ( R ,G , B ) =    ∑ exp −                                 
                                                            
                                   p (v) =v) = ∑ m 2 p ( R, G , B(κdmT vi )
                 N i=1                   σ 12 N∞      1
                                      p ( N ∫ 2π I (κ )    exp ) i v
      Density of chromaticity
   Density of chromaticity:                    i=1    o   i
                                                m= 0
    (integrate out magnitude) mi = wi , v 2 wi 2κ i = 2 mi
     conditioning on m=1.                        =    ,
                                        m=     R +m + B σ
                                                   G               2
                                                   i
  Projected normal estimator:
Variable bandwidthspheres”,
Watson,”Statistics on voMF KDE:          What’s new? The notion that
             1983,                       all estimation can proceed in
  Bishop, “Neural networks for                  Euclidean space.
 Small,”Therecognition” 2006.
   pattern statistical theory of
         shape”, 1995                                                    51
Estimate density of RGB using KDE/Mixture models

Use random variable transformation to get density of HSI
              (hue, saturation,intensity)

        Integrate out S,I to get density of hue




                                                      52
   Consistency between densities of Euclidean and
    unit vector data (in terms of random variable
    transformation/conditioning).
   Potential to use the large body of literature
    available for statistics of Euclidean data
    (example: Fast Gauss Transform Greengard et al
    (SIAM Sci. Computing 1991), Duraiswami et al
    (IJCV 2003).
   Model selection can be done in Euclidean space.

                                                      53

More Related Content

PDF
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
PDF
Test
PDF
Image Restitution Using Non-Locally Centralized Sparse Representation Model
PPTX
Kccsi 2012 a real-time robust object tracking-v2
PDF
PDF
Image formation
PDF
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
PDF
01 graphical models
CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: Isocontours, Registration
Test
Image Restitution Using Non-Locally Centralized Sparse Representation Model
Kccsi 2012 a real-time robust object tracking-v2
Image formation
CVPR2010: Advanced ITinCVPR in a Nutshell: part 5: Shape, Matching and Diverg...
01 graphical models

What's hot (19)

PDF
Design Approach of Colour Image Denoising Using Adaptive Wavelet
PDF
Test
PDF
A Novel Methodology for Designing Linear Phase IIR Filters
PDF
Biao Hou--SAR IMAGE DESPECKLING BASED ON IMPROVED DIRECTIONLET DOMAIN GAUSSIA...
PDF
Macrocanonical models for texture synthesis
PDF
Bouguet's MatLab Camera Calibration Toolbox for Stereo Camera
PDF
Bouguet's MatLab Camera Calibration Toolbox
PDF
Object Detection with Discrmininatively Trained Part based Models
PDF
Uncertainty in deep learning
PDF
Stability of adaptive random-walk Metropolis algorithms
PDF
Nonlinear Manifolds in Computer Vision
PDF
PDF
Gtti 10032021
PDF
Color Img at Prisma Network meeting 2009
PDF
Ben Gal
PDF
Nonparametric Density Estimation
PPT
Chapter 1 introduction (Image Processing)
PPTX
Snakes in Images (Active contour tutorial)
PDF
Continuous and Discrete-Time Analysis of SGD
Design Approach of Colour Image Denoising Using Adaptive Wavelet
Test
A Novel Methodology for Designing Linear Phase IIR Filters
Biao Hou--SAR IMAGE DESPECKLING BASED ON IMPROVED DIRECTIONLET DOMAIN GAUSSIA...
Macrocanonical models for texture synthesis
Bouguet's MatLab Camera Calibration Toolbox for Stereo Camera
Bouguet's MatLab Camera Calibration Toolbox
Object Detection with Discrmininatively Trained Part based Models
Uncertainty in deep learning
Stability of adaptive random-walk Metropolis algorithms
Nonlinear Manifolds in Computer Vision
Gtti 10032021
Color Img at Prisma Network meeting 2009
Ben Gal
Nonparametric Density Estimation
Chapter 1 introduction (Image Processing)
Snakes in Images (Active contour tutorial)
Continuous and Discrete-Time Analysis of SGD
Ad

Viewers also liked (7)

PDF
NIPS2009: Understand Visual Scenes - Part 1
PDF
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
PDF
ECCV2010 tutorial: statisitcal and structural recognition of human actions pa...
PPTX
ICCV2009: Max-Margin Ađitive Classifiers for Detection
PPS
789d600f 9574 4e73 977f 3f717cb0369a Les40ansdemadame
PPT
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1
PDF
ICCV2009: MAP Inference in Discrete Models: Part 1: Introduction
NIPS2009: Understand Visual Scenes - Part 1
CVPR2010: Advanced ITinCVPR in a Nutshell: part 6: Mixtures
ECCV2010 tutorial: statisitcal and structural recognition of human actions pa...
ICCV2009: Max-Margin Ađitive Classifiers for Detection
789d600f 9574 4e73 977f 3f717cb0369a Les40ansdemadame
ECCV2008: MAP Estimation Algorithms in Computer Vision - Part 1
ICCV2009: MAP Inference in Discrete Models: Part 1: Introduction
Ad

Similar to CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides (20)

PDF
Modern features-part-1-detectors
PDF
Lecture03
PDF
Test
KEY
Team meeting 100325
KEY
Team meeting 100325
PDF
NIPS2008: tutorial: statistical models of visual images
PDF
IGARSS_AMASM_woo_20110727.pdf
PDF
Discrete Models in Computer Vision
PDF
ICCV2009: MAP Inference in Discrete Models: Part 2
PDF
Image Processing 2
PDF
Signal Processing Course : Compressed Sensing
PDF
Bayesian Methods for Machine Learning
PDF
icml2004 tutorial on bayesian methods for machine learning
PDF
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
PDF
Lecture11
PDF
Plane rectification through robust vanishing point tracking using the expecta...
PDF
CVPR2010: higher order models in computer vision: Part 4
PDF
FABIA: Large Data Biclustering in Drug Design
PDF
Influence of Signal-to-Noise Ratio and Point Spread Function on Limits of Sup...
PDF
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...
Modern features-part-1-detectors
Lecture03
Test
Team meeting 100325
Team meeting 100325
NIPS2008: tutorial: statistical models of visual images
IGARSS_AMASM_woo_20110727.pdf
Discrete Models in Computer Vision
ICCV2009: MAP Inference in Discrete Models: Part 2
Image Processing 2
Signal Processing Course : Compressed Sensing
Bayesian Methods for Machine Learning
icml2004 tutorial on bayesian methods for machine learning
Estimation and Prediction of Complex Systems: Progress in Weather and Climate
Lecture11
Plane rectification through robust vanishing point tracking using the expecta...
CVPR2010: higher order models in computer vision: Part 4
FABIA: Large Data Biclustering in Drug Design
Influence of Signal-to-Noise Ratio and Point Spread Function on Limits of Sup...
Kernel based models for geo- and environmental sciences- Alexei Pozdnoukhov –...

More from zukun (20)

PDF
My lyn tutorial 2009
PDF
ETHZ CV2012: Tutorial openCV
PDF
ETHZ CV2012: Information
PDF
Siwei lyu: natural image statistics
PDF
Lecture9 camera calibration
PDF
Brunelli 2008: template matching techniques in computer vision
PDF
Modern features-part-4-evaluation
PDF
Modern features-part-3-software
PDF
Modern features-part-2-descriptors
PDF
Modern features-part-0-intro
PDF
Lecture 02 internet video search
PDF
Lecture 01 internet video search
PDF
Lecture 03 internet video search
PDF
Icml2012 tutorial representation_learning
PPT
Advances in discrete energy minimisation for computer vision
PDF
Gephi tutorial: quick start
PDF
EM algorithm and its application in probabilistic latent semantic analysis
PDF
Object recognition with pictorial structures
PDF
Iccv2011 learning spatiotemporal graphs of human activities
PDF
Icml2012 learning hierarchies of invariant features
My lyn tutorial 2009
ETHZ CV2012: Tutorial openCV
ETHZ CV2012: Information
Siwei lyu: natural image statistics
Lecture9 camera calibration
Brunelli 2008: template matching techniques in computer vision
Modern features-part-4-evaluation
Modern features-part-3-software
Modern features-part-2-descriptors
Modern features-part-0-intro
Lecture 02 internet video search
Lecture 01 internet video search
Lecture 03 internet video search
Icml2012 tutorial representation_learning
Advances in discrete energy minimisation for computer vision
Gephi tutorial: quick start
EM algorithm and its application in probabilistic latent semantic analysis
Object recognition with pictorial structures
Iccv2011 learning spatiotemporal graphs of human activities
Icml2012 learning hierarchies of invariant features

Recently uploaded (20)

PPTX
Cell Structure & Organelles in detailed.
PDF
Anesthesia in Laparoscopic Surgery in India
PPTX
GDM (1) (1).pptx small presentation for students
PPTX
human mycosis Human fungal infections are called human mycosis..pptx
PDF
Introduction-to-Social-Work-by-Leonora-Serafeca-De-Guzman-Group-2.pdf
PDF
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
PPTX
Open Quiz Monsoon Mind Game Final Set.pptx
PPTX
Week 4 Term 3 Study Techniques revisited.pptx
PDF
Open folder Downloads.pdf yes yes ges yes
PPTX
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
PDF
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
PDF
01-Introduction-to-Information-Management.pdf
PDF
Basic Mud Logging Guide for educational purpose
PPTX
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
PDF
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
PDF
Microbial disease of the cardiovascular and lymphatic systems
PPTX
Renaissance Architecture: A Journey from Faith to Humanism
PDF
Business Ethics Teaching Materials for college
PPTX
COMPUTERS AS DATA ANALYSIS IN PRECLINICAL DEVELOPMENT.pptx
PDF
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf
Cell Structure & Organelles in detailed.
Anesthesia in Laparoscopic Surgery in India
GDM (1) (1).pptx small presentation for students
human mycosis Human fungal infections are called human mycosis..pptx
Introduction-to-Social-Work-by-Leonora-Serafeca-De-Guzman-Group-2.pdf
grade 11-chemistry_fetena_net_5883.pdf teacher guide for all student
Open Quiz Monsoon Mind Game Final Set.pptx
Week 4 Term 3 Study Techniques revisited.pptx
Open folder Downloads.pdf yes yes ges yes
Introduction_to_Human_Anatomy_and_Physiology_for_B.Pharm.pptx
Origin of periodic table-Mendeleev’s Periodic-Modern Periodic table
01-Introduction-to-Information-Management.pdf
Basic Mud Logging Guide for educational purpose
school management -TNTEU- B.Ed., Semester II Unit 1.pptx
ANTIBIOTICS.pptx.pdf………………… xxxxxxxxxxxxx
Microbial disease of the cardiovascular and lymphatic systems
Renaissance Architecture: A Journey from Faith to Humanism
Business Ethics Teaching Materials for college
COMPUTERS AS DATA ANALYSIS IN PRECLINICAL DEVELOPMENT.pptx
Physiotherapy_for_Respiratory_and_Cardiac_Problems WEBBER.pdf

CVPR2010: Advanced ITinCVPR in a Nutshell: part 4: additional slides

  • 1. 1
  • 2. Probability density function (pdf) estimation using isocontours/isosurfaces  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 2
  • 3. Histograms Kernel Mixture model density estimate Parameter selection: Sample-based bin-width/bandwidth/number methods of components Bias/variance tradeoff: Do not treat a large bandwidth: high bias, signal as a low bandwidth: high variance) signal 3
  • 4. Continuous image representation: function I(x,y) at Trace out isocontours of the intensity using some interpolant. values. several intensity 4
  • 5. P (αLevel< α + Curves at Intensity α α + region < I Level ∆ α ) ∝ area of brown ∆ α Curves at Intensity α and
  • 6.  Assume a uniform 1 d   density on (x,y) p I (α ) =  | Ω | dα  ∫ dxdy    I ( x, y )≤ α  Random variable 1 du transformation from = |Ω | ∫ I2 + I2 I ( x, y )= α x y (x,y) to (I,u) Every point in the image Integrate out u along the u = direction to get domain contributes to densitylevel set the of intensity I the density. (dummy variable) Published in CVPR 2006, PAMI 2009. 6
  • 7. 7
  • 8. P (α 1 < I1 < α 1 at Intensity (α 1I 2 <+α∆ 2α + )∆ αI1 ) ∝ Level Curves + ∆ α 1 ,α 2 < , α 1 1 in 2 Level Curves at Intensity α 1 in I1 and α 2 in I 2 area ofα black region I 2 and ( 2 , α 2 + ∆ α 2 ) in 1 1 p I1 ,I 2 (α 1 , α 2 ) = |Ω | ∑ C | ∇ I1 ( x, y)∇ I 2 ( x, y) sin(θ ( x, y)) | C = {( x , y) | I1 ( x , y) = α 1 , I 2 ( x, y) = α 2 } θ ( x , y) = angle between gradients at ( x, y) Relationships between geometric and probabilistic entities. 8
  • 9. Similar density estimator developed by Kadir and Brady (BMVC 2005) independently of us.  Similar idea: several differences in implementation, motivation, derivation of results and applications. 9
  • 10. 1 du p I (α ) = |Ω | ∫ I ( x, y )= α | ∇ I ( x, y ) | 1 1 p I1 ,I 2 (α 1 , α 2 ) = |Ω | ∑ C | ∇ I1 ( x, y)∇ I 2 ( x , y) sin(θ ( x, y)) | C = {( x , y) | I1 ( x, y) = α 1 , I 2 ( x, y) = α 2 } Compute cumulative of the cumulative) do not exist Densities (derivatives where image gradients are (α < or where image interval measures. P zero, I < α + ∆ ) gradients run parallel. 10
  • 11. 11
  • 12. 102464 bins 256 bins 512 128 32 bins Standard histograms Isocontour Method 12
  • 13. 13
  • 14. Randomized/digital approximation to area calculation.  Strict lower bound on the accuracy of the isocontour method, for a fixed interpolant.  Computationally more expensive than the isocontour method. 14
  • 15. 128 x 128 bins 15
  • 16. Simplest one: linear interpolant to each half- pixel (level curves are segments).  Low-order polynomial interpolants: high bias, low variance.  High-order polynomial interpolants: low bias, high variance. 16
  • 17. Accuracy of estimated density Polynomial improves as signal is sampled with Interpolant finer resolution. Bandlimited analog signal, Nyquist-sampled digital Assumptions on signal: signal: better interpolant Accurate reconstruction by sincinterpolant! (Whitaker-Shannon Sampling Theorem) 17
  • 18. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 18
  • 19. •Mutual Information: Well known image find the Given two images of an object, to similarity measure Viola and Wells (IJCV geometric transformation that “best” 1995) and Maes et al (TMI 1997). aligns one with the other, w.r.t. some image similarity measure. Insensitive to illumination changes: useful in multimodality image registration 19
  • 20. Marginal MI ( I1 , I 2 ) = H ( I1 ) − H ( I1 | I 2 ) Probabilities = H ( I 1 ) + H ( I 2 ) − H ( I1 , I 2 ) p1 (α 1 ) H ( I1 ) = − ∑ p1 (α 1 ) log p1 (α 1 ) p2 (α 2 ) α1 H ( I 2 ) = − ∑ p2 (α 2 ) log p2 (α 2 ) Joint Probability α 2 p12(α 1 , α 2 ) H ( I1 , I 2 ) = − ∑ ∑ p12 (α 1 , α 2 ) log p12 (α 1 , α 2 ) α1 α 2 Marginal entropy H ( I1 ) Joint entropy H ( I1 , I 2 ) Conditional entropy H ( I1 | I 2 ) 20
  • 21. Hypothesis: If the alignment between images is optimal then Mutual information is maximum. I1 I2 Functions of Geometric Transformation p12(i, j) H ( I1 , I 2 ) MI( I1 , I 2 ) 21
  • 22. σ = 0.05 σ = 0 .2 σ = 0 .7 22
  • 23. σ = 0.05 2 7 32 bins 128 bins PVI=partial volume interpolation (Maes et al, TMI 1997) 23
  • 24. PD slice Warped andsliceslice slice WarpedNoisy T2 T2 T2 Brute force search for the maximum of MI 24
  • 25. MI with standard MI with our method histograms σ = 0 .7 Par. of affine transf. θ = 30, s = t = − 0.3, ϕ = 0 25
  • 26. Method Error in Error in Error in t Theta s(avg.,var. (avg., var.) (avg., var.) ) Histograms 3.7,18.1 0.7,0 0.43,0.08 (bilinear) 32 BINS Isocontours 0,0.06 0,0 0,0 PVI 1.9, 8.5 0.56,0.08 0.49,0.1 Histograms 0.3,49.4 0.7,0 0.2,0 (cubic) 2DPointProb 0.3,0.22 0,0 0,0 26
  • 27. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 27
  • 28. Anisotropic neighborhood filters (Kernel density based filters): Grayscale images ∑ K ( I ( x, y ) − I (a, b);σ ) I ( x, y ) ( x , y )∈ N ( a ,b ) I ( a, b) = ∑ K ( I ( x, y ) − I (a, b); σ ) ( x , y )∈ N ( a ,b ) K: a decreasing Parameter σ controls function the degree of (typically anisotropicity of the smoothing Gaussian) Central Pixel (a,b): Neighborhood N(a,b) around (a,b) 28
  • 29. Anisotropic Neighborhood filters: Problems Sensitivity to Sensitivity to the the SIZE of the parameter σ Neighborhood Does not account for gradient information 29
  • 30. Anisotropic Neighborhood filters: Problems Treat pixels as independent samples 30
  • 31. Continuous Image Representation Interpolate in between the pixel values ∑ K ( I ( x, y ) − I (a, b);σ ) I ( x, y ) ( x, y )∈ N ( a ,b ) I ( a, b) = ∑ K ( I ( x, y ) − I (a, b);σ ) ( x, y )∈ N ( a ,b ) ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) I ( a, b) = ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) 31
  • 32. Continuous Image Representation Areas between isocontours at intensity α and α+Δ (divided by area of neighborhood)= Pr(α<Intensity <α+Δ|N(a,b)) 32
  • 33. ∫ ∫ I ( x, y) K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) I ( a, b) = ∫ ∫ K ( I ( x, y) − I (a, b);σ )dxdy N ( a ,b ) Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × α × K (α − I ( a, b); σ ) Lim∆ → ∑ Pr I ( a,(b) b) = I a, = α α Lim∆ → 0 ∑0 Area(α < I < α + ∆ | N ( a, b)) × K (α − I ( a, b); σ ) Lim∆ → ∑ Pr α α Areas between isocontours: Published in contribute to weights for averaging. EMMCVPR 2009 33
  • 34. Extension to RGB images R (a, b), G (a, b), B (a, b)      ∑) α Pr(α < ( R, G, B) < α + ∆ ) K (α − ( R, G, B);σ ) (α =   = Area Joint Probability< of R,G,B (α − ( R,of,overlap of  ∑ Pr(α < ( R, G, B) α + ∆ ) K  (α ) isocontour pairs from R, G, B images G B ); σ ) 34
  • 35. Mean-shift framework • A clustering method developed by Fukunaga& Hostetler (IEEE Trans. Inf. Theory, 1975). • Applied to image filtering by Comaniciu and Meer (PAMI 2003). • Involves independent update of each pixel by maximization of local estimate of probability density of joint spatial and intensity parameters. 35
  • 36. Mean-shift framework • One step of mean-shift update around (a,b,c) where c=I(a,b). ∑ (x, y, I(x, y))w(x, y) (x, y)∈ N(a,b) ˆ ˆ ˆ 1. (a, b, c) := ∑ w(x, y) (x, y)∈ N(a,b) ˆ ˆ (a, b) ≡ updated center of the neighborhood , c ≡ updated intensity value ˆ  (x − a) 2 (y − b) 2 (I(x, y) − c) 2  w(x, y) := exp − − −   σs2 σs2 σr 2    ˆ ˆ ˆ 2. (a, b, c) ⇐ (a, b, c) 3. Repeat steps (1) and (2) till (a, b, c) stops changing. 4. Set I(a, b) ⇐ c. ˆ 36
  • 37. Our Method in Mean-shift Setting I(x,y) X(x,y)=x Y(x,y)=y 37
  • 38. Our Method in Mean-shift Setting Facets of tessellation induced by isocontours and the pixel grid ( X k , Yk ) = Centroid of Facet #k. I k = Intensity (from interpolated image) at ( X k , Yk ) . Ak = Area of Facet #k. ∑ ( X k , Yk , I k ) A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ ) k ∈ N ( a ,b ) ( X (a, b), Y (a, b), I (a, b)) = ∑ A k K( ( X k , Yk , I k ) − ( X (a, b), Y (a, b), I (a, b)) ; σ ) k ∈ N ( a ,b ) 38
  • 39. Experimental Setup: Grayscale Images • Piecewise-linear interpolation used for our method in all experiments. • For our method, Kernel K = pillbox kernel, i.e. K ( z; σ ) = 1 If |z| <= σ K ( z; σ ) = 0 If |z| >σ • For discrete mean-shift, Kernel K = Gaussian. • Parameters used: neighborhood radius ρ=3, σ=3. • Noise model: Gaussian noise of variance 0.003 (scale of 0 to 1). 39
  • 40. Original Image Noisy Image Denoised Denoised (Isocontour (Gaussian Kernel Mean Shift) Mean Shift) 40
  • 41. Denoised Denoised Original Noisy Image (Isocontour (Std.Mean Image Mean Shift) Shift) 41
  • 42. Experiments on color images • Use of pillbox kernels for our method. • Use of Gaussian kernels for discrete mean shift. • Parameters used: neighborhood radius ρ= 6, σ = 6. • Noise model: Independent Gaussian noise on each channel with variance 0.003 (on a scale of 0 to 1). 42
  • 43. Experiments on color images • Independent piecewise-linear interpolation on R,G,B channels in our method. • Smoothing of R, G, B values done by coupled updates using joint probabilities. 43
  • 44. Original Noisy Image Image Denoised Denoised (Isocontour (Gaussian Kernel Mean Shift) Mean Shift) 44
  • 45. Denoised Denoised Original Noisy Image (Isocontour (Gaussian Kernel Image Mean Shift) Mean Shift) 45
  • 46. Observations • Discrete kernel mean shift performs poorly with small neighborhoods and small values of σ. • Why? Small sample-size problem for kernel density estimation. • Isocontour based method performs well even in this scenario (number of isocontours/facets >> number of pixels). • Large σ or large neighborhood values not always necessary for smoothing. 46
  • 47. Observations • Superior behavior observed when comparing isocontour-based neighborhood filters with standard neighborhood filters for the same parameter set and the same number of iterations. 47
  • 48. Probability density function (pdf) estimation using isocontours  Application to Image Registration  Application to Image Filtering  Circular/spherical density estimation in Euclidean space 48
  • 49.  Examples of unit vector data: 1. Chromaticity vectors of color values: ( R, G, B) ( r , g , b) = R2 + G2 + B2 2. Hue (from the HSI color scheme) obtained from the RGB values.  3 (G − B)  θ = arctan   2R − G − B    49
  • 50. Convert RGB values to unit ( Ri , Gi , Bi ) vi = (ri , g i , bi ) = vectors Ri2 + Gi2 + Bi2 Estimate density of unit 1 N vectors p (v ) = N ∑i= 1 K(v; κ , vi ) K ( v; κ , u ) = C p ( κ )e κ vT u Other mixture voMF popular K = von - Mises Fisher Kernel kernels: Watson, models κ = concentration parameter Banerjee et al (JMLR cosine. C p ( κ ) = normalization constant 2005) 50
  • 51. Density of Estimate density of RGB (magnitude,chromaticity) using KDE/Mixture models p ( m, v) = m 2 p ( R, G, B ) using random-variable m = ) 2 + 2 B −G 2 2 B 2 transformation ( R − Ri ) 2 + (G − Gi R ( + Bi ) +  1 N p ( R ,G , B ) = ∑ exp −    p (v) =v) = ∑ m 2 p ( R, G , B(κdmT vi ) N i=1 σ 12 N∞ 1  p ( N ∫ 2π I (κ )  exp ) i v Density of chromaticity Density of chromaticity: i=1 o i m= 0 (integrate out magnitude) mi = wi , v 2 wi 2κ i = 2 mi conditioning on m=1. = , m= R +m + B σ G 2 i Projected normal estimator: Variable bandwidthspheres”, Watson,”Statistics on voMF KDE: What’s new? The notion that 1983, all estimation can proceed in Bishop, “Neural networks for Euclidean space. Small,”Therecognition” 2006. pattern statistical theory of shape”, 1995 51
  • 52. Estimate density of RGB using KDE/Mixture models Use random variable transformation to get density of HSI (hue, saturation,intensity) Integrate out S,I to get density of hue 52
  • 53. Consistency between densities of Euclidean and unit vector data (in terms of random variable transformation/conditioning).  Potential to use the large body of literature available for statistics of Euclidean data (example: Fast Gauss Transform Greengard et al (SIAM Sci. Computing 1991), Duraiswami et al (IJCV 2003).  Model selection can be done in Euclidean space. 53