Digital Image Processing Part 1 - PowerPoint PPT Presentation

1 / 64
About This Presentation
Title:

Digital Image Processing Part 1

Description:

Digital image processing involves computer manipulation and interpretation. ... The cost of digital computers very high and ... (Remote Sensing Tutorial) ... – PowerPoint PPT presentation

Number of Views:770
Avg rating:3.0/5.0
Slides: 65
Provided by: richar119
Category:

less

Transcript and Presenter's Notes

Title: Digital Image Processing Part 1


1
Digital Image ProcessingPart 1
2
Topics
  • Analog vs. Digital Images
  • Image Resolution
  • Image Pre-Processing
  • Image Enhancement
  • Image Classification
  • Data Merging and GIS Integration
  • Hyperspectral Image Analysis
  • Biophysical Modeling
  • Image Transmission and Compression

3
Introduction
  • Digital image processing involves computer
    manipulation and interpretation.
  • Origins in the 1960s
  • A limited number of researchers analyzing limited
    airborne multispectral scanner data and digitized
    aerial photographs.
  • The launch of Landsat-1 in 1972 was responsible
    for making digital image data widely available
    at that time
  • The theory and practice of digital image
    processing in its infancy
  • The cost of digital computers very high and
    efficiency very low
  • Today, low cost efficient hardware software
    readily available.

4
Introduction
  • Digital image data sources are many and varied,
    such as
  • Earth resource satellite systems
  • Meteorological satellites
  • Airborne scanner data
  • Airborne digital camera data
  • Photogrammatic scanners

5
Analog verses Digital
  • Photographs are analogue (or analog) images
  • These represent a scale model of the feature one
    wants to record
  • Once these images are developed, no further
    processing can easily be done.

6
Analog verses Digital
  • Digital images, on the other hand, are a
    collection of discrete values for each pixel, or
    position on the image
  • A black and white image would have a brightness
    level for each pixel
  • Digital images are easier than analog images to
    manipulate by computational means, and distribute
    to amongst different computational facilities as
    electronic files

7
Image Resolution
  • Photographic film resolution is based on being
    able to distinguish two objects from each other
  • Film resolution is the threshold line spacing
    between dark and light lines that can be
    distinguished from each other, e.g., 50
    light-dark line pairs per centimeter.
  • These lines projected onto ground are the ground
    resolution distance (GRD)

8
Image Resolution
  • Digital image resolution is the number of pixels
    per linear scale, e.g., 50 pixels per inch, or
    dots per inch (dpi)
  • Pixels adjacent to each other of the same shading
    or color cant be individually distinguished.
  • Digital image resolution must be divided by two
    to be comparable with photographic film
    resolution.

9
Image Resolution
  • Another parameter of digital image resolution is
    the range of grayscale values or color range for
    the pixels
  • A 2-bit image has 22 8 values
  • An 8-bit image has 28 256 values

http//hosting.soonet.ca/eliris/remotesensing/bl13
0intro.htm
10
Image Processing
  • Basic image processing is a four step process
  • Image rectification and restoration
  • Image enhancement
  • Image classification
  • Data merging and GIS integration

(Virtual Science Centre)
11
Image Processing
  • Digital image processing is a broad field and
    mathematically complex
  • Central idea is simple and straightforward
  • Each digital image pixel is input into the
    computer
  • Pixel information is mathematically processed for
    different results
  • Results form a new digital image that may be
    displayed or recorded in pictorial format or
    processed further.

12
Image Processing
  • Additional processing categories, after the basic
    four, are
  • Hyperspectral image analysis
  • Biophysical modeling
  • Image transmission and compression

13
Image Rectification and Restoration
  • The intent is to correct image data for
    distortions or degradations that stem from the
    image acquisition process and conditions.
  • Procedures vary with digital image acquisition
    type, for example
  • Digital camera
  • Along-track scanner
  • Across-track scanner
  • Procedures also vary with airborne versus
    satellite imagery and total field of view.
  • The following corrections are discussed
  • Geometric correction
  • Radiometric correction
  • Noise removal

14
Image Rectification and Restoration Geometric
Correction
  • Image geometric distortions can be significant
    and must be corrected before usage as a map base.
  • Some distortions are
  • Sensor platform variations in altitude, attitude,
    and velocity
  • Oblique viewing angles
  • Earth curvature
  • Atmospheric refraction
  • Relief displacement
  • Sensor sweep nonlinearities
  • Skewing due to Earths west to east rotation
  • Remedy to de-skew by offsetting each successive
    scan line slightly to west.
  • 2-D coordinate transformation equations
    geometrically correct coordinates and distorted
    image coordinates.
  • Also data is transformed to conform to specific
    map projection system.

15
Image Rectification and Restoration Radiometric
Correction
  • Radiance measured by a system over given object
    is influenced by changes in scene illumination,
    atmospheric conditions, viewing geometry, and
    instrument response characteristics.
  • Combined influence of solar zenith angle and
    Earth-Sun distance given by
  • Where
  • E normalized solar radiance
  • E0 solar irradiance at mean Earth-Sun distance
  • ?0 Suns angle from the zenith
  • d Earth-Sun distance, in astronomical units

16
Image Rectification and RestorationRadiometric
Correction (concluded)
  • Influence of solar illumination variation is
    compounded by atmospheric effects, such as
  • Attenuation of solar energy traveling through it.
  • Acts as a reflector/scatterer, thus generating
    noise.
  • where
  • Ltot total spectral radiance measured by sensor
  • ? reflectance of object
  • E irradiance of object
  • T transmission of atmosphere
  • Lp path radiance

17
Image Rectification and RestorationDropped Lines
  • Dropped lines are missing data that occur in the
    sensor response or data recording and
    transmission which loses a row of pixels in the
    image.
  • A remedy is to sample the neighboring data and
    then fill in with an average value

http//hosting.soonet.ca/eliris/remotesensing/bl13
0intro.htm
18
Image Rectification and RestorationNoise Removal
  • Noise can be long term drifting (low frequency)
    where pixels have a bias.
  • Several sensors are compared and the anomalous
    one properly offset to compensate.
  • Random noise is usually of small spatial or
    temporal extent (high frequency).
  • Remedy is to compare an off- valued pixel with
    its neighbors.
  • Small window of a few pixels 3x3 or 5x5 is
    sampled.
  • Mathematical averaging calculated using pixel
    values in sample window.
  • Central pixel replaced with average value.

(ccrs.nrcam)
19
Image Rectification and RestorationDropped Lines
  • Another error common in multi-spectral imagery
    are stripping, or banding.
  • Stripping or banding occur while the sensor scans
    lines.
  • The result can be dissimilar data stripes when
    joined into an image.
  • Smoothing or Fourier transform operations can
    remedy these.

http//hosting.soonet.ca/eliris/remotesensing/bl13
0intro.htm
20
Image Enhancement
  • Image enhancement eases visual interpretation by
    increasing the apparent distinction between the
    features in a scene.
  • Pixel values are manipulated to achieve this.
  • The most commonly employed digital enhancement
    techniques are
  • Contrast manipulation
  • Gray-level thresholding, level slicing, and
    contrast stretching
  • Spatial feature manipulation
  • Spatial filtering, convolution, edge enhancement,
    and Fourier analysis.
  • Multi-image manipulation
  • Multispectral band ratioing and differencing,
    principal components, canonical components,
    vegetation components, intensity-hue-saturation
    (IHS) color space transformations, and
    decorrelation stretching.

21
Image Enhancement Contrast Manipulation
  • Gray-level thresholding
  • Level slicing
  • Contrast stretching

22
Image EnhancementGray-Level Thresholding
(Image Thresholding)
Threshold at 65
Threshold at 80
23
Image Enhancement
(SPOT before)
(SPOT after)
  • SPOT Systeme Pour lObserviation de la Terre
  • Use linear gray level stretching
  • Pixel below threshold value mapped to zero.
  • Other pixel values mapped between 0 and 255 (28).

(Virtual Science Centre)
24
Image EnhancementLevel Slicing
  • Enhancement technique whereby digital numbers
    (DNs) are distributed along the x-axis of image
    histogram into a series of analyst-specified
    intervals or slices.
  • All DNs falling within a given interval in input
    image are displayed as a single DN in the output
    image.
  • Establishment of six different levels would give
    six different gray scales in output image.
  • Each level could be shown as a different color.
  • Level slicing used extensively in display of
    thermal infrared images to show discrete
    temperature ranges in gray scale or color.

25
Image EnhancementPractical Aspects of Level
Slicing
Tri-level slicing used to quantify height of
objects
Procedure 1.Subtract out ramp 2.Use range image
to quantify height of object
0 -1 -2
R
Rprojected
Building puts modulation on range ramp
R
26
Image EnhancementLevel Slicing (Concluded)
http//www.personal.psu.edu/users/k/j/kjv115/Geog
2035220quiz204.htm
27
Image EnhancementContrast Stretching
  • In raw imagery, useful data often populates only
    a small portion of the available range of digital
    values--usually up to 8 bits or 28 256 levels.
  • Contrast enhancement involves changing the
    original values so that more of the available
    range is used.
  • The key to understanding contrast enhancement is
    to understand the image histogram
  • Graphical representation of the brightness values
    that comprise an image.
  • Brightness values vs. frequency of occurrence
    shown.

(ccrs.nrcan)
28
Image EnhancementContrast Stretching
(Concluded)
Before

After stretch

stretch
(ccrs.nrcan)
29
Image EnhancementSpatial Feature Manipulation
  • Spatial filtering
  • Convolution
  • Edge enhancement
  • Fourier analysis

30
Image EnhancementSpatial Filtering Methodology
  • While spectral filters block or pass energy over
    various spectral ranges, spatial filters
    emphasize or de-emphasize image data of various
    spatial repetitiveness, or frequencies.
  • Spatial frequency refers to roughness of tonal
    variations in an image.
  • Areas of high spatial frequencies are tonally
    rough.
  • Gray levels in these areas change abruptly over a
    small number of pixels.
  • An analogy is a pebble beach.
  • Areas of low spatial frequencies are tonally
    smooth.
  • Gray levels vary only gradually over a relatively
    large number of pixels.
  • An analogy is a smooth sand beach.
  • Low pass filters are designed to emphasize low
    frequency features, and high pass filters do just
    the opposite.
  • Spatial filtering is accomplished by a local
    operation.
  • Pixel values in an original image are modified on
    the basis of gray level scales of neighboring
    pixels.

31
Image EnhancementSpatial Filtering
Low pass filtering generalizes image.
High pass filtering highlights abrupt
discontinuities.
(Remote Sensing Tutorial)
32
Image EnhancementConvolution
  • Convolving image involves the following
  • A moving established window that contains an
    array of coefficients or weighting factors.
  • These arrays are called operators, or kernels
  • The arrays normally are an odd number of pixels
    (3 x 3, 5 x 5, 7 x 7)
  • The kernel is moved throughout the original
    image, and DN at the center of the kernel in a
    second (convoluted) image is obtained by
    multiplying each coefficient in kernel by
    corresponding DN in original image and adding
    resulting products.
  • Operation is performed for each pixel in original
    image
  • Convolving image results in averaging values in
    moving window.
  • Influence of convolution dependent on size of
    kernel and values of coefficients used in the
    kernel.
  • Center-weight
  • Uniform weight
  • Gaussian weight

33
Image EnhancementEdge Enhancement
  • Directional, or edge detection filters are used
    to highlight linear features such as roads or
    field boundaries.
  • The Sobel Edge Extractor

(ccrs.nrcan)
Have sx (A2 A0) 2(A3 A7) (A4 A6) gt
0 sy (A0 A6) 2(A1 A5) (A2 A4) gt 0 If
sx gt 0 and sy gt 0, dot is printed in center
pixel location.
34
Image EnhancementFourier Analysis
  • Process involves transforming the spatial domain
    image into the frequency domain with a two
    dimensional Fourier transform.
  • The result is a mapping of image feature
    locations into feature repetitive
    characteristics.
  • In this domain, frequency characteristics can be
    modified.
  • An inverse Fourier transform maps the image back
    into the spatial domain.

35
Image EnhancementFourier Analysis
Introductory Digital Image Processing
36
Image EnhancementFourier Analysis
Introductory Digital Image Processing
37
Image EnhancementFourier Analysis
Introductory Digital Image Processing
38
Image EnhancementFourier Analysis
Introductory Digital Image Processing
39
Image EnhancementFourier Analysis
Spatial domain image on left, and transformed
frequency domain image on right. (Mini Project)
40
Image Enhancement
Fourier Analysis
  • Noise added on upper left, with its frequency
    domain plot on upper right.
  • Result in frequency space to right.
  • (Mini Project)

41
Image Enhancement Fourier Analysis
  • Low pass filtering in frequency domain to picture
    on left.
  • High pass filtering to picture on right.
  • (Mini Project)

42
Image EnhancementMulti-Image Manipulation
Single images can convey much information but
when two complimentary images are combined, the
resulting information is better than the simple
individual contributions. Some combinational
techniques are
  • Multispectral band ratioing and differencing
  • Principal components / Canonical components
  • Vegetation components
  • Intensity-hue-saturation (IHS) color space
    transformations
  • Decorrelation stretching

43
Image EnhancementMultispectral Band Ratioing
  • Common transform applied to image data.
  • Ratio data from two different spectral bands.
  • Resultant image enhances variations in the slopes
    of the spectral reflectance.
  • Example
  • Healthy vegetation reflects in near-IR but
    absorbs in visible red.
  • Other surface types show near equal reflectance
    in the two.
  • (Band 7) / (Band 5) would give ratio much greater
    than 1.0 for vegetation and about 1.0 for other
    covers.
  • ? Discrimination of vegetation greatly enhanced.

44
Image EnhancementMultispectral Band Ratioing
Normalized Difference Vegetation Index
(ccrs.nrcan)
45
Image EnhancementMultispectral Band Ratioing
  • Image created by ratio of Band 4/Band 2.
  • Band 4 is NIR, Band 2 is visible green.
  • Image seen clearly because very little
    correlation between bands 4 and 2.
  • Image created by ratio of Band 1/Band 2.
  • Band 2 is visible green, Band 1 is visible blue.
  • Image not clear because high correlation between
    Bands 1 and 2 (appears noisy).

(Image Thresholding)
46
Image EnhancementMultispectral Band Ratioing
(Concluded)
  • Image-to-image ratioing applied to identifying
    landslides induced by the Chi-Chi earthquake.
  • Above is image-to-image GCPs selection.
    (Leftpre-earthquake, Rightpost-earthquake)
  • Pre- (left) and post-earthquake (right)
    band-ratioed (IR/R) images.

(Image Thresholding)
47
Image EnhancementPrincipal/Canonical Components
(ccrs.nrcan)
  • Different spectral bands are often highly
    correlated
  • They contain similar information
  • Landsat MSS Bands 4 and 5 (green and red)
    typically have similar appearance.
  • Image transformation based on complex processing
    can reduce data redundancy and correlation.

48
Image EnhancementPrincipal/Canonical Components
  • Principal components analysis can reduce the
    number of bands in the data and compress much of
    the information in the original bands into fewer
    bands.
  • Seven band thematic mapper (TM) data set is
    transformed such that the first three principal
    components contain over 90 of data in original
    seven bands.

49
Image EnhancementPrincipal/Canonical Components
(Cont.)
  • First PC (Morro Bay)
  • Maximum amount of variation in 7-dimensional
    space defined by seven Thematic Mapper bands.
  • Image produced from PC 1 data commonly resembles
    actual aerial photograph.
  • Histogram shows two peaks, with one on left being
    the ocean pixels and the other being land pixels.

(Remote Sensing Tutorial)
50
Image EnhancementPrincipal/Canonical Components
(Cont.)
  • Second PC with stretching
  • histogram equalization
  • Latter produces a histogram where space between
    most frequent values is increased and less
    frequent values are combined and compressed.
  • Without transformation, image would be tonally
    flat.
  • Two gray levels defining most of land surfaces.
  • One gray level defining ocean
  • Ocean breakers nicely displayed

(Remote Sensing Tutorial)
51
Image EnhancementPrincipal/Canonical Components
(Cont.)
  • PC of Morro Bay
  • PC with stretching histogram equalization

(Remote Sensing Tutorial)
52
Image EnhancementPrincipal/Canonical Components
(Cont.)
  • For the Morro Bay TM scene there are 7 spectral
    bands
  • Each pixel has 7 values
  • Pixel in row i, column j of the image is a vector
  • x(i,j,1) x(i,j,2) x(i,j,3) x(i,j,4) x(i,j,5)
    x(i,j,6) x(i,j,7)
  • x(i,j,1) is the value of band 1 in row i, column
    j
  • x(i,j,2) is the value of band 2 in row i, column
    j, etc
  • A linear combination of these values would look
    like
  • Multiplication and addition is carried out for
    each of the picture elements, pixels, in the
    image.

53
Image EnhancementPrincipal/Canonical Components
(Concluded)
Principal Components Analysis Transforms set of
correlated variables into set of uncorrelated
variables.
(Remote Sensing Tutorial)
First Principal Component

Second Principal Component
54
Image EnhancementPrincipal Component Analysis
Summary
  • With Landsat and others, several of the TM bands,
    especially 1, 2, and 3, are strongly correlated.
  • Variations in one band are closely matched in the
    others.
  • Tonal patterns or gray levels may not show enough
    differences to separate features that have
    similar responses in each band.
  • Principal Components Analysis shifts the axes
    that show strong correlations.
  • New spatial positions cause significant
    differences (decorrelation) in gray levels from
    band to band, thus there is discrimination.
  • New images contain the influence of all bands
    being considered for cross-correlations.
  • Special processing method known as decorrelation
    stretching takes three PCA images (usually 1, 2,
    3), manipulates, and then transforms into an RGB
    (red-green-blue) image that is stretched.
  • Decorrelation effect transferred back into more
    conventional image.

55
Image EnhancementPrincipal Component Analysis
and Decorrelation Stretching
  • The top TM image uses bands 7, 4, and 2 (R.B.G).
  • Bottom image has undergone a decorrelation
    stretch.
  • Some of vegetated areas now appearing in
    different intensities.
  • Mountains depicting changes much better because
    decorrelation stretched image changing view of
    mountains.

(Lab 6 Image in Colorado)
56
Image EnhancementMulti-Image Manipulation
Vegetation Components
  • In addition to AVHRR, numerous other forms of
    linear data transformations have been developed
    for vegetation monitoring.
  • Differing sensors vegetation dictate different
    transformations.
  • Kauth and Thomas (1976) derived linear
    transformation of four Landsat MSS bands.
  • Established four new axes in spectral data
    (vegetation components)
  • Useful for agricultural crop monitoring
  • Tassled cap transformation rotates the MSS data
    such that majority of information is contained in
    two components or features that are directly
    related to physical scene characteristics.
  • Brightness is weighted sum of all bands and
    defined in direction of principal variation in
    soil reflectance.
  • Greenness is approximately orthogonal to
    brightness and is a contrast between the near-IR
    and the visible bands.
  • Brightness and greenness express about 95 of the
    total variability in MSS data.

57
Image EnhancementVegetation Components
(Concluded)
58
Image EnhancementIntensity-Hue-Saturation Color
Space Transformation
  • Intensity refers to total brightness of color.
  • Hue relates to dominant or average wavelength of
    light contributing to color.
  • Saturation refers to purity of color relative to
    gray.
  • Pink has low saturation
  • Crimson has high saturation
  • Transforming RGB components into IHS components
    may provide more control over color enhancements.
  • IHS components can be varied independently.

59
Image EnhancementIntensity-Hue-Saturation Color
Space Transformation (Continued)
  • Intensity - brightness of the color
  • I R G B
  • Hue - dominant wavelength
  • H G - B/I - 3B
  • Saturation - pureness of the color
  • Pastels have intermediate saturation values
  • One set of several transformation equations that
    are used is SI-3B/I
  • A common procedure to increase the richness
    (saturation) of the color in an image is to apply
    the IHS transform to the data, stretch the
    saturation values, and return to RGB space and
    view the image.

60
Image EnhancementIntensity-Hue-Saturation Color
Space Transformation (Continued)
  • RGB
  • Black has no color, so it is at (0,0,0).
  • White has maximum color, so it is at (255, 255,
    255)
  • RGB are at (255, 0, 0), (0, 255, 0), (0, 0,
    255).
  • Yellow, magenta, and red have maximum amounts of
    two primaries.


61
Image EnhancementIntensity-Hue-Saturation Color
Space Transformation (Continued)
  • HIS or IHS
  • Cone shape has one central axis representing
    intensity.
  • Black at one end, white at other, gray in
    between.
  • Intensity goes up as progress toward whites.
  • Hues determined by angular location.
  • Saturation or richness of color defined as
    distance perpendicular to intensity axis.


62
Image EnhancementIntensity-Hue-Saturation Color
Space Transformation (Example)
  • Example Need to change color of bright yellow
    car but leave highlights and shadows unaffected
  • Difficult task in RGB
  • Relatively simple task in IHS
  • Yellow pixels of car have specific range of hue
    regardless of intensity or saturation.
  • Pixels can be easily isolated and their hue
    component modified, giving a different colored
    car.
  • Almost all digital image processing systems
    operate on RGB images, so process would be
    accomplished in three steps
  • Original RGB image converted to HIS
  • Hue (or saturation or intensity) modified
  • Image converted back to RGB

63
References
  • Remote Sensing Tutorial http//rst.gsfc.nasa.gov
  • Image Interpretation and Analysishttp//www.ccrs.
    nrcan.gc.ca/ccrs/eduref/tutorial/chap4/c4p6e.html
  • Geographic Information Systemshttp//www.usgs.gov
    /research/gis/application
  • Image Thresholdinghttp//www.cs.hut.fi/papyrus/Fi
    lters.html
  • Lab 6 Image in Coloradohttp//everest.hunter.cuny
    .edu/jcox/rslab6.html
  • Color Spaceshttp//www-viz.tamu.edu/faculty/parke
    /ends489f00/notes/sec1_4.html

64
References
  • Mini Project 2 Space domain (operator) and
    frequency domain (FFT) filtering.http//www.scien
    ce.gmu.edu/yxing/759_5/project2.html
  • Yu-Chuan Kuo, Hui-Chung Yeh,Ke-Sheng Cheng,
    Chia-Ming Liou, and Ming-Tung Wu, Identification
    of Landslides Induced by Chi-Chi Earthquake using
    Spot Multispectral Images. http//www.gisdevelopme
    nt.net/aars/acrs/2000/ts12/laus0005pf.htm
  • Jensen, John R., Introductory Digital Image
    Processing, Prentice Hall
  • Tortosa, Delio, GeoForum, Remote Sensing Course
    http//hosting.soonet.ca/eliris/remotesensing/bl1
    30intro.htm
Write a Comment
User Comments (0)
About PowerShow.com