Rendering%20Pipeline%20and%20Graphics%20Hardware - PowerPoint PPT Presentation

About This Presentation
Title:

Rendering%20Pipeline%20and%20Graphics%20Hardware

Description:

Collection of primitives, other objects. Associated matrix for transformations ... Up direction (ux, uy, uz) Aperture. Field of view (xfov, yfov) Film plane ... – PowerPoint PPT presentation

Number of Views:125
Avg rating:3.0/5.0
Slides: 53
Provided by: davidp93
Category:

less

Transcript and Presenter's Notes

Title: Rendering%20Pipeline%20and%20Graphics%20Hardware


1
Rendering Pipeline and Graphics Hardware
  • Aaron Bloomfield
  • CS 445 Introduction to Graphics
  • Fall 2006

2
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How is the rasterized scene kept in memory?
3
Framebuffers
  • So far weve talked about the physical display
    device
  • How does the interface between the device and the
    computers notion of an image look?
  • Framebuffer A memory array in which the computer
    stores an image
  • On most computers, separate memory bank from main
    memory (why?)
  • Many different variations, motivated by cost of
    memory

4
Framebuffers True-Color
  • A true-color (aka 24-bit or 32-bit) framebuffer
    stores one byte each for red, green, and blue
  • Each pixel can thus be one of 224 colors
  • Pay attention toEndian-ness
  • How can 24-bit and 32-bit mean the same thing
    here?

5
Framebuffers Indexed-Color
  • An indexed-color (8-bit or PseudoColor)
    framebuffer stores one byte per pixel (also GIF
    image format)
  • This byte indexes into a color map
  • How many colorscan a pixel be?
  • Still common on low-end displays (cell phones,
    PDAs,GameBoys)
  • Cute trick color-map animation

6
Framebuffers Hi-Color
  • Hi-Color was a popular PC SVGA standard
  • Packs pixels into 16 bits
  • 5 Red, 6 Green, 5 Blue
  • (why would green get more?)
  • Sometimes just 5,5,5
  • Each pixel can be one of 216 colors
  • Hi-color images can exhibit worse quantization
    artifacts than a well-mapped 8-bit image

7
(No Transcript)
8
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How does the graphics hardware process the
graphical display?
9
The Rendering Pipeline A Tour
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
10
The Parts You Know
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
11
The Rendering Pipeline
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
12
2-D Rendering Rasterization
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
  • Well talk about this soon

13
The Rendering Pipeline 3-D
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
14
The Rendering Pipeline 3-D
Scene graphObject geometry
  • Result
  • All vertices of scene in shared 3-D world
    coordinate system
  • Vertices shaded according to lighting model
  • Scene vertices in 3-D view or camera
    coordinate system
  • Exactly those vertices portions of polygons in
    view frustum
  • 2-D screen coordinates of clipped vertices

ModelingTransforms
LightingCalculations
ViewingTransform
Clipping
ProjectionTransform
15
The Rendering Pipeline 3-D
Scene graphObject geometry
  • Result
  • All vertices of scene in shared 3-D world
    coordinate system
  • Vertices shaded according to lighting model
  • Scene vertices in 3-D view or camera
    coordinate system
  • Exactly those vertices portions of polygons in
    view frustum
  • 2-D screen coordinates of clipped vertices

ModelingTransforms
LightingCalculations
ViewingTransform
Clipping
ProjectionTransform
16
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How do you transform the objects so they can be
displayed?
17
Rendering Transformations
  • So far, discussion has been in screen space
  • But model is stored in model space
  • (a.k.a. object space or world space)
  • Three sets of geometric transformations
  • Modeling transforms
  • Viewing transforms
  • Projection transforms

18
Rendering Transformations
  • Modeling transforms
  • Size, place, scale, and rotate objects parts of
    the model w.r.t. each other
  • Object coordinates ? world coordinates
  • The scene now has its origin at (0,0,0)

Y
Z
X
19
Rendering Transformations
  • Viewing transform
  • Rotate translate the world to lie directly in
    front of the camera
  • Typically place camera at origin
  • Typically looking down -Z axis
  • World coordinates ? view coordinates
  • The scene now has its origin at the camera

20
Rendering Transformations
  • Projection transform
  • Apply perspective foreshortening
  • Distant small the pinhole camera model
  • View coordinates ? screen coordinates
  • The scene is now in 2 dimensions

21
Rendering Transformations
  • All these transformations involve shifting
    coordinate systems (i.e., basis sets)
  • Matrices do that
  • Represent coordinates as vectors, transforms as
    matrices
  • Multiply matrices concatenate transforms!

22
Rendering Transformations
  • Homogeneous coordinates represent coordinates in
    3 dimensions with a 4-vector
  • Denoted x, y, z, wT
  • Note that w 1 in model coordinates
  • To get 3-D coordinates, divide by wx, y,
    zT x/w, y/w, z/wT
  • Transformations are 4x4 matrices
  • Why? To handle translation and projection
  • Well see this a bit more later in the semester

23
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How do we compute the radiance for each sample
ray?
24
The Rendering Pipeline 3-D
Scene graphObject geometry
  • Result
  • All vertices of scene in shared 3-D world
    coordinate system
  • Vertices shaded according to lighting model
  • Scene vertices in 3-D view or camera
    coordinate system
  • Exactly those vertices portions of polygons in
    view frustum
  • 2-D screen coordinates of clipped vertices

ModelingTransforms
LightingCalculations
ViewingTransform
Clipping
ProjectionTransform
25
Rendering Lighting
  • Illuminating a scene coloring pixels according
    to some approximation of lighting
  • Global illumination solves for lighting of the
    whole scene at once
  • Local illumination local approximation,
    typically lighting each polygon separately
  • Interactive graphics (e.g., hardware) does only
    local illumination at run time

26
Lighting Simulation
  • Lighting parameters
  • Light source emission
  • Surface reflectance
  • Atmospheric attenuation
  • Camera response

Light Source
Surface
Camera
27
Lighting Simulation
  • Local illumination
  • Ray casting
  • Polygon shading
  • Global illumination
  • Ray tracing
  • Monte Carlo methods
  • Radiosity methods

Light Source
Surface
N
More on these methods later!
Camera
28
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How do you only display those parts of the scene
that are visible?
29
The Rendering Pipeline 3-D
Scene graphObject geometry
  • Result
  • All vertices of scene in shared 3-D world
    coordinate system
  • Vertices shaded according to lighting model
  • Scene vertices in 3-D view or camera
    coordinate system
  • Exactly those vertices portions of polygons in
    view frustum
  • 2-D screen coordinates of clipped vertices

ModelingTransforms
LightingCalculations
ViewingTransform
Clipping
ProjectionTransform
30
Rendering Clipping
  • Clipping a 3-D primitive returns its intersection
    with the view frustum

31
Rendering Clipping
  • Clipping is tricky!
  • We will a lot more on clipping

In 3 vertices Out 6 vertices
Clip
In 1 polygon Out 2 polygons
Clip
32
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How is the 3D scene described in a computer?
33
The Rendering Pipeline 3-D
Model CameraParameters
Rendering Pipeline
Framebuffer
Display
34
Modeling The Basics
  • Common interactive 3-D primitives points, lines,
    polygons (i.e., triangles)
  • Organized into objects
  • Not necessarily in the OOP sense
  • Collection of primitives, other objects
  • Associated matrix for transformations
  • Instancing using same geometry for multiple
    objects
  • 4 wheels on a car, 2 arms on a robot

35
Modeling The Scene Graph
  • The scene graph captures transformations and
    object-object relationships in a DAG
  • Nodes are objects
  • Arcs indicate instancing
  • Each has a matrix

Robot
Body
Head
Arm
Trunk
Leg
Eye
Mouth
36
Modeling The Scene Graph
  • Traverse the scene graph in depth-first order,
    concatenating transformations
  • Maintain a matrix stack of transformations

Robot
Visited
Head
Body
Unvisited
Leg
Arm
Trunk
Eye
Mouth
MatrixStack
Active
Foot
37
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How is the viewing device described in a computer?
38
Modeling The Camera
  • Finally need a model of the virtual camera
  • Can be very sophisticated
  • Field of view, depth of field, distortion,
    chromatic aberration
  • Interactive graphics (OpenGL)
  • Camera pose position orientation
  • Captured in viewing transform (i.e., modelview
    matrix)
  • Pinhole camera model
  • Field of view
  • Aspect ratio
  • Near far clipping planes

39
Modeling The Camera
  • Camera parameters (FOV, etc) are encapsulated in
    a projection matrix
  • Homogeneous coordinates ? 4x4 matrix!
  • See OpenGL Appendix F for the matrix
  • The projection matrix pre-multiplies the viewing
    matrix, which pre-multiplies the modeling
    matrices
  • Actually, OpenGL lumps viewing and modeling
    transforms into modelview matrix

40
Camera Models
  • The most common model is pin-hole camera
  • All captured light rays arrive along paths toward
    focal point without lens distortion (everything
    is in focus)
  • Sensor response proportional to radiance

Other models consider ... Depth of field Motion
blur Lens distortion
41
Camera Parameters
  • Position
  • Eye position (px, py, pz)
  • Orientation
  • View direction (dx, dy, dz)
  • Up direction (ux, uy, uz)
  • Aperture
  • Field of view (xfov, yfov)
  • Film plane
  • Look at point
  • View plane normal

View Plane
Up direction
Look at Point
back
View direction
right
Eye Position
42
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

How can the front-most surface be found with an
algorithm?
43
Visible Surface Determination
  • The color of each pixel on the view planedepends
    on the radiance emanating from visible surfaces

View plane
Eye position
44
Ray Casting
  • For each sample
  • Construct ray from eye position through view
    plane
  • Find first surface intersected by ray through
    pixel
  • Compute color of sample based on surface radiance

45
Ray Casting
  • For each sample
  • Construct ray from eye position through view
    plane
  • Find first surface intersected by ray through
    pixel
  • Compute color of sample based on surface radiance

46
Visible Surface Determination
  • For each sample
  • Construct ray from eye position through view
    plane
  • Find first surface intersected by ray through
    pixel
  • Compute color of sample based on surface radiance

More efficient algorithms utilize spatial
coherence!
47
Rendering Algorithms
  • Rendering is a problem in sampling and
    reconstruction!

48
Overview
  • Framebuffers
  • Rendering Pipeline
  • Transformations
  • Lighting
  • Clipping
  • Modeling
  • Camera
  • Visible Surface Determination
  • History

Whats the history of computer graphics hardware?
49
Graphical Hardware Companies
  • In the beginning there was SGI
  • and they remained the king for 15 years
  • Are now in bankruptcy protection
  • Why buy a really expensive server when you can
    get a PC that is almost as fast, but 1/10th the
    cost?
  • NVidia and ATI provide high-end graphics cards
    for PCs
  • ATI tens to focus more on increasing the number
    of triangles rendered per frame
  • NVidia tends to focus more on adding new
    graphical capabilities
  • So researchers use it more

50
A much older graphics pipeline
  • SGI Onyx2
  • From 1997 or so
  • A fully configured system could easily run 100k
  • A 200 graphics card today can perform 2-3 times
    as much
  • In all fairness, the Onyx2 had a lot of
    advantages

51
(No Transcript)
52
Summary
  • Major issues in 3D rendering
  • 3D scene representation
  • 3D viewer representation
  • Visible surface determination
  • Lighting simulation
  • Concluding note
  • Accurate physical simulation is complex and
    intractable
  • Rendering algorithms apply many approximations
    to simplify representations and computations
Write a Comment
User Comments (0)
About PowerShow.com