Rendering Pipeline and Graphics Hardware Aaron Bloomfield CS
Rendering Pipeline and Graphics Hardware Aaron Bloomfield CS 445: Introduction to Graphics Fall 2006
Overview Ø n n n n Framebuffers How is the rasterized Rendering Pipeline scene kept in memory? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 2
Framebuffers n n n So far we’ve talked about the physical display device How does the interface between the device and the computer’s notion of an image look? Framebuffer: A memory array in which the computer stores an image n n On most computers, separate memory bank from main memory (why? ) Many different variations, motivated by cost of memory 3
Framebuffers: True-Color n n A true-color (aka 24 -bit or 32 -bit) framebuffer stores one byte each for red, green, and blue Each pixel can thus be one of 224 colors Pay attention to Endian-ness How can 24 -bit and 32 -bit mean the same thing here? 4
Framebuffers: Indexed-Color n n n An indexed-color (8 -bit or Pseudo. Color) framebuffer stores one byte per pixel (also: GIF image format) This byte indexes into a color map: How many colors can a pixel be? Still common on low-end displays (cell phones, PDAs, Game. Boys) Cute trick: color-map animation 5
Framebuffers: Hi-Color n n Hi-Color was a popular PC SVGA standard Packs pixels into 16 bits: n 5 Red, 6 Green, 5 Blue n n (why would green get more? ) Sometimes just 5, 5, 5 Each pixel can be one of 216 colors Hi-color images can exhibit worse quantization artifacts than a well-mapped 8 -bit image 6
7
Overview n Ø n n n n Framebuffers How does the graphics hardware Rendering Pipeline process the graphical display? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 8
The Rendering Pipeline: A Tour Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters Rendering Pipeline Framebuffer Display 9
The Parts You Know Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters Rendering Pipeline Framebuffer Display 10
The Rendering Pipeline Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters Rendering Pipeline Framebuffer Display 11
2 -D Rendering: Rasterization Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters n Rendering Pipeline We’ll talk about this soon… Framebuffer Display 12
The Rendering Pipeline: 3 -D Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters Rendering Pipeline Framebuffer Display 13
The Rendering Pipeline: 3 -D Scene graph Object geometry Result: Modeling Transforms • All vertices of scene in shared 3 -D “world” coordinate system Lighting Calculations • Vertices shaded according to lighting model Viewing Transform • Scene vertices in 3 -D “view” or “camera” coordinate system Clipping Projection Transform • Exactly those vertices & portions of polygons in view frustum • 2 -D screen coordinates of clipped vertices 14
The Rendering Pipeline: 3 -D Scene graph Object geometry Result: Modeling Transforms • All vertices of scene in shared 3 -D “world” coordinate system Lighting Calculations • Vertices shaded according to lighting model Viewing Transform • Scene vertices in 3 -D “view” or “camera” coordinate system Clipping Projection Transform • Exactly those vertices & portions of polygons in view frustum • 2 -D screen coordinates of clipped vertices 15
Overview n n Ø n n n Framebuffers How do you transform the objects Rendering Pipeline so they can be displayed? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 16
Rendering: Transformations n n So far, discussion has been in screen space But model is stored in model space n n (a. k. a. object space or world space) Three sets of geometric transformations: n n n Modeling transforms Viewing transforms Projection transforms 17
Rendering: Transformations n Modeling transforms n n n Size, place, scale, and rotate objects parts of the model w. r. t. each other Object coordinates world coordinates The scene now has it’s origin at (0, 0, 0) Y Y X Z 18
Rendering: Transformations n Viewing transform n Rotate & translate the world to lie directly in front of the camera n n Typically place camera at origin Typically looking down -Z axis World coordinates view coordinates The scene now has it’s origin at the camera 19
Rendering: Transformations n Projection transform n Apply perspective foreshortening n n n Distant = small: the pinhole camera model View coordinates screen coordinates The scene is now in 2 dimensions 20
Rendering: Transformations n All these transformations involve coordinate systems (i. e. , basis sets) n n Matrices do that Represent coordinates as vectors, transforms as matrices éX ¢ù écos q ê ¢ú = ê q ëY û ësin n shifting -sin q ù éX ù ú ê ú cos q û ëY û Multiply matrices = concatenate transforms! 21
Rendering: Transformations n Homogeneous coordinates: represent coordinates in 3 dimensions with a 4 -vector n Denoted [x, y, z, w]T n n Note that w = 1 in model coordinates To get 3 -D coordinates, divide by w: [x’, y’, z’]T = [x/w, y/w, z/w]T Transformations are 4 x 4 matrices Why? To handle translation and projection n We’ll see this a bit more later in the semester 22
Overview n n n Ø n n n Framebuffers How do we compute the Rendering Pipeline radiance for each sample ray? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 23
The Rendering Pipeline: 3 -D Scene graph Object geometry Result: Modeling Transforms • All vertices of scene in shared 3 -D “world” coordinate system Lighting Calculations • Vertices shaded according to lighting model Viewing Transform • Scene vertices in 3 -D “view” or “camera” coordinate system Clipping Projection Transform • Exactly those vertices & portions of polygons in view frustum • 2 -D screen coordinates of clipped vertices 24
Rendering: Lighting n Illuminating a scene: coloring pixels according to some approximation of lighting n n n Global illumination: solves for lighting of the whole scene at once Local illumination: local approximation, typically lighting each polygon separately Interactive graphics (e. g. , hardware) does only local illumination at run time 25
Lighting Simulation n Lighting parameters n n Light source emission Surface reflectance Atmospheric attenuation Camera response Light Source Surface N N Camera 26
Lighting Simulation n Local illumination n Light Source Ray casting Polygon shading Global illumination n Ray tracing Monte Carlo methods Radiosity methods N Surface N N More on these methods later! Camera 27
Overview n n Ø n n Framebuffers How do you only display those Rendering Pipeline parts of the scene that are visible? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 28
The Rendering Pipeline: 3 -D Scene graph Object geometry Result: Modeling Transforms • All vertices of scene in shared 3 -D “world” coordinate system Lighting Calculations • Vertices shaded according to lighting model Viewing Transform • Scene vertices in 3 -D “view” or “camera” coordinate system Clipping Projection Transform • Exactly those vertices & portions of polygons in view frustum • 2 -D screen coordinates of clipped vertices 29
Rendering: Clipping n Clipping a 3 -D primitive returns its intersection with the view frustum: 30
Rendering: Clipping n Clipping is tricky! n We will a lot more on clipping Clip In: 3 vertices Out: 6 vertices Clip In: 1 polygon Out: 2 polygons 31
Overview n n n Ø n n n Framebuffers How is the 3 D scene Rendering Pipeline described in a computer? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 32
The Rendering Pipeline: 3 -D Transform Illuminate Transform Clip Project Rasterize Model & Camera Parameters Rendering Pipeline Framebuffer Display 33
Modeling: The Basics n n Common interactive 3 -D primitives: points, lines, polygons (i. e. , triangles) Organized into “objects” n n Not necessarily in the OOP sense Collection of primitives, other objects Associated matrix for transformations Instancing: using same geometry for multiple objects n 4 wheels on a car, 2 arms on a robot 34
Modeling: The Scene Graph n n n The scene graph captures transformations and object-object relationships in a DAG Nodes are objects; Arcs indicate instancing n Each has a matrix Robot Head Mouth Body Eye Leg Trunk Arm 35
Modeling: The Scene Graph n n Traverse the scene graph in depth-first order, concatenating transformations Maintain a matrix stack of transformations Robot Visited Head Body Unvisited Active Matrix Stack Mouth Eye Leg Trunk Arm Foot 36
Overview n n n Ø n n Framebuffers How is the viewing device Rendering Pipeline described in a computer? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 37
Modeling: The Camera n Finally: need a model of the virtual camera n Can be very sophisticated n n Field of view, depth of field, distortion, chromatic aberration… Interactive graphics (Open. GL): n n Camera pose: position & orientation n Captured in viewing transform (i. e. , modelview matrix) Pinhole camera model n Field of view n Aspect ratio n Near & far clipping planes 38
Modeling: The Camera n Camera parameters (FOV, etc) are encapsulated in a projection matrix n n n Homogeneous coordinates 4 x 4 matrix! See Open. GL Appendix F for the matrix The projection matrix pre-multiplies the viewing matrix, which pre-multiplies the modeling matrices n Actually, Open. GL lumps viewing transforms into modelview matrix and modeling 39
Camera Models n The most common model is pin-hole camera n n All captured light rays arrive along paths toward focal point without lens distortion (everything is in focus) Sensor response proportional to radiance View plane Other models consider. . . Depth of field Motion blur Lens distortion Eye position (focal point) 40
Camera Parameters n Position n n Orientation n View direction (dx, dy, dz) Up direction (ux, uy, uz) Up direction Aperture n n Eye position (px, py, pz) Field of view (xfov, yfov) Film plane n n “Look at” point View plane normal View Plane “Look at” Point back View directi o n right Eye Position 41
Overview n n n n Ø n Framebuffers How can the front-most surface Rendering Pipeline be found with an algorithm? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 42
Visible Surface Determination n The color of each pixel on the view plane depends on the radiance emanating from visible surfaces Rays through view plane Simplest method is ray casting View plane Eye position 43
Ray Casting n For each sample … n n n Construct ray from eye position through view plane Find first surface intersected by ray through pixel Compute color of sample based on surface radiance 44
Ray Casting n For each sample … n n n Construct ray from eye position through view plane Find first surface intersected by ray through pixel Compute color of sample based on surface radiance 45
Visible Surface Determination n For each sample … n n n Construct ray from eye position through view plane Find first surface intersected by ray through pixel Compute color of sample based on surface radiance More efficient algorithms utilize spatial coherence! 46
Rendering Algorithms n Rendering is a reconstruction! problem in sampling and 47
Overview n n n n Ø Framebuffers What’s the history of computer Rendering Pipeline graphics hardware? Transformations Lighting Clipping Modeling Camera Visible Surface Determination History 48
Graphical Hardware Companies n In the beginning there was SGI n n … and they remained the king for 15+ years Are now in bankruptcy protection Why buy a really expensive server when you can get a PC that is almost as fast, but 1/10 th the cost? NVidia and ATI provide high-end graphics cards for PCs n n ATI tens to focus more on increasing the number of triangles rendered per frame NVidia tends to focus more on adding new graphical capabilities n So researchers use it more 49
A much older graphics pipeline n SGI Onyx 2 n n From 1997 or so A fully configured system could easily run 100+k A $200 graphics card today can perform 2 -3 times as much In all fairness, the Onyx 2 had a lot of advantages… 50
51
Summary n Major issues in 3 D rendering n n n 3 D scene representation 3 D viewer representation Visible surface determination Lighting simulation Concluding note n Accurate physical simulation is complex and intractable n Rendering algorithms apply many approximations to simplify representations and computations 52
- Slides: 52