CG Class 17, Mon 2018-10-22

1   Chapter 6 programs

  1. Chapter 6

    1. wireSphere: wire frame of recursively generated sphere
    2. shadedCube: rotating cube with modified Phong shading
    3. shadedSphere1: shaded sphere using true normals and per vertex shading
    4. shadedSphere2: shaded sphere using true normals and per fragment shading
    5. shadedSphere3: shaded sphere using vertex normals and per vertex shading
    6. shadedSphere4: shaded sphere using vertex normals and per fragment shading
    7. shadedSphereEyeSpace and shadedSphereObjectSpace show how lighting computations can be carried out in these spaces
  2. Summary of the new part of shadedCube:

    1. var nBuffer = gl.createBuffer();

      Reserve a buffer id.

    2. gl.bindBuffer( gl.ARRAY_BUFFER, nBuffer );

      1. Create that buffer as a buffer of data items, one per vertex.
      2. Make it the current buffer for future buffer operations.
    3. gl.bufferData( gl.ARRAY_BUFFER, flatten(normalsArray), gl.STATIC_DRAW );

      Write a array of normals, flattened to remove metadata, into the current buffer.

    4. var vNormal = gl.getAttribLocation( program, "vNormal" );

      Get the address of the shader (GPU) variable named "vNormal".

    5. gl.vertexAttribPointer( vNormal, 3, gl.FLOAT, false, 0, 0 );

      Declare that the current buffer contains 3 floats per vertex.

    6. gl.enableVertexAttribArray( vNormal );

      Enable the array for use.

    7. (in the shader) attribute vec3 vNormal;

      Declare the variable in the vertex shader that will receive each row of the javascript array as each vertex is processed.

    8. The whole process is repeated with the vertex positions.

      Note that the variable with vertex positions is not hardwired here. You pass in whatever data you want, and your shader program uses it as you want.

2   Available graphics HW

I have (courtesy of ECSE) for this class to play with:

  1. Vive VR fully immersive first-person experience system, in my lab JEC6115.
  2. Ricoh Theta V 360 degree camera, available to borrow.
  3. Older Okulus Rift DK2, available to borrow.
  4. Nvidia GPUs on geoxeon.ecse: GM200 GeForce GTX Titan X, GK110GL Tesla K20Xm.
  5. Nvidia GPU on parallel.ecse: GeForce GTX 1080.

See me if you're interested.

4   Computing surface normals

  1. For a curved surface, the normal vector at a point on the surface is the cross product of two tangent vectors at that point. They must not be parallel to each other.
  2. If it's a parametric surface, partial derivatives are tangent vectors.
  3. A mesh is a common way to approximate a complicated surface.
  4. For a mesh of flat (planar) pieces (facets):
    1. Find the normal to each facet.
    2. Average the normals of the facets around each vertex to get a normal vector at each vertex.
    3. Apply Phong (or Gouraud) shading from those vertex normals.

5   Textures

Today's big new idea.

  1. Textures started as a way to paint images onto polygons to simulate surface details. They add per-pixel surface details without raising the geometric complexity of a scene.
  2. That morphed into a general array data format with fast I/O.
  3. If you read a texture with indices that are fractions, the hardware interpolates a value, using one of several algorithms. This is called sampling. E.g., reading T[1.1,2] returns something like .9*T[1,2]+.1*T[2,2].
  4. Textures involve many coordinate systems:
    1. (x,y,z,w) - world.
    2. (u,v) - parameters on one polygon
    3. (s,t) - location in a texture.
  5. Aliasing is also important.

6   Chapter 9 slides

  1. 9_1 Buffers.

    Ignore anything marked old or deprecated.

    Not a lot of content in this file.

  2. 9_2 Bitblt.