CG Lecture 25, Wed 2016-11-09

  1. Tomorrow, Prof Cutler will tell us about her spring course. CSCI-4530-01 Advanced Computer Graphics.

  2. Aliasing and anti-

    1. The underlying image intensity, as a function of x, is a signal, f(x).
    2. When the objects are small, say when they are far away, f(x) is changing fast.
    3. To display the image, the system evaluates f(x) at each pixel. That is, f(x) is sampled at x=0,1,2,3,...
    4. If f(x), when Fourier transformed, has frequencies higher than 1/2 (cycle per pixel), then that sampling is too coarse to capture the signal. See the Nyquist sampling theorem.
    5. When this hi-freq signal is sampled at too low a frequency, then the result computed for the frame buffer will have visual problems.
    6. It's not just that you won't see the hi frequencies. That's obvious.
    7. Worse, you will see fake low frequency signals that were never in the original scene. They are called '''aliases''' of the hi-freq signals.
    8. These artifacts may jump out at you, because of the Mach band effect.
    9. Aliasing can even cause (in NTSC) rapid intensity changes to cause fake colors and vv.
    10. Aliasing can occur with time signals, like a movie of a spoked wagon wheel.
    11. This is like a strobe effect.
    12. The solution is to filter out the hi frequencies before sampling, or sample with a convolution filter instead of sampling at a point. That's called '''anti-aliasing'''.
    13. OpenGl solutions:
      1. Mipmaps.
      2. Compute scene on a higher-resolution frame buffer and average down.
      3. Consider pixels to be squares not points. Compute the fraction of each pixel covered by each object, like a line. Lines have to have finite width.
    14. Refs:
      1. http://en.wikipedia.org/wiki/Aliasing
      2. http://en.wikipedia.org/wiki/Clear_Type
      3. http://en.wikipedia.org/wiki/Wagon-wheel_effect
      4. http://en.wikipedia.org/wiki/Spatial_anti-aliasing (The H Freeman referenced worked at RPI for 10 years).
      5. http://en.wikipedia.org/wiki/Mipmap
      6. http://en.wikipedia.org/wiki/Jaggies
  3. Videos - military applications of graphics

    US Military's Futuristic Augmented Reality Battlefield - Augmented Immersive Team Trainer (AITT)

    Daqri's Smart Helmet Hands On

    HoloLens Review: Microsoft's Version of Augmented Reality

    US Military's Futuristic Augmented Reality Battlefield - Augmented Immersive Team Trainer (AITT)

    '''Modeling and simulation''' is a standard term.

  4. 10_5 Rendering the Mandelbrot Set.

    Shows the power of GPU programming.

    The program is in Chapter 10.

  5. 11_1 Framebuffer objects.

  6. 11_2 Render to texture.

  7. 11_3 Agent based models.

  8. If my order here looks a little chaotic, it's because the slides don't exactly align with the programs.

  9. More programs from Chapter 7.

    1. Several of the programs don't display on my laptop, so I won't show them. They include bumpmap and render.

    2. Cubet: enabling and disabling the depth buffer and drawing translucent objects.

    3. particleDiffusion: buffer ping ponging of 50 particles initially placed randomly and then moving randomly with their previous positions diffused as a texture.

      It renders to a texture instead of to a color buffer. Then it uses the texture.

      There are better descriptions on the web. Search for 'webgl render texture'. I'll work with the textbook code. Jumping back and forth is confusing, partly because they might use different utility files.

      There are often better descriptions on the web than in the textbook. I've considered running a course only with web material. What do you think?

      https://developer.mozilla.org/en-US/docs/Web/API/WebGL_API has good documentation.