OpenGL renders to framebuffers. By default OpenGL renders to screen, the default framebuffer that commonly contains a color and a depth buffer. This is great for many purposes where a pipeline consists of a single pass, a pass being a sequence of shaders. For instance a simple pass can have only a vertex and a fragment shader.
For more complex graphical effects or techniques, such as shadows or deferred rendering, multiple passes are often required, where the outputs of a pass are inputs of the following pass, for instance as textures. In this context, instead of rendering to screen, and then copying the result to a texture it would be much nicer to render to texture directly. The figure shows a two pass pipeline, where the first produces three textures that are used in the second pass to compose the final image. This is one of the advantages of framebuffer objects: we can render to multiple outputs in a single pass.
Besides, rendering to screen requires the outputs to be of a displayable format, which is not always the case in a multipass pipeline. Sometimes the textures produced by a pass need to have a floating point format which does not translate directly to colors, for instance the speed of a particle in meters per second.
In this short tutorial we will see how a framebuffer object can be created, and used with shaders. A demo is also provided with full source code, and a VS 2010 solution.