![]() |
Murl Engine Lua Addon API
Version 1.0 beta
|
The IFrameBuffer graph node interface.
Normally, all geometry contained in the scene graph gets rendered to the back buffer provided by the platform, which gets presented to the user once per frame. For certain purposes however, it can be necessary not to render directly to the back buffer but to an off-screen area:
A frame buffer represents a render target to such an off-screen area. To be able to access the generated contents, a frame buffer must refer to at least one Graph::ITexture, which holds the generated image after rendering to the frame buffer is complete, and serves as the pixel input for a later render stage.
Depending on which information is actually needed for such a later stage, one or more different texture attachment points can be used for a given frame buffer, e.g. the color target texture receives actual RGBA pixel color values, and a depth target texture receives pixel depth values.
Often, only color values are needed later but the rendering process requires an active depth buffer for correct display. In such a case, it is not necessary to create and attach a depth texture; instead, it is sufficient to explicitly set a depth buffer format to create a depth buffer that is only used internally.
To use a frame buffer for rendering, one or more Graph::IView nodes must refer to this frame buffer; after activating a Graph::ICamera referring to such a view, all subsequent geometry is then rendered to that frame buffer, with the given view and camera settings.
Note, that when multiple textures are attached, all of these textures must have the same dimensions, or initialization will fail.
Get the constant Graph::INode interface. This method returns a constant pointer to the node's Graph::INode interface, to be able to query common node properties such as active state, visibility or ID.
Murl.Graph.INode GetNodeInterface()
Get the constant Graph::ITextureNodeTarget color buffer container. This method returns a constant pointer to the node's Graph::ITextureNodeTarget container to query the referenced texture node used for storing the frame buffer's output color values.
Murl.Graph.IGenericNodeTarget.GraphITexture GetColorTextureNodeTarget()
Get the constant Graph::ITextureNodeTarget depth buffer container. This method returns a constant pointer to the node's Graph::ITextureNodeTarget container to query the referenced texture node used for storing the frame buffer's output depth values.
Murl.Graph.IGenericNodeTarget.GraphITexture GetDepthTextureNodeTarget()
Get the constant Graph::ITextureNodeTarget stencil buffer container. This method returns a constant pointer to the node's Graph::ITextureNodeTarget container to query the referenced texture node used for storing the frame buffer's output stencil values.
Murl.Graph.IGenericNodeTarget.GraphITexture GetStencilTextureNodeTarget()
Explicitly set the format of the depth buffer. If no texture is specified as a target for storing depth buffer values, the explicit depth buffer format is used to create an offscreen buffer. In that case, the depth buffer is only used for rendering internally and cannot be accessed from the outside.
Boolean SetDepthBufferFormat(Murl.IEnums.DepthBufferFormat format)
format | The explicit depth buffer format. |
Get the explicit depth buffer format.
Murl.IEnums.DepthBufferFormat GetDepthBufferFormat()
Explicitly set the format of the stencil buffer. If no texture is specified as a target for storing stencil buffer values, the explicit stencil buffer format is used to create an offscreen buffer. In that case, the stencil buffer is only used for rendering internally and cannot be accessed from the outside.
Boolean SetStencilBufferFormat(Murl.IEnums.StencilBufferFormat format)
format | The explicit stencil buffer format. |
Get the explicit stencil buffer format.
Murl.IEnums.StencilBufferFormat GetStencilBufferFormat()
Set the framebuffer's orientation. By default, the framebuffer has this value set to IEnums::ORIENTATION_ROTATE_0, so all views referring to this framebuffer as a render target will render their contents 'upright'. For certain cases, like post-processing in image space, it may be necessary that the framebuffer receives its contents in the same orientation as the back buffer. This can be achieved by calling this method with IEnums::ORIENTATION_DEFAULT.
Boolean SetRendererOrientation(Murl.IEnums.Orientation orientation)
orientation | One of the four values NORMAL, ROTATE_CW, FLIP or ROTATE_CCW to define a fixed framebuffer orientation, or DEFAULT to select the main back buffer's orientation. |
Get the framebuffer's orientation.
Murl.IEnums.Orientation GetRendererOrientation()
Enable/disable color de-linearization when writing to the color buffer, if available. By default, a fragment shader's color output RGB values are written to the output color buffer without any conversion. If the shader performs operations on linear color values and the target color buffer is a regular integer RGB(A) texture or render buffer, the output will be too dark, as the target color buffer expects gamma-corrected values. In this case, de-linearization should be enabled on the frame buffer, so that the usual gamma value of 2.2 is applied to the output pixels. Note, that floating point color buffers are always linear, so this has no effect.
Boolean SetDelinearizationEnabled(Boolean enabled)
enabled | If true, color de-linearization is enabled. |
Check if color de-linearization during rendering is enabled.
Boolean IsDelinearizationEnabled()
Enable/disable automatic MIP map generation.
Boolean SetMipMapGenerationEnabled(Boolean enabled)
enabled | If true, automatic MIP map generation is enabled. |
Check if automatic MIP map generation is enabled.
Boolean IsMipMapGenerationEnabled()
Set the target texture layer when rendering to an array texture. By default, the output is rendered to layer 0. This value is ignored for non-array textures.
Boolean SetTargetLayer(Integer layer)
layer | The target layer. |
Get the target texture layer for rendering.
Integer GetTargetLayer()
Set the target MIP level when rendering to (a) mip-mapped texture(s). By default, the output is rendered to a texture's base level, i.e. level=0. This value is ignored if automatic MIP map generation is enabled via SetMipMapGenerationEnabled().
Boolean SetTargetMipLevel(Integer level)
level | The target MIP level. |
Get the target MIP level for rendering.
Integer GetTargetMipLevel()
Set the number of samples for multisample anti-aliasing. By default, the number of samples is set to 1, i.e. multisampling is disabled. A value of 0 indicates that the global number of samples is to be used, which can be configured via IEngineConfiguration::SetNumberOfAntiAliasSamples(). A value greater than 1 enables multisampling for this frame buffer, with the given number of samples clamped to the maximum allowed value indicated by the GPU.
Boolean SetNumberOfSamples(Integer numSamples)
numSamples | The number of samples. |
Get the number of samples for multisample anti-aliasing.
Integer GetNumberOfSamples()
Set the frame buffers's absolute sort order. Frame buffers are generally processed in the order in which they receive drawables during rendering. If a frame buffer depends on another frame buffer, which should be updated before it is used, it is often desired to specify an explicit order in which the frame buffers are processed globally. Setting a higher sort order results in the frame buffer always being processed after all frame buffers with a lower order have been updated. Frame buffers with the same sort order are processed in the order in which they get filled with drawables. Note: The back buffer is always processed last.
Boolean SetSortOrder(Integer sortOrder)
sortOrder | The global order of this frame buffer. |
Get the frame buffers's global sort order.
Integer GetSortOrder()
Get the frame buffer width.
Integer GetSizeX()
Get the frame buffer height.
Integer GetSizeY()
Set the input coordinate reference size. Generally, input coordinates are represented by values ranging from -1.0 to 1.0. As a convenience, this method can be used to set an arbitrary reference size for both dimensions of the frame buffer; this way it is possible to specify integer (pixel) values instead of (quite unreadable) floats. By default, both sizeX and sizeY are set to 1.0. See also Graph::IButton::SetOutCoordSize() as the counterpart providing the actual input coordinates.
Boolean SetInCoordSize(Number sizeX, Number sizeY)
sizeX | The horizontal coordinate reference size. |
sizeY | The vertical coordinate reference size. |
Set the horizontal input coordinate reference size. See SetInCoordSize().
Boolean SetInCoordSizeX(Number sizeX)
sizeX | The horizontal coordinate reference size. |
Set the vertical input coordinate reference size. See SetInCoordSize().
Boolean SetInCoordSizeY(Number sizeY)
sizeY | The vertical coordinate reference size. |
Get the horizontal input coordinate reference size. See SetInCoordSize().
Number GetInCoordSizeX()
Get the vertical input coordinate reference size. See SetInCoordSize().
Number GetInCoordSizeY()
Set the frame buffer's input coordinate range. This method sets the frame buffer's start and end input coordinates.
Boolean SetInCoord(Number x1, Number y1, Number x2, Number y2)
x1 | The start coordinate in X direction. |
y1 | The start coordinate in Y direction. |
x2 | The end coordinate in X direction. |
y2 | The end coordinate in Y direction. |
Set the frame buffer's input coordinate start. See SetInCoord().
Boolean SetInCoord1(Number x1, Number y1)
x1 | The start coordinate in X direction. |
y1 | The start coordinate in Y direction. |
Set the frame buffer's input coordinate end. See SetInCoord().
Boolean SetInCoord2(Number x2, Number y2)
x2 | The end coordinate in X direction. |
y2 | The end coordinate in Y direction. |
Set the frame buffer's horizontal start input coordinate. See SetInCoord().
Boolean SetInCoordX1(Number x1)
x1 | The start coordinate in X direction. |
Set the frame buffer's vertical start input coordinates. See SetInCoord().
Boolean SetInCoordY1(Number y1)
y1 | The start coordinate in Y direction. |
Set the frame buffer's horizontal end input coordinate. See SetInCoord().
Boolean SetInCoordX2(Number x2)
x2 | The end coordinate in X direction. |
Set the frame buffer's vertical end input coordinates. See SetInCoord().
Boolean SetInCoordY2(Number y2)
y2 | The end coordinate in Y direction. |
Get the frame buffer's horizontal start input coordinate. See SetInCoord().
Number GetInCoordX1()
Get the frame buffer's vertical start input coordinate. See SetInCoord().
Number GetInCoordY1()
Get the frame buffer's horizontal end input coordinate. See SetInCoord().
Number GetInCoordX2()
Get the frame buffer's vertical end input coordinate. See SetInCoord().
Number GetInCoordY2()
Get the frame buffer's number of stages.
Integer GetNumberOfStages()
Get the frame buffer's internal input screen area object.
Murl.Input.IScreenArea GetInputScreenAreaObject(Integer stage)
stage | The stage to query. |