![]() |
Murl Engine Lua Addon API
Version 1.0 beta
|
The ITexture graph node interface.
This interface represents a generic node representing a texture for rendering.
Murl.Graph.IStateSlot
Murl.Graph.IStateUnit
Get the constant Graph::INode interface. This method returns a constant pointer to the node's Graph::INode interface, to be able to query common node properties such as active state, visibility or ID.
Murl.Graph.INode GetNodeInterface()
Get the constant container holding the optional child textures. This method returns a constant pointer to the node's Graph::ITextureNodeTarget sub container, which is used to store multiple sub-textures.
Murl.Graph.IGenericNodeTarget.GraphITexture GetSubTextureNodeTarget()
Get a constant Graph::IImageResourceTarget container. This method returns a constant pointer to a Graph::IImageResourceTarget container, which allows to query the image resources referenced by a node implementing this interface.
Murl.Graph.IGenericResourceTarget.ResourceIImage GetImageResourceTarget()
Manually set a video stream for a given layer and target. This method can be used to supply a manually created video stream as the texture's pixel source. Note, that you cannot supply both an image resource and a video stream for the same target. Note also, that the user is responsible for correct destruction of the given stream. The layer parameter must specify a layer in the range from 0 to GetNumberOfLayers()-1.
Boolean SetVideoStream(Murl.IEnums.TextureTarget target, Integer layer, Murl.IVideoStream stream)
target | The texture target (flat, or one of the 6 cube map sides) |
layer | The texture layer |
stream | The video stream to apply. |
Manually set a video stream for layer 0 and a given target. This method can be used to supply a manually created video stream as the texture's pixel source. Note, that you cannot supply both an image resource and a video stream for the same target. Note also, that the user is responsible for correct destruction of the given stream.
Boolean SetVideoStream(Murl.IEnums.TextureTarget target, Murl.IVideoStream stream)
target | The texture target (flat, or one of the 6 cube map sides) |
stream | The video stream to apply. |
Get the video stream for layer 0 and a given target.
Murl.IVideoStream GetVideoStream(Murl.IEnums.TextureTarget target)
target | The texture target to query |
Get the video stream for a given layer and target.
Murl.IVideoStream GetVideoStream(Murl.IEnums.TextureTarget target, Integer layer)
layer | The texture layer |
target | The texture target to query |
Set the texture type. For a generic node implementing this interface, the actual texture type may be set using this method. For specialized implementations that implicitly set the type (like flat textures or cube maps), this method always returns false.
Boolean SetType(Murl.IEnums.TextureType type)
type | One of the available texture types. |
Get the texture type.
Murl.IEnums.TextureType GetType()
Set the number of texture layers. If the node's type is not an array texture (flat or cubemap), this method returns false. See SetType().
Boolean SetNumberOfLayers(Integer numLayers)
numLayers | The number of array layers. |
Get the number of texture layers. If the node's type is not an array texture (flat or cubemap), this method always returns 1. See SetType().
Integer GetNumberOfLayers()
Set the MIP map generation mode. By default, the MIP map generation mode is set to IEnums::MIP_MAP_GENERATION_MODE_FAST.
Boolean SetMipMapGenerationMode(Murl.IEnums.MipMapGenerationMode mode)
mode | The MIP map generation mode. |
Get the texture type.
Murl.IEnums.MipMapGenerationMode GetMipMapGenerationMode()
Set the texure's dimensions. A given value has no effect if a positive non-zero scale factor is defined for the respective axis via SetAutoScaleFactor().
Boolean SetSize(Integer sizeX, Integer sizeY)
sizeX | The texture width in pixels. |
sizeY | The texture height in pixels. |
Get the texture's base width. This returns the base width of the texture (at MIP level 0), which is either defined via SetSize(), SetAutoScaleFactor() or implicitly from a given image resource when none of the previous value is defined. A possible prescale factor is not considered.
Integer GetSizeX()
Get the texture's base height. See GetSizeX().
Integer GetSizeY()
Set the texure's auto scale factors. By default, the auto scale factors for both axes are set to 0.0, and the texture's dimensions match the values given via SetSize(). If any of the given scale factors is a positive non-zero value, the actual texture dimension for the respective axis is calculated from the current output surface dimension multiplied by that factor. In this case, a size value set via SetSize() has no effect. The current output surface size is retrieved via IAppConfiguration::GetDisplaySurfaceSizeX() and IAppConfiguration::GetDisplaySurfaceSizeY(). This is useful for e.g. post-processing frame buffer textures, which are supposed to match the current output display dimensions. For example, to create a texture that is half as wide and equal in height as the current output surface, specify a value of 0.5 for scaleX and a value of 1.0 for scaleY.
Boolean SetAutoScaleFactor(Number scaleX, Number scaleY)
scaleX | The width scale factor. |
scaleY | The height scale factor. |
Get the texture's width.
Number GetAutoScaleFactorX()
Get the texture's height.
Number GetAutoScaleFactorY()
Enable/disable the alpha channel for this texture.
Boolean SetAlphaEnabled(Boolean enabled)
enabled | If true, the alpha channel should be used. |
Check if the alpha channel is enabled for this texture.
Boolean IsAlphaEnabled()
Enable/disable mip-mapping for this texture.
Boolean SetMipMappingEnabled(Boolean enabled)
enabled | If true, mip-maps are enabled. |
Check if mip-mapping is enabled for this texture.
Boolean IsMipMappingEnabled()
Enable/disable prescaling for this texture. In the IEngineConfiguration, an application may define a power-of-2 texture prescale factor that can be used to e.g. scale down texture resources depending on a device's actual screen resolution. However, in certain cases it may not be desired to prescale all textures (e.g. when using shadow maps); for such textures, prescaling can be disabled using this method.
Boolean SetPrescalingEnabled(Boolean enabled)
enabled | If true, prescaling is enabled. |
Check if prescaling is enabled for this texture.
Boolean IsPrescalingEnabled()
Set the texture's actual pixel format.
Boolean SetPixelFormat(Murl.IEnums.PixelFormat pixelFormat)
pixelFormat | The pixel format to use. |
Get the texture's actual pixel format.
Murl.IEnums.PixelFormat GetPixelFormat()
Set the texture's wrap mode in X direction.
Boolean SetWrapModeX(Murl.IEnums.TextureWrapMode mode)
mode | The wrap mode. |
Get the texture's wrap mode in X direction.
Murl.IEnums.TextureWrapMode GetWrapModeX()
Set the texture's wrap mode in Y direction.
Boolean SetWrapModeY(Murl.IEnums.TextureWrapMode mode)
mode | The wrap mode. |
Get the texture's wrap mode in Y direction.
Murl.IEnums.TextureWrapMode GetWrapModeY()
Set the texture's wrap mode in Z direction.
Boolean SetWrapModeZ(Murl.IEnums.TextureWrapMode mode)
mode | The wrap mode. |
Get the texture's wrap mode in Z direction.
Murl.IEnums.TextureWrapMode GetWrapModeZ()
Set the texture filter used for magnification. Valid magFilter values are restricted to TEXTURE_FILTER_NEAREST and TEXTURE_FILTER_LINEAR.
Boolean SetMagFilter(Murl.IEnums.TextureFilter magFilter)
magFilter | The filter to use. |
Get the texture filter used for magnification.
Murl.IEnums.TextureFilter GetMagFilter()
Set the texture filter used for minification. Valid minFilter values are restricted to TEXTURE_FILTER_NEAREST and TEXTURE_FILTER_LINEAR.
Boolean SetMinFilter(Murl.IEnums.TextureFilter minFilter)
minFilter | The filter to use. |
Get the texture filter used for minification.
Murl.IEnums.TextureFilter GetMinFilter()
Set the texture filter used for mip-level selection.
Boolean SetMipFilter(Murl.IEnums.TextureFilter mipFilter)
mipFilter | The filter to use. |
Get the texture filter used for mip-level selection.
Murl.IEnums.TextureFilter GetMipFilter()
Set the depth compare mode, if the pixel format defines a depth texture.
Boolean SetDepthTestMode(Murl.IEnums.DepthTestMode mode)
mode | The depth compare mode to use. |
Get the depth compare mode.
Murl.IEnums.DepthTestMode GetDepthTestMode()
Set the depth test function, if the pixel format defines a depth texture and the depth compare mode is not NONE.
Boolean SetDepthTestFunction(Murl.IEnums.DepthTestFunction function)
function | The depth test function to use. |
Get the depth test function.
Murl.IEnums.DepthTestFunction GetDepthTestFunction()
Set the maximum anisotropy for filtering. By default, a maximum anisotropy value of 0.0 is defined. In this case, the global value defined via Murl::IEngineConfiguration::SetDefaultMaxTextureAnisotropy() is used. If set to a value other than 0.0, the given value is used. The actual value is clamped to the range from 1.0 to the highest possible value defined in the graphics driver/hardware (typically around 16.0, but may be lower), with 1.0 representing isotropic filtering (fastest), and higher values producing better visual results at the cost of rendering performance. Note that if the maximum anisotropy is higher than 1.0, it may be the case that the actual filter(s) chosen by the graphics API are different than the ones specified via SetMagFilter(), SetMinFilter() and/or SetMipFilter().
Boolean SetMaxAnisotropy(Number maxAnisotropy)
maxAnisotropy | The maximum anisotropy value. |
Set the maximum anisotropy for filtering.
Number GetMaxAnisotropy()
Get the texture's number of detail levels.
Integer GetNumberOfDetailLevels()
Get the texture's number of stages for a given detail level.
Integer GetNumberOfStages(Integer detailLevel)
detailLevel | The detail level to query. |