Murl Engine Lua Addon API  Version 1.0 beta
Murl.Graph.ITexture

The ITexture graph node interface.

This interface represents a generic node representing a texture for rendering.


Table members

Inherited


Murl.Graph.IStateSlot
Murl.Graph.IStateUnit

Methods


GetNodeInterface()

Get the constant Graph::INode interface. This method returns a constant pointer to the node's Graph::INode interface, to be able to query common node properties such as active state, visibility or ID.

Murl.Graph.INode GetNodeInterface()

Returns
Murl.Graph.INode The constant Graph::INode interface, or null if not available

GetSubTextureNodeTarget()

Get the constant container holding the optional child textures. This method returns a constant pointer to the node's Graph::ITextureNodeTarget sub container, which is used to store multiple sub-textures.

Murl.Graph.IGenericNodeTarget.GraphITexture GetSubTextureNodeTarget()

Returns
Murl.Graph.IGenericNodeTarget.GraphITexture The constant Graph::ITextureNodeTarget container, or null if not available.

GetImageResourceTarget()

Get a constant Graph::IImageResourceTarget container. This method returns a constant pointer to a Graph::IImageResourceTarget container, which allows to query the image resources referenced by a node implementing this interface.

Murl.Graph.IGenericResourceTarget.ResourceIImage GetImageResourceTarget()

Returns
Murl.Graph.IGenericResourceTarget.ResourceIImage The constant Graph::IImageResourceTarget container, or null if not available

SetVideoStream(target, layer, stream)

Manually set a video stream for a given layer and target. This method can be used to supply a manually created video stream as the texture's pixel source. Note, that you cannot supply both an image resource and a video stream for the same target. Note also, that the user is responsible for correct destruction of the given stream. The layer parameter must specify a layer in the range from 0 to GetNumberOfLayers()-1.

Boolean SetVideoStream(Murl.IEnums.TextureTarget target, Integer layer, Murl.IVideoStream stream)

Parameters
targetThe texture target (flat, or one of the 6 cube map sides)
layerThe texture layer
streamThe video stream to apply.
Returns
Boolean true if successful.

SetVideoStream(target, stream)

Manually set a video stream for layer 0 and a given target. This method can be used to supply a manually created video stream as the texture's pixel source. Note, that you cannot supply both an image resource and a video stream for the same target. Note also, that the user is responsible for correct destruction of the given stream.

Boolean SetVideoStream(Murl.IEnums.TextureTarget target, Murl.IVideoStream stream)

Parameters
targetThe texture target (flat, or one of the 6 cube map sides)
streamThe video stream to apply.
Returns
Boolean true if successful.

GetVideoStream(target)

Get the video stream for layer 0 and a given target.

Murl.IVideoStream GetVideoStream(Murl.IEnums.TextureTarget target)

Parameters
targetThe texture target to query
Returns
Murl.IVideoStream The video stream at the given target, or null if none is active.

GetVideoStream(target, layer)

Get the video stream for a given layer and target.

Murl.IVideoStream GetVideoStream(Murl.IEnums.TextureTarget target, Integer layer)

Parameters
layerThe texture layer
targetThe texture target to query
Returns
Murl.IVideoStream The video stream at the given target, or null if none is active.

SetType(type)

Set the texture type. For a generic node implementing this interface, the actual texture type may be set using this method. For specialized implementations that implicitly set the type (like flat textures or cube maps), this method always returns false.

Boolean SetType(Murl.IEnums.TextureType type)

Parameters
typeOne of the available texture types.
Returns
Boolean true if successful.

GetType()

Get the texture type.

Murl.IEnums.TextureType GetType()

Returns
Murl.IEnums.TextureType The actual texture type.

SetNumberOfLayers(numLayers)

Set the number of texture layers. If the node's type is not an array texture (flat or cubemap), this method returns false. See SetType().

Boolean SetNumberOfLayers(Integer numLayers)

Parameters
numLayersThe number of array layers.
Returns
Boolean true if successful.

GetNumberOfLayers()

Get the number of texture layers. If the node's type is not an array texture (flat or cubemap), this method always returns 1. See SetType().

Integer GetNumberOfLayers()

Returns
Integer The number of array layers.

SetMipMapGenerationMode(mode)

Set the MIP map generation mode. By default, the MIP map generation mode is set to IEnums::MIP_MAP_GENERATION_MODE_FAST.

Boolean SetMipMapGenerationMode(Murl.IEnums.MipMapGenerationMode mode)

Parameters
modeThe MIP map generation mode.
Returns
Boolean true if successful.

GetMipMapGenerationMode()

Get the texture type.

Murl.IEnums.MipMapGenerationMode GetMipMapGenerationMode()

Returns
Murl.IEnums.MipMapGenerationMode The actual texture type.

SetSize(sizeX, sizeY)

Set the texure's dimensions. A given value has no effect if a positive non-zero scale factor is defined for the respective axis via SetAutoScaleFactor().

Boolean SetSize(Integer sizeX, Integer sizeY)

Parameters
sizeXThe texture width in pixels.
sizeYThe texture height in pixels.
Returns
Boolean true if successful.

GetSizeX()

Get the texture's base width. This returns the base width of the texture (at MIP level 0), which is either defined via SetSize(), SetAutoScaleFactor() or implicitly from a given image resource when none of the previous value is defined. A possible prescale factor is not considered.

Integer GetSizeX()

Returns
Integer The texture width in pixels.

GetSizeY()

Get the texture's base height. See GetSizeX().

Integer GetSizeY()

Returns
Integer The texture height in pixels.

SetAutoScaleFactor(scaleX, scaleY)

Set the texure's auto scale factors. By default, the auto scale factors for both axes are set to 0.0, and the texture's dimensions match the values given via SetSize(). If any of the given scale factors is a positive non-zero value, the actual texture dimension for the respective axis is calculated from the current output surface dimension multiplied by that factor. In this case, a size value set via SetSize() has no effect. The current output surface size is retrieved via IAppConfiguration::GetDisplaySurfaceSizeX() and IAppConfiguration::GetDisplaySurfaceSizeY(). This is useful for e.g. post-processing frame buffer textures, which are supposed to match the current output display dimensions. For example, to create a texture that is half as wide and equal in height as the current output surface, specify a value of 0.5 for scaleX and a value of 1.0 for scaleY.

Boolean SetAutoScaleFactor(Number scaleX, Number scaleY)

Parameters
scaleXThe width scale factor.
scaleYThe height scale factor.
Returns
Boolean true if successful.

GetAutoScaleFactorX()

Get the texture's width.

Number GetAutoScaleFactorX()

Returns
Number The texture width in pixels.

GetAutoScaleFactorY()

Get the texture's height.

Number GetAutoScaleFactorY()

Returns
Number The texture height in pixels.

SetAlphaEnabled(enabled)

Enable/disable the alpha channel for this texture.

Boolean SetAlphaEnabled(Boolean enabled)

Parameters
enabledIf true, the alpha channel should be used.
Returns
Boolean true if successful.

IsAlphaEnabled()

Check if the alpha channel is enabled for this texture.

Boolean IsAlphaEnabled()

Returns
Boolean true if enabled.

SetMipMappingEnabled(enabled)

Enable/disable mip-mapping for this texture.

Boolean SetMipMappingEnabled(Boolean enabled)

Parameters
enabledIf true, mip-maps are enabled.
Returns
Boolean true if successful.

IsMipMappingEnabled()

Check if mip-mapping is enabled for this texture.

Boolean IsMipMappingEnabled()

Returns
Boolean true if enabled.

SetPrescalingEnabled(enabled)

Enable/disable prescaling for this texture. In the IEngineConfiguration, an application may define a power-of-2 texture prescale factor that can be used to e.g. scale down texture resources depending on a device's actual screen resolution. However, in certain cases it may not be desired to prescale all textures (e.g. when using shadow maps); for such textures, prescaling can be disabled using this method.

Boolean SetPrescalingEnabled(Boolean enabled)

Parameters
enabledIf true, prescaling is enabled.
Returns
Boolean true if successful.

IsPrescalingEnabled()

Check if prescaling is enabled for this texture.

Boolean IsPrescalingEnabled()

Returns
Boolean true if enabled.

SetPixelFormat(pixelFormat)

Set the texture's actual pixel format.

Boolean SetPixelFormat(Murl.IEnums.PixelFormat pixelFormat)

Parameters
pixelFormatThe pixel format to use.
Returns
Boolean true if successful.

GetPixelFormat()

Get the texture's actual pixel format.

Murl.IEnums.PixelFormat GetPixelFormat()

Returns
Murl.IEnums.PixelFormat The texture's pixel format.

SetWrapModeX(mode)

Set the texture's wrap mode in X direction.

Boolean SetWrapModeX(Murl.IEnums.TextureWrapMode mode)

Parameters
modeThe wrap mode.
Returns
Boolean true if successful.

GetWrapModeX()

Get the texture's wrap mode in X direction.

Murl.IEnums.TextureWrapMode GetWrapModeX()

Returns
Murl.IEnums.TextureWrapMode The wrap mode.

SetWrapModeY(mode)

Set the texture's wrap mode in Y direction.

Boolean SetWrapModeY(Murl.IEnums.TextureWrapMode mode)

Parameters
modeThe wrap mode.
Returns
Boolean true if successful.

GetWrapModeY()

Get the texture's wrap mode in Y direction.

Murl.IEnums.TextureWrapMode GetWrapModeY()

Returns
Murl.IEnums.TextureWrapMode The wrap mode.

SetWrapModeZ(mode)

Set the texture's wrap mode in Z direction.

Boolean SetWrapModeZ(Murl.IEnums.TextureWrapMode mode)

Parameters
modeThe wrap mode.
Returns
Boolean true if successful.

GetWrapModeZ()

Get the texture's wrap mode in Z direction.

Murl.IEnums.TextureWrapMode GetWrapModeZ()

Returns
Murl.IEnums.TextureWrapMode The wrap mode.

SetMagFilter(magFilter)

Set the texture filter used for magnification. Valid magFilter values are restricted to TEXTURE_FILTER_NEAREST and TEXTURE_FILTER_LINEAR.

Boolean SetMagFilter(Murl.IEnums.TextureFilter magFilter)

Parameters
magFilterThe filter to use.
Returns
Boolean true if successful.

GetMagFilter()

Get the texture filter used for magnification.

Murl.IEnums.TextureFilter GetMagFilter()

Returns
Murl.IEnums.TextureFilter The filter used.

SetMinFilter(minFilter)

Set the texture filter used for minification. Valid minFilter values are restricted to TEXTURE_FILTER_NEAREST and TEXTURE_FILTER_LINEAR.

Boolean SetMinFilter(Murl.IEnums.TextureFilter minFilter)

Parameters
minFilterThe filter to use.
Returns
Boolean true if successful.

GetMinFilter()

Get the texture filter used for minification.

Murl.IEnums.TextureFilter GetMinFilter()

Returns
Murl.IEnums.TextureFilter The filter used.

SetMipFilter(mipFilter)

Set the texture filter used for mip-level selection.

Boolean SetMipFilter(Murl.IEnums.TextureFilter mipFilter)

Parameters
mipFilterThe filter to use.
Returns
Boolean true if successful.

GetMipFilter()

Get the texture filter used for mip-level selection.

Murl.IEnums.TextureFilter GetMipFilter()

Returns
Murl.IEnums.TextureFilter The filter used.

SetDepthTestMode(mode)

Set the depth compare mode, if the pixel format defines a depth texture.

Boolean SetDepthTestMode(Murl.IEnums.DepthTestMode mode)

Parameters
modeThe depth compare mode to use.
Returns
Boolean true if successful.

GetDepthTestMode()

Get the depth compare mode.

Murl.IEnums.DepthTestMode GetDepthTestMode()

Returns
Murl.IEnums.DepthTestMode The depth compare mode used.

SetDepthTestFunction(function)

Set the depth test function, if the pixel format defines a depth texture and the depth compare mode is not NONE.

Boolean SetDepthTestFunction(Murl.IEnums.DepthTestFunction function)

Parameters
functionThe depth test function to use.
Returns
Boolean true if successful.

GetDepthTestFunction()

Get the depth test function.

Murl.IEnums.DepthTestFunction GetDepthTestFunction()

Returns
Murl.IEnums.DepthTestFunction The depth test function used.

SetMaxAnisotropy(maxAnisotropy)

Set the maximum anisotropy for filtering. By default, a maximum anisotropy value of 0.0 is defined. In this case, the global value defined via Murl::IEngineConfiguration::SetDefaultMaxTextureAnisotropy() is used. If set to a value other than 0.0, the given value is used. The actual value is clamped to the range from 1.0 to the highest possible value defined in the graphics driver/hardware (typically around 16.0, but may be lower), with 1.0 representing isotropic filtering (fastest), and higher values producing better visual results at the cost of rendering performance. Note that if the maximum anisotropy is higher than 1.0, it may be the case that the actual filter(s) chosen by the graphics API are different than the ones specified via SetMagFilter(), SetMinFilter() and/or SetMipFilter().

Boolean SetMaxAnisotropy(Number maxAnisotropy)

Parameters
maxAnisotropyThe maximum anisotropy value.
Returns
Boolean true if successful.

GetMaxAnisotropy()

Set the maximum anisotropy for filtering.

Number GetMaxAnisotropy()

Returns
Number The maximum anisotropy value.

GetNumberOfDetailLevels()

Get the texture's number of detail levels.

Integer GetNumberOfDetailLevels()

Returns
Integer The number of detail levels.

GetNumberOfStages(detailLevel)

Get the texture's number of stages for a given detail level.

Integer GetNumberOfStages(Integer detailLevel)

Parameters
detailLevelThe detail level to query.
Returns
Integer The number of stages.