MPSImageGuidedFilter(3) MetalPerformanceShaders.framework MPSImageGuidedFilter(3)

MPSImageGuidedFilter

#import <MPSImageGuidedFilter.h>

Inherits MPSKernel.


(nonnull instancetype) - initWithDevice:kernelDiameter:
(nonnull instancetype) - initWithDevice:
(nullable instancetype) - initWithCoder:device:
(void) - encodeRegressionToCommandBuffer:sourceTexture:guidanceTexture:weightsTexture:destinationCoefficientsTexture:
(void) - encodeReconstructionToCommandBuffer:guidanceTexture:coefficientsTexture:destinationTexture:


NSUInteger kernelDiameter
float epsilon
float reconstructScale
float reconstructOffset

MPSImageGuidedFilter.h MetalPerformanceShaders

Copyright:

Copyright (c) 2018 Apple Inc. All rights reserved. MetalPerformanceShaders guided filter

Perform Guided Filter to produce a coefficients image The filter is broken into two stages:

The regression stage learns a 4-channel 'coefficient' texture (typically at a very low resolution), and represents the per-pixel linear regression of the source texture to the guidance texture.

The reconstruction stage upsamples the coefficeints to the same size as the final output and then at each pixel computes the inner product to produce the output.

The filter is broken into two stages to allow coefficients to be filtered (such as for example - temporally filtering for video to prevent flicker).

There is also support for an optional weight texture that can be used to discard values in the source data.

Guided Filter is described at https://arxiv.org/pdf/1505.00996.pdf.

- (void) encodeReconstructionToCommandBuffer: (nonnull id< MTLCommandBuffer >) commandBuffer(nonnull id< MTLTexture >) guidanceTexture(nonnull id< MTLTexture >) coefficientsTexture(nonnull id< MTLTexture >) destinationTexture

Perform Guided Filter Reconstruction (inference) to produce the filtered output The filter will not begin to execute until after the command buffer has been enqueued and committed.

sourceGuidanceTexture Input guidance pixel buffer. This should be a color (RGB) image. coefficientsTexture Input coefficients texture generated generated by a previous encodeRegressionToCommandBuffer

Parameters:

destinationTexture Output texture

Note: The coefficients are upsampled at the reconstruction of the filtered data. Reconstruct(guidance RGB) = a.r * R + a.g * G + a.b * B + b, where a and b are the coefficients learnt using encodeRegressionToCommandBuffer.

Final reconstructed value = value * reconstructScale + reconstructOffset

- (void) encodeRegressionToCommandBuffer: (nonnull id< MTLCommandBuffer >) commandBuffer(nonnull id< MTLTexture >) sourceTexture(nonnull id< MTLTexture >) guidanceTexture(nullable id< MTLTexture >) weightsTexture(nonnull id< MTLTexture >) destinationCoefficientsTexture

Perform Guided Filter Regression (correlation) to produce a coefficients texture The filter will not begin to execute until after the command buffer has been enqueued and committed.

Parameters:

commandBuffer A valid MTLCommandBuffer.
sourceTexture Input source texture to be filtered (typically a mask). This should be a single channel image.
guidanceTexture Input guidance texture. This should be a color (RGB) image.
weightsTexture Optional input confidence texture. This should also a single channel image.
destinationCoefficientsTexture Output texture with four coefficients that minimize the mean squared error between the source and an affine function of guidance R, G, B. Note: The destinationCoefficientsTexture computes the linear cofficients 'a' and 'b'. The 'a' coefficient is stored in the RGB channels of destinationCoefficientsTexture and the 'b' coefficient in the alpha chnanel.

Set the MPSKernelOptionsAllowReducedPrecision in the 'options' property for this kernel to peform the computations using half-precision arithmetic. This can potentially improve performance and/or power usage.

- (nullable instancetype) initWithCoder: (NSCoder *__nonnull) aDecoder(nonnull id< MTLDevice >) device

NSSecureCoding compatability While the standard NSSecureCoding/NSCoding method -initWithCoder: should work, since the file can't know which device your data is allocated on, we have to guess and may guess incorrectly. To avoid that problem, use initWithCoder:device instead.

Parameters:

aDecoder The NSCoder subclass with your serialized MPSKernel
device The MTLDevice on which to make the MPSKernel

Returns:

A new MPSKernel object, or nil if failure.

Reimplemented from MPSKernel.

- (nonnull instancetype) initWithDevice: (nonnull id< MTLDevice >) device

Standard init with default properties per filter type

Parameters:

device The device that the filter will be used on. May not be NULL.

Returns:

a pointer to the newly initialized object. This will fail, returning nil if the device is not supported. Devices must be MTLFeatureSet_iOS_GPUFamily2_v1 or later.

Reimplemented from MPSKernel.

- (nonnull instancetype) initWithDevice: (nonnull id< MTLDevice >) device(NSUInteger) kernelDiameter

Specifies information to apply the guided filter regression.

Parameters:

device The device the filter will run on
kernelDiameter The local window size

Returns:

A valid MPSImageGuidedFilterRegression object or nil, if failure.

- epsilon [read], [write], [nonatomic], [assign]

The regularization parameter The parameter used when computing the linear coefficients a and b.

- kernelDiameter [read], [nonatomic], [assign]

The local window size The local window size.

- reconstructOffset [read], [write], [nonatomic], [assign]

The offset parameter The offset parameter added to the result of the scaled reconstructed value. The default value is 0.0f.

- reconstructScale [read], [write], [nonatomic], [assign]

The scale parameter The parameter used to scale the result of the reconstruction operation. The default value is 1.0f.

Generated automatically by Doxygen for MetalPerformanceShaders.framework from the source code.

Mon Jul 9 2018 Version MetalPerformanceShaders-119.3