MPSCNNBatchNormalization(3) | MetalPerformanceShaders.framework | MPSCNNBatchNormalization(3) |
MPSCNNBatchNormalization
#import <MPSCNNBatchNormalization.h>
Inherits MPSCNNKernel.
(nonnull instancetype) - initWithDevice:dataSource:
(nonnull instancetype) -
initWithDevice:dataSource:fusedNeuronDescriptor:
(nonnull instancetype) - initWithDevice:
(nullable instancetype) - initWithCoder:device:
(void) -
encodeToCommandBuffer:sourceImage:batchNormalizationState:destinationImage:
(void) -
encodeBatchToCommandBuffer:sourceImages:batchNormalizationState:destinationImages:
(void) -
encodeToCommandBuffer:sourceImage:destinationState:destinationImage:
(MPSImage *__nonnull) -
encodeToCommandBuffer:sourceImage:destinationState:destinationStateIsTemporary:
(void) -
encodeBatchToCommandBuffer:sourceImages:destinationStates:destinationImages:
(MPSImageBatch *__nonnull) -
encodeBatchToCommandBuffer:sourceImages:destinationStates:destinationStateIsTemporary:
(MPSCNNBatchNormalizationState *__nullable) -
resultStateForSourceImage:sourceStates:destinationImage:
(MPSCNNBatchNormalizationState *__nullable) -
temporaryResultStateForCommandBuffer:sourceImage:sourceStates:destinationImage:
(void) - reloadDataSource:
(void) - reloadGammaAndBetaFromDataSource
(void) - reloadMeanAndVarianceFromDataSource
(void) - reloadGammaAndBetaWithCommandBuffer:gammaAndBetaState:
(void) - reloadMeanAndVarianceWithCommandBuffer:meanAndVarianceState:
NSUInteger numberOfFeatureChannels
float epsilon
id< MPSCNNBatchNormalizationDataSource > dataSource
This depends on Metal.framework MPSCNNBatchNormalization normalizes input images using per-channel means and variances.
for (c = 0; c < numberOfFeatureChannels; ++c) { input_image = in(:,:,c,:); output_image = (input_image - mean[c]) * gamma[c] / sqrt(variance[c] + epsilon) + beta[c]; out(:,:,c,:) = output_image; }
Encode this kernel to a command buffer for a batch of images using a batch normalization state.
Parameters:
Encode a MPSCNNKernel with a destination state into a command Buffer. This is typically used during training. The state is commonly a MPSNNGradientState. Please see -resultStateForSourceImages:SourceStates:destinationImage and batch+temporary variants.
Parameters:
Reimplemented from MPSCNNKernel.
Encode a MPSCNNKernel into a command Buffer. Create a MPSImageBatch and MPSStateBatch to hold the results and return them. In the first iteration on this method, encodeToCommandBuffer:sourceImage:destinationImage: some work was left for the developer to do in the form of correctly setting the offset property and sizing the result buffer. With the introduction of the padding policy (see padding property) the filter can do this work itself. If you would like to have some input into what sort of MPSImage (e.g. temporary vs. regular) or what size it is or where it is allocated, you may set the destinationImageAllocator to allocate the image yourself.
This method uses the MPSNNPadding padding property to figure out how to size the result image and to set the offset property. See discussion in MPSNeuralNetworkTypes.h. All images in a batch must have MPSImage.numberOfImages = 1.
Usage:
MPSStateBatch * outStates = nil; // autoreleased MPSImageBatch * result = [k encodeBatchToCommandBuffer: cmdBuf
sourceImages: sourceImages
destinationStates: &outStates ];
Parameters:
Returns:
Reimplemented from MPSCNNKernel.
Encode this kernel to a command buffer for a single image using a batch normalization state.
Parameters:
Encode a MPSCNNKernel with a destination state into a command Buffer. This is typically used during training. The state is commonly a MPSNNGradientState. Please see -resultStateForSourceImages:SourceStates: and batch+temporary variants.
Parameters:
Reimplemented from MPSCNNKernel.
Encode a MPSCNNKernel into a command Buffer. Create a texture and state to hold the results and return them. In the first iteration on this method, encodeToCommandBuffer:sourceImage:destinationState:destinationImage: some work was left for the developer to do in the form of correctly setting the offset property and sizing the result buffer. With the introduction of the padding policy (see padding property) the filter can do this work itself. If you would like to have some input into what sort of MPSImage (e.g. temporary vs. regular) or what size it is or where it is allocated, you may set the destinationImageAllocator to allocate the image yourself.
This method uses the MPSNNPadding padding property to figure out how to size the result image and to set the offset property. See discussion in MPSNeuralNetworkTypes.h. All images in a batch must have MPSImage.numberOfImages = 1.
Parameters:
Returns:
Reimplemented from MPSCNNKernel.
NSSecureCoding compatability While the standard NSSecureCoding/NSCoding method -initWithCoder: should work, since the file can't know which device your data is allocated on, we have to guess and may guess incorrectly. To avoid that problem, use a subclass of NSCoder that implements the <MPSDeviceProvider> protocol to tell MPS the MTLDevice to use.
Parameters:
Returns:
Reimplemented from MPSCNNKernel.
Use initWithDevice:dataSource instead
Reimplemented from MPSCNNKernel.
Initializes a batch normalization kernel using a data source.
Parameters:
Returns:
Initializes a batch normalization kernel using a data source and a neuron descriptor.
Parameters:
Returns:
Reinitialize the filter using a data source.
Parameters:
Reinitialize the filter's gamma and beta values using the data source provided at kernel initialization.
Reload data using new gamma and beta terms contained within an MPSCNNNormalizationGammaAndBetaState object.
Parameters:
Reinitialize the filter's mean and variance values using the data source provided at kernel initialization.
Reload data using new mean and variance terms contained within an MPSCNNNormalizationMeanAndVarianceState object.
Parameters:
Return an MPSCNNBatchNormalizationState object which may be used with a MPSCNNBatchNormalization filter.
Reimplemented from MPSCNNKernel.
Return a temporary MPSCNNBatchNormalizationState object which may be used with a MPSCNNBatchNormalization filter.
Reimplemented from MPSCNNKernel.
The data source the batch normalization was initialized with
The epsilon value used in the batch normalization formula to bias the variance when normalizing.
The number of feature channels in an image to be normalized.
Generated automatically by Doxygen for MetalPerformanceShaders.framework from the source code.
Mon Jul 9 2018 | Version MetalPerformanceShaders-119.3 |