DML_ACTIVATION_RELU_GRAD_OPERATOR_DESC structure (directml.h)

Computes backpropagation gradients for a rectified linear unit (ReLU). This operator performs the following element-wise computation.

X = InputTensor
dY = InputGradientTensor

OutputGradientTensor = (X > 0 ? dY : 0)

The corresponding forward-pass operator is DML_ACTIVATION_RELU_OPERATOR_DESC.

Syntax

struct DML_ACTIVATION_RELU_GRAD_OPERATOR_DESC {
  const DML_TENSOR_DESC *InputTensor;
  const DML_TENSOR_DESC *InputGradientTensor;
  const DML_TENSOR_DESC *OutputGradientTensor;
};

Members

InputTensor

Type: const DML_TENSOR_DESC*

The input (feature) tensor. This is typically the same input as was provided during the forward pass (see DML_ACTIVATION_RELU_OPERATOR_DESC).

InputGradientTensor

Type: const DML_TENSOR_DESC*

The incoming gradient tensor. This is typically obtained from the output of backpropagation of a preceding layer. The Sizes and DataType of this tensor must exactly match those of the InputTensor.

OutputGradientTensor

Requirements

   
Header directml.h

See also

DML_ACTIVATION_RELU_OPERATOR_DESC