DML_ACTIVATION_LEAKY_RELU_OPERATOR_DESC structure (directml.h)

Performs a leaky rectified linear unit (ReLU) activation function on every element in InputTensor, placing the result into the corresponding element of OutputTensor.

f(x) = x,         if x >= 0
       Alpha * x, otherwise

This operator supports in-place execution, meaning that the output tensor is permitted to alias InputTensor during binding.

Syntax

struct DML_ACTIVATION_LEAKY_RELU_OPERATOR_DESC {
  const DML_TENSOR_DESC *InputTensor;
  const DML_TENSOR_DESC *OutputTensor;
  FLOAT                 Alpha;
};

Members

InputTensor

Type: const DML_TENSOR_DESC*

The input tensor to read from.

OutputTensor

Type: const DML_TENSOR_DESC*

The output tensor to write the results to.

Alpha

Type: FLOAT

The alpha coefficient. A typical default for this value is 0.01.

Requirements

   
Header directml.h

See also

DML_ACTIVATION_RELU_OPERATOR_DESC

Availability

This operator was introduced in DML_FEATURE_LEVEL_1_0.

Tensor constraints

InputTensor and OutputTensor must have the same DataType, DimensionCount, and Sizes.

Tensor support

DML_FEATURE_LEVEL_3_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 1 to 8 FLOAT32, FLOAT16
OutputTensor Output 1 to 8 FLOAT32, FLOAT16

DML_FEATURE_LEVEL_2_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 4 to 5 FLOAT32, FLOAT16
OutputTensor Output 4 to 5 FLOAT32, FLOAT16

DML_FEATURE_LEVEL_1_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 4 FLOAT32, FLOAT16
OutputTensor Output 4 FLOAT32, FLOAT16