DML_ACTIVATION_ELU_OPERATOR_DESC structure (directml.h)

Performs an exponential linear unit (ELU) activation function on every element in InputTensor, placing the result into the corresponding element of OutputTensor.

f(x) = x,                    if x >= 0
       Alpha * (exp(x) - 1), otherwise

Where exp(x) is the natural exponentiation function.

This operator supports in-place execution, meaning that the output tensor is permitted to alias InputTensor during binding.

Syntax

struct DML_ACTIVATION_ELU_OPERATOR_DESC {
  const DML_TENSOR_DESC *InputTensor;
  const DML_TENSOR_DESC *OutputTensor;
  FLOAT                 Alpha;
};

Members

InputTensor

Type: const DML_TENSOR_DESC*

The input tensor to read from.

OutputTensor

Type: const DML_TENSOR_DESC*

The output tensor to write the results to.

Alpha

Type: FLOAT

The alpha coefficient. A typical default for this value is 1.0.

Availability

This operator was introduced in DML_FEATURE_LEVEL_1_0.

Tensor constraints

InputTensor and OutputTensor must have the same DataType, DimensionCount, and Sizes.

Tensor support

DML_FEATURE_LEVEL_3_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 1 to 8 FLOAT32, FLOAT16
OutputTensor Output 1 to 8 FLOAT32, FLOAT16

DML_FEATURE_LEVEL_2_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 4 to 5 FLOAT32, FLOAT16
OutputTensor Output 4 to 5 FLOAT32, FLOAT16

DML_FEATURE_LEVEL_1_0 and above

Tensor Kind Supported dimension counts Supported data types
InputTensor Input 4 FLOAT32, FLOAT16
OutputTensor Output 4 FLOAT32, FLOAT16

Requirements

Requirement Value
Header directml.h