Developer Guide for Intel® Data Analytics Acceleration Library 2018 Update 3
The backward pReLU layer accepts the input described below. Pass the Input ID as a parameter to the methods that provide input for your algorithm. For more details, see Algorithms.
Input ID |
Input |
|
---|---|---|
inputGradient |
Pointer to the tensor of size n1 x n2 x ... x np that stores the input gradient computed on the preceding layer. This input can be an object of any class derived from Tensor. |
|
inputFromForward |
Collection of data needed for the backward pReLU layer. This collection can contain objects of any class derived from Tensor. |
|
Element ID |
Element |
|
auxData |
Pointer to the tensor of size n1 x n2 x ... x np that stores the input data for the forward pReLU layer. This input can be an object of any class derived from Tensor. |
|
auxWeights |
Pointer to the tensor of size nk x nk + 1 x ... x nk + q- 1 that stores weights for the forward pReLU layer. This input can be an object of any class derived from Tensor. |
For common parameters of neural network layers, see Common Parameters.
In addition to the common parameters, the backward pReLU layer has the following parameters:
Parameter |
Default Value |
Description |
|
---|---|---|---|
algorithmFPType |
float |
The floating-point type that the algorithm uses for intermediate computations. Can be float or double. |
|
method |
defaultDense |
Performance-oriented computation method, the only method supported by the layer. |
|
dataDimension |
0 |
Starting index of data dimension of type size_t to apply a weight. |
|
weightsDimension |
1 |
Number of weight dimensions of type size_t. |
|
propagateGradient |
false |
Flag that specifies whether the backward layer propagates the gradient. |
The backward pReLU layer calculates the result described below. Pass the Result ID as a parameter to the methods that access the results of your algorithm. For more details, see Algorithms.
Result ID |
Result |
|
---|---|---|
gradient |
Pointer to the tensor of size n1 x n2 x ... x np that stores the result of the backward pReLU layer. This input can be an object of any class derived from Tensor. |
|
weightDerivatives |
Pointer to the tensor of size nk x nk + 1 x ... x nk + q- 1 that stores result ∂Ε / ∂wik...ik + q - 1of the backward pReLU layer. This input can be an object of any class derived from Tensor. |
C++: prelu_layer_dense_batch.cpp
Java*: PReLULayerDenseBatch.java
Python*: prelu_layer_dense_batch.py