Developer Guide for Intel® Data Analytics Acceleration Library 2018 Update 2

Parametric Rectifier Linear Unit (pReLU) Backward Layer

The parametric rectifier linear unit (pReLU) activation layer applies the transform f(x) = max(0, x) + w * min(0, x) to the input data. The backward pReLU layer computes the values z = y*f'(x), where y is the input gradient computed on the preceding layer, w is the weight of the input argument. and

Problem Statement

Given p-dimensional tensors X , Y , and W of size n 1 x n 2 x ... x n p , the problem is to compute the p-dimensional tensor Z = (z i 1 ...i p ) of size n 1 x n 2 x ... x n p such that: