Developer Guide for Intel® Data Analytics Acceleration Library 2019 Update 4
The rectifier linear unit (ReLU) activation layer applies the transform f(x) = max(0, x) to the input data. The backward ReLU layer computes the value z = y * f'(x), where y is the input gradient computed on the prior layer and f'(x) = {1 if x > 0, 0 if x =< 0}.
Given p-dimensional tensors X and Y of size n1 x n2 x ... x np, the problem is to compute a p-dimensional tensor Z = (zi1...ip) of size n1 x n2 x ... x np, where:
zi1...ip = {yi1...ip if xi1...ip > 0, 0 if xi1...ip =< 0}