Developer Guide for Intel® Data Analytics Acceleration Library 2018
The smooth rectifier linear unit (SmoothReLU) activation layer applies the transform f(x) = log(1 + exp(x)) to the input data. The backward SmoothReLU layer computes the values z = y*f'(x), where y is the input gradient computed on the preceding layer and
Given p-dimensional tensors X and Y of size n 1 x n 2 x ... x n p , the problem is to compute the p-dimensional tensor Z = (z i 1 ...i p ) of size n 1 x n 2 x ... x n p such that: