Developer Guide for Intel® Data Analytics Acceleration Library 2019 Update 4
The forward two-dimensional (2D) transposed convolution layer computes the tensor Y by applying a set of nKernels 2D kernels K of size m 3 x m 4 to the input tensor X. For more details, refer to 2D Transposed Convolution Forward Layer.
The backward 2D transposed convolution layer computes the derivatives of the objective function E.
The problem is to compute:
Values:
For the notations in this formula, refer to 2D Convolution Forward Layer.
The computation flow in the backward 2D transposed convolution layer is identical to the computation of the gradient in the 2D convolution forward layer, except the following notation changes:
2D Convolution Forward Layer |
2D Transposed Convolution Backward Layer |
---|---|
Input tensor X |
Input gradient tensor G |
Values tensor Y |
Gradient tensor Z |
nKernels |
l 2 |
n 2 |
nKernels |