C++ API Reference for Intel® Data Analytics Acceleration Library 2018 Update 3

Namespaces | Classes | Enumerations

Contains classes for the forward stage of the neural network layer. More...

Namespaces

 daal::algorithms::neural_networks::layers::forward
 Contains classes for the forward stage of the neural network layer.
 
 daal::algorithms::neural_networks::layers::forward::interface1
 Contains version 1.0 of Intel(R) Data Analytics Acceleration Library (Intel(R) DAAL) interface.
 

Classes

class  LayerIface
 Abstract class which defines interface for the layer. More...
 
class  LayerIfaceImpl
 Implements the abstract interface LayerIface. LayerIfaceImpl is, in turn, the base class for the classes interfacing the layers. More...
 
class  LayerContainerIfaceImpl
 Provides methods of base container for forward layers. This class is associated with the daal::algorithms::neural_networks::layers::forward::LayerContainerIfaceImpl class. More...
 
class  LayerDescriptor
 Class defining descriptor for layer on forward stage. More...
 
class  InputIface
 Abstract class that specifies interface of the input objects for the neural network layer algorithm. More...
 
class  Input
 Input objects for layer algorithm More...
 
class  Result
 Provides methods to access the result obtained with the compute() method of the layer algorithm. More...
 

Enumerations

enum  InputId { data }
 
enum  InputLayerDataId
 
enum  ResultId { value }
 
enum  ResultLayerDataId { resultForBackward = lastResultId + 1 }
 

Enumeration Type Documentation

enum InputId

Available identifiers of input objects for the layer algorithm

Enumerator
data 

Input data

enum InputLayerDataId

Available identifiers of input objects for the layer algorithm

enum ResultId

Available identifiers of results for the layer algorithm

Enumerator
value 

Table to store the result

enum ResultLayerDataId

Available identifiers of results for the layer algorithm

Enumerator
resultForBackward 

Data for backward step

For more complete information about compiler optimizations, see our Optimization Notice.