C++ API Reference for Intel® Data Analytics Acceleration Library 2018 Update 2

List of all members
DistributedPartialResult Class Reference

Provides methods to access partial result obtained with the compute() method of the neural network training algorithm in the distributed processing mode. More...

Additional Inherited Members

- Static Protected Member Functions inherited from Argument
static data_management::DataCollectionPtr & getStorage (Argument &a)
 
static const data_management::DataCollectionPtr & getStorage (const Argument &a)
 

Class Declaration

Member Function Documentation

◆ allocate()

DAAL_EXPORT services::Status allocate ( const daal::algorithms::Input input,
const daal::algorithms::Parameter parameter,
const int  method 
)

Registers user-allocated memory to store partial results of the neural network model based training

Parameters
[in]inputPointer to an object containing input data
[in]methodComputation method for the algorithm
[in]parameterParameter of the neural network training
Returns
Status of computations

◆ check()

services::Status check ( const daal::algorithms::Input input,
const daal::algorithms::Parameter par,
int  method 
) const
virtual

Checks partial result of the neural network algorithm

Parameters
[in]inputInput object of algorithm
[in]parParameter of algorithm
[in]methodComputation method
Returns
Status of computations

Reimplemented from PartialResult.

◆ get()

training::ResultPtr get ( Step2MasterPartialResultId  id) const

Returns the partial result of the neural network model based training

Parameters
[in]idIdentifier of the partial result
Returns
Partial result that corresponds to the given identifier

◆ set()

void set ( Step2MasterPartialResultId  id,
const training::ResultPtr &  value 
)

Sets the partial result of neural network model based training

Parameters
[in]idIdentifier of the partial result
[in]valuePartial result

The documentation for this class was generated from the following file:

For more complete information about compiler optimizations, see our Optimization Notice.