Java* API Reference for Intel® Data Analytics Acceleration Library 2018 Update 3

List of all members
TrainingOnline Class Reference

Provides methods for linear regression model-based training in the online processing mode. More...

Class Constructor

TrainingOnline ( DaalContext  context,
TrainingOnline  other 
)

Constructs a linear regression training algorithm by copying input objects and parameters of another linear regression training algorithm in the online processing mode

Parameters
contextContext to manage linear regression model-based training
otherAlgorithm to use as the source to initialize the input objects and parameters of the algorithm
TrainingOnline ( DaalContext  context,
Class<?extends Number >  cls,
TrainingMethod  method 
)

Constructs the linear regression training algorithm in the online processing mode

Parameters
contextContext to manage linear regression model-based training
clsData type to use in intermediate computations of linear regression, Double.class or Float.class
methodAlgorithm computation method, TrainingMethod

Detailed Description

References

Member Function Documentation

TrainingOnline clone ( DaalContext  context)

Returns a newly allocated linear regression training algorithm with a copy of the input objects and parameters of this linear regression training algorithm in the online processing mode

Parameters
contextContext to manage linear regression model-based training
Returns
Newly allocated algorithm
PartialResult compute ( )

Computes a partial result of linear regression model-based training

Returns
Partial result of linear regression model-based training
TrainingResult finalizeCompute ( )

Computes the result of linear regression model-based training

Returns
Result of linear regression model-based training

Member Data Documentation

Input input

Input data

Training method for the algorithm

Parameter parameter

Parameters of the algorithm


The documentation for this class was generated from the following file:

For more complete information about compiler optimizations, see our Optimization Notice.