Java* API Reference for Intel® Data Analytics Acceleration Library 2018 Update 2

List of all members
Parameter Class Reference

Base class for parameters of the LogitBoost training algorithm. More...

Detailed Description

Member Function Documentation

◆ getAccuracyThreshold()

double getAccuracyThreshold ( )

Retrieves the accuracy of the LogitBoost training algorithm

Returns
Accuracy of the LogitBoost training algorithm

◆ getMaxIterations()

long getMaxIterations ( )

Retrieves the maximal number of iterations of the LogitBoost training algorithm

Returns
Maximal number of iterations

◆ getResponsesDegenerateCasesThreshold()

double getResponsesDegenerateCasesThreshold ( )

Retrieves the threshold needed to avoid degenerate cases when calculating responses Z

Returns
The threshold

◆ getWeightsDegenerateCasesThreshold()

double getWeightsDegenerateCasesThreshold ( )

Retrieves the threshold needed to avoid degenerate cases when calculating weights W

Returns
The threshold

◆ setAccuracyThreshold()

void setAccuracyThreshold ( double  accuracyThreshold)

Sets the accuracy of the LogitBoost training algorithm

Parameters
accuracyThresholdAccuracy of the LogitBoost training algorithm

◆ setMaxIterations()

void setMaxIterations ( long  maxIterations)

Sets the maximal number of iterations of the LogitBoost training algorithm

Parameters
maxIterationsMaximal number of iterations

◆ setResponsesDegenerateCasesThreshold()

void setResponsesDegenerateCasesThreshold ( double  responsesDegenerateCasesThreshold)

Sets the threshold to avoid degenerate cases when calculating responses Z

Parameters
responsesDegenerateCasesThresholdThe threshold

◆ setWeightsDegenerateCasesThreshold()

void setWeightsDegenerateCasesThreshold ( double  weightsDegenerateCasesThreshold)

Sets the threshold to avoid degenerate cases when calculating weights W

Parameters
weightsDegenerateCasesThresholdThe threshold

The documentation for this class was generated from the following file:

For more complete information about compiler optimizations, see our Optimization Notice.