C++ API Reference for Intel® Data Analytics Acceleration Library 2020 Update 1

List of all members
Batch< algorithmFPType, method > Class Template Reference

Provides methods to run implementations of the KD-tree based kNN model-based prediction. More...

Class Declaration

template<typename algorithmFPType = DAAL_ALGORITHM_FP_TYPE, Method method = defaultDense>
class daal::algorithms::kdtree_knn_classification::prediction::interface2::Batch< algorithmFPType, method >

Template Parameters
algorithmFPTypeData type to use in intermediate computations for KD-tree based kNN model-based prediction in the batch processing mode, double or float
methodComputation method in the batch processing mode, Method
Enumerations
  • Method Computation methods for KD-tree based kNN model-based prediction
References

Constructor & Destructor Documentation

Batch ( )
inline

Default constructor

Batch ( const Batch< algorithmFPType, method > &  other)
inline

Constructs a KD-tree based kNN prediction algorithm by copying input objects and parameters of another KD-tree based kNN prediction algorithm

Parameters
[in]otherAlgorithm to use as the source to initialize the input objects and parameters of the algorithm

Member Function Documentation

services::SharedPtr<Batch<algorithmFPType, method> > clone ( ) const
inline

Returns a pointer to the newly allocated KD-tree based kNN prediction algorithm with a copy of input objects of this KD-tree based kNN prediction algorithm

Returns
Pointer to the newly allocated algorithm
InputType* getInput ( )
inline

Get input objects for the KD-tree based kNN prediction algorithm

Returns
Input objects for the KD-tree based kNN prediction algorithm
virtual int getMethod ( ) const
inlinevirtual

Returns the method of the algorithm

Returns
Method of the algorithm

Member Data Documentation

InputType input

Input data structure

ParameterType parameter

Parameters of prediction


The documentation for this class was generated from the following file:

For more complete information about compiler optimizations, see our Optimization Notice.