Train Clustering Model VI

LabVIEW 2018 Analytics and Machine Learning Toolkit Help

Edition Date: July 2018

Part Number: 377059B-01

»View Product Info
Download Help (Windows Only)

Owning Palette: Clustering VIs

Requires: Analytics and Machine Learning Toolkit

Trains a clustering model.

Example

model in specifies the information about the entire workflow of the model.
clustering model info in specifies the initialized clustering model for training.

You can acquire an initialized clustering model from the following VIs:
error in describes error conditions that occur before this node runs. This input provides standard error in functionality.
model out returns the information about the entire workflow of the model. Wire model out to the reference input of a standard Property Node to get an AML Analytics Property Node.
clustering model info out returns the trained clustering model.

Wire clustering model info out to the reference input of a standard Property Node to get the properties of the trained clustering model. The following table displays the VI you wire to clustering model info in and the corresponding Property Node you get from clustering model info out.

VI NameProperty Node
Initialize Clustering Model (DBSCAN) VIAML DBSCAN
Initialize Clustering Model (GMM) VIAML GMM
Initialize Clustering Model (K-Means) VIAML K-Means
predicted labels returns the predicted labels of training data.
metric returns the evaluation metric for the clustering model. This output is valid only if you used hyperparameter optimization to initialize the clustering model.

For the Davies Bouldin Index metric, the lower the value of metric, the better the compactness and separation of the clustering model.

For the Dunn Index metric, the higher the value of metric, the better the compactness and separation of the clustering model.

For the Jaccard Index metric, the higher the value of metric, the better the separation of the clustering model.

For the Rand Index metric, the lower the value of metric, the better the separation of the clustering model.

For the Akaike Information Criterion (AIC) metric, the lower the value of metric, the better a GMM model is.

For the Bayesian Information Criterion (BIC) metric, the lower the value of metric, the better a GMM model is.
error out contains error information. This output provides standard error out functionality.

Example

Refer to the Clustering (Search Parameters, Training) VI in the labview\examples\AML\Clustering directory for an example of using the Train Clustering Model VI.

WAS THIS ARTICLE HELPFUL?

Not Helpful