Model Compare

Model Compare

Working with Model Compare

To start working with Model Compare, follow the steps given below.

  1. Go to the Home page and create a new workbook or open an existing workbook.
  2. Drag and drop the required dataset on the workbook canvas.
  3. In the Properties pane, select the Data Fields from the dataset.



  4. Select the classification or regression algorithm that you want to compare and connect it to the dataset node.
  5. Select the algorithm node and select its respective Properties displayed in the Properties pane.
    Here, we have selected the Dependent and Independent variables, and Advanced Properties of MLP Neural Network algorithm.



  6. Click the algorithm and then click Run.
  7. Repeat steps 4 to 6 for all the algorithms that you want to compare.
    Here we have selected Decision Tree



    (info) Note:

    Make sure you select either the Classification models or Regression models to compare. You cannot compare a Classification model to another Regression model.




  8. Drag and Drop Model Compare on the workbook.
  9. Connect the selected algorithms to Model Compare.



  10. Select the Comparison Metrices for Model Compare.



    (info)

    Note:

    For Regression models, the Comparison Metrices are –

    • RMSE
    • Adjusted R Square
    • R square
    • MSE
    • MAE
    • MAPE
    • AIC
    • BIC

    For Classification models, the Comparison Metrices are –

    • Accuracy
    • Specificity
    • Sensitivity / Recall
    • F-Score
    • Precision
    • AUC

    If cross validation is added as a predecessor to any of the algorithms, the comparison metrices are Mean accuracy and Standard Deviation accuracy.

    If Train Test Split is used as a predecessor to the algorithms, then the algorithms are compared based on Test data.

  11. Select the Model Compare node, then click Run.
    The node execution starts and after completion, a confirmation is displayed.
  12. After the Model Compare node execution is complete, Click Explore.
    The result page is displayed.
    The result page displays the metrices of both the models sorted on the Performance Metrices.




    As seen in the above figure, in this example, the recommended model is Decision Tree.


    • Related Articles

    • Model Validation

      Model validation is an enhancement of publishing a model. You can use this feature to explore the result of the published model for a selected dataset. In model validation, you can use the published model for the selected algorithm with the same ...
    • Working with Model Studio

      What is Model Studio Model Studio is a visual model designer for data scientists. It helps to build, train, test, deploy, and publish data models. These published data models can be reused whenever required. This enables you to maintain multiple ...
    • Lookup

      Lookup is located under Model Studio () in Data Preparation, in the left task pane. Use the drag-and-drop method to use the feature in the canvas. Click the feature to view and select different properties for analysis. Refer to Properties of Lookup. ...
    • Kruskal Wallis Test

      In the left task pane, Kruskal-Wallis is located under Model Studio () in Statistical Analysis inside Hypothesis Test under Non Parametric Test. Use the drag-and-drop method to use the algorithm in the canvas. Click the algorithm to view and select ...
    • Publishing Models

      You can publish algorithms as models after their successful execution. A model can be reused in a workbook for training and experimenting or can be used in a workflow for production. Notes: This functionality is available only for Machine Learning ...