Читать книгу Advanced Analytics and Deep Learning Models - Группа авторов - Страница 49

2.6 Result of Prediction

Оглавление

The dataset is divided into 80% of the training dataset and 20% of the testing dataset as seen in Table 2.3. The desired libraries were imported and GridSearchCV used to locate the satisfactory model. It compares the model on multiple regresses and different parameters and offers the best score among them. With the assistance of GridSearchCV, we have compared the algorithms, i.e., linear regression, LASSO regression, decision tree, support vector machine, random forest regressor, and XGBoost.

To aid this challenge, various device mastering algorithms are checked. It has been clear that XGBoost acts better with 85% accuracy and with an awful lot less blunders values. While this test is compared to the result, once those algorithms predicts properly. This task has been finished with the number one aim to determine the prediction for prices, which we have got efficiently completed using specific system analyzing algorithms like a linear regression, LASSO regression, decision tree, random forest, more than one regression, guide vector gadget, gradient boosted trees, neural networks, and bagging.

Consequently, it is clear that the XGBoost gives more accuracy in prediction in comparison to the others, and additionally, our research offers to locate the attributes contribution in prediction. In addition, python flask can be used as an http server and CSS/Html for creating UI for internet site. Therefore, one might agree with this that studies may be useful for the people and governments, and the future works are stated under every system, and new software program technology can assist in the future to expect the costs. Price prediction can be advanced by way of including many attributes like surroundings, marketplaces, and many different related variables to the houses.

Table 2.3 Comparison of algorithm.

Model Best score RSME score Error score Accuracy percent
0 Linear regression 0.790932 64.813703 0.209068 79%
1 LASSO regression 0.803637 62.813241 0.196363 80%
2 Decision tree 0.71606 70.813421 0.283936 72%
3 Support Vector Machine 0.204336 126.440620 0.795664 20%
4 Random Forest Regressor 0.884247 48.226644 0.115753 88%
5 XGBoost 0.891979 46.588246 0.108021 89%
Advanced Analytics and Deep Learning Models

Подняться наверх