Hyperparameter Tuning in Machine Learning: A Comprehensive Review

A Ilemobayo, Justus and Durodola, Olamide and Alade, Oreoluwa and J Awotunde, Opeyemi and T Olanrewaju, Adewumi and Falana, Olumide and Ogungbire, Adedolapo and Osinuga, Abraham and Ogunbiyi, Dabira and Ifeanyi, Ark and E Odezuligbo, Ikenna and E Edu, Oluwagbotemi (2024) Hyperparameter Tuning in Machine Learning: A Comprehensive Review. Journal of Engineering Research and Reports, 26 (6). pp. 388-395. ISSN 2582-2926

[thumbnail of Durodola2662024JERR118312.pdf] Text
Durodola2662024JERR118312.pdf - Published Version

Download (222kB)

Abstract

Hyperparameter tuning is essential for optimizing the performance and generalization of machine learning (ML) models. This review explores the critical role of hyperparameter tuning in ML, detailing its importance, applications, and various optimization techniques. Key factors influencing ML performance, such as data quality, algorithm selection, and model complexity, are discussed, along with the impact of hyperparameters like learning rate and batch size on model training. Various tuning methods are examined, including grid search, random search, Bayesian optimization, and meta-learning. Special focus is given to the learning rate in deep learning, highlighting strategies for its optimization. Trade-offs in hyperparameter tuning, such as balancing computational cost and performance gain, are also addressed. Concluding with challenges and future directions, this review provides a comprehensive resource for improving the effectiveness and efficiency of ML models.

Item Type: Article
Subjects: Archive Science > Engineering
Depositing User: Managing Editor
Date Deposited: 08 Jun 2024 07:03
Last Modified: 08 Jun 2024 07:03
URI: http://editor.pacificarchive.com/id/eprint/1476

Actions (login required)

View Item
View Item