Through my work building XGBoost models across different projects, I came across the great resource Effective XGBoost by Matt Harrison, a textbook covering XGBoost, including how to tune hyperparamete...
Similar Articles (10 found)
🔍 58.2% similar
Writing an LLM from scratch, part 22 -- finally training our LLM!
This post wraps up my notes on chapter 5 of Sebastian Raschka's book "Build a Large ...
🔍 57.4% similar
Table of Contents
Fine Tuning SmolVLM for Human Alignment Using Direct Preference Optimization
Preference optimization shines when we want models to m...
🔍 54.1% similar
Introduction
Based on these two factors, I’ve decided to do an exploration of how different decision tree hyperparameters affect both the performance ...
🔍 54.1% similar
> the generation of 281,128 augmented examples, from which 1,000 were
held out as a benchmark test set.
This model is trained on a custom dataset of 2...
🔍 53.7% similar
A lot of people find machine learning ensembles very interesting.
This is probably because they offer an “easy” way to improve the performance of mach...
🔍 53.1% similar
There has been a lot of interest on HN in fine-tuning open-source LLMs recently (eg. Anyscale's post at
https://news.ycombinator.com/item?id=37090632)...
🔍 52.2% similar
To get the most out of this tutorial, you should already have a solid understanding of how linear regression works and the assumptions behind it. You ...
🔍 51.4% similar
The Kaggle Blueprints
Welcome to the first edition of a new article series called "The [Kaggle](https://www.kaggle.com/) Blueprints", where we will an...
🔍 51.1% similar
Why Classical Machine Learning Still Matters
In an era of GPU supremacy, why do real-world business cases depend so much on classical machine learning...
🔍 50.8% similar
Day zero model performance optimization work is a mix of experimentation, bug fixing, and benchmarking guided by intuition and experience. This writeu...