Unsloth December Update
Thanks for all the support this year! 🥰 In the past few weeks, we’ve got lots of Unsloth improvements and new models (from Google, NVIDIA, Mistral, and more). See below for the latest updates:
✨ New Google model
Google releases FunctionGemma, a new 270M parameter model built for tool/function-calling. You can fine-tune it with Unsloth & later run/deploy on your phone. Learn how with our guide.
Our notebook turns FunctionGemma into a reasoning model by making it ‘think’ before tool-calling.
🦥 3× Faster Training
You can now train 3× faster and use ~30% less VRAM (with no accuracy loss) via new RoPE + MLP Triton kernels, padding-free training, and uncontaminated packing. Train Qwen3-4B 3× faster on just 3.9GB VRAM.
We also added a preliminary multi-GPU support: DDP Guide
This is an early preview (the official release is expected early next year).
🤗Transformers v5 is now supported, too!
📱 Deploy LLMs on your phone
We collaborated with PyTorch to enable you to fine-tune LLMs, and then directly deploy/run them on your phone. Read guide
🧠 500K context training
You can now do 500K context-length fine-tuning with Unsloth! Train gpt-oss-20B to extend its context window to 530K on 80GB VRAM and 750K+ on 192GB, with no accuracy loss.
Unsloth’s new algorithms + our collaboration with Snowflake enable ~72% less VRAM and up to 6× longer context lengths. Read the blog
💚 NVIDIA Nemotron 3
NVIDIA releases Nemotron 3, a new 30B hybrid reasoning model. You can now fine-tune and run the model locally with Unsloth. Nemotron 3 Guide
Also thanks to NVIDIA for collabing on a blogpost and video on how to do reinforcement learning (RL) with Unsloth: NVIDIA blog • Video Tutorial
🐱 New Mistral models
Mistral releases Devstral 2 and Mistral 3, new multimodal models that excel at coding + chat workflows: Ministral 3 • Devstral 2
You can fine-tune them all with Unsloth and we made a new reinforcement learning notebook for autonomously solving sudoku.
And that’s it, hope you have a wonderful Christmas and happy holidays! 🎁



You are all geniunes! Truly appreciate everything that y'all do!
1st