Troy Schultz PRO
TroyDoesAI
AI & ML interests
Contact Me ~ Open For Work ~ Contract or W2
Recent Activity
reacted to SeaWolf-AI's post with π₯ 1 day ago
𧬠Darwin-35B-A3B-Opus β The Child That Surpassed Both Parents
What if a merged model could beat both its parents? We proved it can.
Darwin-35B-A3B-Opus is a 35B MoE model (3B active) built with our Darwin V5 engine β the first evolution system that CT-scans parent models before merging them.
π€ Model: https://huggingface.co/FINAL-Bench/Darwin-35B-A3B-Opus
The result speaks for itself: GPQA Diamond 90.0%, versus Father (Qwen3.5-35B-A3B) at 84.2% and Mother (Claude 4.6 Opus Distilled) at 85.0%. That's +6.9% over Father and +5.9% over Mother. Not a tradeoff β a genuine leap. Meanwhile, MMMLU sits at 85.0% (Father: 85.2%), multimodal is fully intact, and all 201 languages are preserved.
How? Model MRI changed everything. Traditional merging is guesswork. Darwin V4 added evolution. Darwin V5 added X-ray vision. Model MRI scans each parent layer by layer and discovers: Mother's L34βL38 is the reasoning engine (peak cosine distance), 50β65% of Mother's experts are dead (killed by text-only distillation), and Father is a healthy generalist with every expert alive. The prescription: transplant Mother's reasoning brain at L38 (90% weight), replace her dead experts with Father's living ones, and let Father's router handle the output layer. Reasoning went up. Versatility stayed intact. No tradeoff β just evolution.
35B total, 3B active (MoE) Β· GPQA Diamond 90.0% Β· MMMLU 85.0% (201 languages) Β· Multimodal Image & Video Β· 262K native context Β· 147.8 tok/s on H100 Β· Runs on a single RTX 4090 (Q4) Β· Apache 2.0
Darwin V5's full algorithm and technical details will be released alongside an upcoming paper.
π Live Demo: https://huggingface.co/spaces/FINAL-Bench/Darwin-35B-A3B-Opus
π FINAL Bench Leaderboard: https://huggingface.co/spaces/FINAL-Bench/Leaderboard
π ALL Bench Leaderboard: https://huggingface.co/spaces/FINAL-Bench/all-bench-leaderboard
Built by VIDRAFT Β· Supported by the Korean Government GPU Support Program new activity 2 days ago
blascotobasco/Mistral-NeMoE-12B-16E:Wow what a cool experiment liked a model 2 days ago
blascotobasco/Mistral-NeMoE-12B-16E