1. EachPod

GPT-4.5 Orion: Training the Giant — A Deep Dive into Scale, Data, and Safety

Author
Mike Breault
Published
Sat 12 Apr 2025
Episode Link
None

A deep dive into OpenAI's GPT-4.5 Orion: the two-year build, Azure-backed infrastructure, and the shift from compute-bound to data-bound bottlenecks. We dissect the full training pipeline—unsupervised pretraining, supervised fine-tuning, and RLHF—plus planning, co-design, real-world hiccups (like the PyTorch summation bug), system-card insights, and multilingual/safety implications that shape its enterprise value and pricing.


Note: This podcast was AI-generated, and sometimes AI can make mistakes. Please double-check any critical information.

Sponsored by Embersilk LLC

Share to: