Sinusoidal and learned positional encodings were early solutions for transformers, but modern LLMs now use RoPE and ALiBi for better long-context performance. Learn why and how these techniques evolved.