N-Gram House

Tag: positional encoding

Positional Encoding in Transformers: Sinusoidal vs Learned for LLMs

Positional Encoding in Transformers: Sinusoidal vs Learned for LLMs

Sinusoidal and learned positional encodings were early solutions for transformers, but modern LLMs now use RoPE and ALiBi for better long-context performance. Learn why and how these techniques evolved.

Categories

  • History (50)
  • Machine Learning (30)

Recent Posts

Vocabulary Size in Large Language Models: How Token Count Affects Accuracy and Efficiency Feb, 23 2026
Vocabulary Size in Large Language Models: How Token Count Affects Accuracy and Efficiency
Prompt Engineering for Large Language Models: Core Principles and Practical Patterns Feb, 16 2026
Prompt Engineering for Large Language Models: Core Principles and Practical Patterns
How Design Teams Use Generative AI for Wireframes, Creative Variations, and Asset Generation Jan, 21 2026
How Design Teams Use Generative AI for Wireframes, Creative Variations, and Asset Generation
Agentic Systems vs Vibe Coding: How to Pick the Right AI Autonomy for Your Project Jan, 22 2026
Agentic Systems vs Vibe Coding: How to Pick the Right AI Autonomy for Your Project
Architectural Innovations Powering Modern Generative AI Systems Nov, 7 2025
Architectural Innovations Powering Modern Generative AI Systems

Menu

  • About
  • Terms of Service
  • Privacy Policy
  • CCPA
  • Contact

© 2026. All rights reserved.