# Global and [Sliding Window Attention](Sliding%20Window%20Attention.md)
- [Sliding Window Attention](Sliding%20Window%20Attention.md) and [Dilated Sliding Window Attention](Dilated%20Sliding%20Window%20Attention.md) are not always enough
- global [Attention](Attention.md)” on few pre-selected input locations.
- This [Attention](Attention.md) is operation symmetric: that is, a token with a global [Attention](Attention.md) attends to all tokens across the sequence, and all tokens in the sequence attend to it
- 