Blog Track Poster
Why RoPE Struggles to Maintain Long-Term Decay in Long Sequences?
Wei Shen · Chao Yin · Yuliang Liu · Zikai Xiao · Xiaonan He · WangYan
Abstract:
Rotary Position Embedding (RoPE) improves upon traditional positional encodings but struggles with long-term decay in contexts exceeding its training length, limiting the model's generalization to longer sequences. Our experiments suggest that this issue may stem from a high proportion of obtuse angles on the complex plane between the linear transformations of query and key embeddings.
Chat is not available.
Successful Page Load