FlexPrefill: A Context-Aware Sparse Attention Mechanism for Efficient Long-Sequence Inference
Xunhao Lai, Jianqiao Lu, Yao Luo, Yiyuan Ma, Xun Zhou · arxiv.org
Query conditions: topic=computer-version, publish_at in 202502, and type=paper
Xunhao Lai, Jianqiao Lu, Yao Luo, Yiyuan Ma, Xun Zhou · arxiv.org
Shengqiong Wu, Hao Fei, Xiangtai Li, Jiayi Ji, Hanwang Zhang, Tat-Seng Chua, Shuicheng Yan · arxiv.org
Yihong Luo, Xiaolong Chen, Xinghua Qu, Tianyang Hu, Jing Tang · arxiv.org
Chaoyue Song, Jianfeng Zhang, Xiu Li, Fan Yang, Yiwen Chen, Zhongcong Xu, Jun Hao Liew, Xiaoyang Guo, Fayao Liu, Jiashi Feng, Guosheng Lin · arxiv.org
Zihao Huang, Qiyang Min, Hongzhi Huang, Defa Zhu, Yutao Zeng, Ran Guo, Xun Zhou · arxiv.org
Mingfei Han, Linjie Yang, Xiaojun Chang, Lina Yao, Heng Wang · arxiv.org
This page ranks Computer Version papers by topic match, content-type filter, checkout momentum, and freshness. The ranking is recalculated as new items and engagement signals arrive, so the top results stay practical for current workflows instead of remaining static or purely chronological.
The time suffix in this URL defines the publish-date window used for ranking. Year paths include items in that year, and YYYYMM paths include one calendar month. This makes comparisons cleaner when you want a focused snapshot rather than an all-time aggregate.
No. This ranking is editorial and signal-driven, not sponsored placement. Attendemia evaluates public metadata, source context, and usage momentum to rank candidates. Payment does not buy position, so readers can interpret the list as a curation surface rather than advertising inventory.