Official release of InternLM2.5 base and chat models. 1M context support
-
Updated
Nov 21, 2024 - Python
Official release of InternLM2.5 base and chat models. 1M context support
Code and documents of LongLoRA and LongAlpaca (ICLR 2024 Oral)
LongWriter: Unleashing 10,000+ Word Generation from Long Context LLMs
[ACL 2024] LongBench: A Bilingual, Multitask Benchmark for Long Context Understanding
Large Context Attention
Implementation of MEGABYTE, Predicting Million-byte Sequences with Multiscale Transformers, in Pytorch
Implementation of 💍 Ring Attention, from Liu et al. at Berkeley AI, in Pytorch
LongCite: Enabling LLMs to Generate Fine-grained Citations in Long-context QA
Implementation of Recurrent Memory Transformer, Neurips 2022 paper, in Pytorch
The code of our paper "InfLLM: Unveiling the Intrinsic Capacity of LLMs for Understanding Extremely Long Sequences with Training-Free Memory"
Codes for the paper "∞Bench: Extending Long Context Evaluation Beyond 100K Tokens": https://arxiv.org/abs/2402.13718
PyTorch implementation of Infini-Transformer from "Leave No Context Behind: Efficient Infinite Context Transformers with Infini-attention" (https://arxiv.org/abs/2404.07143)
[COLM 2024] TriForce: Lossless Acceleration of Long Sequence Generation with Hierarchical Speculative Decoding
[EMNLP 2024] LongAlign: A Recipe for Long Context Alignment of LLMs
LLM KV cache compression made easy
ACL 2024 | LooGLE: Long Context Evaluation for Long-Context Language Models
open-source code for paper: Retrieval Head Mechanistically Explains Long-Context Factuality
LongQLoRA: Extent Context Length of LLMs Efficiently
awesome llm plaza: daily tracking all sorts of awesome topics of llm, e.g. llm for coding, robotics, reasoning, multimod etc.
Implementation of NAACL 2024 Outstanding Paper "LM-Infinite: Simple On-the-Fly Length Generalization for Large Language Models"
Add a description, image, and links to the long-context topic page so that developers can more easily learn about it.
To associate your repository with the long-context topic, visit your repo's landing page and select "manage topics."