Tuning-Free Longer Context Lengths For LLMs - A Review of Self-Extend (LLM Maybe LongLM) | Towards Data Science
A simple strategy to enable LLMs to consume longer context length inputs during inference without the need for finetuning.

Source: Towards Data Science
A simple strategy to enable LLMs to consume longer context length inputs during inference without the need for finetuning.