Context Engineering
Context Density
Context density measures how much useful, task-relevant information each token carries, and optimizing for it is the primary lever for improving agent performance within fixed context window limits. Low-density context burns tokens on boilerplate, irrelevant examples, and redundant information; high-density context uses tight type signatures instead of full source files, relevant test failures instead of entire test suites, and summarized history instead of raw transcripts. Research on "lost in the middle" effects shows that models attend less to information placed in the center of long contexts, so density optimization is not just about fitting more in but about placing critical information where the model will actually use it.
resources
Lost in the Middlearxiv.orgResearch showing models struggle to use information in the middle of long contexts (arxiv.org)Anthropic: Prompt Engineeringdocs.anthropic.comBest practices for writing dense, effective prompts (docs.anthropic.com)OpenAI: Prompt Engineeringplatform.openai.comStrategies for maximizing information density in prompts (platform.openai.com)Many-Shot ICLarxiv.orgResearch on how example density affects in-context learning performance (arxiv.org)