Share code or text projects with LLMs via the Model Context Protocol (MCP) or clipboard. Provides rule-based file selection, smart code outlining and a rich CLI for context generation.
https://github.com/cyberchitta/llm-context.pyEvery developer knows the drill: you want to discuss your project with Claude or GPT, so you start copy-pasting files. Three files become ten. Ten becomes twenty. Your chat context explodes, you lose track of what you've shared, and you're manually managing file selection like it's 2010.
llm-context.py solves this with intelligent project context sharing through the Model Context Protocol and a powerful CLI workflow. It respects your .gitignore, switches between task-focused rule sets, and gives LLMs exactly what they need to understand your codebase.
You're working on a bug fix and want LLM help. Do you:
Traditional approaches force you to choose between too little context (LLM lacks understanding) or too much context (token waste, degraded performance). You end up spending more time managing what to share than actually solving problems.
llm-context.py treats your project as a cohesive unit, not a collection of random files. It uses .gitignore patterns to automatically exclude build artifacts, dependencies, and irrelevant files while including everything that matters for your specific task.
Core workflow in 3 commands:
lc-init # One-time setup
lc-sel-files # Smart file selection
lc-context # Generate and copy context
For Claude Desktop users: Native MCP integration means no CLI needed. Just tell Claude "I want to work with my project /path/to/repo" and it automatically loads the relevant context.
Different tasks need different context. Code review requires test files and documentation. Bug fixing needs core logic and related modules. Feature development wants the full architectural picture.
Instead of manually reconfiguring file selection every time, define rule sets:
lc-set-rule lc-code-review # Include tests, docs, core files
lc-set-rule lc-debug # Focus on implementation files
lc-set-rule my-frontend-work # Your custom rules
Each rule set defines its own inclusion patterns, exclusion rules, and even custom prompts. Switch contexts as fast as you switch git branches.
Here's where it gets interesting. When you're working with large codebases, LLMs don't need to see every implementation detail upfront. They need to understand the structure first.
llm-context.py generates intelligent code outlines using tree-sitter parsing:
The LLM gets architectural understanding without drowning in implementation details. When it needs specific implementations, use lc-clip-implementations to provide exactly what it requests.
Claude Desktop (MCP): Add one JSON config block and get native project access. No CLI needed.
Any Chat Interface: Generate context with lc-context -p to include task-specific prompts. Perfect for Claude Projects, Custom GPTs, or standard chat interfaces.
File-Based Workflows: Output to files with lc-context -f output.md for documentation or sharing.
Dynamic Updates: When the LLM asks for additional files, copy its request and run lc-clip-files. It parses the request and provides exactly what was asked for.
Bug Investigation:
lc-set-rule lc-debug
lc-sel-files
lc-context -p
# Paste into LLM with debug-focused prompts included
Code Review Prep:
lc-set-rule lc-code-review
lc-context
# Full context including tests and documentation
Architecture Discussion:
lc-sel-outlines
lc-outlines
# High-level structure without implementation noise
This isn't another tool that requires workflow changes. It enhances what you're already doing:
.gitignore patternsThe tool was built by developers actively using it for their own development work. Every feature exists because it solved a real problem in day-to-day coding.
uv tool install "llm-context>=0.3.0"
cd your-project
lc-init
lc-sel-files
lc-context
Your project context is now in your clipboard, ready for any LLM chat. For Claude Desktop users, add the MCP configuration and get native integration.
No more manual file selection. No more copy-paste workflows. Just intelligent project context that scales with your needs.