forked from cline/cline
-
Notifications
You must be signed in to change notification settings - Fork 2.3k
Closed
Labels
Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.New issue. Needs quick review to confirm validity and assign labels.enhancementNew feature or requestNew feature or requestproposal
Description
What specific problem does this solve?
What problem does this solve?
Current Problem:
Currently, RooCode does not maintain a complete and updated context of the project or repository when interacting with the language model (LLM). This means that each request may involve the LLM working with partial or outdated information, leading to:
- Errors in code suggestions or generation.
- Higher token consumption, as the LLM needs to process additional information to "catch up" with each call.
- Lower efficiency in making changes or complex refactorings, prone to errors, because the model lacks a global view of the code.
What's the proposed solution?
Proposed Solution:
Add a feature that enables RooCode to maintain a complete and updated project or repository context for every LLM call. This could be achieved through the following steps:
- Generate a consolidated context: Create a mechanism to automatically collect relevant content from project files (e.g., source code files) in a format usable by the LLM.
- Automatic updates: Ensure this context is updated whenever changes occur in the project and before each LLM interaction.
- LLM integration: Include this context in the system prompt (or in the data stream sent to the model) so the LLM always has a complete and current view of the code. By incorporating it into the system prompt or appending it, the LLM will always have the full, updated global context of the repository and know its current state. Subsequent messages reflecting modifications will guide the LLM on the path taken based on user decisions and changes made to reach the repository’s current state.
Expected Benefits:
- Greater accuracy: With a complete context, the LLM will generate more precise code and suggestions, reducing errors.
- Lower token consumption: By avoiding repetitive or unnecessary information processing in each call, resource usage is optimized in the long term.
- Better change handling: The LLM can perform modifications or refactorings more effectively by understanding how all project files are interconnected.
- Improved user experience: Developers will have greater confidence in RooCode, boosting productivity.
Suggested Technical Details:
- Context collection: RooCode could scan project files open in VS Code and consolidate their content into a readable format (e.g., plain text with file names and their code).
- Exclusions: Ignore irrelevant files such as binaries, temporary configurations, or folders like
node_modules
. - Updates: Implement a system to detect file changes (e.g., on save) and automatically update the context.
- Configuration: Allow users to decide which files or folders to include or exclude from the context.
Are you interested in implementing this?
None
Implementation approach (if contributing)
No response
Proposal checklist
- I've checked for existing issues or related proposals
- I understand this needs review before implementation can start
Contribution checklist (if contributing)
- I've read the Contributing Guide
- I'm willing to make changes based on feedback
- I understand the code review process and requirements
Metadata
Metadata
Assignees
Labels
Issue/PR - TriageNew issue. Needs quick review to confirm validity and assign labels.New issue. Needs quick review to confirm validity and assign labels.enhancementNew feature or requestNew feature or requestproposal
Type
Projects
Status
Done