Skip to content

Commit fcb4fd1

Browse files
committed
docs(core): update enhance LLM feature page
1 parent 72fe0cf commit fcb4fd1

File tree

1 file changed

+142
-55
lines changed

1 file changed

+142
-55
lines changed

docs/shared/features/enhance-AI.md

Lines changed: 142 additions & 55 deletions
Original file line numberDiff line numberDiff line change
@@ -5,96 +5,183 @@ description: 'Learn how Nx enhances your AI assistant by providing rich workspac
55

66
# Enhance Your LLM
77

8-
Nx's LLM enhancement feature makes your AI assistant significantly smarter by providing it with rich metadata about your workspace structure, project relationships, and architectural decisions. While LLMs are powerful tools for boosting development productivity, their effectiveness depends entirely on the context they have access to. Nx bridges this gap by:
8+
{% youtube src="https://youtu.be/dRQq_B1HSLA" title="We Just Shipped the Monorepo MCP for Copilot" /%}
99

10-
- Providing **architectural awareness** about your workspace structure and project relationships
11-
- Feeding information about **project ownership and team responsibilities**
12-
- Sharing knowledge about **available tasks** and their configuration
13-
- Including details about **technology stacks** and project types
14-
- Supplying **Nx documentation** context for better assistance
10+
Monorepos **provide an ideal foundation for AI-powered development**, enabling cross-project reasoning and code generation. However, without proper context, **LLMs struggle to understand your workspace architecture**, seeing only individual files rather than the complete picture.
1511

16-
{% side-by-side %}
17-
{% youtube src="https://youtu.be/RNilYmJJzdk?si=et_6zWMMxJPa7lp2" title="We Just Made Your LLM Way Smarter!" /%}
12+
Nx's transforms your AI assistant by providing rich workspace metadata that enables it to:
1813

19-
{% youtube src="https://www.youtube.com/watch?v=V2W94Sq_v6A" title="We Just Released the MCP Server for Monorepos" /%}
20-
{% /side-by-side %}
14+
- Understand your **workspace architecture** and project relationships
15+
- Identify **project owners** and team responsibilities
16+
- Access **Nx documentation** for accurate guidance
17+
- Leverage **code generators** for consistent scaffolding
18+
- Connect to your **CI pipeline** to help fix failures
2119

22-
## How It Works
20+
The goal is to transform your AI assistant from a generic code helper into an architecturally-aware collaborator that understands your specific workspace structure and can make intelligent, context-aware decisions.
2321

24-
![Nx Console LLM Enhancement](/shared/images/nx-enhance-llm-illustration.avif)
22+
## How Nx MCP Enhances Your LLM
2523

26-
Nx maintains comprehensive metadata about your workspace to power features like [caching](/features/cache-task-results) and [distributed task execution](/ci/features/distribute-task-execution). Nx Console, as an editor extension, hooks into this rich metadata, post-processes it, and feeds it directly to your LLM. This enables your LLM to:
24+
The [Model Context Protocol (MCP)](https://modelcontextprotocol.io/) is an open standard that enables AI models to interact with your development environment through a standardized interface. Nx implements an MCP server via the [Nx Console](/getting-started/editor-setup) that exposes workspace metadata to compatible AI assistants like GitHub Copilot, Claude, and others.
2725

28-
- Understand the complete workspace structure, including applications and libraries
29-
- Know how different projects are connected and their dependency relationships
30-
- Recognize project technology stacks and available tasks
31-
- Access information about project ownership and team responsibilities
32-
- Utilize Nx documentation to provide more accurate assistance
26+
With the Nx MCP server, your AI assistant gains a "map" of your entire system being able to go from just reasoning at the file level to seeing the more high level picture. This allows the LLM to move between different abstraction levels - from high-level architecture down to specific implementation details:
3327

34-
This enhanced context allows your LLM to move beyond simple file-level operations to understand your workspace at an architectural level, making it a more effective development partner.
28+
![Different abstraction levels](/blog/images/articles/nx-ai-abstraction-levels.avif)
3529

36-
{% callout type="note" title="Current Support" %}
37-
LLM enhancement is available for VS Code with GitHub Copilot, Cursor, and other LLM clients through the MCP server. Support for additional editors and LLM providers continues to expand.
38-
{% /callout %}
30+
These are some of the available tools which the Nx MCP server exposes:
3931

40-
## Setting Up LLM Enhancements
32+
- **`nx_workspace`**: Provides a comprehensive view of your Nx configuration and project graph
33+
- **`nx_project_details`**: Returns detailed configuration for specific projects
34+
- **`nx_docs`**: Retrieves relevant documentation based on queries
35+
- **`nx_generators`**: Lists available code generators in your workspace
36+
- **`nx_generator_schema`**: Provides detailed schema information for generators
37+
- **`nx_visualize_graph`**: Opens interactive project or task graph visualizations
38+
- **`nx_cloud_cipe_details`**: Returns information about CI pipelines from Nx Cloud
39+
- **`nx_cloud_fix_cipe_failure`**: Provides detailed information about CI failures to help fix issues
4140

42-
### VS Code with GitHub Copilot
41+
## Setting Up Nx MCP
4342

44-
To enable LLM enhancement in VS Code:
43+
The Nx MCP server is part of Nx Console and can be easily configured in supported editors:
4544

46-
1. Install or update [Nx Console](/getting-started/editor-setup) in VS Code
47-
2. Ensure you have GitHub Copilot installed and configured
48-
3. Start using Copilot in your Nx workspace by typing `@nx` at the beginning of your prompt - this will automatically provide the enhanced context
45+
### VS Code / Cursor Setup
4946

50-
![Example of @nx chat participant in VSCode](/shared/images/nx-chat-participant.avif)
47+
1. Install [Nx Console](/getting-started/editor-setup) from the marketplace
48+
2. You'll receive a notification to "Improve Copilot/AI agent with Nx-specific context"
49+
3. Click "Yes" to automatically configure the MCP server
5150

52-
### Cursor
51+
If you miss the notification, run the `nx.configureMcpServer` command from the command palette (`Ctrl/Cmd + Shift + P`).
5352

54-
To enable LLM enhancements in Cursor:
53+
![VS Code showing the Nx MCP installation prompt](/blog/images/articles/copilot-mcp-install.avif)
5554

56-
1. Install or update [Nx Console](/getting-started/editor-setup) and make sure you're on the latest version of Cursor (0.46 and up)
57-
2. When starting Cursor, you will see a notification prompting you to add the Nx enhancement to Cursor Agents. Accept the notification. Alternatively, you can execute the `nx.configureMcpServer` command. ![Screenshot of the Nx AI notification](/shared/images/cursor-ai-notification.avif)
58-
3. Make sure the MCP server is enabled under `Cursor` -> `Settings` -> `Cursor Settings` -> `MCP` ![Screenshot of the Cursor MCP settings page](/shared/images/cursor-mcp-settings.avif)
59-
4. You have now successfully configured an [MCP server](https://modelcontextprotocol.io/introduction) that provides the enhanced context to Cursor's AI features ![Example of Cursor Agent using Nx tools](/shared/images/cursor-mcp-example.avif)
55+
### Other MCP-Compatible Clients
6056

61-
### Other LLM Clients
57+
For other MCP-compatible clients (that do not have Nx Console available) like Claude Desktop you can use the Nx MCP by configuring it manually as follows:
6258

63-
Other LLM clients like Claude Desktop can use the MCP server as well:
59+
```json {% fileName="mcp.json" %}
60+
{
61+
"servers": {
62+
"nx-mcp": {
63+
"command": "npx",
64+
"args": ["nx-mcp@latest", "/path/to/your/workspace"]
65+
}
66+
}
67+
}
68+
```
6469

65-
1. Run `npx nx-mcp /path/to/workspace` to start the MCP server for your workspace
66-
2. The MCP server will provide the necessary context to your LLM client when interacting with your Nx workspace
70+
Replace `/path/to/your/workspace` with the absolute path to your Nx workspace.
6771

68-
## Key Benefits
72+
## Powerful Use Cases
6973

70-
- **Architectural Understanding** - Your LLM gains deep insight into your workspace structure, identifying applications, libraries, and their relationships. It understands project categorization through tags and can make informed suggestions about feature implementation.
74+
### Understanding Your Workspace Architecture
7175

72-
- **Team and Ownership Awareness** - Access to project ownership information allows the LLM to identify relevant team members for collaboration and provide guidance on who to consult for specific components.
76+
{% youtube src="https://youtu.be/RNilYmJJzdk" title="Nx Just Made Your LLM Way Smarter" /%}
7377

74-
- **Task and Generator Knowledge** - Enhanced context about workspace tasks and generators enables the LLM to suggest appropriate commands, help set up new projects, and provide guidance on available tasks and their configuration.
78+
Ask your AI assistant about your workspace structure and get detailed, accurate responses about projects, their types, and relationships:
7579

76-
Here are some example queries:
80+
```
81+
What is the structure of this workspace?
82+
How are the projects organized?
83+
```
7784

78-
Ask your LLM about your workspace structure and get detailed, accurate responses:
85+
With Nx MCP, your AI assistant can:
86+
87+
- Identify applications and libraries in your workspace
88+
- Understand project categorization through tags
89+
- Recognize technology types (feature, UI, data-access)
90+
- Determine project ownership and team responsibilities
7991

8092
![Example of LLM understanding project structure](/blog/images/articles/nx-ai-example-project-data.avif)
8193

82-
Get informed suggestions about where to implement new functionality based on existing code:
94+
You can also get informed suggestions about where to implement new functionality:
95+
96+
```
97+
Where should I implement a feature for adding products to cart?
98+
```
8399

84100
![Example of LLM providing implementation guidance](/blog/images/articles/nx-ai-example-data-access-feature.avif)
85101

86-
Identify relevant team members and ownership information:
102+
Learn more about workspace architecture understanding in our blog post [Nx Just Made Your LLM Way Smarter](/blog/nx-just-made-your-llm-smarter).
103+
104+
### Instant CI Failure Resolution
105+
106+
{% youtube src="https://youtu.be/fPqPh4h8RJg" title="Connect Your Editor, CI and LLMs" /%}
107+
108+
When a CI build fails, Nx Console can notify you directly in your editor:
109+
110+
![Nx Console shows the notification of the CI failure](/blog/images/articles/ci-notification.avif)
111+
112+
Your AI assistant can then:
113+
114+
1. Access detailed information from Nx Cloud about the failed build
115+
2. Analyze your git history to understand what changed in your PR
116+
3. Understand the error context and affected files
117+
4. Help implement the fix right in your editor
118+
119+
This integration dramatically improves the development velocity because you get immediately notified when an error occurs, you don't even have to leave your editor to understand what broke, and the LLM can help you implement or suggest a possible fix.
120+
121+
Learn more about CI integration in our blog post [Save Time: Connecting Your Editor, CI and LLMs](/blog/nx-editor-ci-llm-integration).
122+
123+
### Smart Code Generation with AI-Enhanced Generators
124+
125+
{% youtube src="https://youtu.be/PXNjedYhZDs" title="Enhancing Nx Generators with AI" /%}
126+
127+
Nx generators provide predictable code scaffolding, while AI adds intelligence and contextual understanding. Instead of having the AI generate everything from scratch, you get the best of both worlds:
128+
129+
```
130+
Create a new React library into the packages/orders/feat-cancel-orders folder
131+
and call the library with the same name of the folder structure. Afterwards,
132+
also connect it to the main shop application.
133+
```
134+
135+
Your AI assistant will:
136+
137+
1. Identify the appropriate generator and its parameters
138+
2. Open the Nx Console Generate UI with preset values
139+
3. Let you review and customize the options
140+
4. Execute the generator and help integrate the new code with your existing projects
141+
142+
![LLM invoking the Nx generate UI](/blog/images/articles/llm-nx-generate-ui.avif)
143+
144+
This approach ensures consistent code that follows your organization's best practices while still being tailored to your specific needs. Learn more about AI-enhanced generators in our blog post [Enhancing Nx Generators with AI](/blog/nx-generators-ai-integration).
145+
146+
### Documentation-Aware Configuration
147+
148+
{% youtube src="https://youtu.be/V2W94Sq_v6A?si=aBA-eppEw0fHrh5O&t=388" title="Making Cursor Smarter with an MCP Server" /%}
149+
150+
Get accurate guidance on Nx configuration without worrying about hallucinations or outdated information:
151+
152+
```
153+
Can you configure Nx release for the packages of this workspace?
154+
Update nx.json with the necessary configuration using conventional commits
155+
as the versioning strategy.
156+
```
157+
158+
The AI assistant will:
159+
160+
1. Query the Nx docs for the latest information on release configuration
161+
2. Understand your workspace structure to identify packages
162+
3. Generate the correct configuration based on your specific needs
163+
4. Apply the changes to your nx.json file
164+
165+
Learn more about documentation-aware configuration in our blog post [Making Cursor Smarter with an MCP Server For Nx Monorepos](/blog/nx-made-cursor-smarter).
166+
167+
### Cross-Project Dependency Analysis
168+
169+
{% youtube src="https://youtu.be/dRQq_B1HSLA?si=lhHsjRvwgijC1IL8&t=186" title="Nx MCP Now Available for VS Code Copilot" /%}
87170

88-
![Example of LLM providing ownership information](/blog/images/articles/nx-ai-example-ownership.avif)
171+
Understand the impact of changes across your monorepo with questions like:
89172

90-
Get assistance with creating new projects using Nx generators:
173+
```
174+
If I change the public API of feat-product-detail, which other projects
175+
might be affected by that change?
176+
```
91177

92-
![Example of LLM helping with code generation](/blog/images/articles/nx-ai-example-generate-code.avif)
178+
Your AI assistant can:
93179

94-
## Learn More
180+
- Analyze the project graph to identify direct and indirect dependencies
181+
- Visualize affected projects using the `nx_visualize_graph` tool
182+
- Suggest strategies for refactoring that minimize impact
183+
- Identify which teams would need to be consulted for major changes
95184

96-
For a deeper dive into how Nx's monorepo approach enhances AI capabilities and makes your workspace future-proof, read our blog post [Nx Just Made Your LLM Way Smarter](/blog/nx-just-made-your-llm-smarter). The post explores:
185+
This architectural awareness is particularly powerful in larger monorepos where understanding project relationships is crucial for making informed development decisions.
97186

98-
- How monorepos break down barriers for both teams and LLMs
99-
- Why Nx's metadata is a goldmine for enhancing AI capabilities
100-
- How this positions your workspace for future AI advancements
187+
Learn more about dependency analysis in our blog post [Nx MCP Now Available for VS Code Copilot](/blog/nx-mcp-vscode-copilot).

0 commit comments

Comments
 (0)