Skip to content

Feature Request: Integrate AI conversation capability with context awareness #9

@kookyleo

Description

@kookyleo

Feature Description

Add AI conversation capability to Markon with context awareness, allowing users to interact with AI assistants while viewing/annotating markdown documents.

Use Cases

  1. Document Q&A: Ask questions about the current document content
  2. Annotation Assistance: Get AI help for summarizing, explaining, or expanding selected text
  3. Writing Support: Request AI assistance for note-taking, rephrasing, or translation
  4. Context-Aware Chat: Maintain conversation history with document context

Proposed Integration Approaches

Option 1: Sidebar Chat Panel

  • Add a collapsible chat panel (similar to annotation sidebar)
  • Toggle with keyboard shortcut (e.g., `Ctrl+Shift+A`)
  • Include current document context in AI requests
  • Support markdown rendering in chat responses

Option 2: Selection-Based Interaction

  • Right-click or keyboard shortcut on selected text
  • Quick AI actions: "Summarize", "Explain", "Translate", "Expand"
  • Results displayed as annotations or in a modal

Option 3: Command Palette

  • Press `/` or `Ctrl+K` to open command palette
  • Type AI commands like `/ask`, `/summarize`, `/translate`
  • Send selected text or full document as context

Technical Considerations

Backend Options

  1. OpenAI API Integration

    • Support for GPT-4, GPT-3.5-turbo
    • Configurable API key (environment variable or settings)
    • Streaming responses for better UX
  2. Anthropic Claude API

    • Claude 3 Opus/Sonnet/Haiku support
    • Better for long documents (200K context window)
  3. Local LLM Support

    • Ollama integration
    • LM Studio compatibility
    • Privacy-friendly for sensitive documents
  4. Pluggable Architecture

    • Provider abstraction layer
    • Support multiple AI backends
    • Allow custom endpoints

Context Management

```rust
struct AIContext {
document_path: String,
document_content: String, // Full or relevant sections
selected_text: Option,
conversation_history: Vec,
annotations: Vec, // Existing annotations as context
}
```

Configuration

```toml

~/.config/markon/config.toml or .markon.toml

[ai]
enabled = true
provider = "openai" # openai, anthropic, ollama, custom
api_key = "sk-..." # Or use MARKON_AI_API_KEY env var
model = "gpt-4"
max_context_tokens = 8000
temperature = 0.7

[ai.shortcuts]
toggle_chat = "Ctrl+Shift+A"
quick_ask = "Ctrl+Shift+Q"
summarize_selection = "Ctrl+Shift+S"
```

Privacy & Security

  • API keys stored securely (system keychain or env vars)
  • Option to disable AI for sensitive documents
  • Local processing option (Ollama/LM Studio)
  • Explicit user consent before sending data
  • Configurable data retention policy

UI/UX Features

Chat Interface

  • Markdown rendering for responses
  • Code syntax highlighting in AI responses
  • Copy response to clipboard
  • Insert response as annotation
  • Export conversation history

Visual Indicators

  • Show when AI is processing (loading spinner)
  • Token usage display (for API-based providers)
  • Context length indicator
  • Error handling with user-friendly messages

Implementation Phases

Phase 1: Basic Integration (MVP)

  • OpenAI API integration
  • Simple chat sidebar
  • Send selected text or full document as context
  • Basic configuration (API key)

Phase 2: Enhanced Features

  • Multiple AI provider support
  • Context-aware suggestions
  • Annotation integration (AI-generated notes)
  • Conversation history persistence

Phase 3: Advanced Features

  • Local LLM support (Ollama)
  • Custom prompts/templates
  • Multi-document context
  • Collaborative AI sessions (shared mode)

Example User Workflows

Workflow 1: Document Understanding

  1. Open a complex markdown file
  2. Press `Ctrl+Shift+A` to open AI chat
  3. Ask: "What is the main topic of this document?"
  4. AI provides summary with document context

Workflow 2: Annotation Enhancement

  1. Select technical jargon in document
  2. Press `Ctrl+Shift+S` for quick summarize
  3. AI explains the term in simple language
  4. Save explanation as annotation

Workflow 3: Writing Assistant

  1. Writing notes in annotation
  2. Ask AI: "Rephrase this more professionally"
  3. AI provides alternatives
  4. Choose and apply to annotation

Related Tools for Inspiration

Open Questions

  1. Default provider: Should we recommend one provider, or leave it completely open?
  2. Pricing concerns: Should we include usage cost estimates?
  3. Offline mode: Priority for local LLM support?
  4. Multi-language: Should AI respect document language for responses?
  5. Streaming UI: Real-time streaming vs. complete response?

Community Input Welcome

This is a significant feature addition. Community feedback on:

  • Preferred AI providers
  • Most valuable use cases
  • Privacy concerns
  • UI/UX preferences
  • Configuration complexity tolerance

Priority: Medium (nice-to-have, not critical)
Complexity: High (requires significant backend and frontend work)

Metadata

Metadata

Assignees

No one assigned

    Labels

    enhancementNew feature or request

    Projects

    No projects

    Milestone

    No milestone

    Relationships

    None yet

    Development

    No branches or pull requests

    Issue actions