Skip to content

connect all updated code#1

Open
SIBAM890 wants to merge 48 commits intoshayannab:mainfrom
SIBAM890:main
Open

connect all updated code#1
SIBAM890 wants to merge 48 commits intoshayannab:mainfrom
SIBAM890:main

Conversation

@SIBAM890
Copy link
Copy Markdown

@SIBAM890 SIBAM890 commented Jan 28, 2026

Summary by Sourcery

Integrate file-based inventory context and improved AI workflow generation/visualization with a refined dashboard, builder UX, and WhatsApp/simulation robustness.

New Features:

  • Allow users to upload CSV/Excel files to provide inventory context for AI-generated workflows and expose this via new backend upload APIs and frontend workflowApi methods.
  • Enhance the builder chat interface to support file attachments, pass file context into workflow generation, and display attachment status.
  • Add a deployment flow in the builder to name and mark automations as live with corresponding UI state changes.
  • Introduce business-type presets and richer active automation listing on the dashboard to guide users into the builder.
  • Support AI-generated workflow explanations with structured responses returned as plain text summaries.

Bug Fixes:

  • Stabilize WhatsApp connection handling by ensuring proper reconnection, session cleanup on logout, and safe message extraction before engine processing.
  • Prevent frontend crashes by enforcing plain-text output for workflow explanations and handling various AI response shapes in the explanation component.
  • Improve engine safety by normalizing non-string incoming messages and expanding greeting handling.

Enhancements:

  • Refine AI workflow generation to use multiple Gemini models with fallback, accept optional file/inventory context, enforce a node-based schema, and provide a structured fallback workflow on failure.
  • Upgrade workflow visualization to support both legacy step-based and new node-based workflows, infer edges from explicit connections or fall back to linear linking, and better categorize nodes.
  • Modernize the dashboard and ROI cards with updated layout, metrics, and visual design focused on query resolution analytics.
  • Expand the node palette by dynamically loading a large library of tools from JSON and updating search/filtering across combined core and dynamic categories.
  • Polish the builder header and side panel UX, including conditional controls based on deployment state and clearer assistant labeling.

Build:

  • Add CSV and Excel parsing dependencies to the backend for file-based context ingestion.

Sibam Prasad Sahoo added 2 commits January 27, 2026 23:21
@sourcery-ai
Copy link
Copy Markdown

sourcery-ai bot commented Jan 28, 2026

Reviewer's Guide

Connects updated AI workflow generation, file upload, WhatsApp integration, and UI enhancements into a cohesive automation builder experience, adding inventory-aware workflows, deployment flows, richer toolbox, and safer engine/visualization behaviors.

Sequence diagram for inventory-aware workflow generation with file upload

sequenceDiagram
    actor User
    participant Frontend_ChatInterface as ChatInterface
    participant Frontend_workflowApi as workflowApi
    participant Backend_API as Express_API
    participant Backend_UploadController as uploadController
    participant Backend_WorkflowController as workflowController
    participant Backend_AIService as aiService
    participant Google_Gemini as Google_GenerativeAI

    User->>Frontend_ChatInterface: Select file to upload
    Frontend_ChatInterface->>Frontend_workflowApi: uploadFile(file)
    Frontend_workflowApi->>Backend_API: POST /upload (multipart file)
    Backend_API->>Backend_UploadController: uploadFile middleware chain
    Backend_UploadController->>Backend_UploadController: Store file to /uploads
    Backend_UploadController->>Backend_UploadController: Parse via xlsx.sheet_to_json
    Backend_UploadController-->>Backend_API: { columns, preview, rowCount, filename }
    Backend_API-->>Frontend_workflowApi: JSON response
    Frontend_workflowApi-->>Frontend_ChatInterface: { columns, preview, rowCount }
    Frontend_ChatInterface->>Frontend_ChatInterface: Save uploadedFile.context

    User->>Frontend_ChatInterface: Enter workflow description
    Frontend_ChatInterface->>Frontend_workflowApi: generate(description, fileContext)
    Frontend_workflowApi->>Backend_API: POST /generate-workflow { userPrompt, fileContext }
    Backend_API->>Backend_WorkflowController: createWorkflow
    Backend_WorkflowController->>Backend_AIService: generateWorkflow(userPrompt, fileContext)

    Backend_AIService->>Backend_AIService: Extract inventoryColumns, sampleData
    Backend_AIService->>Google_Gemini: generateContent(prompt with context)
    Google_Gemini-->>Backend_AIService: JSON-like text
    Backend_AIService->>Backend_AIService: Clean markdown fences
    Backend_AIService->>Backend_AIService: JSON.parse(cleanJson)
    Backend_AIService->>Backend_AIService: validateWorkflow(parsed)
    alt Valid_nodes
        Backend_AIService-->>Backend_WorkflowController: { nodes }
    else All_models_fail_or_invalid
        Backend_AIService-->>Backend_WorkflowController: generateFallbackWorkflow()
    end

    Backend_WorkflowController-->>Backend_API: { success, workflow }
    Backend_API-->>Frontend_workflowApi: JSON workflow
    Frontend_workflowApi-->>Frontend_ChatInterface: { workflow }
    Frontend_ChatInterface-->>User: Confirmation message
    Frontend_ChatInterface-->>Frontend_ChatInterface: onWorkflowGenerated(workflow)
Loading

Sequence diagram for WhatsApp message processing through engine and simulator

sequenceDiagram
    actor WhatsApp_User
    participant WhatsApp_Server as WhatsApp
    participant Baileys_Socket as WA_Socket
    participant Backend_Engine as engineService

    rect rgb(240,240,255)
        WhatsApp_User->>WhatsApp_Server: Send message
        WhatsApp_Server-->>Baileys_Socket: messages.upsert event
        Baileys_Socket->>Baileys_Socket: Extract userMessage and sender
        alt Has_text_and_not_fromMe
            Baileys_Socket->>Backend_Engine: processMessage(sock, sender, userMessage)
            Backend_Engine->>Backend_Engine: Normalize messageText to string
            Backend_Engine->>Backend_Engine: lowerMsg = messageText.toLowerCase().trim()
            alt Greeting_match
                Backend_Engine->>Baileys_Socket: sendMessage(sender, greeting_reply)
            else Other_logic
                Backend_Engine->>Baileys_Socket: sendMessage(sender, business_reply)
            end
        else Non_text_or_fromMe
            Baileys_Socket->>Baileys_Socket: Ignore message
        end
    end

    rect rgb(240,255,240)
        participant Frontend_TestMode as TestMode_UI
        participant Frontend_workflowApi as workflowApi
        participant Backend_API as Express_API

        Frontend_TestMode->>Frontend_workflowApi: simulateMessage(message)
        Frontend_workflowApi->>Backend_API: POST /simulate-message { message }
        Backend_API->>Backend_Engine: processMessage(mockSock, TestUser, message)
        Backend_Engine->>Backend_Engine: Same processing as live flow
        Backend_Engine->>Backend_API: mockSock.sendMessage captures botReply
        Backend_API-->>Frontend_workflowApi: { success, reply: botReply }
        Frontend_workflowApi-->>Frontend_TestMode: botReply
    end
Loading

Class diagram for core backend and frontend modules

classDiagram
    class ai_service {
        +generateWorkflow(userPrompt, fileContext)
        +explainWorkflow(workflowJson)
        -validateWorkflow(parsed)
        -generateFallbackWorkflow(reason)
        -genAI
    }

    class workflow_controller {
        +createWorkflow(req, res)
        +explainWorkflow(req, res)
    }

    class upload_controller {
        +uploadFile(req, res, next)
        -storage
        -upload
    }

    class file_service {
        +parseFile(filePath, mimetype)
    }

    class engine_service {
        +processMessage(sock, sender, messageText)
        -activeWorkflow
    }

    class whatsapp_service {
        +connectToWhatsApp()
    }

    class workflowApi {
        +generate(description, fileContext)
        +uploadFile(file)
    }

    class ChatInterface_Component {
        -messages
        -isLoading
        -uploadedFile
        -messagesEndRef
        -fileInputRef
        +handleFileUpload(event)
        +clearFile()
        +handleSend(text)
    }

    class Builder_Page {
        -workflow
        -isTestOpen
        -isCustomMode
        -isDeployOpen
        -isDeployed
        -businessName
        -tempName
        +handleWorkflowGenerated(data)
        +handleDeploy()
    }

    class WorkflowGraph_Component {
        -workflowData
        -nodes
        -edges
        +useEffect_onWorkflowData()
    }

    class NodePalette_Component {
        -searchQuery
        -expandedCategories
        -allCategories
        +onDragStart(event, tool)
        +toggleCategory(catId)
    }

    %% Relationships
    workflow_controller --> ai_service : uses
    workflow_controller --> upload_controller : separate_route
    upload_controller --> file_service : may_use
    whatsapp_service --> engine_service : calls_processMessage
    engine_service --> whatsapp_service : indirect_via_sock

    workflowApi --> ChatInterface_Component : used_by
    Builder_Page --> ChatInterface_Component : renders
    Builder_Page --> WorkflowGraph_Component : renders
    Builder_Page --> NodePalette_Component : renders
    ChatInterface_Component --> workflowApi : calls_generate_and_upload
    WorkflowGraph_Component --> engine_service : relies_on_workflow_format
Loading

File-Level Changes

Change Details Files
Refactored AI workflow generation and explanation to support inventory/file context, multiple Gemini model fallbacks, structured nodes, and safer fallbacks.
  • Introduced workflow validation and fallback generator utilities to normalize AI responses into a nodes array and provide a default workflow on failure.
  • Extended generateWorkflow to accept optional fileContext, derive inventory columns/sample data for prompt context, iterate over multiple Gemini models with logging and JSON sanitization, and return a normalized nodes structure.
  • Rewrote explainWorkflow to enforce plain-text bullet explanations with emojis, truncate workflow JSON in prompt, and return a stable { explanation } payload with error fallback.
backend/src/services/ai.service.js
Enhanced frontend dashboard, builder, and visualization to support business onboarding, deployment state, richer metrics, and AI-driven node/edge mapping.
  • Redesigned Dashboard header and main content with business-type cards, improved ROI section, and updated active automations list styling.
  • Updated Builder to manage deployment state (business name, modal, live indicator), conditionally show customization/test controls, and adjust assistant title.
  • Augmented WorkflowGraph to accept both nodes/steps formats, map node metadata more flexibly, and build edges using next/true_id/false_id/outputs with a linear fallback layout.
  • Modernized ROIDashboard visuals and metrics to focus on queries vs resolutions with new icons and area charts.
autoflow-frontend/src/pages/Dashboard.jsx
autoflow-frontend/src/pages/Builder.jsx
autoflow-frontend/src/components/visualization/WorkflowGraph.jsx
autoflow-frontend/src/components/dashboard/ROIDashboard.jsx
Added inventory/file upload flow and wiring between frontend chat, backend upload parsing, and AI workflow generation.
  • Extended ChatInterface with file upload UI (paperclip button, status chip), backend upload invocation, and passing of fileContext into workflow generation calls.
  • Added workflowApi helpers for generate(description, fileContext) and uploadFile(file) using multipart/form-data.
  • Implemented backend upload.controller using Multer and xlsx to store, parse Excel/CSV files, compute columns/preview, and return context for AI.
  • Introduced file.service (CSV/XLSX parsing into text context) and updated workflow controller to forward fileContext into aiService.generateWorkflow.
autoflow-frontend/src/components/builder/ChatInterface.jsx
autoflow-frontend/src/services/workflowApi.js
backend/src/controllers/upload.controller.js
backend/src/services/file.service.js
backend/src/controllers/workflow.controller.js
Stabilized WhatsApp and engine simulation pipeline, including Baileys reconnect behavior, message parsing, and simulation API.
  • Reworked WhatsApp connection to use printQRInTerminal, robust connection.update handling with session clearing on logout, and safer messages.upsert extraction and engine invocation.
  • Expanded engineService.processMessage greeting handling, ensuring message text is string-normalized and trimmed before intent logic.
  • Hardened /simulate-message route to use a mock socket, capture replies via sendMessage, and wrap engineService invocation in try/catch with clearer error responses.
backend/src/services/whatsapp.service.js
backend/src/services/engine.service.js
backend/src/routes/api.routes.js
Expanded node palette with a large external tools library and improved drag-and-drop metadata handling.
  • Imported a 500+ tools catalog from tools.json and merged it with existing TOOL_CATEGORIES via useMemo grouping by category.
  • Extended icon mapping to cover many more tool types and wired search/filtering over the combined categories, updating the header badge count.
  • Updated drag start handler to pass both the tool.id via application/reactflow and serialized tool data via application/toolData for richer drop handling.
autoflow-frontend/src/components/builder/NodePalette.jsx
autoflow-frontend/src/data/tools.json
Made AI explanation consumer more robust to different response payload shapes.
  • Adjusted AIExplanation to accept either string or object responses from the backend, building human-readable text from summary/steps/explanation fields as available and logging errors.
  • Ensured rendered explanation handles both primitive and object forms via conditional JSON stringification.
autoflow-frontend/src/components/builder/AIExplanation.jsx
Added backend dependencies required for file upload and parsing.
  • Included csv-parse, multer, and xlsx packages in backend package.json for CSV/Excel ingestion and multipart handling.
backend/package.json

Tips and commands

Interacting with Sourcery

  • Trigger a new review: Comment @sourcery-ai review on the pull request.
  • Continue discussions: Reply directly to Sourcery's review comments.
  • Generate a GitHub issue from a review comment: Ask Sourcery to create an
    issue from a review comment by replying to it. You can also reply to a
    review comment with @sourcery-ai issue to create an issue from it.
  • Generate a pull request title: Write @sourcery-ai anywhere in the pull
    request title to generate a title at any time. You can also comment
    @sourcery-ai title on the pull request to (re-)generate the title at any time.
  • Generate a pull request summary: Write @sourcery-ai summary anywhere in
    the pull request body to generate a PR summary at any time exactly where you
    want it. You can also comment @sourcery-ai summary on the pull request to
    (re-)generate the summary at any time.
  • Generate reviewer's guide: Comment @sourcery-ai guide on the pull
    request to (re-)generate the reviewer's guide at any time.
  • Resolve all Sourcery comments: Comment @sourcery-ai resolve on the
    pull request to resolve all Sourcery comments. Useful if you've already
    addressed all the comments and don't want to see them anymore.
  • Dismiss all Sourcery reviews: Comment @sourcery-ai dismiss on the pull
    request to dismiss all existing Sourcery reviews. Especially useful if you
    want to start fresh with a new review - don't forget to comment
    @sourcery-ai review to trigger a new review!

Customizing Your Experience

Access your dashboard to:

  • Enable or disable review features such as the Sourcery-generated pull request
    summary, the reviewer's guide, and others.
  • Change the review language.
  • Add, remove or edit custom review instructions.
  • Adjust other review settings.

Getting Help

Copy link
Copy Markdown

@sourcery-ai sourcery-ai bot left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Hey - I've found 6 issues, and left some high level feedback:

  • The file upload response shape doesn’t match frontend expectations: upload.controller returns filename/originalName and no context, but ChatInterface reads data.fileName and data.context, so the keys and structure should be aligned on both sides.
  • In workflowApi.uploadFile and ChatInterface.handleFileUpload, you treat the upload response as if it were a ready-to-send fileContext, but the backend returns columns, preview, and counts; either wrap that into a context object on the backend or adjust the frontend to pass the actual shape expected by generateWorkflow.
  • backend/src/services/file.service.js is added but not used anywhere in the routes/controllers; consider either wiring it into the upload flow (to centralize parsing/cleanup) or removing it to avoid dead code.
Prompt for AI Agents
Please address the comments from this code review:

## Overall Comments
- The file upload response shape doesn’t match frontend expectations: `upload.controller` returns `filename`/`originalName` and no `context`, but `ChatInterface` reads `data.fileName` and `data.context`, so the keys and structure should be aligned on both sides.
- In `workflowApi.uploadFile` and `ChatInterface.handleFileUpload`, you treat the upload response as if it were a ready-to-send `fileContext`, but the backend returns `columns`, `preview`, and counts; either wrap that into a `context` object on the backend or adjust the frontend to pass the actual shape expected by `generateWorkflow`.
- `backend/src/services/file.service.js` is added but not used anywhere in the routes/controllers; consider either wiring it into the upload flow (to centralize parsing/cleanup) or removing it to avoid dead code.

## Individual Comments

### Comment 1
<location> `autoflow-frontend/src/components/builder/ChatInterface.jsx:31-32` </location>
<code_context>
+
+        setIsLoading(true);
+        try {
+            const data = await workflowApi.uploadFile(file);
+            setUploadedFile({ name: data.fileName, context: data.context });
+            setMessages(prev => [...prev, {
+                role: 'system',
</code_context>

<issue_to_address>
**issue (bug_risk):** Upload response shape doesn’t match how the frontend is consuming it, so file name and context will be undefined.

`uploadFile` returns keys like `filename`, `originalName`, `columns`, `preview`, but this code reads `data.fileName` and `data.context`. That will render the “Using: …” label as `undefined` and pass `undefined` into `workflowApi.generate`, so the uploaded data is never used. Please align the API and client contract (e.g. have the controller return `fileName`/`context`, or update the frontend to use the existing `filename`/`preview`/`columns` shape consistently).
</issue_to_address>

### Comment 2
<location> `autoflow-frontend/src/components/builder/ChatInterface.jsx:37-38` </location>
<code_context>
+            }
+
+            setExplanation(explanationText);
         } catch (error) {
+            console.error(error);
             setExplanation("Failed to generate explanation.");
</code_context>

<issue_to_address>
**issue (bug_risk):** Operator precedence in the error message will almost always ignore the `error.message` fallback.

Because `+` has higher precedence than `||`, the expression is effectively:

```js
("❌ Upload Failed: " + error.response?.data?.error) || error.message;
```
Any non-empty string is truthy, so the fallback to `error.message` is never used (you’ll even see `"❌ Upload Failed: undefined"`). Use parentheses or a template literal so the fallback actually applies:

```js
content: `❌ Upload Failed: ${error.response?.data?.error || error.message}`
```
</issue_to_address>

### Comment 3
<location> `autoflow-frontend/src/components/visualization/WorkflowGraph.jsx:86-95` </location>
<code_context>
-                    style: { stroke: '#555', strokeWidth: 2 }
-                };
+                // B. Handle Condition Branches (true_id / false_id)
+                if (node.data?.true_id) {
+                    aiEdges.push({
+                        id: `e${sourceId}-${node.data.true_id}-yes`,
+                        source: sourceId,
+                        target: node.data.true_id,
+                        sourceHandle: 'bottom', // Conditions usually split down
+                        targetHandle: 'left',
+                        label: 'Yes/True',
+                        animated: true,
+                        type: 'default',
+                        style: { stroke: 'green' }
+                    });
+                }
+                if (node.data?.false_id) {
+                    aiEdges.push({
+                        id: `e${sourceId}-${node.data.false_id}-no`,
</code_context>

<issue_to_address>
**issue (bug_risk):** Condition edges look for `true_id`/`false_id` under `data`, but the AI contract puts them at the top level.

Given the workflow prompt, `true_id` and `false_id` are top-level fields on the node, not under `data`, so branches produced by the AI won’t render with the current checks. This should instead read from the node root, e.g.:

```js
if (node.true_id) { /* ... */ }
if (node.false_id) { /* ... */ }
```

If you need to support older data, you could keep the `data.*` checks as a fallback.
</issue_to_address>

### Comment 4
<location> `backend/src/controllers/upload.controller.js:30-39` </location>
<code_context>
+                return res.status(400).json({ error: "No file uploaded" });
+            }
+
+            const filePath = req.file.path;
+            const fileExt = path.extname(req.file.originalname).toLowerCase();
+            let data = [];
+
+            // Parse Excel or CSV
+            if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
+                const workbook = xlsx.readFile(filePath);
+                const sheetName = workbook.SheetNames[0]; // Read first sheet
+                const sheet = workbook.Sheets[sheetName];
+                data = xlsx.utils.sheet_to_json(sheet);
+            }
+
+            // Get columns for AI context
+            const columns = data.length > 0 ? Object.keys(data[0]) : [];
+            const preview = data.slice(0, 5); // Send first 5 rows as sample
+
+            res.json({
+                filename: req.file.filename,
+                originalName: req.file.originalname,
</code_context>

<issue_to_address>
**suggestion (performance):** Uploaded files are never deleted after parsing, which can accumulate unnecessary disk usage.

The file is read via `xlsx.readFile(filePath)` but never removed, so repeated uploads will grow the `uploads` directory. Since you only need the parsed JSON, you can safely delete the file after a successful parse:

```js
const workbook = xlsx.readFile(filePath);
// ...parse...
fs.unlink(filePath, err => {
  if (err) console.error('Failed to delete uploaded file:', err);
});
```

This prevents unnecessary disk usage, especially with large or frequent uploads.

Suggested implementation:

```javascript
            // Parse Excel or CSV
            if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
                const workbook = xlsx.readFile(filePath);
                const sheetName = workbook.SheetNames[0]; // Read first sheet
                const sheet = workbook.Sheets[sheetName];
                data = xlsx.utils.sheet_to_json(sheet);
            }

            // Clean up uploaded file after parsing to avoid disk accumulation
            fs.unlink(filePath, (err) => {
                if (err) {
                    console.error("Failed to delete uploaded file:", err);
                }
            });

            // Get columns for AI context
            const columns = data.length > 0 ? Object.keys(data[0]) : [];
            const preview = data.slice(0, 5); // Send first 5 rows as sample

            res.json({

```

1. Ensure `fs` is imported at the top of `backend/src/controllers/upload.controller.js`, e.g.:
   `const fs = require('fs');`
2. If you have other upload handlers or temporary-file flows in this file, consider applying the same `fs.unlink` cleanup pattern there as well.
</issue_to_address>

### Comment 5
<location> `autoflow-frontend/src/components/visualization/WorkflowGraph.jsx:37` </location>
<code_context>

     useEffect(() => {
         scrollToBottom();
-    }, [messages, isLoading]);
</code_context>

<issue_to_address>
**issue (complexity):** Consider extracting node normalization, node mapping, and edge-building into pure helper functions plus a shared edge factory so the useEffect becomes a small orchestration layer.

You can keep the new behavior and significantly reduce complexity by extracting a few pure helpers and a shared edge factory. That will make the `useEffect` much shorter and easier to test.

### 1. Normalize and map nodes outside the effect

Move the “nodes vs steps” handling and layout logic into helpers:

```ts
function normalizeIncomingNodes(workflowData?: any) {
  return workflowData?.nodes || workflowData?.steps || [];
}

function mapNodesToReactFlow(incomingNodes: any[]) {
  return incomingNodes.map((node, index) => {
    const isFirst = index === 0;

    return {
      id: node.id || generateId(),
      type: 'custom',
      position: node.position || {
        x: isFirst ? 50 : 450,
        y: isFirst ? 250 : 100 + ((index - 1) * 150),
      },
      data: {
        label: node.data?.label || node.label || 'Step',
        type:
          node.type ||
          node.data?.type ||
          (node.label?.includes('?') ? 'condition' : 'action'),
        category: isFirst ? 'AutoFlow' : 'AI',
      },
    };
  });
}
```

### 2. DRY edge creation with a small factory

Most edges share the same shape. Use a helper to centralize defaults:

```ts
function createEdge(
  source: string,
  target: string,
  overrides: Partial<Edge> = {}
) {
  return {
    id: overrides.id || `e${source}-${target}`,
    source,
    target,
    sourceHandle: 'right',
    targetHandle: 'left',
    animated: true,
    type: 'default',
    ...overrides,
  };
}
```

### 3. Split edge strategies into pure functions

Each edge-building strategy can be a separate pure function using `createEdge`:

```ts
function buildEdgesFromNext(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    if (!sourceId || !Array.isArray(node.next)) return;

    node.next.forEach((targetId: string) => {
      edges.push(createEdge(sourceId, targetId));
    });
  });
  return edges;
}

function buildEdgesFromConditions(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    if (!sourceId || !node.data) return;

    if (node.data.true_id) {
      edges.push(
        createEdge(sourceId, node.data.true_id, {
          id: `e${sourceId}-${node.data.true_id}-yes`,
          sourceHandle: 'bottom',
          label: 'Yes/True',
          style: { stroke: 'green' },
        })
      );
    }

    if (node.data.false_id) {
      edges.push(
        createEdge(sourceId, node.data.false_id, {
          id: `e${sourceId}-${node.data.false_id}-no`,
          label: 'No/False',
          style: { stroke: 'red' },
        })
      );
    }
  });
  return edges;
}

function buildEdgesFromOutputs(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    const outputs = node.data?.outputs;
    if (!sourceId || typeof outputs !== 'object') return;

    Object.entries(outputs).forEach(([intent, targetId]) => {
      edges.push(
        createEdge(sourceId, String(targetId), {
          id: `e${sourceId}-${targetId}-${intent}`,
          label: intent,
        })
      );
    });
  });
  return edges;
}

function buildFallbackEdges(aiNodes: Node[]) {
  const edges: Edge[] = [];
  for (let i = 0; i < aiNodes.length - 1; i++) {
    const current = aiNodes[i];
    const next = aiNodes[i + 1];
    edges.push(
      createEdge(current.id, next.id, {
        id: `e-fallback-${current.id}-${next.id}`,
        style: { stroke: '#999', strokeDasharray: '5,5' },
      })
    );
  }
  return edges;
}
```

### 4. Use the helpers in the `useEffect`

The `useEffect` then becomes orchestration-only:

```ts
useEffect(() => {
  const incomingNodes = normalizeIncomingNodes(workflowData);

  if (!Array.isArray(incomingNodes) || incomingNodes.length === 0) return;

  const aiNodes = mapNodesToReactFlow(incomingNodes);

  let aiEdges = [
    ...buildEdgesFromNext(incomingNodes),
    ...buildEdgesFromConditions(incomingNodes),
    ...buildEdgesFromOutputs(incomingNodes),
  ];

  if (aiEdges.length === 0 && aiNodes.length > 1) {
    aiEdges = buildFallbackEdges(aiNodes);
  }

  setNodes(aiNodes);
  setEdges(aiEdges);
}, [workflowData, setNodes, setEdges]);
```

This keeps all existing behavior (support for `nodes`/`steps`, layout, `next`, `true_id`/`false_id`, `outputs`, and fallback edges) but significantly lowers complexity and makes each behavior testable and maintainable in isolation.
</issue_to_address>

### Comment 6
<location> `autoflow-frontend/src/components/builder/NodePalette.jsx:21` </location>
<code_context>
+    const [expandedCategories, setExpandedCategories] = useState(['core']);
+
+    // Merge Core Logic with 500+ Tools dynamically
+    const allCategories = useMemo(() => {
+        // 1. Get Core Logic (Triggers, Conditions, etc.)
+        const coreLogic = TOOL_CATEGORIES[0];
</code_context>

<issue_to_address>
**issue (complexity):** Consider extracting the category-building logic, icon map, and drag-data handling into small utilities so `NodePalette` focuses only on rendering and simple wiring.

You can keep all the new behavior while reducing the complexity inside `NodePalette` by pushing data shaping and mappings out of the component.

### 1. Extract `allCategories` construction to a small utility

Right now `useMemo` both normalizes `toolsData` and builds categories, which couples data concerns to the UI. You can move that out and keep the component focused on rendering:

```ts
// utils/buildToolCategories.ts
import { TOOL_CATEGORIES } from '../../constants/tools';
import toolsData from '../../data/tools.json';

export function buildToolCategories() {
  const coreLogic = TOOL_CATEGORIES[0];

  const groupedTools: Record<string, { id: string; label: string; type: string; icon: string }[]> = {};

  toolsData.forEach(tool => {
    if (!groupedTools[tool.category]) {
      groupedTools[tool.category] = [];
    }
    groupedTools[tool.category].push({
      id: tool.id,
      label: tool.name,
      type: 'tool',
      icon: tool.icon,
    });
  });

  const dynamicCategories = Object.keys(groupedTools)
    .sort()
    .map(catName => ({
      id: catName.toLowerCase().replace(/\s+/g, '_'),
      name: catName,
      tools: groupedTools[catName],
    }));

  return [coreLogic, ...dynamicCategories];
}
```

Then in the component:

```tsx
// NodePalette.tsx
import { useMemo } from 'react';
import { buildToolCategories } from '../../utils/buildToolCategories';

export const NodePalette = () => {
  const allCategories = useMemo(() => buildToolCategories(), []);
  // ...
};
```

This keeps the normalization logic testable and reusable and makes the component easier to scan.

### 2. Isolate the icon map

The extended `IconMap` makes this file long and mixes UI with registry concerns. You can move the map into its own module without changing behavior:

```ts
// icons/toolIcons.ts
import {
  MessageSquare, Zap, Divide, Link, Mail, Smartphone, Database, ShoppingCart,
  Globe, CreditCard, Users, Code, Cloud, Clock, Bot, Facebook, Twitter,
  Instagram, Linkedin, Slack, Github, Trello, Calendar, Activity, AlertTriangle,
  Shield, Key, HardDrive, FileText, BarChart, Layout, GitBranch, Video,
  DollarSign, Send, MessageCircle, Headphones, Flame, Box,
} from 'lucide-react';

export const ToolIconMap = {
  MessageSquare,
  Zap,
  Divide,
  Link,
  Mail,
  Smartphone,
  Database,
  ShoppingCart,
  Globe,
  CreditCard,
  Users,
  Code,
  Cloud,
  Clock,
  Bot,
  Facebook,
  Twitter,
  Instagram,
  Linkedin,
  Slack,
  Github,
  Trello,
  Calendar,
  Activity,
  AlertTriangle,
  Shield,
  Key,
  HardDrive,
  FileText,
  BarChart,
  Layout,
  GitBranch,
  Video,
  DollarSign,
  Send,
  MessageCircle,
  Headphones,
  Flame,
  Box,
} as const;
```

And in the palette:

```tsx
import { ToolIconMap as IconMap } from '../../icons/toolIcons';
```

### 3. Tighten `onDragStart` to what’s actually consumed

If `application/toolData` is not yet used by your drop handler, you can keep the behavior ready but avoid the extra payload until it’s needed, or wrap it so the intent is clearer.

**If not currently used:**

```ts
const onDragStart = (event, tool) => {
  event.dataTransfer.setData('application/reactflow', tool.id);
  event.dataTransfer.effectAllowed = 'move';
};
```

**If you want to keep support but decouple it:**

```ts
// utils/setToolDragData.ts
export function setToolDragData(event: DragEvent, tool: any, options?: { includeFullData?: boolean }) {
  event.dataTransfer?.setData('application/reactflow', tool.id);
  if (options?.includeFullData) {
    event.dataTransfer?.setData('application/toolData', JSON.stringify(tool));
  }
  event.dataTransfer!.effectAllowed = 'move';
}
```

```tsx
// NodePalette.tsx
import { setToolDragData } from '../../utils/setToolDragData';

const onDragStart = (event, tool) => {
  setToolDragData(event, tool, { includeFullData: true }); // keep current behavior
};
```

This keeps all functionality but makes the palette itself much simpler: it renders categories, handles search, and delegates data shaping/drag payload details elsewhere.
</issue_to_address>

Sourcery is free for open source - if you like our reviews please consider sharing them ✨
Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.

Comment on lines +31 to +32
const data = await workflowApi.uploadFile(file);
setUploadedFile({ name: data.fileName, context: data.context });
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Upload response shape doesn’t match how the frontend is consuming it, so file name and context will be undefined.

uploadFile returns keys like filename, originalName, columns, preview, but this code reads data.fileName and data.context. That will render the “Using: …” label as undefined and pass undefined into workflowApi.generate, so the uploaded data is never used. Please align the API and client contract (e.g. have the controller return fileName/context, or update the frontend to use the existing filename/preview/columns shape consistently).

Comment on lines +37 to +38
} catch (error) {
setMessages(prev => [...prev, { role: 'system', content: "❌ Upload Failed: " + error.response?.data?.error || error.message }]);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Operator precedence in the error message will almost always ignore the error.message fallback.

Because + has higher precedence than ||, the expression is effectively:

("❌ Upload Failed: " + error.response?.data?.error) || error.message;

Any non-empty string is truthy, so the fallback to error.message is never used (you’ll even see "❌ Upload Failed: undefined"). Use parentheses or a template literal so the fallback actually applies:

content: `❌ Upload Failed: ${error.response?.data?.error || error.message}`

Comment on lines +86 to +95
if (node.data?.true_id) {
aiEdges.push({
id: `e${sourceId}-${node.data.true_id}-yes`,
source: sourceId,
target: node.data.true_id,
sourceHandle: 'bottom', // Conditions usually split down
targetHandle: 'left',
label: 'Yes/True',
animated: true,
type: 'default',
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (bug_risk): Condition edges look for true_id/false_id under data, but the AI contract puts them at the top level.

Given the workflow prompt, true_id and false_id are top-level fields on the node, not under data, so branches produced by the AI won’t render with the current checks. This should instead read from the node root, e.g.:

if (node.true_id) { /* ... */ }
if (node.false_id) { /* ... */ }

If you need to support older data, you could keep the data.* checks as a fallback.

Comment on lines +30 to +39
const filePath = req.file.path;
const fileExt = path.extname(req.file.originalname).toLowerCase();
let data = [];

// Parse Excel or CSV
if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
const workbook = xlsx.readFile(filePath);
const sheetName = workbook.SheetNames[0]; // Read first sheet
const sheet = workbook.Sheets[sheetName];
data = xlsx.utils.sheet_to_json(sheet);
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

suggestion (performance): Uploaded files are never deleted after parsing, which can accumulate unnecessary disk usage.

The file is read via xlsx.readFile(filePath) but never removed, so repeated uploads will grow the uploads directory. Since you only need the parsed JSON, you can safely delete the file after a successful parse:

const workbook = xlsx.readFile(filePath);
// ...parse...
fs.unlink(filePath, err => {
  if (err) console.error('Failed to delete uploaded file:', err);
});

This prevents unnecessary disk usage, especially with large or frequent uploads.

Suggested implementation:

            // Parse Excel or CSV
            if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
                const workbook = xlsx.readFile(filePath);
                const sheetName = workbook.SheetNames[0]; // Read first sheet
                const sheet = workbook.Sheets[sheetName];
                data = xlsx.utils.sheet_to_json(sheet);
            }

            // Clean up uploaded file after parsing to avoid disk accumulation
            fs.unlink(filePath, (err) => {
                if (err) {
                    console.error("Failed to delete uploaded file:", err);
                }
            });

            // Get columns for AI context
            const columns = data.length > 0 ? Object.keys(data[0]) : [];
            const preview = data.slice(0, 5); // Send first 5 rows as sample

            res.json({
  1. Ensure fs is imported at the top of backend/src/controllers/upload.controller.js, e.g.:
    const fs = require('fs');
  2. If you have other upload handlers or temporary-file flows in this file, consider applying the same fs.unlink cleanup pattern there as well.

@@ -35,54 +35,118 @@ export const WorkflowGraph = ({ workflowData }) => {

// Handle initial workflow data from AI
useEffect(() => {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider extracting node normalization, node mapping, and edge-building into pure helper functions plus a shared edge factory so the useEffect becomes a small orchestration layer.

You can keep the new behavior and significantly reduce complexity by extracting a few pure helpers and a shared edge factory. That will make the useEffect much shorter and easier to test.

1. Normalize and map nodes outside the effect

Move the “nodes vs steps” handling and layout logic into helpers:

function normalizeIncomingNodes(workflowData?: any) {
  return workflowData?.nodes || workflowData?.steps || [];
}

function mapNodesToReactFlow(incomingNodes: any[]) {
  return incomingNodes.map((node, index) => {
    const isFirst = index === 0;

    return {
      id: node.id || generateId(),
      type: 'custom',
      position: node.position || {
        x: isFirst ? 50 : 450,
        y: isFirst ? 250 : 100 + ((index - 1) * 150),
      },
      data: {
        label: node.data?.label || node.label || 'Step',
        type:
          node.type ||
          node.data?.type ||
          (node.label?.includes('?') ? 'condition' : 'action'),
        category: isFirst ? 'AutoFlow' : 'AI',
      },
    };
  });
}

2. DRY edge creation with a small factory

Most edges share the same shape. Use a helper to centralize defaults:

function createEdge(
  source: string,
  target: string,
  overrides: Partial<Edge> = {}
) {
  return {
    id: overrides.id || `e${source}-${target}`,
    source,
    target,
    sourceHandle: 'right',
    targetHandle: 'left',
    animated: true,
    type: 'default',
    ...overrides,
  };
}

3. Split edge strategies into pure functions

Each edge-building strategy can be a separate pure function using createEdge:

function buildEdgesFromNext(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    if (!sourceId || !Array.isArray(node.next)) return;

    node.next.forEach((targetId: string) => {
      edges.push(createEdge(sourceId, targetId));
    });
  });
  return edges;
}

function buildEdgesFromConditions(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    if (!sourceId || !node.data) return;

    if (node.data.true_id) {
      edges.push(
        createEdge(sourceId, node.data.true_id, {
          id: `e${sourceId}-${node.data.true_id}-yes`,
          sourceHandle: 'bottom',
          label: 'Yes/True',
          style: { stroke: 'green' },
        })
      );
    }

    if (node.data.false_id) {
      edges.push(
        createEdge(sourceId, node.data.false_id, {
          id: `e${sourceId}-${node.data.false_id}-no`,
          label: 'No/False',
          style: { stroke: 'red' },
        })
      );
    }
  });
  return edges;
}

function buildEdgesFromOutputs(incomingNodes: any[]) {
  const edges: Edge[] = [];
  incomingNodes.forEach((node: any) => {
    const sourceId = node.id;
    const outputs = node.data?.outputs;
    if (!sourceId || typeof outputs !== 'object') return;

    Object.entries(outputs).forEach(([intent, targetId]) => {
      edges.push(
        createEdge(sourceId, String(targetId), {
          id: `e${sourceId}-${targetId}-${intent}`,
          label: intent,
        })
      );
    });
  });
  return edges;
}

function buildFallbackEdges(aiNodes: Node[]) {
  const edges: Edge[] = [];
  for (let i = 0; i < aiNodes.length - 1; i++) {
    const current = aiNodes[i];
    const next = aiNodes[i + 1];
    edges.push(
      createEdge(current.id, next.id, {
        id: `e-fallback-${current.id}-${next.id}`,
        style: { stroke: '#999', strokeDasharray: '5,5' },
      })
    );
  }
  return edges;
}

4. Use the helpers in the useEffect

The useEffect then becomes orchestration-only:

useEffect(() => {
  const incomingNodes = normalizeIncomingNodes(workflowData);

  if (!Array.isArray(incomingNodes) || incomingNodes.length === 0) return;

  const aiNodes = mapNodesToReactFlow(incomingNodes);

  let aiEdges = [
    ...buildEdgesFromNext(incomingNodes),
    ...buildEdgesFromConditions(incomingNodes),
    ...buildEdgesFromOutputs(incomingNodes),
  ];

  if (aiEdges.length === 0 && aiNodes.length > 1) {
    aiEdges = buildFallbackEdges(aiNodes);
  }

  setNodes(aiNodes);
  setEdges(aiEdges);
}, [workflowData, setNodes, setEdges]);

This keeps all existing behavior (support for nodes/steps, layout, next, true_id/false_id, outputs, and fallback edges) but significantly lowers complexity and makes each behavior testable and maintainable in isolation.

const [expandedCategories, setExpandedCategories] = useState(['core']);

// Merge Core Logic with 500+ Tools dynamically
const allCategories = useMemo(() => {
Copy link
Copy Markdown

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

issue (complexity): Consider extracting the category-building logic, icon map, and drag-data handling into small utilities so NodePalette focuses only on rendering and simple wiring.

You can keep all the new behavior while reducing the complexity inside NodePalette by pushing data shaping and mappings out of the component.

1. Extract allCategories construction to a small utility

Right now useMemo both normalizes toolsData and builds categories, which couples data concerns to the UI. You can move that out and keep the component focused on rendering:

// utils/buildToolCategories.ts
import { TOOL_CATEGORIES } from '../../constants/tools';
import toolsData from '../../data/tools.json';

export function buildToolCategories() {
  const coreLogic = TOOL_CATEGORIES[0];

  const groupedTools: Record<string, { id: string; label: string; type: string; icon: string }[]> = {};

  toolsData.forEach(tool => {
    if (!groupedTools[tool.category]) {
      groupedTools[tool.category] = [];
    }
    groupedTools[tool.category].push({
      id: tool.id,
      label: tool.name,
      type: 'tool',
      icon: tool.icon,
    });
  });

  const dynamicCategories = Object.keys(groupedTools)
    .sort()
    .map(catName => ({
      id: catName.toLowerCase().replace(/\s+/g, '_'),
      name: catName,
      tools: groupedTools[catName],
    }));

  return [coreLogic, ...dynamicCategories];
}

Then in the component:

// NodePalette.tsx
import { useMemo } from 'react';
import { buildToolCategories } from '../../utils/buildToolCategories';

export const NodePalette = () => {
  const allCategories = useMemo(() => buildToolCategories(), []);
  // ...
};

This keeps the normalization logic testable and reusable and makes the component easier to scan.

2. Isolate the icon map

The extended IconMap makes this file long and mixes UI with registry concerns. You can move the map into its own module without changing behavior:

// icons/toolIcons.ts
import {
  MessageSquare, Zap, Divide, Link, Mail, Smartphone, Database, ShoppingCart,
  Globe, CreditCard, Users, Code, Cloud, Clock, Bot, Facebook, Twitter,
  Instagram, Linkedin, Slack, Github, Trello, Calendar, Activity, AlertTriangle,
  Shield, Key, HardDrive, FileText, BarChart, Layout, GitBranch, Video,
  DollarSign, Send, MessageCircle, Headphones, Flame, Box,
} from 'lucide-react';

export const ToolIconMap = {
  MessageSquare,
  Zap,
  Divide,
  Link,
  Mail,
  Smartphone,
  Database,
  ShoppingCart,
  Globe,
  CreditCard,
  Users,
  Code,
  Cloud,
  Clock,
  Bot,
  Facebook,
  Twitter,
  Instagram,
  Linkedin,
  Slack,
  Github,
  Trello,
  Calendar,
  Activity,
  AlertTriangle,
  Shield,
  Key,
  HardDrive,
  FileText,
  BarChart,
  Layout,
  GitBranch,
  Video,
  DollarSign,
  Send,
  MessageCircle,
  Headphones,
  Flame,
  Box,
} as const;

And in the palette:

import { ToolIconMap as IconMap } from '../../icons/toolIcons';

3. Tighten onDragStart to what’s actually consumed

If application/toolData is not yet used by your drop handler, you can keep the behavior ready but avoid the extra payload until it’s needed, or wrap it so the intent is clearer.

If not currently used:

const onDragStart = (event, tool) => {
  event.dataTransfer.setData('application/reactflow', tool.id);
  event.dataTransfer.effectAllowed = 'move';
};

If you want to keep support but decouple it:

// utils/setToolDragData.ts
export function setToolDragData(event: DragEvent, tool: any, options?: { includeFullData?: boolean }) {
  event.dataTransfer?.setData('application/reactflow', tool.id);
  if (options?.includeFullData) {
    event.dataTransfer?.setData('application/toolData', JSON.stringify(tool));
  }
  event.dataTransfer!.effectAllowed = 'move';
}
// NodePalette.tsx
import { setToolDragData } from '../../utils/setToolDragData';

const onDragStart = (event, tool) => {
  setToolDragData(event, tool, { includeFullData: true }); // keep current behavior
};

This keeps all functionality but makes the palette itself much simpler: it renders categories, handles search, and delegates data shaping/drag payload details elsewhere.

Sibam Prasad Sahoo and others added 13 commits January 29, 2026 14:15
Removed commented sections from the system architecture diagram for clarity.
Removed high-level overview and system architecture diagram sections from the document.
Added system architecture diagram and updated section headers.
Removed the AutoFlow System Architecture section and diagram.
Expanded the AutoFlow system architecture documentation with detailed sections on high-level overview, system architecture diagram, component breakdown, data persistence, and key workflows.
SIBAM890 and others added 21 commits March 17, 2026 06:11
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment

Labels

None yet

Projects

None yet

Development

Successfully merging this pull request may close these issues.

2 participants