Conversation
…ntend rendering crash, and dashboard warnings
…put checks, and expand greetings
Reviewer's GuideConnects updated AI workflow generation, file upload, WhatsApp integration, and UI enhancements into a cohesive automation builder experience, adding inventory-aware workflows, deployment flows, richer toolbox, and safer engine/visualization behaviors. Sequence diagram for inventory-aware workflow generation with file uploadsequenceDiagram
actor User
participant Frontend_ChatInterface as ChatInterface
participant Frontend_workflowApi as workflowApi
participant Backend_API as Express_API
participant Backend_UploadController as uploadController
participant Backend_WorkflowController as workflowController
participant Backend_AIService as aiService
participant Google_Gemini as Google_GenerativeAI
User->>Frontend_ChatInterface: Select file to upload
Frontend_ChatInterface->>Frontend_workflowApi: uploadFile(file)
Frontend_workflowApi->>Backend_API: POST /upload (multipart file)
Backend_API->>Backend_UploadController: uploadFile middleware chain
Backend_UploadController->>Backend_UploadController: Store file to /uploads
Backend_UploadController->>Backend_UploadController: Parse via xlsx.sheet_to_json
Backend_UploadController-->>Backend_API: { columns, preview, rowCount, filename }
Backend_API-->>Frontend_workflowApi: JSON response
Frontend_workflowApi-->>Frontend_ChatInterface: { columns, preview, rowCount }
Frontend_ChatInterface->>Frontend_ChatInterface: Save uploadedFile.context
User->>Frontend_ChatInterface: Enter workflow description
Frontend_ChatInterface->>Frontend_workflowApi: generate(description, fileContext)
Frontend_workflowApi->>Backend_API: POST /generate-workflow { userPrompt, fileContext }
Backend_API->>Backend_WorkflowController: createWorkflow
Backend_WorkflowController->>Backend_AIService: generateWorkflow(userPrompt, fileContext)
Backend_AIService->>Backend_AIService: Extract inventoryColumns, sampleData
Backend_AIService->>Google_Gemini: generateContent(prompt with context)
Google_Gemini-->>Backend_AIService: JSON-like text
Backend_AIService->>Backend_AIService: Clean markdown fences
Backend_AIService->>Backend_AIService: JSON.parse(cleanJson)
Backend_AIService->>Backend_AIService: validateWorkflow(parsed)
alt Valid_nodes
Backend_AIService-->>Backend_WorkflowController: { nodes }
else All_models_fail_or_invalid
Backend_AIService-->>Backend_WorkflowController: generateFallbackWorkflow()
end
Backend_WorkflowController-->>Backend_API: { success, workflow }
Backend_API-->>Frontend_workflowApi: JSON workflow
Frontend_workflowApi-->>Frontend_ChatInterface: { workflow }
Frontend_ChatInterface-->>User: Confirmation message
Frontend_ChatInterface-->>Frontend_ChatInterface: onWorkflowGenerated(workflow)
Sequence diagram for WhatsApp message processing through engine and simulatorsequenceDiagram
actor WhatsApp_User
participant WhatsApp_Server as WhatsApp
participant Baileys_Socket as WA_Socket
participant Backend_Engine as engineService
rect rgb(240,240,255)
WhatsApp_User->>WhatsApp_Server: Send message
WhatsApp_Server-->>Baileys_Socket: messages.upsert event
Baileys_Socket->>Baileys_Socket: Extract userMessage and sender
alt Has_text_and_not_fromMe
Baileys_Socket->>Backend_Engine: processMessage(sock, sender, userMessage)
Backend_Engine->>Backend_Engine: Normalize messageText to string
Backend_Engine->>Backend_Engine: lowerMsg = messageText.toLowerCase().trim()
alt Greeting_match
Backend_Engine->>Baileys_Socket: sendMessage(sender, greeting_reply)
else Other_logic
Backend_Engine->>Baileys_Socket: sendMessage(sender, business_reply)
end
else Non_text_or_fromMe
Baileys_Socket->>Baileys_Socket: Ignore message
end
end
rect rgb(240,255,240)
participant Frontend_TestMode as TestMode_UI
participant Frontend_workflowApi as workflowApi
participant Backend_API as Express_API
Frontend_TestMode->>Frontend_workflowApi: simulateMessage(message)
Frontend_workflowApi->>Backend_API: POST /simulate-message { message }
Backend_API->>Backend_Engine: processMessage(mockSock, TestUser, message)
Backend_Engine->>Backend_Engine: Same processing as live flow
Backend_Engine->>Backend_API: mockSock.sendMessage captures botReply
Backend_API-->>Frontend_workflowApi: { success, reply: botReply }
Frontend_workflowApi-->>Frontend_TestMode: botReply
end
Class diagram for core backend and frontend modulesclassDiagram
class ai_service {
+generateWorkflow(userPrompt, fileContext)
+explainWorkflow(workflowJson)
-validateWorkflow(parsed)
-generateFallbackWorkflow(reason)
-genAI
}
class workflow_controller {
+createWorkflow(req, res)
+explainWorkflow(req, res)
}
class upload_controller {
+uploadFile(req, res, next)
-storage
-upload
}
class file_service {
+parseFile(filePath, mimetype)
}
class engine_service {
+processMessage(sock, sender, messageText)
-activeWorkflow
}
class whatsapp_service {
+connectToWhatsApp()
}
class workflowApi {
+generate(description, fileContext)
+uploadFile(file)
}
class ChatInterface_Component {
-messages
-isLoading
-uploadedFile
-messagesEndRef
-fileInputRef
+handleFileUpload(event)
+clearFile()
+handleSend(text)
}
class Builder_Page {
-workflow
-isTestOpen
-isCustomMode
-isDeployOpen
-isDeployed
-businessName
-tempName
+handleWorkflowGenerated(data)
+handleDeploy()
}
class WorkflowGraph_Component {
-workflowData
-nodes
-edges
+useEffect_onWorkflowData()
}
class NodePalette_Component {
-searchQuery
-expandedCategories
-allCategories
+onDragStart(event, tool)
+toggleCategory(catId)
}
%% Relationships
workflow_controller --> ai_service : uses
workflow_controller --> upload_controller : separate_route
upload_controller --> file_service : may_use
whatsapp_service --> engine_service : calls_processMessage
engine_service --> whatsapp_service : indirect_via_sock
workflowApi --> ChatInterface_Component : used_by
Builder_Page --> ChatInterface_Component : renders
Builder_Page --> WorkflowGraph_Component : renders
Builder_Page --> NodePalette_Component : renders
ChatInterface_Component --> workflowApi : calls_generate_and_upload
WorkflowGraph_Component --> engine_service : relies_on_workflow_format
File-Level Changes
Tips and commandsInteracting with Sourcery
Customizing Your ExperienceAccess your dashboard to:
Getting Help
|
There was a problem hiding this comment.
Hey - I've found 6 issues, and left some high level feedback:
- The file upload response shape doesn’t match frontend expectations:
upload.controllerreturnsfilename/originalNameand nocontext, butChatInterfacereadsdata.fileNameanddata.context, so the keys and structure should be aligned on both sides. - In
workflowApi.uploadFileandChatInterface.handleFileUpload, you treat the upload response as if it were a ready-to-sendfileContext, but the backend returnscolumns,preview, and counts; either wrap that into acontextobject on the backend or adjust the frontend to pass the actual shape expected bygenerateWorkflow. backend/src/services/file.service.jsis added but not used anywhere in the routes/controllers; consider either wiring it into the upload flow (to centralize parsing/cleanup) or removing it to avoid dead code.
Prompt for AI Agents
Please address the comments from this code review:
## Overall Comments
- The file upload response shape doesn’t match frontend expectations: `upload.controller` returns `filename`/`originalName` and no `context`, but `ChatInterface` reads `data.fileName` and `data.context`, so the keys and structure should be aligned on both sides.
- In `workflowApi.uploadFile` and `ChatInterface.handleFileUpload`, you treat the upload response as if it were a ready-to-send `fileContext`, but the backend returns `columns`, `preview`, and counts; either wrap that into a `context` object on the backend or adjust the frontend to pass the actual shape expected by `generateWorkflow`.
- `backend/src/services/file.service.js` is added but not used anywhere in the routes/controllers; consider either wiring it into the upload flow (to centralize parsing/cleanup) or removing it to avoid dead code.
## Individual Comments
### Comment 1
<location> `autoflow-frontend/src/components/builder/ChatInterface.jsx:31-32` </location>
<code_context>
+
+ setIsLoading(true);
+ try {
+ const data = await workflowApi.uploadFile(file);
+ setUploadedFile({ name: data.fileName, context: data.context });
+ setMessages(prev => [...prev, {
+ role: 'system',
</code_context>
<issue_to_address>
**issue (bug_risk):** Upload response shape doesn’t match how the frontend is consuming it, so file name and context will be undefined.
`uploadFile` returns keys like `filename`, `originalName`, `columns`, `preview`, but this code reads `data.fileName` and `data.context`. That will render the “Using: …” label as `undefined` and pass `undefined` into `workflowApi.generate`, so the uploaded data is never used. Please align the API and client contract (e.g. have the controller return `fileName`/`context`, or update the frontend to use the existing `filename`/`preview`/`columns` shape consistently).
</issue_to_address>
### Comment 2
<location> `autoflow-frontend/src/components/builder/ChatInterface.jsx:37-38` </location>
<code_context>
+ }
+
+ setExplanation(explanationText);
} catch (error) {
+ console.error(error);
setExplanation("Failed to generate explanation.");
</code_context>
<issue_to_address>
**issue (bug_risk):** Operator precedence in the error message will almost always ignore the `error.message` fallback.
Because `+` has higher precedence than `||`, the expression is effectively:
```js
("❌ Upload Failed: " + error.response?.data?.error) || error.message;
```
Any non-empty string is truthy, so the fallback to `error.message` is never used (you’ll even see `"❌ Upload Failed: undefined"`). Use parentheses or a template literal so the fallback actually applies:
```js
content: `❌ Upload Failed: ${error.response?.data?.error || error.message}`
```
</issue_to_address>
### Comment 3
<location> `autoflow-frontend/src/components/visualization/WorkflowGraph.jsx:86-95` </location>
<code_context>
- style: { stroke: '#555', strokeWidth: 2 }
- };
+ // B. Handle Condition Branches (true_id / false_id)
+ if (node.data?.true_id) {
+ aiEdges.push({
+ id: `e${sourceId}-${node.data.true_id}-yes`,
+ source: sourceId,
+ target: node.data.true_id,
+ sourceHandle: 'bottom', // Conditions usually split down
+ targetHandle: 'left',
+ label: 'Yes/True',
+ animated: true,
+ type: 'default',
+ style: { stroke: 'green' }
+ });
+ }
+ if (node.data?.false_id) {
+ aiEdges.push({
+ id: `e${sourceId}-${node.data.false_id}-no`,
</code_context>
<issue_to_address>
**issue (bug_risk):** Condition edges look for `true_id`/`false_id` under `data`, but the AI contract puts them at the top level.
Given the workflow prompt, `true_id` and `false_id` are top-level fields on the node, not under `data`, so branches produced by the AI won’t render with the current checks. This should instead read from the node root, e.g.:
```js
if (node.true_id) { /* ... */ }
if (node.false_id) { /* ... */ }
```
If you need to support older data, you could keep the `data.*` checks as a fallback.
</issue_to_address>
### Comment 4
<location> `backend/src/controllers/upload.controller.js:30-39` </location>
<code_context>
+ return res.status(400).json({ error: "No file uploaded" });
+ }
+
+ const filePath = req.file.path;
+ const fileExt = path.extname(req.file.originalname).toLowerCase();
+ let data = [];
+
+ // Parse Excel or CSV
+ if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
+ const workbook = xlsx.readFile(filePath);
+ const sheetName = workbook.SheetNames[0]; // Read first sheet
+ const sheet = workbook.Sheets[sheetName];
+ data = xlsx.utils.sheet_to_json(sheet);
+ }
+
+ // Get columns for AI context
+ const columns = data.length > 0 ? Object.keys(data[0]) : [];
+ const preview = data.slice(0, 5); // Send first 5 rows as sample
+
+ res.json({
+ filename: req.file.filename,
+ originalName: req.file.originalname,
</code_context>
<issue_to_address>
**suggestion (performance):** Uploaded files are never deleted after parsing, which can accumulate unnecessary disk usage.
The file is read via `xlsx.readFile(filePath)` but never removed, so repeated uploads will grow the `uploads` directory. Since you only need the parsed JSON, you can safely delete the file after a successful parse:
```js
const workbook = xlsx.readFile(filePath);
// ...parse...
fs.unlink(filePath, err => {
if (err) console.error('Failed to delete uploaded file:', err);
});
```
This prevents unnecessary disk usage, especially with large or frequent uploads.
Suggested implementation:
```javascript
// Parse Excel or CSV
if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
const workbook = xlsx.readFile(filePath);
const sheetName = workbook.SheetNames[0]; // Read first sheet
const sheet = workbook.Sheets[sheetName];
data = xlsx.utils.sheet_to_json(sheet);
}
// Clean up uploaded file after parsing to avoid disk accumulation
fs.unlink(filePath, (err) => {
if (err) {
console.error("Failed to delete uploaded file:", err);
}
});
// Get columns for AI context
const columns = data.length > 0 ? Object.keys(data[0]) : [];
const preview = data.slice(0, 5); // Send first 5 rows as sample
res.json({
```
1. Ensure `fs` is imported at the top of `backend/src/controllers/upload.controller.js`, e.g.:
`const fs = require('fs');`
2. If you have other upload handlers or temporary-file flows in this file, consider applying the same `fs.unlink` cleanup pattern there as well.
</issue_to_address>
### Comment 5
<location> `autoflow-frontend/src/components/visualization/WorkflowGraph.jsx:37` </location>
<code_context>
useEffect(() => {
scrollToBottom();
- }, [messages, isLoading]);
</code_context>
<issue_to_address>
**issue (complexity):** Consider extracting node normalization, node mapping, and edge-building into pure helper functions plus a shared edge factory so the useEffect becomes a small orchestration layer.
You can keep the new behavior and significantly reduce complexity by extracting a few pure helpers and a shared edge factory. That will make the `useEffect` much shorter and easier to test.
### 1. Normalize and map nodes outside the effect
Move the “nodes vs steps” handling and layout logic into helpers:
```ts
function normalizeIncomingNodes(workflowData?: any) {
return workflowData?.nodes || workflowData?.steps || [];
}
function mapNodesToReactFlow(incomingNodes: any[]) {
return incomingNodes.map((node, index) => {
const isFirst = index === 0;
return {
id: node.id || generateId(),
type: 'custom',
position: node.position || {
x: isFirst ? 50 : 450,
y: isFirst ? 250 : 100 + ((index - 1) * 150),
},
data: {
label: node.data?.label || node.label || 'Step',
type:
node.type ||
node.data?.type ||
(node.label?.includes('?') ? 'condition' : 'action'),
category: isFirst ? 'AutoFlow' : 'AI',
},
};
});
}
```
### 2. DRY edge creation with a small factory
Most edges share the same shape. Use a helper to centralize defaults:
```ts
function createEdge(
source: string,
target: string,
overrides: Partial<Edge> = {}
) {
return {
id: overrides.id || `e${source}-${target}`,
source,
target,
sourceHandle: 'right',
targetHandle: 'left',
animated: true,
type: 'default',
...overrides,
};
}
```
### 3. Split edge strategies into pure functions
Each edge-building strategy can be a separate pure function using `createEdge`:
```ts
function buildEdgesFromNext(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
if (!sourceId || !Array.isArray(node.next)) return;
node.next.forEach((targetId: string) => {
edges.push(createEdge(sourceId, targetId));
});
});
return edges;
}
function buildEdgesFromConditions(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
if (!sourceId || !node.data) return;
if (node.data.true_id) {
edges.push(
createEdge(sourceId, node.data.true_id, {
id: `e${sourceId}-${node.data.true_id}-yes`,
sourceHandle: 'bottom',
label: 'Yes/True',
style: { stroke: 'green' },
})
);
}
if (node.data.false_id) {
edges.push(
createEdge(sourceId, node.data.false_id, {
id: `e${sourceId}-${node.data.false_id}-no`,
label: 'No/False',
style: { stroke: 'red' },
})
);
}
});
return edges;
}
function buildEdgesFromOutputs(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
const outputs = node.data?.outputs;
if (!sourceId || typeof outputs !== 'object') return;
Object.entries(outputs).forEach(([intent, targetId]) => {
edges.push(
createEdge(sourceId, String(targetId), {
id: `e${sourceId}-${targetId}-${intent}`,
label: intent,
})
);
});
});
return edges;
}
function buildFallbackEdges(aiNodes: Node[]) {
const edges: Edge[] = [];
for (let i = 0; i < aiNodes.length - 1; i++) {
const current = aiNodes[i];
const next = aiNodes[i + 1];
edges.push(
createEdge(current.id, next.id, {
id: `e-fallback-${current.id}-${next.id}`,
style: { stroke: '#999', strokeDasharray: '5,5' },
})
);
}
return edges;
}
```
### 4. Use the helpers in the `useEffect`
The `useEffect` then becomes orchestration-only:
```ts
useEffect(() => {
const incomingNodes = normalizeIncomingNodes(workflowData);
if (!Array.isArray(incomingNodes) || incomingNodes.length === 0) return;
const aiNodes = mapNodesToReactFlow(incomingNodes);
let aiEdges = [
...buildEdgesFromNext(incomingNodes),
...buildEdgesFromConditions(incomingNodes),
...buildEdgesFromOutputs(incomingNodes),
];
if (aiEdges.length === 0 && aiNodes.length > 1) {
aiEdges = buildFallbackEdges(aiNodes);
}
setNodes(aiNodes);
setEdges(aiEdges);
}, [workflowData, setNodes, setEdges]);
```
This keeps all existing behavior (support for `nodes`/`steps`, layout, `next`, `true_id`/`false_id`, `outputs`, and fallback edges) but significantly lowers complexity and makes each behavior testable and maintainable in isolation.
</issue_to_address>
### Comment 6
<location> `autoflow-frontend/src/components/builder/NodePalette.jsx:21` </location>
<code_context>
+ const [expandedCategories, setExpandedCategories] = useState(['core']);
+
+ // Merge Core Logic with 500+ Tools dynamically
+ const allCategories = useMemo(() => {
+ // 1. Get Core Logic (Triggers, Conditions, etc.)
+ const coreLogic = TOOL_CATEGORIES[0];
</code_context>
<issue_to_address>
**issue (complexity):** Consider extracting the category-building logic, icon map, and drag-data handling into small utilities so `NodePalette` focuses only on rendering and simple wiring.
You can keep all the new behavior while reducing the complexity inside `NodePalette` by pushing data shaping and mappings out of the component.
### 1. Extract `allCategories` construction to a small utility
Right now `useMemo` both normalizes `toolsData` and builds categories, which couples data concerns to the UI. You can move that out and keep the component focused on rendering:
```ts
// utils/buildToolCategories.ts
import { TOOL_CATEGORIES } from '../../constants/tools';
import toolsData from '../../data/tools.json';
export function buildToolCategories() {
const coreLogic = TOOL_CATEGORIES[0];
const groupedTools: Record<string, { id: string; label: string; type: string; icon: string }[]> = {};
toolsData.forEach(tool => {
if (!groupedTools[tool.category]) {
groupedTools[tool.category] = [];
}
groupedTools[tool.category].push({
id: tool.id,
label: tool.name,
type: 'tool',
icon: tool.icon,
});
});
const dynamicCategories = Object.keys(groupedTools)
.sort()
.map(catName => ({
id: catName.toLowerCase().replace(/\s+/g, '_'),
name: catName,
tools: groupedTools[catName],
}));
return [coreLogic, ...dynamicCategories];
}
```
Then in the component:
```tsx
// NodePalette.tsx
import { useMemo } from 'react';
import { buildToolCategories } from '../../utils/buildToolCategories';
export const NodePalette = () => {
const allCategories = useMemo(() => buildToolCategories(), []);
// ...
};
```
This keeps the normalization logic testable and reusable and makes the component easier to scan.
### 2. Isolate the icon map
The extended `IconMap` makes this file long and mixes UI with registry concerns. You can move the map into its own module without changing behavior:
```ts
// icons/toolIcons.ts
import {
MessageSquare, Zap, Divide, Link, Mail, Smartphone, Database, ShoppingCart,
Globe, CreditCard, Users, Code, Cloud, Clock, Bot, Facebook, Twitter,
Instagram, Linkedin, Slack, Github, Trello, Calendar, Activity, AlertTriangle,
Shield, Key, HardDrive, FileText, BarChart, Layout, GitBranch, Video,
DollarSign, Send, MessageCircle, Headphones, Flame, Box,
} from 'lucide-react';
export const ToolIconMap = {
MessageSquare,
Zap,
Divide,
Link,
Mail,
Smartphone,
Database,
ShoppingCart,
Globe,
CreditCard,
Users,
Code,
Cloud,
Clock,
Bot,
Facebook,
Twitter,
Instagram,
Linkedin,
Slack,
Github,
Trello,
Calendar,
Activity,
AlertTriangle,
Shield,
Key,
HardDrive,
FileText,
BarChart,
Layout,
GitBranch,
Video,
DollarSign,
Send,
MessageCircle,
Headphones,
Flame,
Box,
} as const;
```
And in the palette:
```tsx
import { ToolIconMap as IconMap } from '../../icons/toolIcons';
```
### 3. Tighten `onDragStart` to what’s actually consumed
If `application/toolData` is not yet used by your drop handler, you can keep the behavior ready but avoid the extra payload until it’s needed, or wrap it so the intent is clearer.
**If not currently used:**
```ts
const onDragStart = (event, tool) => {
event.dataTransfer.setData('application/reactflow', tool.id);
event.dataTransfer.effectAllowed = 'move';
};
```
**If you want to keep support but decouple it:**
```ts
// utils/setToolDragData.ts
export function setToolDragData(event: DragEvent, tool: any, options?: { includeFullData?: boolean }) {
event.dataTransfer?.setData('application/reactflow', tool.id);
if (options?.includeFullData) {
event.dataTransfer?.setData('application/toolData', JSON.stringify(tool));
}
event.dataTransfer!.effectAllowed = 'move';
}
```
```tsx
// NodePalette.tsx
import { setToolDragData } from '../../utils/setToolDragData';
const onDragStart = (event, tool) => {
setToolDragData(event, tool, { includeFullData: true }); // keep current behavior
};
```
This keeps all functionality but makes the palette itself much simpler: it renders categories, handles search, and delegates data shaping/drag payload details elsewhere.
</issue_to_address>Help me be more useful! Please click 👍 or 👎 on each comment and I'll use the feedback to improve your reviews.
| const data = await workflowApi.uploadFile(file); | ||
| setUploadedFile({ name: data.fileName, context: data.context }); |
There was a problem hiding this comment.
issue (bug_risk): Upload response shape doesn’t match how the frontend is consuming it, so file name and context will be undefined.
uploadFile returns keys like filename, originalName, columns, preview, but this code reads data.fileName and data.context. That will render the “Using: …” label as undefined and pass undefined into workflowApi.generate, so the uploaded data is never used. Please align the API and client contract (e.g. have the controller return fileName/context, or update the frontend to use the existing filename/preview/columns shape consistently).
| } catch (error) { | ||
| setMessages(prev => [...prev, { role: 'system', content: "❌ Upload Failed: " + error.response?.data?.error || error.message }]); |
There was a problem hiding this comment.
issue (bug_risk): Operator precedence in the error message will almost always ignore the error.message fallback.
Because + has higher precedence than ||, the expression is effectively:
("❌ Upload Failed: " + error.response?.data?.error) || error.message;Any non-empty string is truthy, so the fallback to error.message is never used (you’ll even see "❌ Upload Failed: undefined"). Use parentheses or a template literal so the fallback actually applies:
content: `❌ Upload Failed: ${error.response?.data?.error || error.message}`| if (node.data?.true_id) { | ||
| aiEdges.push({ | ||
| id: `e${sourceId}-${node.data.true_id}-yes`, | ||
| source: sourceId, | ||
| target: node.data.true_id, | ||
| sourceHandle: 'bottom', // Conditions usually split down | ||
| targetHandle: 'left', | ||
| label: 'Yes/True', | ||
| animated: true, | ||
| type: 'default', |
There was a problem hiding this comment.
issue (bug_risk): Condition edges look for true_id/false_id under data, but the AI contract puts them at the top level.
Given the workflow prompt, true_id and false_id are top-level fields on the node, not under data, so branches produced by the AI won’t render with the current checks. This should instead read from the node root, e.g.:
if (node.true_id) { /* ... */ }
if (node.false_id) { /* ... */ }If you need to support older data, you could keep the data.* checks as a fallback.
| const filePath = req.file.path; | ||
| const fileExt = path.extname(req.file.originalname).toLowerCase(); | ||
| let data = []; | ||
|
|
||
| // Parse Excel or CSV | ||
| if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') { | ||
| const workbook = xlsx.readFile(filePath); | ||
| const sheetName = workbook.SheetNames[0]; // Read first sheet | ||
| const sheet = workbook.Sheets[sheetName]; | ||
| data = xlsx.utils.sheet_to_json(sheet); |
There was a problem hiding this comment.
suggestion (performance): Uploaded files are never deleted after parsing, which can accumulate unnecessary disk usage.
The file is read via xlsx.readFile(filePath) but never removed, so repeated uploads will grow the uploads directory. Since you only need the parsed JSON, you can safely delete the file after a successful parse:
const workbook = xlsx.readFile(filePath);
// ...parse...
fs.unlink(filePath, err => {
if (err) console.error('Failed to delete uploaded file:', err);
});This prevents unnecessary disk usage, especially with large or frequent uploads.
Suggested implementation:
// Parse Excel or CSV
if (fileExt === '.xlsx' || fileExt === '.xls' || fileExt === '.csv') {
const workbook = xlsx.readFile(filePath);
const sheetName = workbook.SheetNames[0]; // Read first sheet
const sheet = workbook.Sheets[sheetName];
data = xlsx.utils.sheet_to_json(sheet);
}
// Clean up uploaded file after parsing to avoid disk accumulation
fs.unlink(filePath, (err) => {
if (err) {
console.error("Failed to delete uploaded file:", err);
}
});
// Get columns for AI context
const columns = data.length > 0 ? Object.keys(data[0]) : [];
const preview = data.slice(0, 5); // Send first 5 rows as sample
res.json({- Ensure
fsis imported at the top ofbackend/src/controllers/upload.controller.js, e.g.:
const fs = require('fs'); - If you have other upload handlers or temporary-file flows in this file, consider applying the same
fs.unlinkcleanup pattern there as well.
| @@ -35,54 +35,118 @@ export const WorkflowGraph = ({ workflowData }) => { | |||
|
|
|||
| // Handle initial workflow data from AI | |||
| useEffect(() => { | |||
There was a problem hiding this comment.
issue (complexity): Consider extracting node normalization, node mapping, and edge-building into pure helper functions plus a shared edge factory so the useEffect becomes a small orchestration layer.
You can keep the new behavior and significantly reduce complexity by extracting a few pure helpers and a shared edge factory. That will make the useEffect much shorter and easier to test.
1. Normalize and map nodes outside the effect
Move the “nodes vs steps” handling and layout logic into helpers:
function normalizeIncomingNodes(workflowData?: any) {
return workflowData?.nodes || workflowData?.steps || [];
}
function mapNodesToReactFlow(incomingNodes: any[]) {
return incomingNodes.map((node, index) => {
const isFirst = index === 0;
return {
id: node.id || generateId(),
type: 'custom',
position: node.position || {
x: isFirst ? 50 : 450,
y: isFirst ? 250 : 100 + ((index - 1) * 150),
},
data: {
label: node.data?.label || node.label || 'Step',
type:
node.type ||
node.data?.type ||
(node.label?.includes('?') ? 'condition' : 'action'),
category: isFirst ? 'AutoFlow' : 'AI',
},
};
});
}2. DRY edge creation with a small factory
Most edges share the same shape. Use a helper to centralize defaults:
function createEdge(
source: string,
target: string,
overrides: Partial<Edge> = {}
) {
return {
id: overrides.id || `e${source}-${target}`,
source,
target,
sourceHandle: 'right',
targetHandle: 'left',
animated: true,
type: 'default',
...overrides,
};
}3. Split edge strategies into pure functions
Each edge-building strategy can be a separate pure function using createEdge:
function buildEdgesFromNext(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
if (!sourceId || !Array.isArray(node.next)) return;
node.next.forEach((targetId: string) => {
edges.push(createEdge(sourceId, targetId));
});
});
return edges;
}
function buildEdgesFromConditions(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
if (!sourceId || !node.data) return;
if (node.data.true_id) {
edges.push(
createEdge(sourceId, node.data.true_id, {
id: `e${sourceId}-${node.data.true_id}-yes`,
sourceHandle: 'bottom',
label: 'Yes/True',
style: { stroke: 'green' },
})
);
}
if (node.data.false_id) {
edges.push(
createEdge(sourceId, node.data.false_id, {
id: `e${sourceId}-${node.data.false_id}-no`,
label: 'No/False',
style: { stroke: 'red' },
})
);
}
});
return edges;
}
function buildEdgesFromOutputs(incomingNodes: any[]) {
const edges: Edge[] = [];
incomingNodes.forEach((node: any) => {
const sourceId = node.id;
const outputs = node.data?.outputs;
if (!sourceId || typeof outputs !== 'object') return;
Object.entries(outputs).forEach(([intent, targetId]) => {
edges.push(
createEdge(sourceId, String(targetId), {
id: `e${sourceId}-${targetId}-${intent}`,
label: intent,
})
);
});
});
return edges;
}
function buildFallbackEdges(aiNodes: Node[]) {
const edges: Edge[] = [];
for (let i = 0; i < aiNodes.length - 1; i++) {
const current = aiNodes[i];
const next = aiNodes[i + 1];
edges.push(
createEdge(current.id, next.id, {
id: `e-fallback-${current.id}-${next.id}`,
style: { stroke: '#999', strokeDasharray: '5,5' },
})
);
}
return edges;
}4. Use the helpers in the useEffect
The useEffect then becomes orchestration-only:
useEffect(() => {
const incomingNodes = normalizeIncomingNodes(workflowData);
if (!Array.isArray(incomingNodes) || incomingNodes.length === 0) return;
const aiNodes = mapNodesToReactFlow(incomingNodes);
let aiEdges = [
...buildEdgesFromNext(incomingNodes),
...buildEdgesFromConditions(incomingNodes),
...buildEdgesFromOutputs(incomingNodes),
];
if (aiEdges.length === 0 && aiNodes.length > 1) {
aiEdges = buildFallbackEdges(aiNodes);
}
setNodes(aiNodes);
setEdges(aiEdges);
}, [workflowData, setNodes, setEdges]);This keeps all existing behavior (support for nodes/steps, layout, next, true_id/false_id, outputs, and fallback edges) but significantly lowers complexity and makes each behavior testable and maintainable in isolation.
| const [expandedCategories, setExpandedCategories] = useState(['core']); | ||
|
|
||
| // Merge Core Logic with 500+ Tools dynamically | ||
| const allCategories = useMemo(() => { |
There was a problem hiding this comment.
issue (complexity): Consider extracting the category-building logic, icon map, and drag-data handling into small utilities so NodePalette focuses only on rendering and simple wiring.
You can keep all the new behavior while reducing the complexity inside NodePalette by pushing data shaping and mappings out of the component.
1. Extract allCategories construction to a small utility
Right now useMemo both normalizes toolsData and builds categories, which couples data concerns to the UI. You can move that out and keep the component focused on rendering:
// utils/buildToolCategories.ts
import { TOOL_CATEGORIES } from '../../constants/tools';
import toolsData from '../../data/tools.json';
export function buildToolCategories() {
const coreLogic = TOOL_CATEGORIES[0];
const groupedTools: Record<string, { id: string; label: string; type: string; icon: string }[]> = {};
toolsData.forEach(tool => {
if (!groupedTools[tool.category]) {
groupedTools[tool.category] = [];
}
groupedTools[tool.category].push({
id: tool.id,
label: tool.name,
type: 'tool',
icon: tool.icon,
});
});
const dynamicCategories = Object.keys(groupedTools)
.sort()
.map(catName => ({
id: catName.toLowerCase().replace(/\s+/g, '_'),
name: catName,
tools: groupedTools[catName],
}));
return [coreLogic, ...dynamicCategories];
}Then in the component:
// NodePalette.tsx
import { useMemo } from 'react';
import { buildToolCategories } from '../../utils/buildToolCategories';
export const NodePalette = () => {
const allCategories = useMemo(() => buildToolCategories(), []);
// ...
};This keeps the normalization logic testable and reusable and makes the component easier to scan.
2. Isolate the icon map
The extended IconMap makes this file long and mixes UI with registry concerns. You can move the map into its own module without changing behavior:
// icons/toolIcons.ts
import {
MessageSquare, Zap, Divide, Link, Mail, Smartphone, Database, ShoppingCart,
Globe, CreditCard, Users, Code, Cloud, Clock, Bot, Facebook, Twitter,
Instagram, Linkedin, Slack, Github, Trello, Calendar, Activity, AlertTriangle,
Shield, Key, HardDrive, FileText, BarChart, Layout, GitBranch, Video,
DollarSign, Send, MessageCircle, Headphones, Flame, Box,
} from 'lucide-react';
export const ToolIconMap = {
MessageSquare,
Zap,
Divide,
Link,
Mail,
Smartphone,
Database,
ShoppingCart,
Globe,
CreditCard,
Users,
Code,
Cloud,
Clock,
Bot,
Facebook,
Twitter,
Instagram,
Linkedin,
Slack,
Github,
Trello,
Calendar,
Activity,
AlertTriangle,
Shield,
Key,
HardDrive,
FileText,
BarChart,
Layout,
GitBranch,
Video,
DollarSign,
Send,
MessageCircle,
Headphones,
Flame,
Box,
} as const;And in the palette:
import { ToolIconMap as IconMap } from '../../icons/toolIcons';3. Tighten onDragStart to what’s actually consumed
If application/toolData is not yet used by your drop handler, you can keep the behavior ready but avoid the extra payload until it’s needed, or wrap it so the intent is clearer.
If not currently used:
const onDragStart = (event, tool) => {
event.dataTransfer.setData('application/reactflow', tool.id);
event.dataTransfer.effectAllowed = 'move';
};If you want to keep support but decouple it:
// utils/setToolDragData.ts
export function setToolDragData(event: DragEvent, tool: any, options?: { includeFullData?: boolean }) {
event.dataTransfer?.setData('application/reactflow', tool.id);
if (options?.includeFullData) {
event.dataTransfer?.setData('application/toolData', JSON.stringify(tool));
}
event.dataTransfer!.effectAllowed = 'move';
}// NodePalette.tsx
import { setToolDragData } from '../../utils/setToolDragData';
const onDragStart = (event, tool) => {
setToolDragData(event, tool, { includeFullData: true }); // keep current behavior
};This keeps all functionality but makes the palette itself much simpler: it renders categories, handles search, and delegates data shaping/drag payload details elsewhere.
Removed commented sections from the system architecture diagram for clarity.
Removed high-level overview and system architecture diagram sections from the document.
Added system architecture diagram and updated section headers.
Removed the AutoFlow System Architecture section and diagram.
Expanded the AutoFlow system architecture documentation with detailed sections on high-level overview, system architecture diagram, component breakdown, data persistence, and key workflows.
…odular backend, frontend, whatsapp-bridge and docker setup
- Complete README with features, architecture, setup, API reference, database schema, usage guide, tech stack, and contribution guidelines - Detailed SYSTEM_ARCHITECTURE.md with data flow diagrams, component hierarchies, executor engine internals, LLM integration, and scalability roadmap
…ar, and proper API integration
…500+ drag-and-drop tools
… browser needed, instant QR)
… render QR in terminal
…olished chat UI, dynamic toolbox
Summary by Sourcery
Integrate file-based inventory context and improved AI workflow generation/visualization with a refined dashboard, builder UX, and WhatsApp/simulation robustness.
New Features:
Bug Fixes:
Enhancements:
Build: