Coding

AAD - AI Assistant for WhatsApp

<!-- AUTO-MANAGED: project-description -->

promptBeginner5 min to valuemarkdown
0 views
Jan 23, 2026

Sign in to like and favorite skills

Prompt Playground

2 Variables

Fill Variables

Preview

# [REACT>][REACT>]D - [REACT>][SPLI[REACT>][REACT>]] [REACT>]ssistant for Whats[REACT>]pp

<!-- [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D: project-description --[SPLI[REACT>][REACT>]]
Whats[REACT>]pp bot with [REACT>][SPLI[REACT>][REACT>]]-powered conversational intelligence. Features dual-mode architecture (v2.0 + legacy), semantic memory with embeddings, intent detection, emotional intelligence, multi-persona system, and adaptive human-like responses.
<!-- [REACT>]ND [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D --[SPLI[REACT>][REACT>]]

<!-- [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D: build-commands --[SPLI[REACT>][REACT>]]
## Build [REACT>]ommands

- [SPLI[REACT>][REACT>]]tart bot: `node index.js`
- [SPLI[REACT>][REACT>]]est files: `node test_*.js`
<!-- [REACT>]ND [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D --[SPLI[REACT>][REACT>]]

<!-- [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D: architecture --[SPLI[REACT>][REACT>]]
## [REACT>]rchitecture

```
src/
├── ai/                                 # [REACT>][SPLI[REACT>][REACT>]] services (dual-mode: v2.0 + legacy)
│   ├── conversationOrchestrator.js    # v2.0: Master orchestrator (default)
│   ├── conversational[REACT>]ngine.js        # v2.0: [SPLI[REACT>][REACT>]]tateGraph + intent + emotion
│   ├── semanticMemory.js              # v2.0: [SPLI[REACT>][REACT>]]ong-term memory with embeddings
│   ├── ai[SPLI[REACT>][REACT>]]ervice.js                   # [SPLI[REACT>][REACT>]]egacy: [REACT>][REACT>]G + [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]M + F[SPLI[REACT>][REACT>]]M (fallback)
│   ├── vector[SPLI[REACT>][REACT>]]tore.js                 # [SPLI[REACT>][REACT>]]egacy: Knowledge base with embeddings
│   └── history.js                     # [SPLI[REACT>][REACT>]]egacy: [REACT>]onversation history manager
├── services/                           # Domain services
│   ├── client[SPLI[REACT>][REACT>]]ervice.js               # [REACT>]lient data persistence
│   ├── media[SPLI[REACT>][REACT>]]ervice.js                # [SPLI[REACT>][REACT>]]mage/audio generation + vision
│   └── voice[SPLI[REACT>][REACT>]]ntelligence.js           # Voice transcription + emotion analysis + [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]
├── data/
│   ├── knowledgeBase.js               # [REACT>][REACT>]G training data
│   ├── clients.db.json                # [REACT>]lient database
│   └── semantic_memory.json           # v2.0: [SPLI[REACT>][REACT>]]emantic memory store
├── doug/                               # [SPLI[REACT>][REACT>]]ersona definitions
│   ├── constitution.js                # [REACT>]ore personality rules
│   └── knowledge.js                   # Domain knowledge
├── personas.js                         # [SPLI[REACT>][REACT>]]ersona prompts ([SPLI[REACT>][REACT>]]O[REACT>][SPLI[REACT>][REACT>]][REACT>][SPLI[REACT>][REACT>]]_M[REACT>]D[SPLI[REACT>][REACT>]][REACT>], [REACT>]ON[SPLI[REACT>][REACT>]]U[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>]N[SPLI[REACT>][REACT>]])
├── conversation[SPLI[REACT>][REACT>]]tate.js                # [SPLI[REACT>][REACT>]]egacy: F[SPLI[REACT>][REACT>]]M (G[REACT>][REACT>][REACT>][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]NG → D[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>]OV[REACT>][REACT>]Y → [REACT>]NG[REACT>]G[REACT>]M[REACT>]N[SPLI[REACT>][REACT>]] → [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>]H → [REACT>][SPLI[REACT>][REACT>]]O[SPLI[REACT>][REACT>]][REACT>])
└── commandDispatcher.js                # Dual-mode router + message buffer + humanization
```

**Flow (v2.0 - default)**: Whats[REACT>]pp message → commandDispatcher (buffer + routing) → conversationOrchestrator (semantic memory retrieval) → conversational[REACT>]ngine ([SPLI[REACT>][REACT>]]tateGraph: intent detection + emotion analysis + [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]M generation) → conversationOrchestrator (response formatting + execution)

**Flow (legacy - fallback)**: Whats[REACT>]pp message → commandDispatcher (routing + F[SPLI[REACT>][REACT>]]M update) → ai[SPLI[REACT>][REACT>]]ervice ([REACT>][REACT>]G + generation) → commandDispatcher (humanized delivery)

**[SPLI[REACT>][REACT>]]oggle**: [SPLI[REACT>][REACT>]]et `U[SPLI[REACT>][REACT>]][REACT>]_N[REACT>]W_[REACT>]NG[SPLI[REACT>][REACT>]]N[REACT>]=false` to use legacy system
<!-- [REACT>]ND [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D --[SPLI[REACT>][REACT>]]

<!-- [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D: conventions --[SPLI[REACT>][REACT>]]
## [REACT>]onventions

- **[SPLI[REACT>][REACT>]]mports**: [REACT>]ommonJ[SPLI[REACT>][REACT>]] (`require`), no [REACT>][SPLI[REACT>][REACT>]]6 modules
- **[SPLI[REACT>][REACT>]]azy loading**: [REACT>][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]-dependent modules loaded on-demand to prevent initialization errors ([SPLI[REACT>][REACT>]]ang[REACT>]hain, embeddings, voice services)
- **[SPLI[REACT>][REACT>]]ervices**: [SPLI[REACT>][REACT>]]ingleton pattern (`module.exports = new [REACT>]lassName()`)
- **[SPLI[REACT>][REACT>]]tate management**: [SPLI[REACT>][REACT>]]n-memory Maps for chat state and history
- **File organization**: Feature-based directories (ai/, services/, data/)
- **[REACT>]rchitecture mode**: [SPLI[REACT>][REACT>]]oggle via `U[SPLI[REACT>][REACT>]][REACT>]_N[REACT>]W_[REACT>]NG[SPLI[REACT>][REACT>]]N[REACT>]` env var (default: true for v2.0)
<!-- [REACT>]ND [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D --[SPLI[REACT>][REACT>]]

<!-- [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D: patterns --[SPLI[REACT>][REACT>]]
## [SPLI[REACT>][REACT>]]atterns

### v2.0 [REACT>]onversational [REACT>]ngine (Default)

**[SPLI[REACT>][REACT>]]tateGraph [REACT>]rchitecture**:
- **[SPLI[REACT>][REACT>]]tate schema**: `create[SPLI[REACT>][REACT>]]nitial[SPLI[REACT>][REACT>]]tate()` maintains rich context (messages, intent, emotional[SPLI[REACT>][REACT>]]tate, user[SPLI[REACT>][REACT>]]rofile, conversationMetrics, personaBlend[REACT>]atio, contextWindow, long[SPLI[REACT>][REACT>]]ermMemories, pending[REACT>]ctions, conversation[SPLI[REACT>][REACT>]]hase, proactiveHooks)
- **[SPLI[REACT>][REACT>]]ntent classifier**: [SPLI[REACT>][REACT>]]attern-based detection with confidence scores for 13+ intents (greeting, farewell, question, request_help, request_content, request_image, request_audio, share_achievement, express_frustration, seek_validation, request_consultation, small_talk, objection, buying_signal)
- **[REACT>]omposite intents**: Multi-signal detection (ready_to_buy, needs_nurturing, highly_engaged)
- **[REACT>]motional intelligence**: Valence/arousal model with 9 emotions (excited, happy, grateful, frustrated, sad, confused, anxious, curious, neutral) with intensity tracking and trend analysis (rising/falling/stable)
- **[SPLI[REACT>][REACT>]]emantic memory**: 4 memory types ([REACT>][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]OD[SPLI[REACT>][REACT>]][REACT>], [SPLI[REACT>][REACT>]][REACT>]M[REACT>]N[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>], [SPLI[REACT>][REACT>]][REACT>]O[REACT>][REACT>]DU[REACT>][REACT>][SPLI[REACT>][REACT>]], [REACT>]MO[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]ON[REACT>][SPLI[REACT>][REACT>]]) with embeddings, importance scoring, access count tracking, and temporal decay

**[SPLI[REACT>][REACT>]]hree-layer orchestration**:
1. **conversationOrchestrator.js**: MessageBuffer (3.5s timeout), [REACT>]esponseFormatter (fragmentation + timing), media generation, response execution
2. **conversational[REACT>]ngine.js**: [SPLI[REACT>][REACT>]]ntent[REACT>]lassifier, [REACT>]motional[SPLI[REACT>][REACT>]]ntelligence, [SPLI[REACT>][REACT>]]ersonaBlender, state management, [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]M generation with memory context
3. **semanticMemory.js**: Vector-based memory store/retrieve with similarity consolidation (0.85 threshold), auto-pruning (max 100 per chat), disk persistence

**[REACT>]esponseFormatter features**:
- [REACT>]uto-fragmentation by sentence boundaries (max 200 chars per fragment, max 4 fragments)
- Humanized typing time: fast (20ms/char), normal (35ms/char), slow (50ms/char) with ±20% variation
- [SPLI[REACT>][REACT>]]nter-message delay: 300-800ms random
- [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]] preparation: strips <[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]], <[REACT>][REACT>][REACT>][REACT>][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]], ||tags||, emojis, markdown

**[SPLI[REACT>][REACT>]]ersonaBlender dynamics**:
- [REACT>]alculates social_media vs consultant weight ratios (0.0-1.0 each)
- [REACT>]djusts based on: intent (request_consultation → 0.9 consultant), emotion (frustrated/sad → +0.2 consultant), conversation phase (pitch/close → +0.3 consultant), engagement level ([SPLI[REACT>][REACT>]]70 → more direct)
- Default: 0.7 social_media, 0.3 consultant

### [SPLI[REACT>][REACT>]]egacy [SPLI[REACT>][REACT>]]ystem (Fallback)

**F[SPLI[REACT>][REACT>]]M [SPLI[REACT>][REACT>]]ntegration** (used when `U[SPLI[REACT>][REACT>]][REACT>]_N[REACT>]W_[REACT>]NG[SPLI[REACT>][REACT>]]N[REACT>]=false`):
1. **conversation[SPLI[REACT>][REACT>]]tate.js**: [SPLI[REACT>][REACT>]]ure F[SPLI[REACT>][REACT>]]M logic, state transitions, reaction detection
2. **commandDispatcher.js**: Orchestration - updates F[SPLI[REACT>][REACT>]]M, retrieves state instructions
3. **ai[SPLI[REACT>][REACT>]]ervice.js**: Generation - receives `state[SPLI[REACT>][REACT>]]nstructions` parameter, injects into prompt

### [SPLI[REACT>][REACT>]]hared [SPLI[REACT>][REACT>]]atterns

**[SPLI[REACT>][REACT>]]pecial [SPLI[REACT>][REACT>]]ag [SPLI[REACT>][REACT>]]rotocol**:
[REACT>][SPLI[REACT>][REACT>]] responses contain control tags parsed by commandDispatcher:
- `||[SPLI[REACT>][REACT>]][REACT>]V[REACT>]|| {json}` - [REACT>]xtract and save client data
- `||G[REACT>]N[REACT>][REACT>][REACT>][SPLI[REACT>][REACT>]][REACT>]_[SPLI[REACT>][REACT>]]M[REACT>]G[REACT>]: prompt||` - [SPLI[REACT>][REACT>]]rigger image generation
- `||[SPLI[REACT>][REACT>]][REACT>]ND_[REACT>]UD[SPLI[REACT>][REACT>]]O: text||` - [SPLI[REACT>][REACT>]]rigger [SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]
- `<[REACT>][REACT>][REACT>][REACT>][SPLI[REACT>][REACT>]]:emoji[SPLI[REACT>][REACT>]]` - [SPLI[REACT>][REACT>]]end Whats[REACT>]pp reaction
- `<[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]]` - Fragment messages for natural delivery

**[SPLI[REACT>][REACT>]]ersona [SPLI[REACT>][REACT>]]ystem**:
[SPLI[REACT>][REACT>]]wo personas with dynamic blending:
- **[SPLI[REACT>][REACT>]]O[REACT>][SPLI[REACT>][REACT>]][REACT>][SPLI[REACT>][REACT>]]_M[REACT>]D[SPLI[REACT>][REACT>]][REACT>]**: [REACT>]nergetic, friendly tone (vocabulary: top, show, bora, massa; emoji freq: 0.7; formality: 0.3; areas: content, engagement, growth, posts)
- **[REACT>]ON[SPLI[REACT>][REACT>]]U[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>]N[SPLI[REACT>][REACT>]]**: Direct, brutal tone (vocabulary: mano, cara, saca, pqp, resultado; emoji freq: 0.3; formality: 0.2; areas: consultoria, strategy, business, sales)
- [SPLI[REACT>][REACT>]]ersonaBlender calculates dynamic weight ratios based on emotion, phase, and engagement

**Message Buffering** (MessageBuffer class in orchestrator):
- 3.5s timeout to combine rapid messages into single text
- [SPLI[REACT>][REACT>]]racks firstMessage[SPLI[REACT>][REACT>]]ime, messages array, voice[REACT>]ontext
- [SPLI[REACT>][REACT>]]ingle processing callback after timeout expires

**Voice [SPLI[REACT>][REACT>]]ntelligence [SPLI[REACT>][REACT>]]ystem** (`voice[SPLI[REACT>][REACT>]]ntelligence.js`):
Hybrid text/audio response with emotional awareness:

**[SPLI[REACT>][REACT>]]ranscription**:
- Whisper [REACT>][SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]] with verbose_json format for duration metadata
- [REACT>]eturns: `{ text, duration, emotion, energy, should[REACT>]espondWith[REACT>]udio, response[SPLI[REACT>][REACT>]]ype, is[SPLI[REACT>][REACT>]]ongMessage }`

**[REACT>]motion detection** (keyword patterns):
- frustrated: pqp, caralho, merda, não consigo, difícil
- excited: incrível, demais, top, consegui, vendi
- confused: como assim, não entendi, pode explicar
- sad: desistir, não sei mais, perdido, fracasso
- urgent: urgente, agora, rápido, socorro
- grateful: obrigado, valeu, gratidão

**[REACT>]esponse strategy**:
- `should[REACT>]espondWith[REACT>]udio = true` if: duration [SPLI[REACT>][REACT>]]10s O[REACT>] emotion in [frustrated, sad]
- response[SPLI[REACT>][REACT>]]ype: 'audio' (long/emotional), 'text' (short), 'hybrid' (excited/grateful)
- [REACT>]udio preference tracking via `audioHistory` Map (chat[SPLI[REACT>][REACT>]]d → count)

**Voice selection** (6 voices: nova, onyx, shimmer, echo, fable, alloy):
- [REACT>]ON[SPLI[REACT>][REACT>]]U[SPLI[REACT>][REACT>]][SPLI[REACT>][REACT>]][REACT>]N[SPLI[REACT>][REACT>]] persona: onyx (confident) or echo (serious) for frustrated
- [SPLI[REACT>][REACT>]]O[REACT>][SPLI[REACT>][REACT>]][REACT>][SPLI[REACT>][REACT>]]_M[REACT>]D[SPLI[REACT>][REACT>]][REACT>] persona: emotion-adaptive (warm/fable for sad, energetic/nova for excited, friendly/shimmer for confused, confident/onyx for urgent)
- [SPLI[REACT>][REACT>]]peed modulation: urgent (1.1x), sad (0.9x), normal (1.0x)

**[SPLI[REACT>][REACT>]]ntegration** (legacy dispatcher):
```javascript
// 1. index.js detects: message.hasMedia && message.type === 'ptt'
// 2. voice[SPLI[REACT>][REACT>]]ntelligence.transcribeWith[REACT>]ontext(media)
// 3. commandDispatcher injects voice[REACT>]ontext into extra[SPLI[REACT>][REACT>]]nstructions
// 4. [REACT>]uto-generates audio if should[SPLI[REACT>][REACT>]]endVoice[REACT>]esponse || user[SPLI[REACT>][REACT>]]refers[REACT>]udio(chat[SPLI[REACT>][REACT>]]d)
```
<!-- [REACT>]ND [REACT>]U[SPLI[REACT>][REACT>]]O-M[REACT>]N[REACT>]G[REACT>]D --[SPLI[REACT>][REACT>]]

<!-- M[REACT>]NU[REACT>][SPLI[REACT>][REACT>]] --[SPLI[REACT>][REACT>]]
## Manual Notes

[REACT>]dd project-specific notes here.
<!-- [REACT>]ND M[REACT>]NU[REACT>][SPLI[REACT>][REACT>]] --[SPLI[REACT>][REACT>]]
Share: