HEART: Hierarchical Emotion Architecture for Reasoning and Trait-Evolution — A Continuous Multi-Dimensional Affect Engine with Conflict Resolution, Personality-Coupled Feedback, and Domain-Adaptive Emotion Processing
| Application No. | 202521098101 |
| Complete Spec. | 6 April 2026 |
| Provisional Filed | 11 October 2025 |
| Applicant | Manvendra Modgil |
| Assignee | Modint Intelligence |
| Inventors | Manvendra Modgil |
| IPC Classes | G06N 3/00, G06N 5/04, G06F 40/56 |
Abstract
The invention discloses HEART (Hierarchical Emotion Architecture for Reasoning and Trait-Evolution), a multi-dimensional affect engine for artificial intelligence systems. The engine maintains an eighteen-dimensional emotional state vector with per-emotion exponential decay, momentum bias, and energy normalization. A conflict resolution mechanism dampens co-occurring opposing emotions with energy redistribution. An emotion blending module detects composite emotional states with behavioral overlays. A personality evolution module bidirectionally coupled to the engine adjusts Big Five traits through time-decayed aggregation and feeds personality bias back into emotional processing. Affect-biased retrieval modulates memory scoring across recency, salience, and trust dimensions. A self-preservation layer gates autonomous actions through emotion-weighted risk scoring. Domain interfaces adapt the engine for clinical assessment, decision governance, and adaptive education. The system achieves emotion-driven adaptive computation maintaining continuity of mood, personality, and context across sessions without retraining.
Claims
20 claims · 6 independentAn emotion processing system for artificial intelligence applications, comprising: (a) an emotional state engine configured to maintain a multi-dimensional affect vector comprising at least eighteen discrete emotion dimensions, each represented as a continuous value in the range [0.0, 1.0]; (b) a per-emotion exponential decay mechanism configured to decay each emotion dimension independently toward a configurable baseline intensity according to a half-life constant specific to that emotion dimension; (c) a momentum bias mechanism configured to modulate the decay rate and event application strength based on the first derivative of each emotion dimension’s trajectory over a rolling window of recent state snapshots; (d) an energy normalization constraint configured to cap the total sum of all emotion dimension intensities to a configurable maximum and proportionally scale all intensities when the constraint is violated; and (e) a dominant emotion tracker configured to identify the highest-intensity emotion dimension and compute a classification confidence score inversely related to the volatility of the dominant emotion.
The system of claim 1, further comprising a conflict resolution mechanism configured to: detect co-occurring opposing emotions from a predefined conflict matrix when both emotions in a pair exceed an activation threshold; apply proportional dampening to both emotions based on the opposition weight and minimum intensity of the pair; redistribute a portion of the dampened energy to designated buffer emotions; and raise a system-wide reflective state flag when cumulative conflict pressure across all pairs exceeds a pressure threshold.
The system of claim 1, further comprising an emotion blending mechanism configured to: detect composite emotional states emergent from co-occurring primary emotions according to a blend map defining composite emotion labels and blend weight functions; activate a composite emotion when its computed blend weight exceeds a blend activation threshold; and apply behavioral overlays specific to each composite emotion that modify downstream retrieval strategy, reasoning chain configuration, and agent parameters.
The system of claim 1, further comprising a per-emotion plugin architecture wherein each emotion dimension is associated with a modular processor defining: cross-emotion coupling weights representing the influence of that emotion’s activation on other emotion dimensions; an activation function responsive to textual patterns in input data; an influence function computing per-tick micro-dynamic delta vectors for background emotional evolution; and a reflection function generating narrative descriptions of emotional state changes.
The system of claim 4, wherein the plugin architecture operates on two gain schedules: a standard per-tick gain applied during idle processing cycles, and an elevated post-event gain applied immediately following detection of an emotional event, the elevated gain being at least 1.5 times the standard gain.
Figures
Figure 1: System Block Diagram of HEART Engine Architecture
Overall architecture of the HEART engine (200) with interconnected modules including the PDAR Autonomy Loop (100), Personality-Evolution Module (300), Hybrid Memory-Knowledge Graph (400), Specialised Software Agents (500a–c), and Output Module (600). Arrows depict bidirectional data exchange between modules.
Figure 2: Emotional-State Engine and Personality Coupling
The eighteen-dimensional affect vector decomposed into individual emotion channels, the Affect Vector output, the Personality-Evolution Layer with OCEAN sliders, and the bias feedback loop from personality to emotion thresholds.
Figure 3: PDAR Autonomy Loop with Reflective Feedback
The four sequential phases (Perceive, Decide, Act, Reflect) and the feedback path from Reflect to HEART state update and Memory Graph storage, with loop-back to Perceive.
Figure 4: Memory-Knowledge Graph with Affect-Biased Retrieval
The PDAR Decide step querying with affect and salience bias, the hybrid memory store with heterogeneous node types (Journal, Dream Log, Reflection, Doc-Chunk) carrying salience, recency, trust, and affect metadata, Top-K context retrieval, and bias-affected response generation.
Figure 5: Internal Monologue to Goal Formation Process
The Idle Internal-Monologue Engine writing to the Memory-Knowledge Graph, threshold-based promotion of entries to goals when salience exceeds threshold and affect trigger conditions are met, and handoff to PDAR Decide for goal execution.