The Lifecycle of a Turn
A visual journey of data through the engine, step-by-step.
This guide traces the journey of a single user input (e.g., "The cat eats") through the BSP engine, from raw text to the final response and state update.
The Flow at a Glance
Step-by-Step Walkthrough
Step 1: Input & Tokenization (The Eyes)
Input: "The cat eats"
The Tokenizer breaks this down. It doesn't use embeddings. It maps strings to deterministic Integers.
"the"→12"cat"→45"eats"→99
Result: A Bitset representing {12, 45, 99}.
Step 2: Activation (The Recognition)
The Learner asks the GroupStore: "Who cares about these tokens?"
- It checks the Inverted Index.
- It finds
Group #5(which contains{cat, dog, pet}) andGroup #20(which contains{eats, drinks}). - It calculates Jaccard Similarity.
Group #5has 1 match ("cat") out of 3 members. Score: 0.33.
Result: Active Groups = [Group #5, Group #20].
Step 3: Deduction (The Thought)
The DeductionGraph looks at the active groups. It contains learned temporal links.
- It sees
Group #5(Pet) andGroup #20(Action). - It finds a strong link:
Group #5 + Group #20 → Group #99 (Food). - It "activates"
Group #99as a prediction.
Step 4: Sequence Generation (The Speech)
The engine now has a bag of predicted concepts (from Group #99: {food, fish, dry, bowl}).
The SequenceModel (a smart n-gram graph) constructs a sentence.
- It sees the user ended with "eats".
- It knows "eats" is often followed by "food" or "fish".
- It selects "fish" because
Group #99boosted its score.
Output: "fish"
Step 5: Learning (The Update)
Once the turn is done (or if we have ground truth), the system learns.
- Surprise: Did the user say what we expected? If the user actually said "The cat eats lasagna", and we predicted "fish", the surprise is high.
- Update:
- The link
Cat → Fishis weakened slightly. - The link
Cat → Lasagnais strengthened. - If "Lasagna" appears often enough, a new Group might form.
- The link