Skip to main content
The simulator lets you test decisions before deploying them. Run sample inputs, see which rules match, and trace data flow through every node.

Opening the simulator

Click Open Simulator in the top-right toolbar. The simulator panel opens at the bottom of the canvas with three sections:
  • Events panel (left) — Manage test events
  • Node trace (center) — Search and inspect node execution
  • Results panel (right) — View Output, Input, and Trace tabs

Managing test events

In the BRMS, test events are organized into:
  • Unsaved — Temporary events that aren’t persisted
  • Private — Your personal saved events
  • Shared — Events shared with your team
Click + to create a new event, or select an existing one to run it.

Running a test

  1. Select or create a test event
  2. Enter your test input as JSON
  3. Click the Run button (play icon) or press Enter
  4. View results in the Output tab
{
  "customer": {
    "tier": "gold",
    "yearsActive": 3
  },
  "order": {
    "subtotal": 250,
    "items": 5
  }
}

Reading results

After running a test, you see: Output — The final result returned by your decision Trace — Step-by-step execution showing:
  • Which nodes executed
  • What data each node received
  • What each node produced
  • Which decision table rows matched

Trace view

The trace shows execution order and data at each step:
StepNodeInputOutput
1Input{customer: {...}, order: {...}}
2Calculate Totals{order: {...}}{subtotal: 250, tax: 20}
3Discount Rules{customer: {...}, subtotal: 250}{discount: 0.15}
4Output{discount: 0.15, ...}Final result
Click any node in the trace to see its full input and output data.

Decision table tracing

For decision tables, the trace shows:
  • Evaluated rows — All rows that were checked
  • Matched row — The row (or rows) that matched
  • Match details — Which conditions passed or failed
This helps you understand why a particular row matched or didn’t match.

Performance metrics

The trace includes timing for each node:
  • Execution time — Microseconds spent in each node
  • Total time — End-to-end evaluation time
Use these metrics to identify slow nodes in complex decisions.

Testing strategies

Test edge cases

Create inputs that test boundary conditions:
// Test the boundary at exactly 100
{ "order": { "total": 100 } }

// Test just below the boundary
{ "order": { "total": 99.99 } }

// Test just above the boundary
{ "order": { "total": 100.01 } }

Test each decision table row

Create inputs designed to match each row in your tables. This ensures all paths work correctly.

Test error conditions

Try inputs with:
  • Missing fields
  • Null values
  • Invalid data types
  • Empty arrays

Save test events

In the BRMS, save your test events by clicking the menu icon next to the event and selecting Save to Private. Build a library of events that cover your critical scenarios.

Debugging tips

No output? Check that all nodes are connected. Data can’t flow through disconnected nodes. Wrong row matched? Review your decision table row order. With first-hit policy, earlier rows take precedence. Unexpected null? Trace the data to find where the value became null. Check for typos in field names. Expression error? The trace shows the exact expression that failed and the error message.