Skip to main content

What is a Use Case?

In MagOneAI, a Use Case is a workflow — a sequence of activities that orchestrate AI agents, tools, and logic to accomplish a task. Each Use Case is defined visually on the canvas, stored as portable JSON, and executed durably by Temporal. Think of a Use Case as a blueprint for how your AI agents, tools, and decision logic work together. You define the steps, connect them together, and MagOneAI handles the orchestration, execution, and state management.
MagOneAI uses the term “Use Case” for what other platforms call workflows or pipelines. Each Use Case contains Activities — the individual steps that make up the workflow.

Visual canvas builder

Build workflows visually using the drag-and-drop canvas in MagOneAI Studio. The canvas provides an intuitive way to design complex orchestration logic without writing code.

Building on the canvas

1

Add activities

Drag activity nodes from the sidebar onto the canvas. Each node represents a step in your workflow.
2

Connect the flow

Draw connections between nodes to define the execution order. Data flows through these connections.
3

Configure each node

Click any node to configure its settings, input mapping, and output handling.
4

Test and deploy

Test your workflow with sample data, then deploy it for production use.

Available node types

Agent

Execute an AI agent with reasoning, tools, knowledge bases, and structured output via DSPy

Tool

Call an MCP tool directly with specific parameters

Parallel

Run multiple branches simultaneously with configurable merge strategies

Condition

Route execution based on conditional logic (LLM-based or expression-based)

Human Task

Pause for human approval or input before continuing

Sub Use Case

Call another workflow as a reusable component with input/output mapping

ForEach

Iterate over collections, executing a child use case for each item in concurrent batches
Every workflow also includes automatic Start and End boundary nodes that mark the entry and exit points of execution.

Workflow as JSON

Every canvas workflow has an equivalent JSON definition. This portable format enables:
  • Version control — Track changes to workflows in Git
  • Import/export — Share workflows across projects and teams
  • Programmatic generation — Build workflows dynamically
  • Backup and migration — Move workflows between environments
The JSON definition captures everything: activity types, configurations, connections, input/output mappings, and conditional logic. You can switch between visual and JSON editing at any time.

Execution lifecycle

When you trigger a workflow, MagOneAI orchestrates a series of steps to execute your Use Case reliably and durably.
1

Trigger fires

The workflow starts from a trigger — an API call, schedule, manual execution, or chat message. The trigger provides initial input data.
2

Temporal workflow starts

MagOneAI creates a Temporal workflow execution. This ensures durable execution with automatic recovery and retry capabilities.
3

Activities execute in sequence

Each activity in your workflow runs in order, respecting parallel branches and conditional logic. Activities execute one at a time unless you use Parallel nodes.
4

Activity processing

Each activity receives input from the variable store, performs its work (agent reasoning, tool execution, etc.), and produces output.
5

Variable store updates

After each activity completes, its output is stored in the variable store. Subsequent activities can access this data through variable references.
6

Workflow completes

When all activities finish, the workflow completes. The final output is returned to the caller and stored in execution history.

Data flow through the workflow

Data flows through your workflow via the variable store — a key-value store scoped to each execution:
  1. Trigger input enters the variable store
  2. Activity outputs are written to the variable store
  3. Downstream activities read from the variable store using variable references like {{previous_activity.output.field}}
  4. Final output is assembled from variable store contents
Learn more about the variable store in the Memory and variable store guide.

Temporal durable execution

MagOneAI leverages Temporal to provide robust, reliable workflow execution with enterprise-grade durability guarantees.

What Temporal provides

Crash recovery

If a server crashes mid-execution, the workflow automatically resumes from the last checkpoint

Automatic retries

Failed activities are automatically retried according to your retry policy

Long-running execution

Workflows can run for minutes, hours, or even days without losing state

Full observability

Complete execution history with activity-level logs, timing, and state transitions

How checkpointing works

Every workflow step is checkpointed to durable storage:
  • Before each activity — Temporal records the execution state
  • After each activity — Results are persisted before moving to the next step
  • On failure — The workflow can resume from the last successful checkpoint
  • Across restarts — Server restarts don’t interrupt execution
This means your workflows are resilient to infrastructure failures, deployment updates, and transient errors. Execution state is never lost.

Benefits for AI workflows

Durable execution is especially valuable for AI workflows:
  • Long LLM calls — Agents can take minutes to reason and execute tools
  • Human-in-the-loop — Workflows can wait hours or days for human approval
  • Batch processing — Process thousands of items without worrying about failures
  • Cost optimization — No compute resources consumed while waiting for external events
Temporal’s durable execution means you can design workflows with confidence. Don’t worry about crashes, timeouts, or lost state — focus on the logic and let MagOneAI handle the reliability.

Next steps

Agent node

Learn how to use AI agents as workflow activities

Parallel execution

Run multiple branches simultaneously for complex orchestration

Memory system

Understand how data flows through your workflows

Triggers and execution

Start and monitor your workflow executions