Skip to content

Test Case Generation Flow

Availability:

TL;DR

Use testgen-flow when a Jira ticket, epic, or story needs grounded requirements and structured QA scenarios before implementation or automation starts. The workflow runs sequentially: load project config, collect Jira and Confluence evidence, analyze contradictions and gaps, stop for human answers, generate a requirements document, generate TestRail-ready test cases, and optionally export them to TestRail.

Expect one phase at a time, a state file updated after each phase, and explicit confirmation before moving forward. For first-time use in a repo, Phase 0 creates testgen-project-config.md, asks the user to confirm or correct the default retrieval scheme, and saves that answer for future runs. The hard gate is Phase 3: the workflow does not continue to requirements generation until the user answers the generated clarification questions and then explicitly approves continuation.

When To Use This Workflow

When Not To Use This Workflow

Before You Start

Prepare the minimum inputs that materially affect output quality:

Workflow-specific preparation that improves results:

For shared Rosetta setup and project-context customization, see Usage Guide.

How To Start

Example prompts that match the workflow:

Generate test cases for PROJ-123
Analyze requirements for PROJ-123 with Confluence pages:
- https://confluence.company.com/display/PROJ/Job+Post
- https://confluence.company.com/pages/viewpage.action?pageId=123456
Analyze requirements for https://jira.company.com/browse/PROJ-123 and export the final test cases to TestRail
Generate test scenarios for PROJ-123. Use these Confluence pages and our existing test generation config.

How Rosetta Shapes This Workflow

Rosetta does not generate all deliverables in one pass. It loads the workflow, executes one phase at a time, updates a state file after each phase, and expects explicit confirmation before moving forward. That changes the UX in four important ways:

Rosetta itself provides instructions and routing. The coding agent performs the retrieval, analysis, document generation, and optional export work.

Workflow At A Glance

Phase What you provide What agents do What you get Review gate
0. Project Config Loading Jira ticket key or URL, project retrieval expectations if no config exists Parse ticket, create ticket workspace, load or create project config, ask the first-time retrieval setup question when the config is missing, record initial state testgen-state.md, initial-data.md, project-level testgen-project-config.md if missing Confirm setup before Phase 1
1. Data Collection Jira input, optional Confluence URLs, extra page IDs if search fails Retrieve Jira fields, comments, Confluence pages, and child pages; capture raw evidence raw-data.md Confirm collected sources before Phase 2
2. Gap and Contradiction Analysis No new input unless source retrieval was incomplete Identify contradictions, gaps, ambiguities, cross-source conflicts, and risk analysis.md Confirm findings before Phase 3
3. Question Generation and User Input Answers to clarification questions and explicit approval to continue Generate prioritized questions, wait, validate answers, structure them questions.md, answers.md Hard HITL gate. Must answer, then explicitly approve Phase 4
4. Requirements Document Generation Validated answers and any final clarifications Synthesize evidence into user stories, FRs, NFRs, constraints, dependencies, assumptions, and traceability requirements.md Review requirements before Phase 5
5. Test Case Generation Approved or reviewed requirements direction Generate prioritized TestRail-ready test cases, merge redundant scenarios, update traceability test-scenarios.md, updated traceability in requirements.md Review scenarios before Phase 6 or before closing
6. Test Case Export TestRail access plus target section_id; project and suite details when your setup cannot detect them from the current ticket and user profile Verify connection, map cases, export to TestRail, record export IDs Updated test-scenarios.md, updated testgen-state.md Confirm export destination and review export results

Workflow Overview

flowchart TD
    A[Start with Jira ticket or URL] --> B[Phase 0 load or create project config]
    B --> C[Phase 1 collect Jira and Confluence evidence]
    C --> D[Phase 2 analyze contradictions gaps and ambiguities]
    D --> E[Phase 3 generate questions]
    E --> F{Answers validated and user explicitly approved?}
    F -- No --> E
    F -- Yes --> G[Phase 4 generate requirements.md]
    G --> H[Phase 5 generate test-scenarios.md]
    H --> I{Export to TestRail?}
    I -- No --> J[Finish with local artifacts]
    I -- Yes --> K[Phase 6 export to TestRail]
    K --> L[Finish with TestRail IDs and export summary]

Interaction Flow

sequenceDiagram
    participant U as User
    participant R as Rosetta instructions
    participant A as Coding agent
    participant X as Jira/Confluence/TestRail systems
    participant F as Workflow artifacts

    R->>A: Load testgen-flow and phase rules
    U->>A: Provide Jira ticket and optional Confluence/TestRail details
    A->>F: Create ticket workspace and state file
    A->>X: Retrieve Jira issue and Confluence pages
    A->>F: Write raw-data.md and analysis.md
    A->>F: Generate questions.md
    A-->>U: Stop for answers and explicit approval to continue
    U->>A: Fill questions.md or provide clarifications
    A->>F: Validate answers and write answers.md
    A-->>U: Wait for user to reply yes or approved
    A->>F: Generate requirements.md
    A-->>U: Request review before test generation
    U->>A: Approve or correct requirements direction
    A->>F: Generate test-scenarios.md and update traceability
    opt Phase 6 requested
        U->>A: Provide TestRail destination details
        A->>X: Export cases to TestRail
        A->>F: Update scenarios with TestRail IDs
    end

Phases

Phase 0: Project Config Loading

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 1: Data Collection

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 2: Gap and Contradiction Analysis

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 3: Question Generation and User Input

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 4: Requirements Document Generation

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 5: Test Case Generation

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

Phase 6: Test Case Export

Goal:

What you provide:

What the agent does:

What you get:

What to watch for:

How To Review Results

Review duties by stage:

If review is skipped, the most common failure mode is that later artifacts look polished but encode the wrong behavior. This workflow depends on human correction at the clarification and review gates.

Workflow-Specific Customization

The highest-value customizations for testgen-flow are specific to evidence retrieval and test design quality:

Artifacts You Will Get

Common ticket-level artifacts under agents/testgen/{TICKET-KEY}/:

Common project-level artifact:

Common Mistakes

Source Files

Authoritative workflow sources: