This document explains how to use muster test to generate API schemas and validate test scenarios against the current muster serve API.
The schema generation and validation functionality helps ensure test scenarios stay in sync with the actual muster serve API. When the API changes, you can regenerate the schema and validate scenarios to catch compatibility issues.
π― Unified Functionality: Both CLI (muster test --validate-scenarios) and MCP server (mcp_muster-test_test_validate_scenario with schema_path) provide identical validation functionality with the same detailed results and error reporting.
Generate a JSON schema from a running muster serve instance:
# Generate schema with default settings
muster test --generate-schema
# Generate with verbose output and custom file name
muster test --generate-schema --verbose --schema-output=api-schema.json
# Use different port range to avoid conflicts
muster test --generate-schema --base-port=19000The generated schema contains:
- All 43+
core_*API tools from muster serve - Arg schemas inferred from tool names and error analysis
- Proper JSON Schema format for validation tools
{
"$schema": "http://json-schema.org/draft-07/schema#",
"title": "muster Core API Schema",
"description": "Generated schema for muster core API tools",
"properties": {
"tools": {
"properties": {
"core_serviceclass_create": {
"type": "object",
"properties": {
"name": { "type": "string", "description": "Name of the resource to create" },
"type": { "type": "string", "description": "ServiceClass type" },
"version": { "type": "string", "description": "ServiceClass version" },
"serviceConfig": { "type": "object", "description": "ServiceClass configuration" }
}
}
}
}
},
"generated_at": "2025-06-22T15:55:42+02:00",
"version": "1.0.0"
}Validate existing test scenarios against the generated schema using CLI or MCP server:
# Validate scenarios with default schema
muster test --validate-scenarios
# Use custom schema file and show verbose output
muster test --validate-scenarios --schema-input=api-schema.json --verbose
# Validate scenarios from custom directory
muster test --validate-scenarios --config=path/to/scenarios# Start MCP server
muster test --mcp-server
# Then call the validation tool:
# mcp_muster-test_test_validate_scenario with args:
# - scenario_path: "/path/to/scenarios" (required)
# - schema_path: "schema.json" (optional, enables API validation)
# - category: "behavioral" (optional)
# - concept: "serviceclass" (optional)Note: Both methods provide identical validation results and error reporting.
The validator provides detailed reports:
π Validation Results
βββββββββββββββββββ
Total scenarios: 207
Valid scenarios: 15
Invalid scenarios: 192
Total errors: 354
Error Summary:
unknown_tool: 29
unexpected_argument: 325
Detailed Results:
β serviceclass-create
unexpected_argument: Step create-test-serviceclass: Argument 'description' not expected for tool 'core_serviceclass_create'
π‘ Check available tools in the schema
| Error Type | Description | Action Required |
|---|---|---|
unknown_tool |
Tool name not found in API schema or invalid prefix | Check if tool name changed, was removed, or has invalid prefix |
unexpected_argument |
Argument not defined in tool schema | Remove argument or check if arg name changed |
missing_required_argument |
Required arg not provided | Add the missing required arg |
The validation system handles different tool prefixes according to their purpose:
-
core_*tools - Core muster API tools- β Validated against API schema: Args nd tool existence are checked
- β Fails if: Tool doesn't exist in current API or has invalid args
- π Example:
core_serviceclass_create,core_service_start
-
x_*tools - Mock MCP server tools- β Always valid: Part of test scenario setup (mock servers)
β οΈ Not validated: Args an't be verified (scenario-specific)- π Example:
x_kubernetes-mock_k8s_pod_list,x_storage-mock_create_volume
-
workflow_*tools - Workflow execution tools- β Always valid: Workflow execution calls
β οΈ Not validated: Args epend on workflow definition- π Example:
workflow_deploy-app,workflow_setup-environment
-
All other prefixes - Invalid tools
- β Always fails: Unknown tool type
- π Fix: Use proper prefix (
core_,x_, orworkflow_)
steps:
# β
VALID: Core tool - will be validated against API schema
- id: "create-serviceclass"
tool: "core_serviceclass_create"
args:
name: "my-service"
type: "web"
# β
VALID: Mock tool - accepted but not arg-validated
- id: "setup-mock"
tool: "x_kubernetes-mock_create_pod"
args:
namespace: "test"
# β
VALID: Workflow execution - accepted but not arg-validated
- id: "run-workflow"
tool: "workflow_deploy-application"
args:
environment: "staging"
# β INVALID: Unknown prefix
- id: "bad-tool"
tool: "custom_my_tool" # Will fail validation
args: {}# 1. Develop new API features
# 2. Generate updated schema
muster test --generate-schema --schema-output=schema-v2.json
# 3. Validate existing scenarios
muster test --validate-scenarios --schema-input=schema-v2.json
# 4. Fix any validation errors in scenarios
# 5. Commit both schema and updated scenarios# In CI pipeline after API changes
muster test --generate-schema --schema-output=current-schema.json
muster test --validate-scenarios --schema-input=current-schema.json
# Fail build if validation errors existWhen the API changes:
- Generate new schema:
muster test --generate-schema - Validate scenarios:
muster test --validate-scenarios - Update scenario files to fix validation errors
- Update documentation to reflect API changes
- Commit updated schema for future validation
# After adding new core tools or changing args
muster test --generate-schema --verbose
# Compare with previous schema
diff schema.json schema-previous.json
# Validate all scenarios against new schema
muster test --validate-scenarios --verbose- Unknown tools: Check if tool was renamed or moved
- Missing arguments: Add required args from schema
- Extra arguments: Remove deprecated or renamed args
- Mock tools: Ensure mock configurations match expected tools
The generated schema can be used with external JSON schema validators:
# Use with jsonschema CLI tool
pip install jsonschema
jsonschema -i scenario.json schema.jsonUse the schema to generate new test scenarios:
import json
# Load schema
with open('schema.json') as f:
schema = json.load(f)
# Generate test cases for each tool
for tool_name, tool_schema in schema['properties']['tools']['properties'].items():
print(f"Tool: {tool_name}")
print(f"Args: {list(tool_schema.get('properties', {}).keys())}")- Port conflicts: Use
--base-portwith different range - Instance startup timeout: Increase
--timeoutduration - Connection refused: Check if other muster instances are running
- Schema file not found: Check
--schema-inputpath - Invalid JSON: Regenerate schema file
- Too many errors: Use
--verboseto see detailed error information
# 1. Generate current API schema
muster test --generate-schema --verbose
# 2. Validate all scenarios (CLI method)
muster test --validate-scenarios --verbose
# 3. Fix identified issues in scenario files
# 4. Re-validate to confirm fixes
muster test --validate-scenarios
# 5. Generate tests for specific concept
muster test --concept=serviceclass --verbose
# 6. Update schema after API changes
muster test --generate-schema --schema-output=schema-v$(date +%Y%m%d).json# Start MCP server for AI-powered validation
muster test --mcp-server
# Use MCP tools for validation:
# - mcp_muster-test_test_validate_scenario: Validate YAML structure or against API schema
# - mcp_muster-test_test_run_scenarios: Execute test scenarios
# - mcp_muster-test_test_list_scenarios: Discover available scenariosThis workflow ensures your test scenarios stay synchronized with the actual muster serve API, catching breaking changes early and maintaining test reliability. Both CLI and MCP server provide identical functionality for maximum flexibility.