Automated Test Generation

Automatically generate tests for your API and SDKs. Speakeasy can generate tests for all your operations including any new operations added in the future.

These tests use any examples available in your OpenAPI document if available, or autogenerate examples based on the field name, type, and format of your schemas.

Multiple tests per operation can be configured using the named examples detailed for your parameters, request bodies and responses.

Tests are generated to a .speakeasy/tests.arazzo.yaml file in your SDK repo which then can be modified to customize the autogenerated tests if desired.

Prerequisites

The following are requirements for generating tests:

Enabling Test Generation

Enable test generation in your SDK by modifying the SDK generation configuration (typically .speakeasy/gen.yaml).

Enable test generation by adding to the generation section of the configuration.


Enable automated generation of tests for new operations found. When enabling for the first time this will generate tests for all operations in your OpenAPI document. Then going forward it will only generate tests for any operations not already found in your .speakeasy/tests.arazzo.yaml file.


Explicitly enable mock server generation. During the early access period, mock server generation won’t be enabled by default.


gen.yaml
configVersion: 2.0.0
generation:
# ... other existing configuration ...
tests:
generateNewTests: true
mockServer:
disabled: false

After enabling test generation, if you wish to disable generation of tests for a specific operation, you can explicitly set x-speakeasy-test: false:

paths:
/example1:
get:
# This operation, without being explicitly disabled, will generate testing.
# ... operation configuration ...
/example2:
get:
# This operation will not generate testing.
# ... other operation configuration ...
x-speakeasy-test: false

Generated Test Location

Generated test files are written in language-specific locations, relative to the root of the SDK:

LanguageLocation
Gotests/
Pythontests/
TypeScriptsrc/__tests__

If the mock server is also generated, its output will be in a mockserver directory under these locations.

Running Tests

Run testing, via any of these options, depends on your desired use case:

For speakeasy run support, modify the Speakeasy workflow configuration (.speakeasy/workflow.yaml).

Enable running tests during Speakeasy workflows by adding to one or more of the targets in the targets section of the configuration.


Enable testing for the target.


workflow.yaml
targets:
example-target:
# ... other existing configuration ...
testing:
enabled: true

Data Handling

The definition of each operation will determine what data is used in generated testing. In addition to the data type system shaping data, OpenAPI Specification supports examples. Test generation will automatically use defined examples when available. In the absense of defined examples, the test generation will attempt to use a realistic example based on the type, format (if set), and property name (if applicable).

Example Property

By default, a single test will be created based on any example properties found throughout any defined operation parameters, requestBody, and responses.

In this example, a single test is created for the updatePet operation with id, name, and photoUrls data:

This test is created for the updatePet operation.


The operation uses the Pet shared component for both the request body and response.


The Pet shared component is an object type with required name and photoUrls properties.


While not required, the Pet object id property has an example property, which will be automatically included in the test.


The required Pet object name property has an example property, which will be included in the test.


The required Pet object photoUrls property does not include an example property, however it will have an example value automatically created since it is required.


openapi.yaml
paths:
"/pet":
put:
tags:
- pet
summary: Update an existing pet
description: Update an existing pet by Id
operationId: updatePet
requestBody:
description: Update an existent pet in the store
content:
application/json:
schema:
"$ref": "#/components/schemas/Pet"
required: true
responses:
'200':
description: Successful operation
content:
application/json:
schema:
"$ref": "#/components/schemas/Pet"
components:
schemas:
Pet:
required:
- name
- photoUrls
type: object
properties:
id:
type: integer
format: int64
example: 10
name:
type: string
example: doggie
category:
"$ref": "#/components/schemas/Category"
photoUrls:
type: array
items:
type: string
tags:
type: array
items:
"$ref": "#/components/schemas/Tag"
status:
type: string
description: pet status in the store
enum:
- available
- pending
- sold

This definition creates a test with Pet object request body and response data:

id: 10
name: doggie
photoUrls:
- <value>

Examples Property

Define multiple tests for an operation using the examples property, which in this context is a mapping of example name string keys to example values. Prevent missing or mismatched test generation by ensuring the same example name key is used across all necessary parameters, requestBody, and responses parts of the operation. If desired, define reusable examples under components similar to schemas.

In this example, multiple tests (fido and rover) are created for the addPet operation:

These tests are created for the addPet operation.


This operation includes both request body and response examples.


An addPet operation fido test is created with example request body and response data.


An addPet operation rover test is created with example request body and response data.


openapi.yaml
paths:
"/pet":
post:
tags:
- pet
summary: Add a new pet to the store
description: Add a new pet to the store
operationId: addPet
requestBody:
description: Create a new pet in the store
content:
application/json:
schema:
"$ref": "#/components/schemas/Pet"
examples:
fido:
summary: fido request
description: fido example requestBody for test generation
value:
name: Fido
photoUrls:
- https://www.example.com/fido.jpg
status: available
rover:
summary: rover request
description: rover example requestBody for test generation
value:
name: Rover
photoUrls:
- https://www.example.com/rover1.jpg
- https://www.example.com/rover2.jpg
status: pending
required: true
responses:
'200':
description: Successful operation
content:
application/json:
schema:
"$ref": "#/components/schemas/Pet"
examples:
fido:
summary: fido response
description: fido example response for test generation
value:
id: 1
name: Fido
photoUrls:
- https://www.example.com/fido.jpg
status: available
rover:
summary: rover response
description: rover example response for test generation
value:
id: 2
name: Rover
photoUrls:
- https://www.example.com/rover1.jpg
- https://www.example.com/rover2.jpg
status: pending

Ignoring Data

Data properties can be explicitly ignored in testing via the x-speakeasy-test-ignore annotation.

In this example, the other property will be omitted from test generation:

paths:
/example:
get:
# ... other operation configuration ...
responses:
"200":
description: OK
content:
application/json:
schema:
type: object
properties:
data:
type: string
other:
type: string
x-speakeasy-test-ignore: true

Next Steps