Automated Test Generation
Automatically generate tests for your API and SDKs. Speakeasy can generate tests for all your operations including any new operations added in the future.
These tests use any examples available in your OpenAPI document if available, or autogenerate examples based on the field name, type, and format of your schemas.
Multiple tests per operation can be configured using the named examples detailed for your parameters, request bodies and responses.
Tests are generated to a .speakeasy/tests.arazzo.yaml
file in your SDK repo which then can be modified to customize the autogenerated tests if desired.
Prerequisites
The following are requirements for generating tests:
- Testing feature prerequisites are met.
Enabling Test Generation
Enable test generation in your SDK by modifying the SDK generation configuration (typically .speakeasy/gen.yaml
).
Enable test generation by adding to the generation
section of the configuration.
Enable automated generation of tests for new operations found.
When enabling for the first time this will generate tests for all operations in your OpenAPI document.
Then going forward it will only generate tests for any operations not already found in your .speakeasy/tests.arazzo.yaml
file.
Explicitly enable mock server generation. During the early access period, mock server generation won’t be enabled by default.
configVersion: 2.0.0generation:# ... other existing configuration ...tests:generateNewTests: truemockServer:disabled: false
After enabling test generation, if you wish to disable generation of tests for a specific operation, you can explicitly set x-speakeasy-test: false
:
paths:/example1:get:# This operation, without being explicitly disabled, will generate testing.# ... operation configuration .../example2:get:# This operation will not generate testing.# ... other operation configuration ...x-speakeasy-test: false
Generated Test Location
Generated test files are written in language-specific locations, relative to the root of the SDK:
Language | Location |
---|---|
Go | tests/ |
Python | tests/ |
TypeScript | src/__tests__ |
If the mock server is also generated, its output will be in a mockserver
directory under these locations.
Running Tests
Run testing, via any of these options, depends on your desired use case:
- Directly via the
speakeasy test
CLI command. - In GitHub Actions workflows.
- In the
speakeasy run
CLI command and existing GitHub Actions generation workflow with additional Speakeasy workflow configuration.
For speakeasy run
support, modify the Speakeasy workflow configuration (.speakeasy/workflow.yaml
).
Enable running tests during Speakeasy workflows by adding to one or more of the targets in the targets
section of the configuration.
Enable testing for the target.
targets:example-target:# ... other existing configuration ...testing:enabled: true
Data Handling
The definition of each operation will determine what data is used in generated testing. In addition to the data type system shaping data, OpenAPI Specification supports examples. Test generation will automatically use defined examples when available. In the absense of defined examples, the test generation will attempt to use a realistic example based on the type
, format
(if set), and property name (if applicable).
Example Property
By default, a single test will be created based on any example
properties found throughout any defined operation parameters
, requestBody
, and responses
.
In this example, a single test is created for the updatePet
operation with id
, name
, and photoUrls
data:
This test is created for the updatePet
operation.
The operation uses the Pet
shared component for both the request body and response.
The Pet
shared component is an object type with required name
and photoUrls
properties.
While not required, the Pet
object id
property has an example
property, which will be automatically included in the test.
The required Pet
object name
property has an example
property, which will be included in the test.
The required Pet
object photoUrls
property does not include an example
property, however it will have an example value automatically created since it is required.
paths:"/pet":put:tags:- petsummary: Update an existing petdescription: Update an existing pet by IdoperationId: updatePetrequestBody:description: Update an existent pet in the storecontent:application/json:schema:"$ref": "#/components/schemas/Pet"required: trueresponses:'200':description: Successful operationcontent:application/json:schema:"$ref": "#/components/schemas/Pet"components:schemas:Pet:required:- name- photoUrlstype: objectproperties:id:type: integerformat: int64example: 10name:type: stringexample: doggiecategory:"$ref": "#/components/schemas/Category"photoUrls:type: arrayitems:type: stringtags:type: arrayitems:"$ref": "#/components/schemas/Tag"status:type: stringdescription: pet status in the storeenum:- available- pending- sold
This definition creates a test with Pet
object request body and response data:
id: 10name: doggiephotoUrls:- <value>
Examples Property
Define multiple tests for an operation using the examples
property, which in this context is a mapping of example name string keys to example values. Prevent missing or mismatched test generation by ensuring the same example name key is used across all necessary parameters
, requestBody
, and responses
parts of the operation. If desired, define reusable examples under components similar to schemas.
In this example, multiple tests (fido
and rover
) are created for the addPet
operation:
These tests are created for the addPet
operation.
This operation includes both request body and response examples.
An addPet
operation fido
test is created with example request body and response data.
An addPet
operation rover
test is created with example request body and response data.
paths:"/pet":post:tags:- petsummary: Add a new pet to the storedescription: Add a new pet to the storeoperationId: addPetrequestBody:description: Create a new pet in the storecontent:application/json:schema:"$ref": "#/components/schemas/Pet"examples:fido:summary: fido requestdescription: fido example requestBody for test generationvalue:name: FidophotoUrls:- https://www.example.com/fido.jpgstatus: availablerover:summary: rover requestdescription: rover example requestBody for test generationvalue:name: RoverphotoUrls:- https://www.example.com/rover1.jpg- https://www.example.com/rover2.jpgstatus: pendingrequired: trueresponses:'200':description: Successful operationcontent:application/json:schema:"$ref": "#/components/schemas/Pet"examples:fido:summary: fido responsedescription: fido example response for test generationvalue:id: 1name: FidophotoUrls:- https://www.example.com/fido.jpgstatus: availablerover:summary: rover responsedescription: rover example response for test generationvalue:id: 2name: RoverphotoUrls:- https://www.example.com/rover1.jpg- https://www.example.com/rover2.jpgstatus: pending
Ignoring Data
Data properties can be explicitly ignored in testing via the x-speakeasy-test-ignore
annotation.
In this example, the other
property will be omitted from test generation:
paths:/example:get:# ... other operation configuration ...responses:"200":description: OKcontent:application/json:schema:type: objectproperties:data:type: stringother:type: stringx-speakeasy-test-ignore: true
Next Steps
- Configure custom contract and end-to-end tests for your SDK.
- Setup testing in GitHub Actions for your SDK.
- Advanced test configuration for your tests.