Automated API Testing with Postman

 Reynaldo Rodríguez
Reynaldo Rodríguez
March 7, 2024
Postman
API
Automation
Testing
QA
Automated API Testing with Postman

In the realm of software development, APIs stand as critical components, acting as the backbone of communication and data exchange between different software systems. As applications grow more complex and interconnected, ensuring that these APIs function correctly becomes not just beneficial but essential. This is where automated API testing comes into play, offering a scalable and efficient means to validate the functionality, performance, and security of APIs.

Postman, a powerful and versatile tool, has emerged as a frontrunner in this domain. Initially popular as a simple HTTP client for testing web services, Postman has evolved into a feature-rich platform supporting automated testing, enabling developers and testers to create, share, and execute test suites with ease and efficiency.

This article aims to provide a comprehensive guide on leveraging Postman for automated API testing. We will explore the basics of API testing, the unique features of Postman that make it suitable for automation, and a step-by-step guide on setting up and executing automated tests. Whether you are new to API testing or an experienced professional looking to enhance your testing strategies, this article promises to equip you with the knowledge and skills required to harness the full potential of Postman in your API testing endeavors.

Let’s start by downloading Postman, it has both web and desktop versions. On whatever option you choose you should sign up, this is required in order to have access to some of the free features like Data Syncing, Collections and Collection Runners. After signing in you will see something like this.

First we need to have a collection, a Postman Collection is a group of API endpoints or requests, along with each endpoint's authorization type, parameters, headers, request bodies, tests, and settings grouped under the same collection ID.

A collection enables you to group requests with different method types (for example GET, POST, DELETE, and PUT) and organize them into folders or subfolders. You can share collections with team members, as well as import and export them into other Postman instances.

Postman Collections are based on the open source collection format that makes it possible to share and run collections. The collection format is:

  • Portable and provides a unique interface for organizing API requests and modeling API workflows.
  • Machine and human readable and can be used to generate client and server-side SDKs, documentation, and mock servers.

Since we don’t want to build an API from scratch for this exercise we will import a collection from a public API, in this case we will use Interpol Notices API, and the open api metadata which needs to be imported in Postman can be found on https://interpol.api.bund.dev/openapi.yaml, in the case that you want to use a different API you need to check if it exposes its own metadata, otherwise you’ll end up creating each endpoint manually.

After a few seconds we will see the imported Collection in our sidebar, along with each endpoint it exposes.

We can also see that the collection already has defined a postman variable called baseUrl, this will serve as the base for all of the endpoints since it is already in use by all of the endpoints as part of their URLs, this will give us the flexibility to change environments in the future.

Now let’s run the Get Red Notices endpoint, so we can see its response, just select the Get Red Notices endpoint and then disable all of the query parameters which are not set and click on send.

We can see it has returned a successful response of a json object which contains few data points, a total property indicating the number of records in total, the query property which includes the pagination metadata, the _embedded property which includes an array of notices, and the _links property which includes linking to other pages. If we test the Get Yellow Notices we can see that it has the same response structure on a successful request example.

So in order to make this exercise short and not repetitive, let’s get rid of all the other endpoints besides the Get Red Notices request, we do that by right clicking each folder and then deleting them.

We will end up just with the red folder and the Get Red Notices request. Now let’s dive in to create a test for this request, here we will handle only the successful response flow, in case that we need to implement other flows we could copy over the request and set different query params to get different responses or flows.

Let’s open the request and move to the Tests tab. Before writing any test we need to define our testing strategy, for this example we will focus on Performance, Functional testing, specifically on Load, Contract, End to End and Unit testing.

The test code should be written in Javascript and the sandbox has the Chai.js library built in, so we can use Chai's behavior-driven development (BDD) syntax to create readable test assertions.

Let’s start by validating the successful response. We can do this with the following code:

Also let’s validate its response time, we want to keep it very low, below 1 second, add the following code below the previous test block.

We can also validate the response headers to comply with the expected response type which should be a json response.

In order to validate the structure of the resulting json object we can make use of the following block, we can take a look at the OK example within the Get Red Notices endpoint to see the actual response schema which can be used to create the schema validation.

After we are done setting up all the test cases in the Test tab, the next time we Send the request, it will run the tests and we can see the results in the Test Results response tab.

If we want to run all the requests tests on the same collection, we could right click on the Collection and then choose Run Collection, this will open up a dedicated test screen where we can see all the requests that will run and configure test settings.

If we run the Collection manually we will see the following result

We have another and more advanced way of running a collection by using data from a file, that will give us the flexibility to test multiple flows using the same request, we could take the query/body params from the file, along with the expected response and use them on the script, here’s how we can do it.

Let’s say we want to evaluate different flows like, searching by name or forename. We would need to come up with a CSV or JSON file which defines the search criteria and the expected result. I.E:

Then we would need to edit the Pre-Request Script to append the new query params if we detect that we are running the collection using a data file.

And the Tests to add a new test whenever we detect we are running the Collection with a data file:

This way the next time we manually run the Collection by specifying a file, it will have the numbers of iterations set based on the number of items in the array we have defined

Results show that a new test has been added which takes cares of validating the expected results when each of the properties are set as query parameters

If we want to set up automation on this suite we have two options, First we can incorporate it on our existing CI/CD pipeline by using the postman cli to run our synced collection.

You’ll have to generate an API key which needs to be used along the postman cli in order to be able to authenticate on postman servers and run the collection. You’ll end up seeing something like this

Keep in mind that if you want to use a data file like we did in the desktop version you’ll have to host that file somewhere so it can be accessible by the machine running the postman cli, once you do that you should be able to append the file to the cli command like this:

Secondly you can set up a Scheduled Run which will be executed on postman cloud servers at the given schedule.

Results can be seen later on the Postman GUI.

Additionally we could run a Performance test locally to simulate real-world traffic on our local machine and observe the results.

Here we can see the results of the Performance test we did, it seems the API implements some sort of Rate Limit because after a certain point in time we started to get 403 Forbidden errors. In the case that the API is under our control we could send a specific request header with a secret value which the backend can use to bypass the rate limit when this request header is present, allowing us to accurately test the Performance without blocking ourselves.

You can check this same collection on the following public workspace https://www.postman.com/jodlanyer/workspace/automated-api-testing-tutorial

Concluding, this article has provided a thorough exploration of automated API testing using Postman, demonstrating its capacity as a powerful tool for ensuring the reliability and performance of APIs in modern software development. By walking through the process of setting up and executing automated tests with Postman, from importing collections to writing and running test cases, we've seen how Postman's robust features and user-friendly interface make it an ideal choice for both newcomers and seasoned professionals in API testing. The practical examples, including testing the Interpol Notices API, illustrated the versatility of Postman in handling various testing scenarios, from performance and functional testing to schema validation.

As APIs continue to be integral to software systems, leveraging tools like Postman for automated testing will be crucial for maintaining high-quality, efficient, and secure applications. Whether integrated into CI/CD pipelines or utilized for scheduled runs, Postman offers a scalable solution to meet the growing demands of software development and testing, ensuring that APIs function as intended under diverse conditions and continue to support the seamless interaction between different systems and services.

Don't miss a thing, subscribe to our monthly Newsletter!

Thanks for subscribing!
Oops! Something went wrong while submitting the form.

Serverless GraphQL API with Hasura and AWS stack

As a follow-up to our serverless REST API post, this article provides a thorough step-by-step guide on how to creating a GraphQL API boilerplate based on Hasura and the AWS stack.

May 5, 2020
Read more ->
Serverless
API
GraphQL
AWS
AWS Benefits

Salesforce Integration Options: REST API, SOAP API, and Heroku Connect

In this article we break down the different options for integrating with Salesforce CRM, which are REST API, SOAP API, and Heroku Connect.

December 4, 2019
Read more ->
Integration
API
Salesforce

Contact

Ready to get started?
Use the form or give us a call to meet our team and discuss your project and business goals.
We can’t wait to meet you!

Write to us!
info@vairix.com

Thank you! Your submission has been received!
Oops! Something went wrong while submitting the form.