pytest plugin for automatic recording of http stubbed tests

pytest, plugin, http, stub, mock, record, responses, recorder, vcr, betamax, automatic, http-mock, pytest-plugin, python
pip install pytest-vts==0.4.8



Circle CI PyPi version

Automatic recorder for http stubbed pytest(s) using responses library. VTS stands for Video Tests System and has been inspired by VHS.

  1. How to use it
    1. Simple example
    2. Customise vts fixture
      1. Record or playback?
      2. Cassette location and name
      3. Body strict comparison
      4. Custom HTTP trx wrappers
  2. How does it actually work?
  3. Why this?
  4. Why pytest plugin?
  5. Why responses?
  6. Future work

How to use it

  1. Add as dependency/Install via pip:
  • from PyPI (recommended): pytest-vts

    • pip install pytest-vts
    • echo 'pytest-vts' >> requirements.txt
  • from github: git+

    • pip install git+
    • echo 'git+' >> requirements.txt

Note: During installation pytest is automatically installed as well if missing.

  1. Once installed the package provides a pytest fixture named vtswhich you can use for your tests.

Simple example, showing available assertions

Source Code

# content of
import requests

def list_repositories(user="bhodorog"):
    url = "{}".format(user)
    headers = {"Accept": "application/vnd.github.v3+json"}
    resp = requests.get(url, headers)
    return resp

# content of

def test_list_repositories(vts):
    assert vts.responses  # exposes underlying responses requests mock

    # asserts vs any information normally exposed by responses
    assert vts.responses.calls
    assert vts.responses.calls[0].request
    assert vts.responses.calls[0].request.url
    assert vts.responses.calls[0].response
    assert vts.responses.calls[0].response.headers
    # look at responses' documentation/code for more available info to assert against

    # you can asserts vs vts' recorded cassete as well
    # since it's just json based duplicated information, using exposed
    # responses instead might be better code style
    assert vts.cassette[0]["request"]
    assert vts.cassette[0]["request"]["url"]
    assert vts.cassette[0]["response"]
    assert vts.cassette[0]["response"]["headers"]

Command line usage

$ ls ./cassettes
ls: ./cassettes: No such file or directory
$ ls ./
# recording
$ py.test
# vts will use requests library to forward the request to
# and save the request-response pair into a cassette
$ ls ./cassettes

# playback-ing
$ py.test
# all http requests are handled by responses based on the existing
# cassette

Customise the vts fixture

Record or playback?

Out of the box pytest-vts will switch itself into recording mode each time a cassette file is not found. This can be overriden by using an environment variable PYTEST_VTS_FORCE_RECORDING which will allow you to re-record an existing cassette

Cassette location and name

When using the of the vts fixture, if the automatically determined location and the name for a cassette are not convenable you can customize them using pytest.mark.parametrize mechanism.

import pytest

# using strings

            "basedir": os.path.expanduser("~"),
            "cassette_name": "custom_name"
def test_list_repositories(vts):
    assert vts.calls
    assert vts.cassette_name.endswith("custom_name")

# using a callable

def custom_name(pytest_req):
    """pytest_req is an instance of
    return + "custom"

@pytest.mark.parametrize("vts", [{"cassette_name": custom_name}], indirect=["vts"])
def test_list_repositories(vts):
    assert vts.calls
    assert vts.cassette_name.endswith("custom")
  • non-injection mode. If vts fixture handle is not needed inside the test there is no need to declare it as an argument to a test function. Use pytest.mark.usefixtures instead. As a bonus, this allows to turn on the vts fixture only once for a collection of test methods, grouped inside a class.
import pytest

@pytest.mark.parametrize("vts", ["/store/cassette/here"], indirect=["vts"])
class TestMoreTests(object):
    def test_list_repositories_once(self):

    def test_list_repositories_twice(self):

Strict comparison for playback mode

Strict mode specifies the precision of the comparison of the current http request and the recorded request is made. By default responses will compare the current http request against the recorded requests using the url of the request only. Additionally, pytest-vts optionally adds more complexity to the comparison:

  • request's body (defaults to False)
  • request's headers (defaults to False)
  • request's query string (defaults to False)

Using the above pytest.mark.parametrize mechanism allows you to configure multiple tests at various levels (module,class,function), such as:

import pytest

@pytest.mark.parametrize("vts", [{"play_kwargs": {"strict_body": True}}], indirect=["vts"])
class TestMoreTests(object):
    def test_list_repositories_once(self, vts):

    def test_list_repositories_twice(self, vts):

However if you desire to toggle strict comparison during a certain unittest you can do that by using the vts instance.

import pytest

class TestMoreTests(object):
    def test_list_repositories_once(self, vts):
        vts.strict_body = True
        vts.strict_body = False
        vts.strict_body = True

    def test_list_repositories_twice(self, vts):

Custom wrappers around HTTP transaction mocked by vts (via responses)

pytest-vts now will use a fixture named vts_request_wrapper which by default is defined as a no-op (basically behaves as prior to adding this feature). You can define your own wrapper and modify the request/response as you see fit.

def change_response_wrapper(func):
    def _inner(prep_req, *args, **kwargs):
        prep_req.url = prep_req.url + '?added-by=vts-response-wrapper'
        status, r_headers, body = func(prep_req, *args, **kwargs)
        r_headers['X-Added-By'] = 'vts-response-wrapper'
            loaded_body = json.loads(body)
            loaded_body['added_by'] = 'vts-response-wrapper'
        except Exception as exc:
            return status, r_headers, body
        return status, r_headers, json.dumps(loaded_body)
    return _inner

def vts_request_wrapper():
    return change_response_wrapper

def test_simple(vts):
    your_url = 'http://your.url'
    resp = requests.get(your_url)
    assert 'X-Added-By' in resp.headers
    assert 'added_by' in resp.json()
    vts_recorded_trx = [
        track for track in vts.cassette
        if your_url in track['request']['url]]
    assert '?added-by=vts-response-wrapper' in vts_recorded_trx[0]['request']['url']
    assert '?added-by=vts-response-wrapper' not in resp.request.url

How does it actually work?

The vts fixture exposes an instance of a vts.Recorder class which initialize it's own copy of responses.RequestsMock object. This is to allow vts to manage its own responses.start|stop|reset() cycles without interfering with the default responses.RequestsMock object exposed by default by responses through response.* interface.

This way you can continue using import response; response.start|add|add_callback|reset|stop in parallel with vts. However if you plan to do so remember there will be 2 instances trying to mock.patch() requests so be careful and .stop() one before start() the other. Obviously the last .start-ed one will be active. For more details on this issue read [response][]'s source and unittest.mock docs.

Beside its own copy of response.RequestsMock vts is responsible of:

  • building an internal copy of most information exposed by responses as a json copy. Similar with other recording libraries pytest-vts refers to this as cassette.

  • deciding the location of the cassette, based on the test module's location and the current test function/method name .

  • recording a new cassette, or playing an existing one.

Why this and not other http mocking and recording library?

Because the current available options have some shortcommings which vts tries to address, probably not without introducing some of its own :) :

  • betamax,, httpretty: are all saving the gzipped/deflated responses verbatim which is considerate but not very useful when visually inspecting the cassettes.
  • httpretty: recording/playback-ing feature is not mentioned in the docs which suggests an experimental status.
  • betamax: while mocking only requests is not an issue, providing a handle to the session object might be inconvenient in some use cases
  • mock or equivalents: are great for solitary tests, but requires extra plumbing code to setup the mocking for each tests

So far, pytest-vts has been succesfully used to automate the testing of an application which heavily relies on making HTTP requests on upstream web based APIs.

Why a pytest plugin and not standalone?

Because, among a lot of features, pytest offers fixtures and tests introspections out of the box, complemented by an awesome development support (to name just a few: pytester builtin fixture, pytest-localserver).

Test introspections have been very useful implementing convenience features such as:

  • automatic naming of the cassette files based on the test name
  • automatic deciding the location of the cassettes based on the tests modules
  • saving the cassette only if the test has passed

The above examples of how to customize the vts fixture are in fact pytest's fixtures features.

Why supporting responses and not others?

Because I think its API is familiar and proved itself as a very reliable option.

Future features?

  1. implement various strategies of handling new/missing requests from cassette-recorded. Currently a new, un-recorded request will exibit the behaviour defined by the mocking library for that kind of requests. (e.g. responses will raise a requests.exceptions.ConnectionError)
  2. serialize requests' response.history to cassette json.
  3. support other http-mocking libraries (probably those with callbacks as mock responses? - most of them have that).
  4. add suppport for filtering sensitive information (e.g. passwords, auth headers) from cassettes in case they're publicly available (e.g. vcs stored on a public vcs service).
  5. add an information text about test being recorded/playbacked in the -vv output of pytest.
  6. consider having tracks saved in their own files to avoid having large cassettes
  7. resolve potential conflicts in automatic naming of cassette (same test method in 2 different module in the same pacakge) files, maybe by using their module name as prefix?
  8. consider not saving duplicated tracks
  9. handle tracks with duplicated ['request'] but different ['response']. Keep the first one? The last one? Raise? Keep? make use of responses' assert_all_requests_are_fired? Keep all duplicates and support responses' functionality.
  10. have separate objects (cassette's sides?) for tracks recorded during tests vs recorded during fixture setup/teardown?
  11. [DONE] have playback callbacks raising when the body of the request doesn't match the body of the recorded request
  12. extend the above behaviour for headers/query_strings/selective headers?
  13. the body of the requests is string. Would be more practical to have if as dict.
  14. Improve the api interface to configure the vts fixture for a test (e.g. set always recording/playing, don't save cassette, etc)
  15. [DONE-although not default yet]Currently having 2 tests with the same name in different classes will reuse the same cassette(use the full identifier for a test?)
  16. Command line to reply a cassette using curl/requests?
  17. It seems [cookies][] library used by responses has problems parsing set-cookie headers with data format in them such as "Expires=Fri, 24 Feb 2017 00:58:28 GMT" and a fix to that library seems quite unlikely to happen since it looks pretty much like a dead library. Furthermore it seems responses starts to be abandoned which may indicate it's stuck with cookeis for a while, so maybe implementing our own requests patching might not be the worst idea ever. Or maybe get involved in responses a bit more.
  18. Yet another problem with responses is the way it matches the mocked requests. For requests with bodies (POST, PUT, PATCH) it doesn't consider the body when trying to match the current request => always the first url is matched. Consider vendoring a modified version of responses.
  19. Add unittests for using vts as part of a higher fixture and lower fixtures using requests
  20. Have vts report which tests are still making http requests j (using requests) and suggest to use vts
  21. Audit the existing cassettes for changes in upstream responses compared with the recorded ones.