Skip to content
New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Preventing Setup Requests from Influencing Test Results #891

Open
chrisgodsey opened this issue Jan 4, 2019 · 3 comments
Open

Preventing Setup Requests from Influencing Test Results #891

chrisgodsey opened this issue Jan 4, 2019 · 3 comments
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6

Comments

@chrisgodsey
Copy link

chrisgodsey commented Jan 4, 2019

So in using K6 to do some performance testing of our API and in some cases we need to do preconditions in order to ensure our tests are successful (the easiest example is authentication). I'm able to easily add the authentication request in the setup and then later add it into the header of all of my requests.

My issue is that this request (and any other setup/precondition/postcondition requests I want to make) pollute the results of my test. If for some reason one of these requests are particularly expensive or inexpensive, it can make a meaningful difference in my tests and their results.

I'm looking for a way to purposefully direct K6 to not track the results of a certain request when it comes to aggregating the results of my test and judging whether or not it passes my test thresholds.

Immediate ideas that come to mind are like an untrackedGet method that mirrors your get function without calling whatever adds its results to the later evaluated results, so that I could do http.untrackedGet() (And similar versions for your other verbs)

The only other alternative I have is to try to do this in a script external to k6 and then have that script also call k6 passing in the values to the docker container but that seems unnecessarily burdensome.

@na--
Copy link
Member

na-- commented Jan 14, 2019

Requests that happen in the setup() or teardown() functions are tagged appropriately with the ::setup and ::teardown (IIRC) values for the group metric tag. So while you can't filter them out in the simple k6 end-of-test summary, you can do so if you output the metrics to a JSON file or InfluxDB or Load Impact Insights: https://docs.k6.io/docs/results-output

Adding a per-request option that specifies that the HTTP request shouldn't be tracked could be easy, we can just add another parameter here. But this won't scale, since we'll have to add such a parameter to every measurable thing k6 does, like WebSockets, and much more in the future (gRPC, DB connections, etc.). Instead, this might be more universally implemented at the group() level, probably as a part of this issue or after it: #884

@depsir
Copy link

depsir commented Feb 5, 2019

I just discovered this behavior and, finding it counter intuitive, I decided to add a brief paragraph to the documentation to make it clear to every user. You can find it here: https://docs.k6.io/docs/test-life-cycle

@na-- na-- added the evaluation needed proposal needs to be validated or tested before fully implementing it in k6 label Aug 27, 2019
@na--
Copy link
Member

na-- commented Feb 2, 2021

I'm not going to close this issue yet, to allow people to find it more easily, despite #1321 being the issue tracking the potential fix. But for now, it's worth pointing out that there is a partial workaround for a lot of the issues people experience with setup() and teardown(), described in the forum threads linked to in #1321 (comment) and grafana/k6-docs#205

Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
enhancement evaluation needed proposal needs to be validated or tested before fully implementing it in k6
Projects
None yet
Development

No branches or pull requests

4 participants