Skip to content

Stop attaching test reruns to final test report entries #387

New issue

Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.

By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.

Already on GitHub? Sign in to your account

Merged
merged 6 commits into from
Dec 1, 2020

Conversation

gnikonorov
Copy link
Member

Stop attaching the rerun entries created by pytest-rerunfailures to reports.

When we introduced our post processing step we incorrectly lumped in rerunfailures reruns ( see #271 ), which caused this issue. To fix it, we now treat each rerun as its own test which is what the original behavior was. I confirmed that behavior is the same as v2.1.1 now.

I'd also like to make a release after this merges, since this is a bugfix for the v3.0.0 release if no one objects

Closes #374

@gnikonorov gnikonorov added the bug This issue/PR relates to a bug. label Nov 29, 2020
@gnikonorov gnikonorov self-assigned this Nov 29, 2020
@ssbarnea
Copy link
Member

IMHO retries should be visible, they may include essential debugging information, in fact we should also make them visible in the UI with some other color than green.

@BeyondEvil
Copy link
Contributor

IMHO retries should be visible, they may include essential debugging information, in fact we should also make them visible in the UI with some other color than green.

I agree. But I think it might be worth doing a bugfix release, and then adding the functionality you mention in a separate release.

@gnikonorov
Copy link
Member Author

@ssbarnea the retries are visible. The issue is that without this PR we take all the retry data and add it to the original run. With this PR the data for each retry row is not added to the original run. Does that make sense?

For example. Here is what a report off of v3.0.0 looks like:
Screen Shot 2020-11-29 at 10 08 13 AM

Here is the same test run used to generate the above report but off of this PR instead:
Screen Shot 2020-11-29 at 10 08 49 AM

You can see that the retry rows still exist and have the proper data. We aren't, however, adding everything into the final pass/fail run of the retry as we are off of head though which is a bug. This didn't happen in 2.1.1 and shouldn't happen IMO.

@gnikonorov
Copy link
Member Author

IMHO retries should be visible, they may include essential debugging information, in fact we should also make them visible in the UI with some other color than green.

I agree. But I think it might be worth doing a bugfix release, and then adding the functionality you mention in a separate release.

@BeyondEvil, that functionality already exists, please see the above comment

@BeyondEvil
Copy link
Contributor

@BeyondEvil, that functionality already exists, please see the above comment

Yepp, you're right! 👍

Copy link
Contributor

@BeyondEvil BeyondEvil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Why is the time/duration stuff in this PR?

@gnikonorov
Copy link
Member Author

Why is the time/duration stuff in this PR?

I moved time formatting to the TestResult class since it makes no sense to format on the report as we merge reports for test results. Formatting should be done only once when we compute the test result to show to the user. The way we were doing it was that we formatted once per report, and the later merged reports which is wrong as we first need to accumulate all the report times and only then format them for display

@gnikonorov
Copy link
Member Author

gnikonorov commented Nov 29, 2020

Why is the time/duration stuff in this PR?

I moved time formatting to the TestResult class since it makes no sense to format on the report as we merge reports for test results. Formatting should be done only once when we compute the test result to show to the user. The way we were doing it was that we formatted once per report, and the later merged reports which is wrong as we first need to accumulate all the report times and only then format them for display

And seeing that we needed to touch the logic anyway as part of this change ( line 718 in the original version of plugin.py ), I figured we may as well move it to the proper place instead of moving it but keeping a known bug that we'd have to fix for an unreleased feature later on

@BeyondEvil
Copy link
Contributor

Why is the time/duration stuff in this PR?

I moved time formatting to the TestResult class since it makes no sense to format on the report as we merge reports for test results. Formatting should be done only once when we compute the test result to show to the user. The way we were doing it was that we formatted once per report, and the later merged reports which is wrong as we first need to accumulate all the report times and only then format them for display

And seeing that we needed to touch the logic anyway as part of this change ( line 718 in the original version of plugin.py ), I figured we may as well move it to the proper place instead of moving it but keeping a known bug that we'd have to fix for an unreleased feature later on

Would it makes sense to have that as a separate PR? Since it's (sort of) unrelated to reruns? 🤔

@gnikonorov
Copy link
Member Author

Why is the time/duration stuff in this PR?

I moved time formatting to the TestResult class since it makes no sense to format on the report as we merge reports for test results. Formatting should be done only once when we compute the test result to show to the user. The way we were doing it was that we formatted once per report, and the later merged reports which is wrong as we first need to accumulate all the report times and only then format them for display

And seeing that we needed to touch the logic anyway as part of this change ( line 718 in the original version of plugin.py ), I figured we may as well move it to the proper place instead of moving it but keeping a known bug that we'd have to fix for an unreleased feature later on

Would it makes sense to have that as a separate PR? Since it's (sort of) unrelated to reruns? 🤔

We could, but I think that would make this change dirty since it would look like this:

                 if test_report.outcome == "rerun":
                     # reruns are separate test runs for all intensive purposes
+                    test_report.duration = getattr(test_report, "duration", 0.0)
+                    test_report.formatted_duration = self._format_duration(test_report)
                     self.append_rerun(test_report)

And we would also be doing it immediately below on https://github.com/pytest-dev/pytest-html/blob/master/src/pytest_html/plugin.py#L717 which is confusing to read.

I'd argue this is related as this PR is really fixing all postprocessing related to dynamic reruns, and this move of the formatting to TestResult is a direct result of that since if we didn't have to change the postprocessing logic the duration formatting on line 718 would make sense.

It seems like a big PR but all I did was move a function from one place to another and it's call from one place to another so it's 2 lines. The method is the same

@BeyondEvil

@BeyondEvil
Copy link
Contributor

It seems like a big PR but all I did was move a function from one place to another and it's call from one place to another so it's 2 lines. The method is the same

@BeyondEvil

Fair enough. 😊

Copy link
Contributor

@BeyondEvil BeyondEvil left a comment

Choose a reason for hiding this comment

The reason will be displayed to describe this comment to others. Learn more.

Send it! 🚀

@gnikonorov
Copy link
Member Author

Thanks @BeyondEvil! I'll merge in tomorrow assuming @ssbarnea has no issues with it.

Then I'll release v3.1.0 on 12/2

@gnikonorov gnikonorov merged commit 2bb2010 into pytest-dev:master Dec 1, 2020
FirePing32 pushed a commit to FirePing32/pytest-html that referenced this pull request Dec 9, 2020
* treat rerun entries as seperate test runs

* Add changelog entry

* fix a typo

* remove debug code

* fix flaky test on mac
Sign up for free to join this conversation on GitHub. Already have an account? Sign in to comment
Labels
bug This issue/PR relates to a bug.
Projects
None yet
Development

Successfully merging this pull request may close these issues.

Extras from reruns are attached to final test
3 participants