Remix.run Logo
brap 15 hours ago

Do people actually send PRs with no tests? That is so bizarre to me

pjdesno 15 hours ago | parent | next [-]

If your review was based on features shipped, and your bosses let you send PRs with no tests, would you? And before you say "no" - would you still do that if your company used stack ranking, and you were worried about being at the bottom of the stack?

Developers may understand that "XYZ is better", but if management provides enough incentives for "not XYZ", they're going to get "not XYZ".

wccrawford 13 hours ago | parent | next [-]

That actually wasn't why I didn't write tests a lot of the time.

What stopped me was that after a year of writing tests, I was moved to a higher priority project, and the person who followed me didn't write tests.

So when I came back, many of the tests were broken. I had to fix all those in order to get new ones to not be a bother.

Repeat again, but this time I came back and the unit testing suite had fundamentally altered its nature. None of the tests worked and they all needed to be rewritten for a new paradigm.

I gave up on tests for that system at that point. It simply wasn't worthwhile. Management didn't care at all, despite how many times I told them how much more reliable it made that system, and it was the only system that survived the first giant penetration test with no problems.

That doesn't mean I quite testing. I still wrote tests whenever I thought it would help me with what I was currently working on. And that was quite often. But I absolutely didn't worry about old tests, and I didn't worry about making sure others could use my tests. They were never going to try.

The final straw, less than a year before I was laid off, was when they decided my "storybook" tests weren't worth keeping in the repo and deleted them. That made me realized exactly how much they valued unit tests.

That isn't to say they had no tests. There was a suite of tests written by the boss that we were required to run. They were all run against live or dev servers with a browser-control framework, and they were shaky for years. But they were required, so they were actually kept working. Nobody wrote new tests for it until something failed and caused a problem, though.

tl;dr - There are a lot of reasons that people choose not to write tests, and not just for job security.

brap 10 hours ago | parent | prev | next [-]

Well this wasn’t really aimed at individual devs, but the team/company standards.

I’ve worked at several teams and it was always the norm that all PRs come with tests. There was never a dedicated QA person (sometimes there would be an eng responsible for the test infra, but you would write your own tests).

I would never accept a PR without tests unless it was totally trivial (e.g. someone mentioned fixing a typo).

eikenberry 9 hours ago | parent | prev | next [-]

A broken environment engenders broken behavior and this explains is why it is bizarre, not that it isn't bizarre.

hamdingers 14 hours ago | parent | prev [-]

Breaking prod repeatedly probably impacts your stack ranking too.

Jtsummers 14 hours ago | parent | next [-]

Depends on how easily the failure is connected back to you personally. If you introduce a flaw this year and it breaks the system in two years, it won't fall back on you but the poor sap that triggered your bug.

actionfromafar 12 hours ago | parent | prev [-]

So can "heroically" save prod ... anti patterns.

weinzierl 14 hours ago | parent | prev | next [-]

> "Do people actually send PRs with no tests?"

Rarely

Do people send PRs with just enough mostly useless tests, just to tick the DoD boxes.

All the time.

xyzzy123 14 hours ago | parent | prev | next [-]

It depends on the application but there are lots of situations where a proper test suite is 10x or more the development work of the feature. I've seen this most commonly with "heavy" integrations.

A concrete example would be adding say saml+scim to a product; you can add a library and do a happy path test and call it a day. Maybe add a test against a captive idp in a container.

But testing all the supported flows against each supported vendor becomes a major project in and of itself if you want to do it properly. The number of possible edge cases is extreme and automating deployment, updates and configuration of the peer products under test is a huge drag, especially if they are hostile to automation.

vrighter 14 hours ago | parent [-]

Once, for a very very critical part of our product, apart from the usual tests, I ended up writing another implementation of the thing, completely separately from the original dev, before looking at his code. We then ran them side by side and ensured that all of their outputs matched perfectly.

The "test implementation" ended up being more performant, and eventually the two implementations switched roles.

brianwawok 15 hours ago | parent | prev | next [-]

When I spell text wrong! Or want to add a log. Lots of reasons something is too silly to need a test.

toephu2 5 hours ago | parent | prev | next [-]

Yes, depends on what you're building. Is it just a prototype? no tests needed. Are you trying to move fast and break things? no tests needed. Are tests just not feasible for this piece of code (e.g., all UI, not unit testable), then no tests needed.

gwbas1c 10 hours ago | parent | prev | next [-]

Yes

To put things in context, it both depends on organization standards, and what the change actually is.

Where I work, there are areas that, if you change, you must update the tests. There are also development helper scripts and internal web sites where "it compiles" is good enough.

Likewise, I've done quite a bit of style cleanup PRs where the existing tests are appropriate.

liampulles 14 hours ago | parent | prev [-]

I've seen it many times. I think it often arises in business that are not very technical at their core.