How important is QA testing in software development?
March 19, 2012 8:41 PM   Subscribe

I'm looking at a software company of about 30 developers. Everything looks great except there's no QA department. Is this a big warning sign?

I asked about testing plans and how QA is done, I was told that engineers are responsible for QA and all testing done should be acceptance testing. I've come from a company that writes financial software, so our QA was pretty rigorous. Is this common? Is this a deal breaker, don't go there thing?

I should emphasize that I mean QA/people testing and not unit testing. They're a strong proponent of unit testing, but an organization that size without a proper QA department says hotfix and late nights to me, though they claim it doesn't.
posted by anonymous to Computers & Internet (17 answers total) 6 users marked this as a favorite
 
It's worrisome. I'd find out what exactly they mean by "engineers are responsible for QA" but I can't think of any answers that would make me feel more comfortable. I would expect a company of 30 developers to have at least 3 dedicated QA people, depending on the type of software they're developing. Unit testing has its place but it's not a magic bullet; if you're at the scale of 30+ developers why aren't you doing anything more than unit and acceptance testing? Dedicated QA personnel can't really just be replaced by "make the engineers spend part of their time doing QA" unless you want crappy QA.
posted by axiom at 8:56 PM on March 19, 2012


This is the situation at my current company and it is not ideal in any way. We have test plans written by developers, but it is very easy to write a test plan that you know will pass, especially if you must have the rest run by a third party, usually an unfortunately marketing person who would rather be doing something else.

It's way cheaper in the short term to farm out QA to your customers, but I don't make any claims for its long-term prospects.
posted by fifteen schnitzengruben is my limit at 8:59 PM on March 19, 2012


Depends on the kind of software. It's not uncommon for companies to not have QA departments (or to have very small ones) if they're making, e.g., consumer-facing web-based stuff that can be rolled out gradually and patched quickly and isn't likely to have disastrous consequences for anyone if it fails. I'm not saying that it's an optimal methodology, but I know it's the methodology of a number of successful companies and I don't hear the developers complaining too much over it.
posted by phoenixy at 9:04 PM on March 19, 2012


I could take a job at a place that had no QA department per se, but not one that had no QA people whatsoever. I have worked on a project where I developers were made responsible for their own QA, and quit, partially for that reason. If I wanted to install four different versions of windows in half a dozen languages each and verify that UIs looked correct in each, I would have gone into QA in the first place. But that is not what I (nor probably you) want to be working on.

One of my favorite thing about my current position is how thorough our QA is and how many bugs they actually catch. I would spend half of my time doing work that I don't want to be doing (and therefore hating my job) if it weren't for them.
posted by tylerkaraszewski at 9:36 PM on March 19, 2012


It's #10 on The Joel Test. I'd find it concerning. As Joel says, they're either shipping buggy code or using $100/hour programmers when they could use $30/hour testers.

Also, the programmers will get bored and go somewhere that they get to program 100% of the time, instead of 66%

If it's otherwise perfect, I might be tempted, but if it seems to have other issues, you might want to keep looking.
posted by Mad_Carew at 9:37 PM on March 19, 2012 [1 favorite]


I think it's a warning sign. Developers make lousy QA testers. We're optimists-- we think the code is going to work; and testing is boring. So they are either wasting developers' time or undertesting.
posted by zompist at 9:44 PM on March 19, 2012 [2 favorites]


It's not a good thing, but it's not at all uncommon, even in much larger companies. Even good ones.
posted by greasepig at 9:51 PM on March 19, 2012


It would worry me.

I worked for a company that laid off all QA in my division. The devs were expected to test each other's code. It was a nightmare and if I recall correctly it was less than a year later that they started hiring QA again.
posted by kbuxton at 9:53 PM on March 19, 2012 [1 favorite]


It sounds like a more informally-structured company culture than you're used to. Whether or not you want to work there is probably going to depend on (A) how attracted you are to this company and job otherwise, and (B) whether you want to spend your own time doing your own QA. This is probably a wear-multiple-hats kind of job, and I would expect that they probably aren't super organized on other subjects either.

Disclaimer, I'm not an engineer, but I come from a family of them, and my bf works in the (very small) QA department of a comparatively enormous company.
posted by celtalitha at 9:55 PM on March 19, 2012


Software needs to be tested by different people than the people who wrote it, just like your financial books needs to be audited by people outside of the team that put them together. I'd look very closely at this since it's a recipe for being stuck many late nights fixing problems that should have been caught in QA. If its a hot startup I might take a chance, but otherwise I'd look long and hard at the company.
posted by bottlebrushtree at 10:50 PM on March 19, 2012 [5 favorites]


Its a red flag. It is a sign of immature software development processes which strongly suggests that there will be other issues where they don't work very smart. I'd be concerned that they screw up CM too or that they work from crappy specifications and waste a ton of effort with Ready, Fire, Aim kind of bullshit. They are going to struggle to grow, because they are already at a mass where a lack of mature processes will punish them. It means that you as a developer will have to devote more time to testing. This isn't per se bad, but it may not be something you enjoy. I actually think it benefits most developers to have spent some time doing QA, so it could be useful for your long term skill building, but I wouldn't think it would be interesting work for very long. I also suspect that coders who do their own testing and take the heat when they release buggy code learn to build better mousetraps. I've known coders who almost never get code bounced and when they do it is almost always down to disagreements about the spec or interactions with other code that didn't work the way that the coder expected. Of course the latter reason is the very point of integration testing, since by definition it can't be caught in unit test. We don't post metrics on individual return numbers from QA, but I notice coders who either write exceptionally fast or exceptionally clean.

In terms of its effect on your life vis a vis hotfixes and late nights, it depends a lot of the kind of stuff they are writing. In some industries you don't get 2am emergency trouble calls and bugs don't have be turned around that fast, nor do they have a huge impact. But if the software does anything remotely important or interesting, its going to have repercussions. Personally, I wouldn't like to code in a shop that didn't have decent processes, because they are probably not going to invest in good tools and its probably going to cause a million small problems that would annoy the hell out of me.

There also exist shops that don't think they have a QA department, but really do. Typically, the testing is done by one person on staff who is charge of putting together the releases and usually has an extraordinarily strong functional expertise. In effect, they have a QA person, they just don't call them that. Other times, especially if they are a one customer shop, there might be a person like that on the customer side who usually ends up being both the gatekeeper for functional changes and more or less reviews all inbound code. Both of those arrangements could work in a smaller house like you describe, but both of them are not optimal and still suggest technical management who don't get it.
posted by Lame_username at 11:05 PM on March 19, 2012


Worrisome? Sure. How worrisome? Not enough information here to really say. Some of the questions I would be asking in this situation to try to assign a 1-through-10 kind of number to how worrisome:

- How many products are the 30 devs working on (one big one? Lots of small ones?)

- Are the products shrink wrapped code, bespoke stuff for customers, one big software-as-a-service app?

- What is the business domain here for the customers and what is the impact of a defect? There is a big difference between, say, medical imaging software and gaming software.

- What is the lifespan of the software? Are they supporting multiple versions in the field? Or are these throwaway apps with short lifespans?

- Do they have any intention of having a QA group? Is it something in their plan, or are they comfortable the way they are?

- Are they doing formal defect tracking? Is there a bug tracking app? How do they figure out what they have to fix? Can they tell you how many defects they had to fix last quarter, for example?

- Are they on a regular release schedule or is it more ad-hoc? If they get a defect reported from the field, how does it get folded back in to the product -- do they release patches or just fix it in the next release?

- How much time are they spending on fixing defects vs. coding new features? Do they know?

- Are they using a formal build system, where they can push one button and generate or reproduce a specific build?

- You mentioned that they are strong proponents of unit testing. Any other compensating controls in place like code reviews or pair programming or something like that?

Finally, I would follow up your question to them about "where is the QA dept?" with "what is the thought process behind not using dedicated testers?" Good luck with the decision, I hope it works out for you either way.
posted by kovacs at 3:29 AM on March 20, 2012


It's typical. Testing is usually funded by the project. We contract testers as and when we need them.
posted by mattoxic at 3:29 AM on March 20, 2012


The question is why. If they're growing and expect to have one eventually, it's different from deciding they don't need one.
posted by Obscure Reference at 4:25 AM on March 20, 2012


Is this an Agile shop? The agile wave of the future in some circles is to get rid of the testers (because, of course, that will get rid of "waste" and long cycles) and have business owners and developers develop all of the acceptance tests.

Do you know what non-functional testing is? Performance testing? Security testing? All of these techniques can help to guarantee a good product. Do you want to be responsible for all of that in addition to your considerable responsibilities as a programmer?

I've heard that Google doesn't have traditional testers any more - Google users do a lot of the acceptance testing - but they do have "developers in test" who I would suspect are not testing their own code. The place you describe doesn't sound like Google, though.
posted by Currer Belfry at 5:53 AM on March 20, 2012


As a QA guy, I've found very few developers who can write tests. It's just a different skill-set and more importantly, a different mindset. Most developers see test development and testing as wasted time then they could be writing new product code. Thirty developers writing code without any formal QA team and/or process seems like a very bad sign. Maybe some of them are good at testing but I'm sure not all of them. And even if they do test their own code, who's testing the integrated builds? Every single unit test can pass and you integrate, build and fire it up and the whole thing crashes and burns because of conflicting assumptions between components.

And who's testing different platforms? Or doing upgrade testing? Or stress testing? Or scalability testing? Or fault tolerance testing? Or power fail testing? If you don't have a dedicated QA team, the answer to those questions is usually, the customers.
posted by octothorpe at 6:09 AM on March 20, 2012 [2 favorites]


Just a point of comparion: companies that *have* dedicated QA testers still suffer from bad code, which leads to buggy software, that leads to bad sales, which in turn leads to bad business decisions, and ulitmately, Fin.
posted by Kruger5 at 7:38 AM on March 20, 2012


« Older How is Chevy reliability?   |   what do you mean exactly Newer »
This thread is closed to new comments.