[Beaker-devel] Integration tests vs unit tests

Nick Coghlan ncoghlan at redhat.com
Fri Sep 6 02:27:54 UTC 2013


Just dumping a transcript of an IRC conversation between rmancy and I
about unit tests vs integration tests (in the context of the patch that
makes the createrepo command configurable:
http://gerrit.beaker-project.org/#/c/2208/).

At the moment, Beaker's unit testing is fairly minimal, so we don't have
a good, quick, confidence building set of tests to run before pushing to
Gerrit, just the full set of integration tests (which can take half an
hour or more to run, depending on the details of your system).

It's going to take some time for us to improve Beaker's unit testing
story, this just struck me as an expedient way to get something on
record that this is the direction we're likely to be heading :)

Cheers,
Nick.

[11:45] <rmancy> ncoghlan, I don't understand what the objection is to
creating integration tests that test different configuration values
[11:46] <ncoghlan> rmancy: integration tests are *slow*, because they
include the full chain (client, web server, database)
[11:46] <ncoghlan> so doing exhaustive testing at the integration level
is inordinately expensive
[11:47] <ncoghlan> the idea of unit/integration/acceptance layer is to
increase confidence while minimising cost
[11:47] <ncoghlan> *layering
[11:47] <ncoghlan> so you do your exhaustive testing of combinations at
the unit test layer, where it's cheap
[11:48] <ncoghlan> and use your integration tests mostly to ensure
everything is hooked up correctly
[11:48] <ncoghlan> rather than to do exhaustive testing of the
alternative inputs at each layer
[11:48] <rmancy> ncoghlan, so why don't we just abandon integration
testing of anything that is not directly testing the UI ?
[11:48] <rmancy> i.e unit tests are the new norm
[11:49] <ncoghlan> rmancy: because it hasn't been our highest priority
to date :)
[11:49] <ncoghlan> but yes, that's the direction I would eventually like
use to go
[11:49] <rmancy> ncoghlan, it doesn't need to be a priority, we just
need to say 'no more integration tests of things that aren't directly
testing the UI'
[11:50] <ncoghlan> rmancy: well, you need at least *some* integration
tests to ensure things are hooked up correctly (so one success case, one
failure case, both the web UI and the CLI)
[11:51] <ncoghlan> but yeah, that would be a good thing to include in
the developer guide :)
[11:51] <ncoghlan> I'd also like us to start moving the unit tests into
test subpackages
[11:51] <ncoghlan> rather than having the test files directly adjacent
to the main source files
[11:51] <rmancy> ncoghlan, but then if you have tested your
success/failure via the integration test, is some circumstances that
would negate the need for the unit test
[11:52] <rmancy> s/is some/in some/
[11:52] <rmancy> which is also fine as far as I'm concerned
[11:52] <ncoghlan> rmancy: ah, but once the unit tests are good enough,
you change the merge criteria from "integration tests pass" to "unit
tests pass", to reduce your cycle times
[11:53] <ncoghlan> since the unit tests should run one or two orders of
magnitude faster than the integration tests
[11:55] <ncoghlan> so yeah, since we're updating the developer guide to
add a high level style guide this sprint, it's a good idea to add
something along those lines
[11:56] <ncoghlan> (although patchbot has already taken some of the pain
away, since it's usually feasible to let that handle the testing to get
a +1 verified, and work on something else while waiting for it)
[11:56] <rmancy> ncoghlan, lately I've been thinking that patchbot does
a reasonable job to reduce those times (at least in my case...)
[11:56] <rmancy> heh, yes
[11:57] <ncoghlan> that means good unit tests become the criteria for
"run these before pushing to Gerrit, so you don't waste patchbot's time" :)
[11:57] <rmancy> ncoghlan, I agree
[11:58] <rmancy> but good integration tests still give you confidence
that you won't break soemthing in production
[11:58] <rmancy> So I think they are still important to have, even if we
get to a point where we only run them before branching a release or
something
[11:58] <ncoghlan> yeah
[11:59] <ncoghlan> actually, I think we're likely to stick with the
current flow (of patchbot running the integration tests)
[11:59] <rmancy> right
[11:59] <ncoghlan> so improving the unit tests will be about a pre-check
run by developers before pushing to Gerrit
[11:59] <rmancy> I just mean that, it's important to have good
integration tests, even if you only run them once
[11:59] <ncoghlan> yup
[12:00] <ncoghlan> it's just about remembering that integration tests
are there to ensure everything is hooked up correctly, not verifying
internal details of individual components
[12:00] <ncoghlan> unless there's no way to verify those internal
details with a unit test
[12:01] <ncoghlan> it's also why the approach of using exception types
to communicate with the UI is so important
[12:01] <ncoghlan> at the moment, we *have* to do exhaustive integration
tests, since that's the only way to check the UI error handling
[12:02] <ncoghlan> whereas when the internal layers communicate with the
UI by throwing particular exceptions, then the integration tests just
need to provoke each kind of error *once*
[12:02] <ncoghlan> and ensure the appropriate message is displayed
[12:03] <ncoghlan> then the *unit* tests can take care of testing the
different ways of provoking each exception, and ensure those each have
the right message in the exception
[12:03] <ncoghlan> hmm, I think I may grab a copy of this archive and
post it to the beaker-devel list :)
[12:04] <ncoghlan> since it's important, but it may be a while before we
get it explained nicely in the developer guide
12:08] <rmancy> ncoghlan, but the unit tests don't actually run the same
code as an integration test. So whilst a unit test will test that the
actual code that was changed work, it doesn't check that it works with
in the context it will actually be called in.
[12:09] <ncoghlan> rmancy: the main things the unit tests ensure is that
calls that used to work keep working. If an integration test fails when
the unit tests passsed, it's a sign that there's a missing unit test
[12:09] <ncoghlan> sorry, *often* a sign
[12:10] <ncoghlan> the other thing it can indicate is that a mocked API
is too permissive, so the unit tests aren't enforcing the same
constraints as the integration tests (there are pros and cons to that)
[12:11] <rmancy> ncoghlan, ok so you can fix your unit test up to catch
those kind of issues, but surely you still want the integration test
there to continue finding non obvious errors so that you can continually
improve your unit tests?
[12:12] <ncoghlan> that's why it's important to have both - unit tests
optimise for speed and reliability and avoiding external dependencies in
order to improve cycle times during development, but you need the
integration tests to back them up and ensure the assumptions in the unit
tests are still valid
[12:12] <rmancy> right
[12:13] <ncoghlan> and then acceptance tests for new features add a
*third* layer, which ensure that what you built is actually usable by
other humans ;)
[12:13] <ncoghlan> and then acceptance tests for new features add a
*third* layer, which ensure that what you built is actually usable by
other humans ;)
[12:14] <rmancy> ncoghlan, right, so in the case of the createrepo
tests, adding a unit test to complement the integration test is what is
needed. I don't yet buy that they are fragile, or not worth the hassle.
[12:15] <ncoghlan> rmancy: there's an existing Bugzilla bug to say we
need unittests for model.TaskLibrary. The bit Dan is point out as
fragile is starting and stopping gunicorn to pick up the config change.
[12:16] <rmancy> ncoghlan, we found a problem that I've fixed. I don't
see that as any different to other test iterations.
[12:16] <ncoghlan> https://bugzilla.redhat.com/show_bug.cgi?id=965915 btw
[12:18] <ncoghlan> rmancy: agreed, there's value in that integration
test to ensure it *is* reading the config file correctly
[12:20] <ncoghlan> however, it's not actually testing that at the moment
- you could change the code to always call "createrepo" regardless of
the settings, and it would still pass
[12:24] <rmancy> ncoghlan, hmm, that's true
[12:24] <ncoghlan> rmancy: more specific feedback added to
http://gerrit.beaker-project.org/#/c/2208/ :)
[12:24] <rmancy> ok, I think I need to change that test

-- 
Nick Coghlan
Red Hat Infrastructure Engineering & Development, Brisbane

Testing Solutions Team Lead
Beaker Development Lead (http://beaker-project.org/)


More information about the Beaker-devel mailing list