An Odyssey of Testing in Go

An Odyssey of Testing in Go

Why bother writing tests at all?

It’s all about moving fast without breaking anything.

This is a brief overview of the work we did on testing over the past half year in our team. With the help of CI, we significantly reduced the number of bugs and sped up the delivery of new features as we expected. However, the most exciting part is: we gained the confidence of refactoring. Any constructive thought can be applied to a certain project at any time, without worrying at breaking anything. Before delving into the details of how the tests are written, we may get familiar with the testing pyramid first.

A testing pyramid is a testing suite that contains several layers, each represents how tests are implemented. Here is a testing pyramid looks like:

the Testing Pyramid from @Colin_but

TBH, this testing model is dogmatic. Tests should focus on the features the product provides, not how the tests are implemented.

The following conclusions are based on the fact that we are developing an online service, not a library.

Unit Tests

Unit tests should not exist at all.

Unit tests are incapable

Modern services are made of framework and libraries, and the main workflow of a typical service is manipulating data from one form to another. Unit tests couldn’t play an important role in either way. Passing all the unit tests means the functions work as expected, that’s all. We still have no idea if the functions collaborate with others correctly.

Unit test v.s. Integration test

Unit tests are craps

We are not surprised to see a package named utils lying in the codebase. It’s the ultimate shelter of the homeless functions that exist for several reasons:

  • the developer is seeking alternatives to macros;
  • the developer prefers functions to libraries;
  • the developer violates the Single Responsibility Principle;
  • the developer programs to an implementation, not an interface;

Sadly, none of the above shows a good taste of programming, the homeless functions are completely craps. What do we call a thing that proves a crap correct? Another crap.

Though the homeless functions are usually poorly designed, lack of abstraction and nearly impossible to re-use(unless you refactor it with an additional parameter ;-P), they still need to be tested, after all, they are part of the production, right?

Instead of using Go’s native testing package, we chose a more expressive test framework, GoConvey, to organize unit tests:

Integration Tests

Remember the pyramid we mentioned above? Time to flop.

A better choice would be integration tests. Integration tests focus on the behaviors when modules or functions are grouped and work as a whole. Given a set of all the possible operations with different inputs, assertions would tell if the current state is correct. Besides, well-designed integration tests should also take care of scenarios like unexpected errors and edge cases.

To run integration tests in our services smoothly, some more efforts need to be made:

  • Methods invoking each other directly is prohibited, each external(not provided by go runtime) method has to implement an interface(or at least part of it) first;
  • Therefore, a DI container like wire is required(we prefer writing the container manually, it’s more flexible);
  • Collect(or develop) all kinds of code-generators, so we don’t need to write database/RPC clients over and over again;

OK, let’s get our hands dirty. Assume there is a service called CRUD(a fantastic self-explained name), how do we implement it?

First, we define an interface, CRUD, including four methods:

And implement the interface:

After service CRUD starts, we then create a DI container and set CRUD as an attribute. So next time when we need to create a record, we do not invoke method Create() directly, instead we search for the container to do this for us:

Finally, under the help of testify/suite, we can set up our integration tests with fixtures in a well-structured way:

Notice that for service CRUD, a single write/read operation test for each method is not enough, a more sophisticated test suite would act like:

  1. Run fixture;
  2. Assert Read(0, 1) should return an empty list;
  3. Assert Create() should not return any error;
  4. Assert Read(0, 2) should return an list containing one Record;
  5. Assert Update(r) should not return any error;
  6. Assert Read(0, 2) should return an list containing one Record, and this Record equals the param r in (5);
  7. Assert Create() x 10 should not return any error;
  8. Assert Read(0, 6) should return the latest 6 records;
  9. Assert Read(6, 20) should return the rest 5 records;
  10. Run fixture;

And do not forget to mock the errors and the edge cases(or any other external dependencies):

Conclusions

Code coverage generated from unit tests doesn’t mean anything. If we do care about the quality of software, design integration tests cover all kinds of scenarios and pass all the tests, this is the only way that makes sure our codes would run as expected.

Leave a Reply

Your email address will not be published. Required fields are marked *

*