Do you even QA, bro?
A detailed look at FreeNAS QA Efforts and How They’ve Changed Over Time
In the seven years since iXsystems adopted the FreeNAS storage operating system, we have worked hard to strike a balance between Quality Assurance (QA) provided through the FreeNAS user community and our internal specialized QA focused on our TrueNAS enterprise-grade hardware/software solutions. At one extreme, we have a community of hundreds of thousands of DIYers around the world who come up with hardware and configurations we would never imagine ourselves, which gives us broader QA coverage than any storage company on the planet. At the other end of the spectrum, we have both manual and automated High Availability stress tests that guarantee that TrueNAS storage arrays are ready for any customer’s use case or workload. The balance we’ve struck has worked quite well overall, but from time to time, issues of various severity would still slip through the cracks in new FreeNAS releases. So, we’ve spent the better part of the past year focused on building a continuous integration and QA process for the FreeNAS 9.x and 11.x releases in order to improve the overall quality of FreeNAS releases by catching as many issues as possible before they reach the community as well as shorten the FreeNAS -> TrueNAS release cycle.
Before joining iXsystems, I performed a “feat of strength” by writing a test suite using the available FreeNAS/TrueNAS REST API documentation. At this time, the VP of Engineering for FreeNAS/TrueNAS 9.x (Kris Moore) was just getting automated QA off the ground for FreeNAS. He was looking for someone to build upon the tests he had started and to extend the testing by writing many additional tests which he didn’t have the time to implement. His framework consisted of several shell scripts that would send commands to a FreeNAS or TrueNAS system using the REST API to do things such as create a user, provision storage, setup an NFS share, and mount the share. When the tests were executed, the framework collected the results in XML format and published them to Jenkins, the open source continuous integration software we use here at iXsystems.
Here are some examples of my pull request and what the test directory looked like at that specific point:
You can also find the publicly available documentation for the FreeNAS/TrueNAS REST API at http://api.freenas.org/
Here are what the test results looked like by the numbers in their original form:
- 5 AFP tests
- 3 Boot environment tests
- 1 Debugging test
- 1 DynDns test
- 1 Email test
- 3 FTP tests
- 1 iSCSI test
- 6 Jail tests
- 12 NFS tests
- 4 RSYNC tests
- 14 CIFS tests
- 2 SSH tests
- 7 Storage tests
- 3 User tests
The initial suite of 63 tests clearly left room for improvement. I realized that only one or two of the NFS/CIFS tests actually ran client tests to verify that a share could successfully mount and be written to. During my first 8 months at iXsystems, I expanded the framework and brought the test count to 358. This included adding support for testing all services with a number of clients, including integration with popular directory services like AD and LDAP.
Today, when an engineer commits to a repository, it triggers an incremental build of the software they have modified and runs this series of tests against our nightly builds which are publicly available at http://download.freenas.org/11/MASTER/. This allows the QA team to verify if the latest build will install as well as upgrade successfully to an existing system. If the new build installs successfully, we then initiate the test suite to determine which areas of functionality may be impacted by each commit.
As of this writing, we automate 541 tests, including the ability to test from remote clients such as Windows and macOS. These tests catch many issues long before they reach the STABLE development branches on which we base our FreeNAS releases and updates. For simple issues, we notify the developer responsible for the triggering commit. For more complex issues, the QA team becomes closely involved with a number of developers, both internal and those from upstream software projects, to determine the root cause and best fix. More often than not, we have a good match between the size of the issue detected and the effort needed to remedy it, and fixes often reach active development before the next nightly build.
We integrate or use several Open Source projects in our testing processes. These include the AngularJS user interface toolkit used in the new FreeNAS UI, the Protractor AngularJS test suite, and the Selenium browser automation tool. We have reported a number of issues with these applications to their maintainers, including those we found in the Python scripting language at the heart of FreeNAS.
In addition to automation, QA has grown in other areas such as scenario testing. Going beyond the automation to cover basic functionality testing, we extended our manual testing capabilities to cover far more complex configuration scenarios. Until recently, we only performed hands-on testing with our TrueNAS enterprise arrays, but we have since extended these tests to include FreeNAS releases. As a result, we tailor a very targeted checklist of what to test based on the commits that went in between major FreeNAS releases.
Since we are an Open Source company, our test framework is open source as well. We welcome suggestions for additional tests and your code contributions. Testing is a great way to get familiar with the FreeNAS development process and who knows, you might even want to join the team as I have!
The FreeNAS REST API Tests:
The FreeNAS UI Tests:
The test suite repo where you can report issues and make suggestions:
Joe Maloney, QA Supervisor, iXsystems