Add Unit Testing to Dspace - Pere Villega
Dspace currently lacks unit testing, which harms the development of the platform and makes easier to reintroduce past bugs while developing. This project is a proposal to add a set of unit test classes to Dspace, based on JUnit, plus some tools that detect issues in the code so we can improve its quality. - Pere Villega
On Testing
Due to the extension of the subject a new page has been created with more details. Please check [Testing] for more information on unit testing
Existing approaches
So far there's been some approaches to the problem. As far as we know, the following:
- Elliot Metsger created some unit testing framework as shown athttp://presentations.dlpe.gatech.edu/proed/or09/or09_052009_4/index.html
- Scott Phillips did a set of functional tests for DSpace 1.5 as seen athttp://scott.phillips.name/2010/04/dspace-functional-tests/
- Gareth Waller has built a set of Selenium tests as commented recently in the developer's list
- Aaron Zeckoski's effort added JUnit-based testing into DSpace Services
Next there is a table that compares these and some other approaches to the problem:
Tools |
Type |
Notes |
---|---|---|
JUnit + JMock |
Unit Testing |
Dspace is tightly integrated with the database (see notes above) which complicates the task of creating a unit test |
JUnit + HTMLUnit |
Functional |
Uses a embedded webserver to run Dspace and runs the tests, using ant, against this instance |
Selenium |
Functional |
Can be run against any running Dspace instance and using several browsers |
JUnit + ContiPerf |
Unit Testing + Performance Testing |
Suffers from the same issues as other Unit Tests, tight integration with database |
As we see we have two main approaches:
- Unit testing: automated, suffers from tight integration with database. Refactoring of code would be advisable, but it can be avoided by using Mocks. Tests at low level, just ensuring a class is correct, not checking the whole functionality. Can allow performance testing via ContiPerf.
- Functional testing: doesn't suffer from database integration. Ensure the functionality works, but it may not account for certain situations that would raise errors due to code bugs.
Both would benefit DSpace, as unit testing would ensure code quality while functional testing would ensure the application behaves as expected with standard usage.
Proposals
The proposals for the project are:
- Create a framework for Functional Tests, Integration Tests, Unit Tests and Performance Tests that run automatically on build (when possible)
- I will create all the background require (mockups, database connections, configs, etc) for the tests to be run
- I will integrate Cobertura to create reports on the testing coverage
- Integrate with a Continuous Integration server
- (Optional but recommended) Integrate with a code quality management tool
Scope
The main scope of the project is to apply the proposals to the API core. Once done, work would be extended in the remaining time to Manakin and then other subprojects (Sword, LNI, OIA). JSPUI will be left for last, as is an interface that will disappear in the future.
Considerations
DSpace code suffers from testability issues. In the long term it might be advisable to refactor it. Refactoring is dangerous right now as we don't have any test and we risk introducing new bugs to the code base, so it should be avoided until we have enough tests working that makes us confidant we can take that road.
Project Description
At first three basic objects will be created: an Abstract Functional Test, an Abstract Integration Test, and if needed an Abstract Unit Tests. These basic abstract tests would be able to preform the basic setup and teardown operations to get those resources needed: database, etc. The structure will include an embedded webserver, database and temporary file system.
Performance Testing will be achieved through the use of ContiPerf, a tool that reuses JUnit tests as performance tests.
Project Plan
A work plan has to established that will outline:
- Work to be done by mid-term evaluation
- Work to be done by end of GSOC
- Documentation to be generated
Draft
The following draft outlines a proposal for the project plan. It has to be approved in a meeting with the community:
- 24th May - 30th May: decide project plan.
- 31th May - 20th June: test scaffolding for Maven that runs JUnit, JMock, HyperSQL, and Jetty for the tests. Create the 3 base classes: an Abstract Functional Test, an Abstract Integration Test, and an Abstract Unit Tests. Optionally there might be integration with Hudson/SONAR.
- 14th June - 11th July: Add Contiperf to the set for performance testing. Start generating unit tests for DSpace API.
- 12th July (Mid Term Evaluation): By this day a number of selected packages from DSpace API should have tests
- 12th July - 8th August: Integrate Selenium into the scaffolding. Prepare tests for selected XMLUI functionalities.
- 9th August: By this day we should have functional tests on the UI on selected functionalities
- 9th August - 16th August (Final Evaluation): code fixes and documentation related to the project
Tools
- The Continuous Integration server will be Hudson.
SONAR [http://www.sonarsource.org/ ] will be the code quality management tool
- All tests will be run automatically by Maven
- Embedded webserver will be Jetty
Embedded database will be HyperSQL [http://hsqldb.org/ ]
- Functional Tests will use Selenium
- Unit Tests and Integration Tests will use Junit
- JMock will be used as mocking framework
- HTMLUnit may be used if required
- Performance Tests will use ContiPerf
Thanks
This page has been created with help from Stuart Lewis, Scott Phillips and Gareth Waller. I want to thank them all for their comments. Some information has been taken from Wikipedia to make the text more complete. I'm to blame for errors in the text.
Feel free to contribute to this page!