Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.
Wiki Markup
h1. Add Unit Testing to Dspace - Pere Villega

...



{excerpt

...

}Dspace currently lacks unit testing, which harms the development of the platform and makes easier to reintroduce past bugs while developing. This project is a proposal to add a set of unit test classes to Dspace, based on JUnit, plus some tools that detect issues in the code so we can improve its quality. - Pere

...

Proposal

My proposal is to do the following:

Integrate Sonar and Dspace.

What to do

SONAR is an open source quality management platform, dedicated to continuously analyze and measure source code quality. Being an external application, my work would consist on deploy it on a server provided by Duraspace and set up the corresponding JIRA integration. This would give us an image of the coverage of the code and highlight other issues.

Reason

Maven reports don't work well in projects with several subprojects like Dspace. Some addons can't aggregate the reports properly and this makes harder to obtain this valuable information.

Improve code quality

What to do

This has 2 components:

  • Refactor the code to facilitate unit testing, starting by the Dspace API and moving to other components later on.
  • Fix issues detected by CPD, FindBugs and other tools.

Reason

About the refactoring, if you run Dspace code by the testability explorer you'll notice it raises several warnings. There are classes like Context that make the testing hard. Mocks can be done and there are some ways to go around the problem, but all the effort put into that will be lost in the long term if the code is refactored to follow Demeter's Law. Also, refactoring would improve average code quality, making easier for developers to enhance Dspace.

Refactoring should solve most of the issues detected by SONAR, but there may be some other problems like duplicated code or possible sources of bugs. When possible these issues should be tackled as they will make the code more stable. This point is lower priority as the benefits on unit testing will be minor.

Unit tests

What to do

Generate a set of unit tests for Dspace, starting by the Dspace API and moving to other components later on.

Reason

The lack of unit testing makes easier for Dspace developers to reintroduce old bugs when doing changes to the code. It also makes harder to ensure customisations done by users are working and increases the difficulty of upgrading to new versions of the platform.

Also, following the suggestion of Graham Triggs, it will allow the usage of Contiperf (h​t​t​p​:​/​/​d​a​t​a​b​e​n​e​.​o​r​g​/​c​o​n​t​i​p​e​r​f​/​) to add performance testing to the modules.

Work for GSOC

The main aim is to apply this (all 3 points) to the API core. Once done, work would be done in the remaining time with Manakin and then other subprojects (Sword, LNI, OIA). JSPUI will be left for last, as is an interface that will disappear in the future.

The project plan will depend on the decision about refactoring the code or simply producing unit tests for it. Deployment of SONAR should be straightforward and is not mandatory (although recommendable)

Workplan

It has to be decided:

  • Scope of the project
  • Work to be delivered by mid-term (mid July)
  • Work to be delivered by end of GSOC (9th August)
  • Documentation to provide
  • Weekly reports to follow progress

Important aims are:

  • to get some general agreement amongst the community about which testing frameworks we concentrate on, and how we get those tests to run automatically,
  • provide systems and framework implementations around the units tests that facilitate the task, motivating the community to contribute to the effort

Existing approaches

So far there's been some approaches to the problem. As far as we know, the following:

...

 Villega{excerpt}

h2. <!-- 		@page { margin: 2cm } 		P { margin-bottom: 0.21cm } 		H2 { margin-bottom: 0.21cm } 		TD P { margin-bottom: 0cm } 		TH P { margin-bottom: 0cm } 		A:link { so-language: zxx } 	-->


h2. On Testing

Due to the extension of the subject a new page has been created with more details. Please check \[[Testing|Testing]\] for more information on unit testing

h2. Existing approaches

So far there's been some approaches to the problem. As far as we know, the following:
* Elliot Metsger created some unit 	testing framework as shown at[http://presentations.dlpe.gatech.edu/proed/or09/or09_052009_4/index.html|http://presentations.dlpe.gatech.edu/proed/or09/or09_052009_4/index.html

...

]
* Scott Phillips did a set of 	functional tests for DSpace 1.5 as seen at[http://scott.phillips.name/2010/04/dspace-functional-tests/|http://scott.phillips.name/2010/04/dspace-functional-tests/

...

]
* Gareth Waller has built a set of 	Selenium tests as commented recently in the developer's list

...



Next there is a table that compares these and some other approaches to the problem:

...

Tools

Type

Notes

JUnit + JMock

Unit Testing

Dspace is tightly integrated with the database (see notes above) which complicates the task of creating a unit test

JUnit + HTMLUnit

Functional

Uses a embedded webserver to run Dspace and runs the tests, using ant, against this instance

Selenium

Functional

Can be run against any running Dspace instance and using several browsers

JUnit + ContiPerf

Unit Testing + Performance Testing

Suffers from the same issues as other Unit Tests, tight integration with database

As we see we have two main approaches:

  • Unit testing: automated, suffers from tight integration with database. Refactoring of code would be advisable, but it can be avoided by using Mocks. Tests at low level, just ensuring a class is correct, not checking the whole functionality. Can allow performance testing via ContiPerf.
  • Functional testing: doesn't suffer from database integration. Ensure the functionality works, but it may not account for certain situations that would raise errors due to code bugs.

Both would benefit DSpace, as unit testing would ensure code quality while functional testing would ensure the application behaves as expected with standard usage.

Tools

Given the existing approaches, it seems the tools to consider are:

...


|| Tools || Type | Notes ||
| JUnit + JMock | Unit Testing | Dspace is tightly integrated with the database (see notes 			above) which complicates the task of creating a unit test |
| JUnit + HTMLUnit | Functional | Uses a embedded webserver to run Dspace and runs the tests, 			using ant, against this instance |
| Selenium | Functional | Can be run against any running Dspace instance and using 			several browsers |
| JUnit + ContiPerf | Unit Testing + Performance Testing | Suffers from the same issues as other Unit Tests, tight 			integration with database |
As we see we have two main approaches:
* Unit testing: automated, suffers from tight integration with 	database. Refactoring of code would be advisable, but it can be 	avoided by using Mocks. Tests at low level, just ensuring a class is 	correct, not checking the whole functionality. Can allow performance 	testing via ContiPerf.

* Functional testing: doesn't suffer from database integration. 	Ensure the functionality works, but it may not account for certain 	situations that would raise errors due to code bugs.

Both would benefit DSpace, as unit testing would ensure code quality while functional testing would ensure the application behaves as expected with standard usage.

h2. Proposals

The proposals for the project are:
* Create a framework for Functional 	Tests, Integration Tests, Unit Tests and Performance Tests that run 	automatically on build (when possible)
** I will create all the background 		require (mockups, database connections, configs, etc) for the tests 		to be run
** I will integrate Cobertura to 		create reports on the testing coverage

* Integrate with a Continuous 	Integration server
* (Optional but recommended) 	Integrate with a code quality management tool

h2. Scope

The main scope of the project is to apply the proposals to the API core. Once done, work would be extended in the remaining time to Manakin and then other subprojects (Sword, LNI, OIA). JSPUI will be left for last, as is an interface that will disappear in the future.

h2. Considerations

DSpace code suffers from testability issues. In the long term it might be advisable to refactor it. Refactoring is dangerous right now as we don't have any test and we risk introducing new bugs to the code base, so it should be avoided until we have enough tests working that makes us confidant we can take that road.

h2. Project Description

At first three basic objects will be created: an Abstract Functional Test, an Abstract Integration Test, and if needed an Abstract Unit Tests. These basic abstract tests would be able to preform the basic setup and teardown operations to get those resources needed: database, etc. The structure will include an embedded webserver, database and temporary file system.

Performance Testing will be achieved through the use of ContiPerf, a tool that reuses JUnit tests as performance tests.

h2. Project Plan

A work plan has to established that will outline:
* Work to be done by mid-term 	evaluation
* Work to be done by end of GSOC
* Documentation to be generated

h2. Tools

* The Continuous Integration server 	will be Hudson.
** SONAR 		\[[http://www.sonarsource.org/|http://www.sonarsource.org/] \] will be the code quality management 		tool
* All tests will be run 	automatically by Maven
** Embedded webserver will be Jetty
** Embedded database will be 		HyperSQL \[[http://hsqldb.org/|http://hsqldb.org/] \]
* Functional Tests will use Selenium
* Unit Tests and Integration Tests 	will use Junit
** JMock will be used as mocking 		framework
** HTMLUnit may be used if required
* Performance Tests will use 	ContiPerf

h2. Thanks

This page has been created with help from Stuart Lewis, Scott Phillips and Gareth Waller. I want to thank them all for their comments. Some information has been taken from Wikipedia to make the text more complete. I'm to blame for errors in the text.

Feel free to contribute to this page\!