Introduction

This page was created after the DCAT Meeting September 2015, where a working group was formed to prepare the test plans for the DSpace 6 Testathon.

Objectives

  • Create two plans, one for XMLUI, one for JSPUI
  • Create these plans as collaborative spreadsheets (Google Docs)
  • Group the different tests per user persona/access level
    • Anonymous - Non-authenticated, anonymous users
    • Submitter - Authenticated user with submission rights
    • Collection Admin - Authenticated user with collection administration rights
    • Admin - Authenticated user with admin rights
  • Create plans to be used on demo.dspace.org
    • As a secondary benefit, it would be great if these plans can work as a model for in-house testing of your own upgrades.
    • It should be very clear what the reference installation on demo.dspace.org entails

Out of scope

  • Specific work on DSpace documentation
    • This will be treated in parallel, related to each new feature for which documentation needs to be reviewed
  • Tests for the other DSpace webapps
    • OAI-PMH
    • REST
    • SWORD
    • LNI
  • Performance and Load tests

 

Official drafts

JSPUI Test plan http://bit.ly/jspui-test-plan

XMLUI Test plan http://bit.ly/xmlui-test-plan 

Different people's own test plans (examples)

Virginia Tech, VTechWorks - https://docs.google.com/spreadsheets/d/1_-20Y06C8dP2VzWxaSBbMnXgvuBOQoc3D0chJGRmyGg/edit?usp=sharing

  • LIKE
    • Bram: column with "expected results".
  • DISLIKE / NOT APPLICABLE
    • ...

Georgetown University Institutional Repository - https://docs.google.com/spreadsheets/d/1QkasjcwhhiAj4HXIDFDeX7TFEocwnqsZGnmDTHNmnCs/edit?usp=sharing

  • LIKE
    • Bram: Section column to group tests together
  • DISLIKE / NOT APPLICABLE
    • ...

 

Atmire - https://docs.google.com/spreadsheets/d/1rbgJk2_NkYWyE9tSIHquiLqN5PdcqRP0PCBHLnVrL_I/edit#gid=409937147

  • LIKE
    • Categorization of what the different types of feedback can be
  • DISLIKE/NOT APPLICABLE HERE
    • Columns for different servers
    • Mix of things that can be tested in the UI & technical things that can only be verified by an admin.

 

Edinburgh - https://wiki.duraspace.org/download/attachments/69834931/DataShare%20Testing%20Results%20Blank%20v4.xlsx?version=1&modificationDate=1441726810758&api=v2 

  • LIKE
  • DISLIKE / NOT APPLICABLE HERE

Mandatory fields (per test)

Ref - a short, unique reference that can be used in related JIRA tickets etc. example VIEW1

Category - an indication of the functional group a test belongs to. example VIEWING

Description - a step by step description of how the test should be executed

Expected outcome - the expected result after execution of the description

Test status - status of the last time this was tested. This column should be filled in with following vocabulary:

  • 2014-09-12 : when a date is entered, the functionality was verified. 

  • TEST UNCLEAR: the description of the test doesn't make it clear either what to test, or how a user can test this.

  • FEATURE MISSING: the described functionality couldn't be verified

  • UI PROBLEM: the feature is there and seems to "work", but there is a UI problem (nature of the problem clarified in comments)

  • NON COMPLIANCE: the test fails the expected results

  • DOCUMENTATION: feature seems to work but docs seem to be missing or incomplete

If you are doing a test later in time, you can overwrite the test status that someone had previously entered

Last tester - name of the person who executed the last test. The main goal of including the person here, is to make sure he or she can be contacted for further information about the test

Comments - free text field for additional comments. If the comments of the previous tester are not yet resolved or are still relevant in another way, you can leave them in and add your own comments in the same cell. When doing this, start your comment with your name, so people can see a different person added the next comment

JIRA tickets - links to JIRA tickets that are related to this test. These links should never be removed, so we still have the entire backlog on what has changed to these tests.  

Structure of the test spreadsheets

The spreadsheets are organized into different tabs. Each tab represents a "role" of a user that is able to execute all tests on the tab. The roles who need a higher level of authorization are more on the right. This means that a role on the right could theoretically also execute all of the tests on the tabs more on the left (but we're not targetting that actively right now).

Requirements for demo.dspace.org installation

Integrations with external systems

Following integrations may not be enabled out of the box. They need to be configured to ensure they are "up and running" on demo.dspace.org.

  • Handle.net
  • DOI
  • ORCID
  • Google Analytics
  • Creative Commons (licensing)
  • Sherpa Romeo Lookup
  • LDAP or other custom authentication methods
    • Many people use this feature but I can't see how we can enable this on demo.dspace.org without also putting up an entire directory infrastructure.
      Emilio: what about LDAP directory "maintained" by Stewart Lewis? Just some need to define the same users at demo.dspace.org.....

Specific credentials

  • Admin email inbox: specific testers should have access to the email inbox that SHOULD receive entries in the repository feedback from, the "Request a copy" requests, ...
  • Google Analytics account

Checklist for features that should be enabled

  • Workflow: Basic workflow or XML workflow ??? Only one of the two can be enabled.
  • PDF cover page?
  • Versioning
  • Collection strength should be shown
  • Legacy stats ???? (Bram: I'm not in favor of still showing these or recommending use of them to anyone, but some people still seem to use them)
  • Thumbnail generation (filter media execution and potential XPDF installation)
  • outgoing email
  • i18n: ideally, all language files we have should be enabled, this makes it clear which messages are wrong/missing in which files.
  • disable weekly wipeouts for the time of testing so people don't have to start from scratch

DRAFT of our procedures that we want to add to the official testathon page

Unless specified otherwise, the credentials that should be used for the tests are the ones listed on the demo.dspace.org/xmlui and /jspui homepages. 

The reference browser for the tests is Google Chrome. However, there are possibilities to report browser compatibility issues with IE, Firefox and Safari.

Testathon page: DSpace Release 6.0 Testathon Page

DSpace 6 Testathon Role Managers

following volunteers are looking after (co-ordinating) each of the roles, represented by tabs in the XMLUI test plan:

Anonymous user: Iryna Kuchma

Submitter: Iryna Kuchma

Collection AdministratorBram Luyten (on holiday) Susan Borda

Repository Administrator: Pauline Ward (on holiday) Susan Borda

JSPUI specific Role Managers (tentative)

Anonymous userMariya MaistrovskayaSean Xiao Zhao

SubmitterMariya MaistrovskayaSean Xiao Zhao

Collection AdministratorMariya MaistrovskayaSean Xiao Zhao

Repository AdministratorMariya MaistrovskayaSean Xiao Zhao

 

  • No labels

10 Comments

  1. Folks, I've had a look at your spreadsheet just now, and I can't see anywhere to record which browser the tester has used. In the past we have seen different behaviour in Internet Explorer on occasion, and things displaying a bit differently in Chrome or in Firefox. I think it would be worthwhile having provision in the process somewhere for ensuring that a variety of browsers are tested, at least for some functionality. For example, for our own testing procedure we like to test things affecting browsing and searching on Firefox, Chrome, IE and Safari. Our own university website's Google Analytics tells us that we do still have a significant proportion of our colleagues here i.e. our depositors (and therefore very likely our target academic users in other institutions) using each of these browsers. Whereas for admin functions we know in the team we mostly use Firefox so we can get away with just testing that.

  2. Great comment Pauline. I would suggest the following:

    I'll put this in the XMLUI sheet right now, and look forward to further feedback. 

     

  3. I've just gone through the test plan, added a couple of wee tests, added a comment which Bram Luyten (Atmire) has already responded to and taken up (thanks v much), and I don't think I have anything more to add. Very impressed, think this covers all the essentials, and of course we can't know every single thing we'll need to test till we actually see it (i.e. all the bells and whistles in the submission form - seems sensible to just try them out when we have the test repo). Thank you!

  4. If any of the four role manager roles is not taken up, I could find the time to take it up, but I'm afraid I am on annual leave for the whole week commencing Monday 28th of March, so if the Testathon overlaps that week, I won't be able to, sorry.

  5. Thanks for signing up Pauline Ward and Iryna Kuchma ! I'll try to take on the fourth role. Like Pauline, I have holidays planned first two weeks of May, so hopefully the testathon falls into the right weeks.

  6. Yes, I'm also out for whole of the first week of May. I've got some annual leave in the summer too, if we think it might slide that far (smile)

  7. Big thanks to Mariya Maistrovskaya and Sean Xiao Zhao for taking on the JSPUI Test plan!

    1. We've finished drafting the testing tasks Google sheet, anyone is welcome to take part in testing in any of the roles

  8. Thanks to Susan BordaSusan Borda to fill in as role manager for XMLUI Coll admin and Repo admin

    1. Great, thanks very much Susan Borda .