Date & Time

  • September 8 15:00 UTC/GMT - 11:00 EDT

Dial-in

We will use the international conference call dial-in. Please follow directions below.

  • U.S.A/Canada toll free: 866-740-1260, participant code: 2257295
  • International toll free: http://www.readytalk.com/intl
    • Use the above link and input 2257295 and the country you are calling from to get your country's toll-free dial in #
    • Once on the call, enter participant code 2257295

Agenda

DSpace 6 Testathon preparation: the ultimate repository manager test plan

Update on the UI Working Group

Is it desirable to have WYSIWIG editing of metadata fields? Is it desirable to have rich text in metadata fields at all? (XMLUI already supports MathJax notation for formulae in metadata fields)  Unable to locate Jira server for this macro. It may be due to Application Link configuration.

 

Preparing for the call

Review the DSpace Release 6.0 Status pageUI Working Group

Gather existing DSpace testing scripts

Meeting notes

We discussed the opportunities for DCAT to create a Standardized testing script for the DSpace 6 Testathon. This script would be used by administrators testing the DSpace 6 release candidate. It therefor focusses on parts that can be examined visually and thus are part of the User Interface. As DSpace 6 will not yet have a unified UI we will have to create two testing scripts, one for JSPUI and another one for XMLUI.

On the long term this testing script will have to become even more standardized to use it for testing purpose on future DSpace versions.

At this moment there are several DSpace repository managers using their own testing scripts for internal testing purpose. As those could contain many interesting points of view for a standardized testing script, these scripts can be used to start from. In case you would have your own testing script which you are wiling to share, please add a link to it in the comment section below.

Scripts

We will have to use two parallel documents. One for XMLUI and one for JSPUI. It would be beneficial if those documents could contain scripts for different persona's. We could for example create a spreadsheet for each UI containing separate tabs for the persona's of DSpace admin, community admin, collection admin and DSpace submitter. Each of those tabs would contain tests tailored to the persona.

Tools

There are many tools available to collaborate in the creation of our testing script. Some people mentioned google docs and google spreadsheets to be useful. Another way of dealing with multiple actors could be by creating a wiki page.

To coordinate the creation of the testing script we will work on a wikipage. We will collaborate on this page until October 13th, on which we would need a first draft of the script to discuss during that day's DCAT meeting. After the discussion we have until December 1st to process our remarks in to a final version which will be used during the Testathon from December 1st to 11th.

When the final version is ready a separate wiki page will be created linking to the test script and providing some additional information aimed at repository managers willing to perform the tests.

The script itself could be published as a google spreadsheet or equivalent. Although the format is currently undecided it would be beneficial to add a column for jira tickets. This column should not be mandatory however. Testers feeling confident a certain issue deserves a jira ticket could create one and link to it in the column. Others could note down their comment in this column. Afterwards their remarks should be reviewed by another tester who could still decide to create a jira ticket.

Testing environment

In the past testers were encouraged to a release candidate locally and perform the test on their own DSpace instance. To not only standardize the test itself, but also the environments on which they are performed, the tests themselves should be done on the demo.dspace.org environment instead of locally.

Coordinators

The creation of the testing plan will be coordinated by Amanda FrenchKate DoheBram Luyten (Atmire). Others willing to join the task force could join anytime.

UI working group update

The UI working group is raising a call for designers who would come up with a new User Interface design. Main idea of this design is admins to be able to configure more in the user interface instead of in the configuration files. The combination of future DSpace versions and the new UI should also make the whole more modular.

Call Attendees

  • No labels

7 Comments

  1. Previous (older) testathons had pretty long lists of features that could be tested, example:

    DSpace Release 1.6.0 Testathon Page

    Efficient testing and reporting issues for DSpace is far from trivial. Ideally, what has to be revealed in a testathon are:

    • Problems in the codebase that will lead to issues for most (if not all) people using specific parts of the code
    • Issues with the installation and upgrade instructions. Ideally, problems with compatibility for specific dependencies should be revealed as early as possible. Historically, many of these problems (notable Oracle related issues) only appeared after release

    This gets an additional layer of complexity if tests are carried out on different test servers, and if not all of them are being installed/configured in the same way. A few potential reports that could relate to such problems:

    • I don't receive emails: email service not properly configured OR deliberately disabled for testing
    • I don't see the Google Analytics stats: google analytics key not properly configured or new DSpace 5 integration disabled as a whole
    • I'm missing feature X or Y: user is trying to test a JSPUI feature but is using XMLUI and vice versa

    Especially when it comes to performance, you ideally want to test with a BIG DSpace

    • large number of collections/communities
    • large number of items
    • large number of bitstreams
    • multiple millions of usage events in the statistics core

    I don't intend to make testing sound very problematic. But it's just to emphasize that if we very consciously choose what we will test and how (environment-wise) we are going to test it, we can really leverage our time and make a much bigger difference with our efforts than in any of the previous releases.

    Looking forward to today's discussion!

     

     

     

  2. Here's my most recent version of that test schedule, with more tests listed DataShare Testing Results Blank v4.xlsx

    And here's a screenshot of a set of test results - completed-test-schedule.png - as you can see, wehre there have been bugs found, I've put a mention of Redmine in my results, since that is our bug tracking software.

  3. Amanda French You mentioned that you had developed a suite of Selenium tests for DSpace. When looking at your wiki at https://github.com/VTUL/vtechworks/wiki it seems that these tests are hosted on an internal VT repo. Any chance you could share those on github, or mail them around to DCAT as well?