Contribute to the DSpace Development Fund

The newly established DSpace Development Fund supports the development of new features prioritized by DSpace Governance. For a list of planned features see the fund wiki page.

You are viewing an old version of this page. View the current version.

Compare with Current View Page History

« Previous Version 6 Next »

This page is dedicated to gathering resources / brainstorms for how to incentivize / encourage community developers to help with Code Reviewing and Pull Request Testing

General Goals / Ideas

  1. We need to find a way to encourage more reviewers from our large community of developers.  Lots of people doing a small number of tests/reviewers scales very well.
  2. Document the incentives for people to do reviewers / functional testing.
  3. Find a way to make the codebase easier to work with.  We've done some of this with Docker. But we should investigate ways to spin up DSpace in a temporary, virtual environment with minimal configuration/steps. This would allow anyone to more easily interact with & test individual PRs
  4. Find a way to acknowledge code reviews / functional testing in Release Notes in the same way as development/code is acknowledged.

Resources to make Developers feel welcome

General Goal: Find a way to encourage other developers to get involved & help out in small ways

Resources for making Code Reviews / Testing easier

General Goal: Find a way to make the codebase easier to work with & test PRs with.

  • Testing DSpace 7 Pull Requests - How to use existing Docker scripts to spin up PRs more easily locally, in order to test them or review them.
  • Spin up code in virtual environment (quickly) for easier reviews/testing
  • Automated Code Reviewing resources.  Tools/resources exist which can do some automatic checking/verification of code quality in Pull Requests.  Some examples include:
    • Code Scanning in GitHub.  We already do some of this, but currently we only scan for security-oriented code issues.
      • Free & integrated into GitHub.  Interface is a bit clunky at times though.
      • Highly Configurable (e.g. see query types)
      • Could configure this to also check PR code quality against coding best practices (currently we only scan for major bugs / security issues).  See these settings
    • SonarCloud.io - This is a hosted version of SonarQube.
      • Free for open source projects. Integrates with GitHub & supports both Java and TypeScript. Can run on every new PR. 
      • Example projects: https://sonarcloud.io/explore/projects
      • Test analysis run by Tim on DSpace backend/frontend: https://sonarcloud.io/organizations/tdonohue/projects  (Keep in mind, there are definitely false positives listed here. These are just raw reports)
      • Pros: Highly Configurable Used by other major OS projects like Apache.  Good documentation/resources on how to fix any issues that are found. SonarQube is open source itself.
    • DeepSource
      • Free for open source projects. Integrates with GitHub & supports both Java and TypeScript. Can run on every new PR. 
      • Test analysis run by Tim on DSpace backend/frontend (Keep in mind, there are definitely false positives listed in both. These are just raw reports)
      • Not as configurable, but able to turn off individual rules if they are too "noisy" or not useful.
      • Pros: Some "autofix" options. Good documentation/resources on how to fix any issues that are found.

Resources for acknowledging code reviewers / testers

General Goal: Find a way to acknowledge / track code reviewers so that we can more easily include them in Release Notes (and include this as a form of contribution for service providers).  Ideal is that it is either automated or semi-automated (e.g. a report that can be run regularly per release)




  • No labels