Page History
...
Property | Description | Default |
---|---|---|
| Configuration used to enable the IRUS statistics tracker. Set to true to enable. | false |
irus.statistics.tracker.type-field | Metadata field to check if certain items should be excluded from tracking. If empty or commented out, all items are tracked. | |
irus.statistics.tracker.type-value | The values in the above metadata field that will be considered to be tracked. | |
irus.statistics.tracker.entity-types | The entity types to be included in the tracking. If left empty, only publication hits will be tracked. If entities are disabled in DSpace (the default in DSpace 7.1), then all Items will be included in tracking. | Publication |
irus.statistics.tracker.environment | The tracker environment determines to which url the statistics are exported (test or prod). | test |
irus.statistics.tracker.testurl | The url to which the trackings are exported when testing. (In theory, this should be https://irus.jisc.ac.uk/counter/test/) | |
irus.statistics.tracker.produrl | The url to which the trackings are exported in production. (this will depend on your area/country, refer to the Prerequisite section) | |
irus.statistics.tracker.urlversion | Tracker version | |
irus.statistics.spider.agentregex.url | External URL pointing to the COUNTER user agents file. The user agents file is downloaded from the provided URL as part of the Apache ant build process. Item views determined by DSpace to have been generated by bots/spiders are not sent to IRUS. Including this additional (and optional) agents file can reduce unnecessary network traffic by reducing the need to transfer view data that will be ignored by IRUS. Example value: https://raw.githubusercontent.com/atmire/COUNTER-Robots/master/generated/COUNTER_Robots_list.txt | |
irus.statistics.spider.agentregex.regexfile | Location where the user agents file should be downloaded to. The Apache ant build process that retrieves the user agents file from the URL specified above places it in the location specified here. Example value: ${dspace.dir}/config/spiders/agents/COUNTER_Robots_list.txt |
Re-trying failed attempts
If the IRUS tracker is down or some other kind of error should occur preventing DSpace from committing to the tracker, the record is stored in the database in a separate table (OpenUrlTracker
) that is being created automatically during deployment. Committing these entries can be tried again using the following command.
Code Block |
---|
[deployed-dspace]/bin/dspace retry-tracker |
This will iterate over all the logged entries and retry committing them. If they fail again, they remain in the table, if they succeed, they are removed.
It is strongly advised to schedule this script to be executed daily or weekly (preferable at low load-times during the night or weekend). If there are no failed entries, the script will not perform any actions and exit immediately.