Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

[dspace-source]/dspace/config/modules/rdf.cfg

The content negotiation needs to know where to refer if anyone asks for RDF serializations of content stored within DSpace. This property sets the URL where the dspace-rdf module can be reached on the internet (depending on how you deployed it)constant.data.GENERAL} will be included in every Entity that is converted to RDF. E.g. it can be used to point to the address of the public readable SPARQL endpoint or may contain the name of the institution running DSpace.

The other properties define files that will be included if a DSpace Object of the specified type (collection, community, item or site) is converted. This makes it possible to add static content to every Item, every Collection, ...

The following properties configure the SimpleDSORelationsConverterPlugin
community2site 0#isPartOfRepository
Property:rdf.publiccontentNegotiation.sparql.endpointenable
Example
Value:
rdf.publiccontentNegotiation.sparql.endpoint = http://${dspace.baseUrl}/sparqlenable = true
Informational
Note:
Address of the read-only public SPARQL endpoint supporting SPARQL 1.1 Query LanguageDefines whether content negotiation should be activated. Set this true, if you use Linked Data support.
Property:rdf.URIGeneratorcontextPath
Example
Value:
rdf.URIGenerator contextPath = org.${dspace.rdf.storage.LocalURIGeneratorbaseUrl}/rdf
Informational
Note:
The name of the class that generates the URIs to be used within the converted data. The LocalURIGenerator gernates URIs using the ${dspace.url} property. The class org.dspace.rdf.storage.HandleURIGenerator uses handles in form of HTTP URLs. It uses the property ${handle.canonical.prefix} to convert handles into HTTPS URLs. The class org.dspace.rdf.storage.DOIURIGenerator uses DOIs in the form of HTTP URLs if possible or local URIs if there are no DOIs. It uses the DOI resolver "http://dx.doi.org" to convert DOIs into HTTP URLs. The class org.dspace.rdf.storage.DOIHandleGenerator does the same but uses Handles as fallback if no DOI exists. The fallbacks are necessary as DOIs are currently used for Items only and not for Communities or Collectionscontent negotiation needs to know where to refer if anyone asks for RDF serializations of content stored within DSpace. This property sets the URL where the dspace-rdf module can be reached on the internet (depending on how you deployed it).
Property:rdf.public.sparql.endpoint
Example
Value:
rdf.public.sparql.endpoint = http://${dspace.baseUrl}/sparql
Informational
Note:
Address of the read-only public SPARQL endpoint supporting SPARQL 1.1 Query Language.
Property:rdf.converter.DSOtypes
Example
Value:
rdf.converter.DSOtypes = SITE, COMMUNITY, COLLECTION, ITEM
Informational
Note:
Define which kind of DSpaceObjects should be converted. Bundles and Bitstreams will be converted as part of the Item they belong to. Don't add EPersons here unless you really know what you are doing. All converted data is stored in the triple store that provides a publicly readable SPARQL endpoint. So all data converted into RDF is exposed publicly. Every DSO type you add here must have an HTTP URI to be referenced in the generated RDF, which is another reason not to add EPersons here currently.
Property:rdf.converterstorage.graphstore.endpoint
Example
Value:
rdf.converter = org.dspace.rdf.conversion.RDFConverterImplstorage.graphstore.endpoint = http://localhost:3030/dspace/data
Informational
Note:
This property sets the class that manages the whole conversion process. Currently there shouldn't be any need to change itAddress of a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. This address is used to create, update and delete converted data in the triple store. If you use Fuseki with the configuration provided as part of DSpace 5, you can leave this as it is. If you use another Triple Store or configure Fuseki on your own, change this property to point to a writeable SPARQL endpoint supporting the SPARQL 1.1 Graph Store HTTP Protocol.
Property:rdf.converterstorage.graphstore.pluginsauthentication
Example
Value:
rdf.converterstorage.graphstore.authentication = no
Informational
Note:
Defines whether to use HTTP Basic authentication to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint.
Properties:

rdf.storage.graphstore.login
rdf.storage.graphstore.password

Example
Values:

rdf.storage.graphstore.login = dspace
rdf.storage.graphstore.password =ecapsdplugins = org.dspace.rdf.conversion.StaticDSOConverterPlugin, \
                    org.dspace.rdf.conversion.MetadataConverterPlugin, \
                    org.dspace.rdf.conversion.SimpleDSORelationsConverterPlugin

Informational
Note:
List all plugins to be used during the conversion of DSpace contents into RDF. If you write a new conversion plugin you want to add its class name to this propertyCredentials for the HTTP Basic authentictaion if it is necessary to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint.
Property:rdf.storage.convertersparql.DSOtypesendpoint
Example
Value:
rdf.converterstorage.sparql.DSOtypes = SITE, COMMUNITY, COLLECTION, ITEMendpoint = http://localhost:3030/dspace/sparql
Informational
Note:
Define which kind of DSpaceObjects should be converted. Bundles and Bitstreams will be converted as part of the Item they belong to. Don't add EPersons here unless you really know what you are doing. All converted data is stored in the triple store that provides a publicly readable SPARQL endpoint. So all data converted into RDF is exposed publicly. Every DSO type you add here must have an HTTP URI to be referenced in the generated RDF, which is another reason not to add EPersons here currently.
Property:rdf.storage
Example
Value:
rdf.storage = org.dspace.rdf.storage.RDFStorageImpl
Informational
Note:
Configure which class to use to store the converted data. This class handles the connection to the SPARQL endpoint. Currently there is only one implementation, so there is no need/possibility to change this property.
Property:rdf.storage.graphstore.endpoint
Example
Value:
rdf.storage.graphstore.endpoint = http://localhost:3030/dspace/data
Besides a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint, DSpace needs a SPARQL 1.1 Query Language endpoint, which can be read-only. This property allows you to set an address to be used to connect to such a SPARQL endpoint. If you leave this property empty the property ${rdf.public.sparql.endpoint} will be used instead.
Properties:

rdf.storage.sparql.authentication
rdf.storage.sparql.login
rdf.storage.sparql.password

Example
Values:

rdf.storage.sparql.authentication = yes
rdf.storage.sparql.login = dspace
rdf.storage.sparql.password = ecapsd

Informational
Note:
As for the Informational
Note:
Address of a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. This address is used to create, update and delete converted data in the triple store. If you use Fuseki with the configuration provided as part of DSpace 5, you can leave this as it is. If you use another Triple Store or configure Fuseki on your own, change this property to point to a writeable SPARQL endpoint supporting the SPARQL 1.1 Graph Store HTTP Protocol.
Property:rdf.storage.graphstore.authentication
Example
Value:
rdf.storage.graphstore.authentication = no
Informational
Note:
Defines whether to use HTTP Basic authentication to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint.
you can configure DSpace to use HTTP Basic authentication to authenticate against the (read-only) SPARQL 1.1 Query Language endpoint.
The following properties configure the StaticDSOConverterPlugin.
Properties:rdf.constant.data.GENERAL
rdf.constant.data.COLLECTION
rdf.constant.data.COMMUNITY
rdf.constant.data.ITEM
rdf.constant.data.SITE
Properties:rdf.storage.graphstore.login
rdf.storage.graphstore.password
Example
Values:

rdf.storageconstant.graphstoredata.login GENERAL = ${dspace
rdf.storage.graphstore.password =ecapsd

Informational
Note:
Credentials for the HTTP Basic authentictaion if it is necessary to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint.
Property:rdf.storage.sparql.endpoint
Example
Value:
rdf.storage.sparql.endpoint = http://localhost:3030/dspace/sparql
Informational
Note:
Besides a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint, DSpace needs a SPARQL 1.1 Query Language endpoint, which can be read-only. This property allows you to set an address to be used to connect to such a SPARQL endpoint. If you leave this property empty the property ${rdf.public.sparql.endpoint} will be used instead.
Properties:

rdf.storage.sparql.authentication
rdf.storage.sparql.login
rdf.storage.sparql.password

Example
Values:

rdf.storage.sparql.authentication = yes
rdf.storage.sparql.login = dspace
rdf.storage.sparql.password = ecapsd

Informational
Note:
As for the SPARQL 1.1 Graph Store HTTP Protocol you can configure DSpace to use HTTP Basic authentication to authenticate against the (read-only) SPARQL 1.1 Query Language endpoint.
Property:rdf.contextPath
Example
Value:
rdf.contextPath = ${dspace.baseUrl}/rdf

.dir}/config/modules/rdf/constant-data-general.ttl
rdf.constant.data.COLLECTION = ${dspace.dir}/config/modules/rdf/constant-data-collection.ttl
rdf.constant.data.COMMUNITY = ${dspace.dir}/config/modules/rdf/constant-data-community.ttl
rdf.constant.data.ITEM = ${dspace.dir}/config/modules/rdf/constant-data-item.ttl
rdf.constant.data.SITE = ${dspace.dir}/config/modules/rdf/constant-data-site.ttl

Informational
Note:

These properties define files to read static data from. These data should be in RDF, and by default Turtle is used as serialization. The data in the file referenced by the property ${rdf.constant.data.GENERAL} will be included in every Entity that is converted to RDF. E.g. it can be used to point to the address of the public readable SPARQL endpoint or may contain the name of the institution running DSpace.

The other properties define files that will be included if a DSpace Object of the specified type (collection, community, item or site) is converted. This makes it possible to add static content to every Item, every Collection, ...

The following properties configure the MetadataConverterPluginInformational
Note:
.
Property:rdf.contentNegotiationmetadata.enablemappings
Example
Value:
rdf.contentNegotiationmetadata.enable = truemappings = ${dspace.dir}/config/modules/rdf/metadata-rdf-mapping.ttl
Informational
Note:
Defines whether content negotiation should be activated. Set this true, if you use Linked Data support.
The following properties configure the StaticDSOConverterPlugin.
the file that contains the mappings for the MetadataConverterPlugin. See below the description of the configuration file [dspace-source]/dspace/config/modules/rdf/metadata-rdf-mapping.ttl.
PropertyProperties:rdf.constantmetadata.data.GENERAL
rdf.constant.data.COLLECTION
rdf.constant.data.COMMUNITY
rdf.constant.data.ITEM
rdf.constant.data.SITEschema
Example
ValueExample
Values:

rdf.constantmetadata.data.GENERAL = schema = file://${dspace.dir}/config/modules/rdf/constantmetadata-datardf-generalschema.ttlrdf.constant.data.COLLECTION = ${dspace.dir}/config/modules/rdf/constant-data-collection.ttl
rdf.constant.data.COMMUNITY = ${dspace.dir}/config/modules/rdf/constant-data-community.ttl
rdf.constant.data.ITEM

Informational
Note:
Configures the URL used to load the RDF Schema of the DSpace Metadata RDF mapping Vocabulary. Using a file:// URI makes it possible to convert DSpace content without having an internet connection. The version of the schema has to be the right one for the used code.  In DSpace 5.0 we use the version 0.2.0. This Schema can be found here as well: http://digital-repositories.org/ontologies/dspace-metadata-mapping/0.2.0.  The newest version of the Schema can be found here: http://digital-repositories.org/ontologies/dspace-metadata-mapping/.
Property:

rdf.metadata.prefixes

Example
Value:

rdf.metadata.prefixes = ${dspace.dir}/config/modules/rdf/constantmetadata-data-itemprefixes.ttl
rdf.constant.data.SITE

Informational
Note:
If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by this property.
The following properties configure the SimpleDSORelationsConverterPlugin
Property:rdf.simplerelations.prefixes
Example
Value:
rdf.simplerelations.prefixes = ${dspace.dir}/config/modules/rdf/constantsimple-datarelations-siteprefixes.ttl
Informational
Note:
These properties define files to read static data from. These data should be in RDF, and by default Turtle is used as serialization. The data If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by the property ${this property.
Property:rdf.
The following properties configure the MetadataConverterPlugin.
simplerelations.site2community
Example
Value:
rdf.simplerelations.site2community = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCommunity
Informational
Note:
Defines the predicates used to link from the data representing the whole repository to the top level communities. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.community2siteProperty:rdf.metadata.mappings
Example
Value:
rdf.metadatasimplerelations.mappings = ${dspace.dir}/config/modules/rdf/metadata-rdf-mapping.ttlcommunity2site = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfRepository
Informational
Note:
Defines the file that contains the mappings for the MetadataConverterPlugin. See below the description of the configuration file [dspace-source]/dspace/config/modules/rdf/metadata-rdf-mapping.ttlpredicates used to link from the top level communities to the data representing the whole repository. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.metadatasimplerelations.schemacommunity2subcommunity
Example
Value:
rdf.metadatasimplerelations.schema community2subcommunity = filehttp://${dspace.dir}/config/modules/rdf/metadata-rdf-schema.ttlpurl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasSubcommunity
Informational
Note:Configures
Defines the predicates used to link from communities to their subcommunities. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.subcommunity2community
Example
Value:
rdf.simplerelations.subcommunity2community = http://purl.org/dc/terms/isPartOf, URL used to load the RDF Schema of the DSpace Metadata RDF mapping Vocabulary. Using a file:// URI makes it possible to convert DSpace content without having an internet connection. The version of the schema has to be the right one for the used code.  In DSpace 5.0 we use the version 0.2.0. This Schema can be found here as well: http://digital-repositories.org/ontologies/dspace-metadata-mapping/0.2.0.  The newest version of the Schema can be found here: http://digital-repositories.org/ontologies/dspace-metadata-mapping/1.0#isSubcommunityOf
Informational
Note:
Defines the predicates used to link from subcommunities to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.metadatasimplerelations.prefixescommunity2collection
Example
Value:
rdf.metadatasimplerelations.prefixes = ${dspace.dir}/config/modules/rdf/metadata-prefixes.ttl
Informational
Note:
If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by this property.
community2collection = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCollection
Informational
Note:
Defines the predicates used to link from communities to their collections. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.prefixescollection2community
Example
Value:
rdf.simplerelations.prefixes = ${dspace.dir}/config/modules/rdf/simple-relations-prefixes.ttlcollection2community = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCommunity
Informational
Note:
If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by this propertyDefines the predicates used to link from collections to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.site2communitycollection2item
Example
Value:
rdf.simplerelations.site2community collection2item = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCommunity0#hasItem
Informational
Note:
Defines the predicates used to link from the data representing the whole repository to the top level communitiescollections to their items. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.community2siteitem2collection
Example Value:rdf.simplerelations.item2collection = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCollection
Informational
Note:
Defines the predicates used to link from the top level communities to the data representing the whole repositoryitems to the collections they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.community2subcommunityitem2bitstream
Example
Value:
rdf.simplerelations.community2subcommunity item2bitstream = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasSubcommunity0#hasBitstream
Informational
Note:
Defines the predicates used to link from communities item to their subcommunitiesbitstreams. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.subcommunity2community
Example
Value:
rdf.simplerelations.subcommunity2community = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isSubcommunityOf
Informational
Note:
Defines the predicates used to link from subcommunities to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.community2collection
Example
Value:
rdf.simplerelations.community2collection = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCollection
Informational
Note:
Defines the predicates used to link from communities to their collections. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.collection2community
Example
Value:
rdf.simplerelations.collection2community = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCommunity
Informational
Note:
Defines the predicates used to link from collections to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.collection2item
Example
Value:
rdf.simplerelations.collection2item = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasItem
Informational
Note:
Defines the predicates used to link from collections to their items. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.item2collection
Example Value:rdf.simplerelations.item2collection = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCollection
Informational
Note:
Defines the predicates used to link from items to the collections they belong to. Defining multiple predicates separated by commas will result in multiple triples.
Property:rdf.simplerelations.item2bitstream
Example
Value:
rdf.simplerelations.item2bitstream = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasBitstream
Informational
Note:
Defines the predicates used to link from item to their bitstreams. Defining multiple predicates separated by commas will result in multiple triples.

[dspace-source]/dspace/config/modules/rdf/constant-data-*.ttl

[dspace-source]/dspace/config/modules/rdf/constant-data-*.ttl

As described in the documentation of the configuration file [dspace-source]/dspace/config/modules/rdf.cfg, the constant-data-*.ttl files can be used to add static RDF to the converted data. The data are written in Turtle, but if As described in the documentation of the configuration file [dspace-source]/dspace/config/modules/rdf.cfg, the constant-data-*.ttl files can be used to add static RDF to the converted data. The data are written in Turtle, but if you change the file suffix (and the path to find the files in  rdf.cfg) you can use any other RDF serialization you like to. You can use this, for example, to add a link to the public readable SPARQL endpoint, add a link to the repository homepage, or add a triple to every community or collection defining it as an entity of a specific type like a bibo:collection. The endpoint, add a link to the repository homepage, or add a triple to every community or collection defining it as an entity of a specific type like a bibo:collection. The content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-general.ttl will be added to every DSpaceObject that is converted. The content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-community.ttl to every community, the content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-collection.ttl to every collection and the content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-item.ttl to every Item. You can use the file [dspace-source]/dspace/config/modules/rdf/constant-data-site.ttl to specify data -general.ttl representing the whole repository.

[dspace-source]/dspace/config/modules/rdf/metadata-rdf-mapping.ttl

This file should contain several metadata mappings. A metadata mapping defines how to map a specific metadata field within DSpace to a triple that will be added to every DSpaceObject that is the converted data. The content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-community.ttl to every community, the content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-collection.ttl to every collection and the content of the file [dspace-source]/dspace/config/modules/rdf/constant-data-item.ttl to every Item. You can use the file [dspace-source]/dspace/config/modules/rdf/constant-data-site.ttl to specify data representing the whole repository.

[dspace-source]/dspace/config/modules/rdf/metadata-rdf-mapping.ttl

This file should contain several metadata mappings. A metadata mapping defines how to map a specific metadata field within DSpace to a triple that will be added to the converted data. The MetadataConverterPlugin uses these metadata mappings to convert the metadata of a item into RDF. For every metadata field and value it looks if any of the specified mappings matches. If one does, the plugin creates the specified triple and adds it to the converted data. In the file you'll find a lot of examples on how to define such a mapping.

For every mapping a metadata field name has to be specified, e.g. dc.title, dc.identifier.uri. In addition you can specify a condition that is matched against the field's value. The condition is specified as a regular expression (using the syntax of the java class java.util.regex.Pattern). If a condition is defined, the mapping will be used only on fields those values which are matched by the regex defined as condition.

The triple to create by a mapping is specified using reified RDF statements. The DSpace Metadata RDF Mapping Vocabulary defines some placeholders that can be used. The most important placeholder is dm:DSpaceObjectIRI which is replaced by the URI used to identify the entity being converted to RDF. That means if a specific Item is converted the URI used to address this Item in RDF will be used instead of dm:DSpaceObjectIRI. There are three placeholders that allow reuse of the value of a meta data field. dm:DSpaceValue will be replace by the value as it is. dm:LiteralGenerator allows one to specify a regex and replacement string for it (see the syntax of the java classes java.util.regex.Pattern and java.util.regex.Matcher) and creates a Literal out of the field value using the regex and the replacement string. dm:ResourceGenerator does the same as dm:LiteralGenerator but it generates a HTTP(S) URI that is used in place. So you can use the resource generator to generate URIs containing modified field values (e.g. to link to classifications). If you know regular expressions and turtle, the syntax should be quite self explanatory.

[dspace-source]/dspace/config/modules/rdf/fuseki-assembler.ttl

MetadataConverterPlugin uses these metadata mappings to convert the metadata of a item into RDF. For every metadata field and value it looks if any of the specified mappings matches. If one does, the plugin creates the specified triple and adds it to the converted data. In the file you'll find a lot of examples on how to define such a mapping.

For every mapping a metadata field name has to be specified, e.g. dc.title, dc.identifier.uri. In addition you can specify a condition that is matched against the field's value. The condition is specified as a regular expression (using the syntax of the java class java.util.regex.Pattern). If a condition is defined, the mapping will be used only on fields those values which are matched by the regex defined as condition.

The triple to create by a mapping is specified using reified RDF statements. The DSpace Metadata RDF Mapping Vocabulary defines some placeholders that can be used. The most important placeholder is dm:DSpaceObjectIRI which is replaced by the URI used to identify the entity being converted to RDF. That means if a specific Item is converted the URI used to address this Item in RDF will be used instead of dm:DSpaceObjectIRI. There are three placeholders that allow reuse of the value of a meta data field. dm:DSpaceValue will be replace by the value as it is. dm:LiteralGenerator allows one to specify a regex and replacement string for it (see the syntax of the java classes java.util.regex.Pattern and java.util.regex.Matcher) and creates a Literal out of the field value using the regex and the replacement string. dm:ResourceGenerator does the same as dm:LiteralGenerator but it generates a HTTP(S) URI that is used in place. So you can use the resource generator to generate URIs containing modified field values (e.g. to link to classifications). If you know regular expressions and turtle, the syntax should be quite self explanatory.

[dspace-source]/dspace/config/modules/rdf/fuseki-assembler.ttl

This is a configuration for the triple store Fuseki of the Apache Jena project. You can find more information on the configuration it provides in the section Install a Triple Store above.

[dspace-source]/dspace/config/spring/api/rdf.xml

This file defines which classes are loaded by DSpace to provide the RDF functionality. There is only one line you probably ever might have to change:

Code Block
<property name="generator" ref="org.dspace.rdf.storage.LocalURIGenerator"/>

This lines defines how URIs should be generated, to be used within the converted data. The LocalURIGenerator generates URIs using the ${dspace.url} property. The HandleURIGenerator uses handles in form of HTTP URLs. It uses the property ${handle.canonical.prefix} to convert handles into HTTPS URLs. The class org.dspace.rdf.storage.DOIURIGenerator uses DOIs in the form of HTTP URLs if possible or local URIs if there are no DOIs. It uses the DOI resolver "http://dx.doi.org" to convert DOIs into HTTP URLs. The class org.dspace.rdf.storage.DOIHandleGenerator does the same but uses Handles as fallback if no DOI exists. The fallbacks are necessary as DOIs are currently used for Items only and not for Communities or Collections.

List all plugins to be used during the conversion of DSpace contents into RDF. If you write a new conversion plugin you want to add its class name to this propertyThis is a configuration for the triple store Fuseki of the Apache Jena project. You can find more information on the configuration it provides in the section Install a Triple Store above.

Maintenance

As described above you should add rdf to the property event.dispatcher.default.consumers and in dspace.cfg. This configures DSpace to automatically update the triple store every time the publicly available content of the repository is changed. Nevertheless there is a command line tool that gives you the possibility to update the content of the triple store. As the triple store is used as a cache only, you can delete its content and reindex it every time you think it is necessary of helpful. The command line tool can be started by the following command which will show its online help:

...