Page History
...
There are several configuration files to configure DSpace's LOD support. The main configuration file can be found under [dspace-source]/dspace/config/modules/rdf.cfg
. Within DSpace we use spring Spring to define which classes to load. For DSpace's LOD support this is done within [dspace-source]/dspace/config/spring/api/rdf.xml
. All other configuration files are positioned in the directory [dspace-source]/dspace/config/modules/rdf
/. Configurations in rdf.cfg
can be modified directly, or overridden via your local.cfg
config file (see Configuration Reference). You'll have to configure where to find and how to connect to the triple store. You may configure how to generate URIs to be used within the generated Linked Data and how to convert the contents stored in DSpace into RDF. We will guide you through the configuration file by file.
...
Property: | rdf.contentNegotiation.enable | ||
Example Value: | rdf.contentNegotiation.enable = true | ||
Informational Note: | Defines whether content negotiation should be activated. Set this true, if you use Linked Data support. | ||
Property: | rdf.contextPath | ||
Example Value: | rdf.contextPath = ${dspace.baseUrl}/rdf | ||
Informational Note: | The content negotiation needs to know where to refer if anyone asks for RDF serializations of content stored within DSpace. This property sets the URL where the dspace-rdf module can be reached on the internet Internet (depending on how you deployed it). | ||
Property: | rdf.public.sparql.endpoint | ||
Example Value: | rdf.public.sparql.endpoint = http://${dspace.baseUrl}/sparql | ||
Informational Note: | Address of the read-only public SPARQL endpoint supporting SPARQL 1.1 Query Language. | ||
Property: | rdf.converterstorage.graphstore.DSOtypesendpoint | ||
Example Value: | rdf.converterstorage.graphstore.DSOtypes = SITE, COMMUNITY, COLLECTION, ITEMendpoint = http://localhost:3030/dspace/data | ||
Informational Note: | Address of a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. This address is used to create, update and delete converted data in the triple store. If you use Fuseki with the configuration provided as part of DSpace 5, you can leave this as it is. If you use another Triple Store or configure Fuseki on your own, change this property to point to a writeable SPARQL endpoint supporting the SPARQL 1.1 Graph Store HTTP Protocol. | ||
Property: | Define which kind of DSpaceObjects should be converted. Bundles and Bitstreams will be converted as part of the Item they belong to. Don't add EPersons here unless you really know what you are doing. All converted data is stored in the triple store that provides a publicly readable SPARQL endpoint. So all data converted into RDF is exposed publicly. Every DSO type you add here must have an HTTP URI to be referenced in the generated RDF, which is another reason not to add EPersons here currently. | Property: | rdf.storage.graphstore.endpointauthentication |
Example Value: | rdf.storage.graphstore.endpoint = http://localhost:3030/dspace/dataauthentication = no | ||
Informational Note: | Address of a Defines whether to use HTTP Basic authentication to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. This address is used to create, update and delete converted data in the triple store. If you use Fuseki with the configuration provided as part of DSpace 5, you can leave this as it is. If you use another Triple Store or configure Fuseki on your own, change this property to point to a writeable SPARQL endpoint supporting the SPARQL 1.1 Graph Store HTTP Protocol. | ||
Properties: | rdf.storage.graphstore.login | Property: | rdf.storage.graphstore. authenticationpassword |
Example Value:Values: | rdf.storage.graphstore.login = dspace password = noecapsd | ||
Informational Note: | Defines whether to use HTTP Basic authentication to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. | ||
Properties: | rdf.storage.graphstore.login | ||
Example Values: | rdf.storage.graphstore.login = dspace | ||
Informational Note: | Credentials for the HTTP Basic authentictaion Credentials for the HTTP Basic authentication if it is necessary to connect to the writable SPARQL 1.1 Graph Store HTTP Protocol endpoint. | ||
Property: | rdf.storage.sparql.endpoint | ||
Example Value: | rdf.storage.sparql.endpoint = http://localhost:3030/dspace/sparql | ||
Informational Note: | Besides a writable SPARQL 1.1 Graph Store HTTP Protocol endpoint, DSpace needs a SPARQL 1.1 Query Language endpoint, which can be read-only. This property allows you to set an address to be used to connect to such a SPARQL endpoint. If you leave this property empty the property ${rdf.public.sparql.endpoint} will be used instead. | ||
Properties: | rdf.storage.sparql.authentication | ||
Example Values: | rdf.storage.sparql.authentication = yes | ||
Informational Note: | As for the SPARQL 1.1 Graph Store HTTP Protocol you can configure DSpace to use HTTP Basic authentication to authenticate against the (read-only) SPARQL 1.1 Query Language endpoint. | ||
The following properties configure the StaticDSOConverterPlugin. | |||
---|---|---|---|
Property: | rdf.converter.DSOtypes | ||
Example Value | Properties: | rdf.converter.DSOtypes = SITE, COMMUNITY, COLLECTION, ITEM | |
Informational Note: | Define which kind of DSpaceObjects should be converted. Bundles and Bitstreams will be converted as part of the Item they belong to. Don't add EPersons here unless you really know what you are doing. All converted data is stored in the triple store that provides a publicly readable SPARQL endpoint. So all data converted into RDF is exposed publicly. Every DSO type you add here must have an HTTP URI to be referenced in the generated RDF, which is another reason not to add EPersons here currently. | ||
The following properties configure the StaticDSOConverterPlugin. | |||
Properties: | constant.data.GENERALExample Values: | rdf.constant.data.GENERAL = ${dspace.dir}/config/modules/rdf/constant-data-general.ttlrdf.constant.data.GENERAL rdf.constant.data.COLLECTION | rdf.constant.data.COMMUNITY rdf.constant.data.ITEM rdf.constant.data.SITE |
Example Values: | rdf.constant.data.GENERAL = ${dspace.dir}/config/modules/rdf/constant-data-communitygeneral.ttl | ||
Informational Note: | These properties define files to read static data from. These data should be in RDF, and by default Turtle is used as serialization. The data in the file referenced by the property ${rdf.constant.data.GENERAL} will be included in every Entity that is converted to RDF. E.g. it can be used to point to the address of the public readable SPARQL endpoint or may contain the name of the institution running DSpace. The other properties define files that will be included if a DSpace Object of the specified type (collection, community, item or site) is converted. This makes it possible to add static content to every Item, every Collection, ... | ||
The following properties configure the MetadataConverterPlugin. | |||
Property: | rdf.metadata.mappings | ||
Example Value: | rdf.metadata.mappings = ${dspace.dir}/config/modules/rdf/metadata-rdf-mapping.ttl | ||
Informational Note: | Defines the file that contains the mappings for the MetadataConverterPlugin. See below the description of the configuration file [dspace-source]/dspace/config/modules/rdf/metadata-rdf-mapping.ttl. | ||
Property: | rdf.metadata.schema | ||
Example Value: | rdf.metadata.schema = file://${dspace.dir}/config/modules/rdf/metadata-rdf-schema.ttl | ||
Informational Note: | Configures the URL used to load the RDF Schema of the DSpace Metadata RDF mapping Vocabulary. Using a file:// URI makes it possible to convert DSpace content without having an internet connection. The version of the schema has to be the right one for the used code. In DSpace 5.0 we use the version 0.2.0. This Schema can be found here as well: http://digital-repositories.org/ontologies/dspace-metadata-mapping/0.2.0. The newest version of the Schema can be found here: http://digital-repositories.org/ontologies/dspace-metadata-mapping/. | ||
Property: | rdf.metadata.prefixes | ||
Example Value: | rdf.metadata.prefixes = ${dspace.dir}/config/modules/rdf/metadata-prefixes.ttl | ||
Informational Note: | If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by this property. | ||
The following properties configure the SimpleDSORelationsConverterPlugin | |||
Property: | rdf.simplerelations.prefixes | ||
Example Value: | rdf.simplerelations.prefixes = ${dspace.dir}/config/modules/rdf/simple-relations-prefixes.ttl | ||
Informational Note: | If you want to use prefixes in RDF serializations that support prefixes, you can define these prefixes in the file referenced by this property. | ||
Property: | rdf.simplerelations.site2community | ||
Example Value: | rdf.simplerelations.site2community = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCommunity | ||
Informational Note: | Defines the predicates used to link from the data representing the whole repository to the top level communities. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.community2site | ||
Example Value: | rdf.simplerelations.community2site = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfRepository | ||
Informational Note: | Defines the predicates used to link from the top level communities to the data representing the whole repository. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.community2subcommunity | ||
Example Value: | rdf.simplerelations.community2subcommunity = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasSubcommunity | ||
Informational Note: | Defines the predicates used to link from communities to their subcommunities. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.subcommunity2community | ||
Example Value: | rdf.simplerelations.subcommunity2community = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isSubcommunityOf | ||
Informational Note: | Defines the predicates used to link from subcommunities to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.community2collection | ||
Example Value: | rdf.simplerelations.community2collection = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasCollection | ||
Informational Note: | Defines the predicates used to link from communities to their collections. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.collection2community | ||
Example Value: | rdf.simplerelations.collection2community = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCommunity | ||
Informational Note: | Defines the predicates used to link from collections to the communities they belong to. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.collection2item | ||
Example Value: | rdf.simplerelations.collection2item = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasItem | ||
Informational Note: | Defines the predicates used to link from collections to their items. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.item2collection | ||
Example Value: | rdf.simplerelations.item2collection = http://purl.org/dc/terms/isPartOf, http://digital-repositories.org/ontologies/dspace/0.1.0#isPartOfCollection | ||
Informational Note: | Defines the predicates used to link from items to the collections they belong to. Defining multiple predicates separated by commas will result in multiple triples. | ||
Property: | rdf.simplerelations.item2bitstream | ||
Example Value: | rdf.simplerelations.item2bitstream = http://purl.org/dc/terms/hasPart, http://digital-repositories.org/ontologies/dspace/0.1.0#hasBitstream | ||
Informational Note: | Defines the predicates used to link from item to their bitstreams. Defining multiple predicates separated by commas will result in multiple triples. |
...
This file defines which classes are loaded by DSpace to provide the RDF functionality. There is only one line you probably ever might have to change:are two things you might want to change: the class that is responsible to generate the URIs to be used within the converted data, and the list of Plugins used during conversion. To change the class responsible for the URIs, change the following line:
Code Block |
---|
<property name="generator" ref="org.dspace.rdf.storage.LocalURIGenerator"/> |
This line defines how URIs should be generated, to be used within the converted data. The LocalURIGenerator generates URIs using the ${dspace.url} property. The HandleURIGenerator uses handles in form of HTTP URLs. It uses the property ${handle.canonical.prefix} to convert handles into HTTPS URLs. The class org.dspace.rdf.storage.DOIURIGenerator uses DOIs in the form of HTTP URLs if possible, or local URIs if there are no DOIs. It uses the DOI resolver "http://dx.doi.org" to convert DOIs into HTTP URLs. The class org.dspace.rdf.storage.DOIHandleGenerator does the same but uses Handles as fallback if no DOI exists. The fallbacks are necessary as DOIs are currently used for Items only and not for Communities or Collections.
All plugins that are instantiated within the configuration file will automatically be used during the conversion. Per default the list looks like the following:
Code Block |
---|
<!-- configure all plugins the converter should use. If you don't want to
use a plugin, remove it here. -->
<bean id="org.dspace.rdf.conversion.SimpleDSORelationsConverterPlugin" class |
Code Block |
<property name="generator" ref="org.dspace.rdf.storageconversion.LocalURIGenerator"/> |
...
SimpleDSORelationsConverterPlugin"/> <bean id="org.dspace.rdf.conversion.MetadataConverterPlugin" class="org.dspace.rdf. |
...
conversion.MetadataConverterPlugin"/> <bean id="org.dspace.rdf.conversion.StaticDSOConverterPlugin" class="org.dspace.rdf.conversion.StaticDSOConverterPlugin"/> |
You can remove plugins if you don't want them. If you develop storage.DOIHandleGenerator does the same but uses Handles as fallback if no DOI exists. The fallbacks are necessary as DOIs are currently used for Items only and not for Communities or Collections.List all plugins to be used during the conversion of DSpace contents into RDF. If you write a new conversion plugin, you want to add its class name to this propertylist.
Maintenance
As described above you should add rdf
to the property event.dispatcher.default.consumers
and in dspace.cfg. This configures DSpace to automatically update the triple store every time the publicly available content of the repository is changed. Nevertheless there is a command line tool that gives you the possibility to update the content of the triple store. As the triple store is used as a cache only, you can delete its content and reindex it every time you think it is necessary of helpful. The command line tool can be started by the following command which will show its online help:
...