Deprecated. This material represents early efforts and may be of interest to historians. It doe not describe current VIVO efforts.
Deprecated. This material represents early efforts and may be of interest to historians. It doe not describe current VIVO efforts.
This applies to VIVO release 1.6.
Permits external applications to provide a list of URIs, and to request that the corresponding records in the search index be updated.
When the VIVO triple-store is updated in any way that bypasses VIVO's internal data channels, the search index will not reflect the changes. Prior to VIVO release 1.6, the only solution was to rebuild the entire search index, from the Site Administration
page.
With this service, you can provide a list of URIs whose contents have changed, and request that only those search records be updated. This is usually faster than rebuilding the entire index.
By default, the SPARQL Update API is disabled in VIVO, for security reasons. See
The Harvester and similar tools write directly to the VIVO triple-store, bypassing the usual data channels in VIVO. After ingesting, it has been necessary to rebuild the search index so it will reflect the changes in the data. With this service, you can rebuild only part of the index.
Note: when the Harvester and other tools have been modified to use the SPARQL Update API, VIVO will ensure that the search index and inferences are kept in synchronization with the data.
Some sites use two VIVO instances. One is a staging instance, and all ingests occur there. The other is a production instance, and periodically the triple-store is copied from staging to production. When this is done, you have 3 options:
The concerns that apply to the search index will also apply to the state of the inferred triples in the data model. When bypassing the data channels in VIVO, you bypass the semantic reasoner. To compensate for this, you must either
Recompute Inferences
from the Site Administration
page, orIn most cases, the time required to re-inference the model is greater than the time required to rebuild the search index. Unfortunately, the reasoning process is not easy to partition. To date, VIVO has no service that would allow you to update the inferences on a limited set of data.
Specification
[vivo]/api/sparqlUpdate
Examples:
The API supports only HTTP POST calls. GET, HEAD, and other methods are not supported, and will return a response code of 400 Bad Request
.
name | value |
---|---|
email | the email address of a VIVO adminstrator account |
password | the password of the VIVO administrator account |
update | A SPARQL Update request |
The syntax for a SPARQL Update request is described on the World Wide Web Consortium site at http://www.w3.org/TR/2013/REC-sparql11-update-20130321/
The API requires that you specify a GRAPH in your SPARQL update request. Insertions or deletions to the default graph are not supported.
Code | Reason |
---|---|
200 OK | SPARQL Update was successful. |
400 Bad Request | Incorrect HTTP method; only POST is accepted. |
HTTP request did not include an update parameter. | |
The SPARQL Update request did not specify a GRAPH. | |
The SPARQL Update request was syntactically incorrect. | |
403 Forbidden | HTTP request did not include an email parameter. |
HTTP request did not include a password parameter. | |
The combination of email and password is not valid. | |
The selected VIVO account is not authorized to use the SPARQL Update API. | |
500 Internal Server Error | VIVO could not execute the request; internal code threw an exception. |
These examples use the UNIX curl
command to insert and delete data using the API.