Versions Compared

Key

  • This line was added.
  • This line was removed.
  • Formatting was changed.

...

  • (thumbs up) 
    Jira
    serverDuraSpace JIRA
    keyVIVO-101
  • Implementing a web service interface (with authentication) to the VIVO RDF API, to allow the Harvester and other tools to add/remove data from VIVO and trigger appropriate search indexing and recomputing of inferences.
    • This would also enable round-trip editing of VIVO content from Drupal or another tool external to VIVO via the SPARQL update capability of the RDF api
    • Authentication will be involved
      • Could manage in our own authentication and authorization system and tell Apache that the servlet requires an HTTPS connection
      • This approach would allow testing in a known environment without having to set up SSL certificates
    • It would help the user experience if it's possible to bundle together an atomic change set (at least all those changes for one graph), so additions and retractions would not show up piecemeal
      • Note that since inferences are done is a separate thread there may still be some lag
  • (warning) Put and delete of data via LOD requests – this has been suggested but we're not sure a specification even exists for an LOD "put" request – please add references here if you're aware of discussion or documentation.
    • when we were design the RDF API, if we allow anybody to execute an arbitrary SPARQL update or delete, you can't just listen to the changes, so we limited what we supported through the RDF API to adds or deletes, using just a subset of the overall language
    • the idea of being able to pipe an arbitrary update or delete through our API would take some work but is theoretically possible

Anchor
SerializeRestoreQuads
SerializeRestoreQuads
Serialize/restore a set of graphs in quad format

(question) 

Jira
serverDuraSpace JIRA
keyVIVO-23

This would support reading and writing data like class groups

Anchor
Editing
Editing
Editing

...

  • (thumbs down) Limiting SPARQL queries by named graph, either via inclusion or exclusion. 
    • This is allegedly supported by the Virtuoso triple store. This would help assure that private or semi-private data in a VIVO could be exposed in via a SPARQL endpoint
    • If this functionality is dependent on the underlying triple store chosen for VIVO, it's not something that can easily be managed in VIVO
  • There are other possible routes for extracting data from VIVO including linked data requests – if private data is included in a VIVO, all query and export paths would also have to be locked down. Linked data requests respect the visibility level settings set on properties to govern public display, but separate more restrictive controls may be required for linked data.
    • would help assure that private or semi-private data in a VIVO could be exposed in via a SPARQL endpoint
    • If this functionality is dependent on the underlying triple store chosen for VIVO, it's not something that can easily be managed in VIVO
  • There are other possible routes for extracting data from VIVO including linked data requests – if private data is included in a VIVO, all query and export paths would also have to be locked down. Linked data requests respect the visibility level settings set on properties to govern public display, but separate more restrictive controls may be required for linked data.
    • (Eliza) can't
  • (question) Being able to get the linked data for a page as N3, Turtle, or JSON LD, not just RDF/XML
    • would be trivial – N3 and RDF/XML are now supported; Turtle is just a subset of N3 and we're not using any of the non-Turtle features of N3
    • VIVO doesn't currently support ntriples, another subset of N3 that lists each triple
    • not sure of the JSON-LD – raised by Eric Meeks
      • Ted – there's some indication that a JSON-LD serializer for Jena is being written 
    Being able to get the linked data for a page as N3, Turtle, or JSON LD, not just RDF/XML
  • Enhancing the internal VIVO SPARQL interface to support add and delete functions, not just select and construct queries – see "Web Service for the RDF API" above

...