Ben Abrahamse, Chair of the Standing Committee on Applications, has agreed to convene and coordinate this subgroup activity.  The API review/testing group will be a collaboration between interested members of SCA and any of the ISNI pilot participants.  A technical contact from ISNI will also participate as needed. 

Members:  Alison Thornton, Amber Billey, Christine Fernsebner Eslao, Isabel Quintana, Lucas Mak, Susan Radovsky, Vitus Tang

The group began its work in October 2017 and anticipates completing its work by the first week in February.

Activities and deliverables include:

    • Review ISNI API documentation to learn functionality and gaps.  For members there are two ISNI APIs :
      1. SRU Search API for members
      2. The AtomPub API:  the AtomPub API allows members and RAG’s to submit ISNI requests using the ISNI Request schema.
    • Test functionality according to varied use cases
    • Determine ease of use and identify support or additional documentation that would be necessary for PCC members availing themselves of this self-service model
    • Identify and prioritize development roadmap for ISNI for maximum benefit of API interaction with ISNI
    • Document lessons learned, activities that require attention by others

API testing use cases:

HARVARD: HUP authors

MICHIGAN STATE: ETD authors

STANFORD: video game developers

COLUMBIA: department names


If you want to help test but don't have a project, contact other members of this group to see if you can help out


Google Folder: If you use Google drive to collaborate on documents or files, please create them in this folder: https://drive.google.com/drive/folders/1dQzBnR9nCkOqVB8Lm0Nf1jDVFDUrx1IX?usp=sharing 

If you have already created files or documents in google drive, please share them with Michael Beckett and I will add access from this folder (access from other folders/accounts will be unaffected).

ISNI APH relevant documentation


ISNI documention: http://www.isni.org/content/documents-related-database-enquiry

Suggested starting point: ISNI SRU Search API: Guidelines and Examples

The Search API is using the SRU standard, version 1.1  and is also available to the general public.

An interface for constructing complex queries is provided at: http://isni.oclc.org/sru/DB=1.2/
(
If using a web browser to view results, view page source to see XML output. In most browsers, you can access this by right-clicking on the page you're viewing and choosing "view page source" from the right-click menu.)

Submission

ISNI documentation: 

http://www.isni.org/content/documents-related-data-submission

http://www.isni.org/content/documents-related-data-submission-output

Suggested starting point: ISNI Atom Pub API guidelines.pdf

The AtomPub API submits single requests at a time, but could be programmed at a local level to automate submission of sequential requests as a way to make it a batch process.

Working documents



Meeting notes 2/23/18

-- Testing issues

 AtomPub issues

Insufficient error messages: "Data not accepted" is not a useful response

"Requester identifier of resource" required–local identifier?

-- Please add links to documents coming out of the test to Duraspace

-- Is there are set of parameters that we can give the API that will provide results in the same sequence/order as the web page

-- How useful is the search API, given its obvious limitations? Should we just recommend avoiding its use?

-- "Set identifier" – do identified sets persists, can they be combined into more complex searches?

-- Question about creating request. Elements "creation class", what is it used for? Should this use a controlled vocabulary? What does attribute "src" refer to (ie source of data vs source of vocabulary) Word document on ISNI website with list of roles that can be used 

-- Is there a comprehensive list of all available "MARC-esque" tags in Pica http://www.isni.org/content/documents-related-data-submission

-- When submitting XML, headers are standard. Examples of valid XML useful. http://isni.oclc.org:8080/isni/atompub/


Meeting notes 3/23/18

Progress reports:

-- Vitus: tested individual records via API; testing who can write a batch of submissions

-- Lucas:want to create a script to convert MARCXML (MARC bibliographic) into submission format for batch submission

-- Christine: haven't done much testing; sent email to OCLC to deal with this issue. voliunteering to relay problems, be point of contact. working on python scripting xform into batch submit, from spreadsheet

-- Brian: good handle on name creation, want to streamline workflow on related identifiers, interested in scripts from spreadsheet

-- Batch submissions : does ISNI have preferred methods for batch submission; what is relationship of mapping group to our group?;  scripts, tools, etc. for batch submission. possibility of meeting with our group?

-- Email 1-2 page summary of testing and results, questions, best pracitces, etc. to: babraham@mit.edu


Testing notes from VT  Stanford

 

I have tested the ISNI SRU Search API and have these observtions:

 

  • The documentation is in a Word document, titled “ISNI SRU Search API: Guidelines and Examples”. It has rather limited information.

 

  • The first example (search by name keyword) doesn’t work:

http://isni.oclc.nl/sru/?query=pica.nw+%3D+%22maloy%2Brebecca%22&operation=searchRetrieve&recordSchema=isni-b

Zero record is retrieved. It would work if “%2B” is replaced with either “%20” or “+”.

 

  • There is no information about how to specify a sort order for the returned records. Other members of the API group have noted that the order of the returned records from the API is different from the order of the records one gets when using the web interface. The documentation should include this type of information.

 

I have also tested the ISNI AtomPub API, which is used to submit a request for ISNI assignment. My test cases were:

 

  • Add profile link and institution affiliation info for Stanford faculty
  • Request ISNI for video game designers

 

The documentation (Word file with title “ISNI Atom Pub API : Guidelines, Examples and Assignment Rules”) is much more extensive (75 pages) compared to the Search API documentation. However, acquiring an understanding of how this API works, based on this documentation, is not an easy task. Several readings are required, supplemented by hands-on testings. I, and other API subgroup members, had encountered these stumbling blocks:

 

  • Access to the API is through whitelisting of IP addresses, which is rather awkward as people often use laptops that don’t have a constant IP address. An API key would be more convenient, and would probably require less time to set up.
  • The presumably simple task of finding the URL of the Request schema so that it can be plugged in to your XML editor to create request documents that meet the API’s requirement had turned out to be not so simple. It took quite a bit of digging in ISNI’s documentation area in order to find that.
  • Cryptic messages in the API’s response. For example, a message like this:

 

<responseRecord>
 <noISNI> 
   <reason>no match initial database</reason>
 </noISNI>
 <information>
   data not accepted 
 </information>

 

               leaves one wondering what happened and what one has to do in order to have a successful request. Result, a lot of digging in documentation and head scratching.

 

What I haven’t been able to test yet is how to make these APIs do things in a batch way, which is what I think many, if not most, PCC libraries will be turning to these APIs for. And the reason for that is I don’t have enough programming skills to know how to set up batch processes with these API’s, for example to search a list of names in a batch way or to submit a list of names for ISNI assignment in a batch way. I wish ISNI would provide more support for that, for example have a document that outlines what one has to do to set up a batch processing infrastructure, or better yet to have scripts (or script templates) that one can copy and reuse.





 




https://drive.google.com/drive/folders/11RnMibXd0fCAujmqDKAUjnF1CpBUFx-a?usp=sharing

  • No labels