Ready to Learn?Ex Libris products all provide open APIs

Tech Blog

 

Process Orchestration with the Set and Job APIs

Josh Weisman on June 12th, 2017

Over the past year, Alma's job and set APIs have been developed to the point where many process orchestration scenarios are now possible. According to Wikipedia, “Orchestration is the automated arrangement, coordination, and management of computer systems, middleware and services.”

Alma performs bulk work on sets of records, such as bibliographic records, users, items, etc. The work to be performed is defined in jobs of various types (bibliographic record export, remote storage, metadata import, etc.) Alma facilitates orchestration workflows with APIs to manage jobs and sets. Key APIs include:

  • Create, update and delete sets, both itemized and logical
  • Manage set members
  • Retrieve job and job instance details
  • Submit a job

In this blog post, we’ll walk through a common orchestration scenario by creating a bash shell script which performs the following steps:

  • Create a new logical set
  • Submit a job to export the members of the set
  • Monitor the job until it’s complete
  • Download the exported file(s)
  • Delete the set

The natural continuation of this scenario might include editing the records and submitting them to a metadata import job.

Create a new logical set

We use the create set API to create our new set. Alma supports creating both itemized and logical sets. Itemized sets allow you to manually specify the member records. Logical sets allow you to define a query which is executed at run time. Any records which answer the query are included in the set.

To help specify the logical query, the Developer Network includes comprehensive documentation on all available fields and corresponding operators. 

Process Automation - Logical Query Reference

 

For our example, we’ll create a set of bibliographic records which were published later than the year 2000 and where the keywords include “Binghamton”. We can use the query documentation, or we can build the set using the Alma user interface and then retrieve the corresponding XML using the set details API. For our scenario, the set XML looks like this:

<set>
  <name>Recent Binghamton Material</name>
  <description></description>
  <type desc="Logical">LOGICAL</type>
  <content desc="All Titles">BIB_MMS</content>
  <private desc="Yes">true</private>
  <status desc="Active">ACTIVE</status>
  <query>BIB_MMS where BIB_MMS (all CONTAIN "Binghamton" AND main_pub_date GREATER_EQUAL "2000")
  </query>
</set>

Submit the job

After the set is created, we want to submit a job which will export all the records in the set. The submit a job API requires that we include all of the job's parameters in the payload. Alma provides a shortcut to help compose the payload. In the Alma UI, use the “Run a job” function to select a job and fill out the parameters. In the confirmation screen, expand the API information section. The URL and job payload can be used as a base for our script. We just need to update the set ID parameter with the ID of the set we created in the previous step.

Process Automation - Run Job API

Monitor the job instance & download the file

When a job is submitted, Alma queues the execution. The submit job API returns an "additional_info" node with a link attribute. The link is to the job instance created by our API call. To know when the job is complete, we can poll using the job instance details API. We’re looking for a COMPLETED* status.

When the job is completed successfully, we want to extract the file name of the exported file. The job instance returns this information in a counter node with a type of “c.jobs.bibExport.link.” 

We can then use an FTP program such as LFTP to log in to the SFTP server and download the specified file(s).

lftp -c "connect sftp://$FTP_USER:$FTP_PASS@my.ftp.com /export; mget *$FILE_ID*"

Cleaning up and putting it all together

It’s always good to clean up after ourselves. In this case, we can delete the set that we created using the delete set API.

Putting it all together, the output of our script looks like this:

$ API_KEY=l7xxxxxx FTP_USER=user FTP_PASS=pass ./export-set.sh 
Creating new set
Setting the job payload to use set 296000000000561
Submitting the export job
Checking the job status at https://api-na.hosted.exlibrisgroup.com/almaws/v1/conf/jobs/M26713650000011/instances/2960000000561
Job progress: 100; Job status: COMPLETED_SUCCESS
Downloading the files for BIBLIOGRAPHIC_296020000000561_1.xml
Deleting set 29600000000561
Done

The job and set APIs in Alma allow any number of orchestration scenarios.

The entire script is available in this Github Gist.

Github