Tech Blog

Process Orchestration with the Set and Job APIs

Over the past year, Alma’s job and set APIs have been developed to the point where many process orchestration scenarios are now possible. According to Wikipedia, “Orchestration is the automated arrangement, coordination, and management of computer systems, middleware and services.”

Alma performs bulk work on sets of records, such as bibliographic records, users, items, etc. The work to be performed is defined in jobs of various types (bibliographic record export, remote storage, metadata import, etc.) Alma facilitates orchestration workflows with APIs to manage jobs and sets. Key APIs include:

  • Create, update and delete sets, both itemized and logical
  • Manage set members
  • Retrieve job and job instance details
  • Submit a job

In this blog post, we’ll walk through a common orchestration scenario by creating a bash shell script which performs the following steps:

  • Create a new logical set
  • Submit a job to export the members of the set
  • Monitor the job until it’s complete
  • Download the exported file(s)
  • Delete the set

The natural continuation of this scenario might include editing the records and submitting them to a metadata import job.

Create a new logical set

We use the create set API to create our new set. Alma supports creating both itemized and logical sets. Itemized sets allow you to manually specify the member records. Logical sets allow you to define a query which is executed at run time. Any records which answer the query are included in the set.

To help specify the logical query, the Developer Network includes comprehensive documentation on all available fields and corresponding operators.

For our example, we’ll create a set of bibliographic records which were published later than the year 2000 and where the keywords include “Binghamton”. We can use the query documentation, or we can build the set using the Alma user interface and then retrieve the corresponding XML using the set details API. For our scenario, the set XML looks like this:

  <name>Recent Binghamton Material</name>
  <type desc="Logical">LOGICAL</type>
  <content desc="All Titles">BIB_MMS</content>
  <private desc="Yes">true</private>
  <status desc="Active">ACTIVE</status>
  <query>BIB_MMS where BIB_MMS (all CONTAIN "Binghamton" AND main_pub_date GREATER_EQUAL "2000")

Submit the job

After the set is created, we want to submit a job which will export all the records in the set. The submit a job API requires that we include all of the job’s parameters in the payload. Alma provides a shortcut to help compose the payload. In the Alma UI, use the “Run a job” function to select a job and fill out the parameters. In the confirmation screen, expand the API information section. The URL and job payload can be used as a base for our script. We just need to update the set ID parameter with the ID of the set we created in the previous step.

Monitor the job instance & download the file

When a job is submitted, Alma queues the execution. The submit job API returns an “additional_info” node with a link attribute. The link is to the job instance created by our API call. To know when the job is complete, we can poll using the job instance details API. We’re looking for a COMPLETED* status.

When the job is completed successfully, we want to extract the file name of the exported file. The job instance returns this information in a counter node with a type of “”

We can then use an FTP program such as LFTP to log in to the SFTP server and download the specified file(s).

lftp -c "connect sftp://$FTP_USER:$ /export; mget *$FILE_ID*"

Cleaning up and putting it all together

It’s always good to clean up after ourselves. In this case, we can delete the set that we created using the delete set API.

Putting it all together, the output of our script looks like this:

$ API_KEY=l7xxxxxx FTP_USER=user FTP_PASS=pass ./ 
Creating new set
Setting the job payload to use set 296000000000561
Submitting the export job
Checking the job status at
Job progress: 100; Job status: COMPLETED_SUCCESS
Downloading the files for BIBLIOGRAPHIC_296020000000561_1.xml
Deleting set 29600000000561

The job and set APIs in Alma allow any number of orchestration scenarios.

The entire script is available in this Github Gist.


4 Replies to “Process Orchestration with the Set and Job APIs”

  1. We have extensively used the techniques outlined in this article with excellent results. We had a case recently where the creation of a set resulted in 0 records (an empty set). The Job Status was COMPLETED_SUCCESS, so our process continued with the next step to update Holdings using normalization rules. This step for some reason decided to try to update 3,760,728 records. It aborted at some point and we can’t tell what records were incorrectly update. We’ve found at least 20,770 examples but don’t know how many others were updated incorrectly, specifically deleting call number subfield i. The situation is out in more detail in case number 00701617. We would like Ex Libris’ help in resolving this situation.


  2. I have tested further and realize that the create set api doesn’t have a job status for logical sets. The set is created successfully with 0 members as I’ve proved in testing again. I will change code to look for number of members. This isn’t an api problem per se, but I can see that I should exit the script if the set created has 0 members. But it does seem like an Alma issue that a job to process such a set would actually update records.

Leave a Reply