Contents

How to execute Jenkins jobs programatically, part 2

Contents

In this post we saw how to trigger a Jenkins job in a programatic way, with a very simple source code snippet in Groovy to illustrate how to do it. In this post we’ll revisit the same topic, but with a Bash script. As you’ll see, the script is more sophisticated as it not only runs the job, but also waits the job to complete and retrieve an hypothetical file hosted as an artifact.

Have a look at the comments for better understanding.

 1
 2
 3
 4
 5
 6
 7
 8
 9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
jenkins_server='https://www.mysuperduperjenkins.com'

# This part is pretty the same as the method showed in the previous post to retrieve a breadcrumb from Jenkins. Here we use a cookie jar,
# which is useful if you don't need to login to access Jenkins. If you require login, please use the method in the previous post.
cookie_jar="$(mktemp)"
breadcrumb=$(curl -s --cookie-jar "$cookie_jar" "$jenkins_server"/crumbIssuer/api/xml?xpath=concat\(//crumbRequestField,%22:%22,//crumb\))
echo "[INFO] Got Breadcrumb: $breadcrumb"

# Launch the job with the obtained breadcrumb and cookie. In this example, we show how to launch a parameterized job:
queued_job_headers="$(curl -s -H "$breadcrumb" -X POST --cookie "$cookie_jar" \
--data 'foo1=bar1' \
--data 'foo2=bar2' \
--data 'foo3=bar3' \
-D - "$jenkins_server/job/mySuperDuperJob/buildWithParameters")"

# When you trigger a job, Jenkins returns the queued job URL in a header named "Location", so we pick it and use this URL
# to monitor the queue status. When the job starts executing, ".executable.url" will be populated and we can continue.
queue_url=$(echo "$queued_job_headers" | grep Location | cut -d' ' -f 2 | tr -d '\r')
job_url='null'
while [ "$job_url" == 'null' ]
do
    echo "[INFO] Polling ${queue_url}api/json to retrieve executed job..."
    sleep 3
    job_url="$(curl -s "${queue_url}api/json" | jq -r .executable.url)"
done

# The job is now executing and we have its URL, so we wait for it to finish, again with the API. This is done
# by checking the ".result" field in the returned JSON.
echo "[INFO] Triggered job: $job_url"
result='null'
while [ "$result" == 'null' ]
do
    echo "[INFO] Polling ${job_url}api/json to check whether it's finished..."
    sleep 30
    result="$(curl -s "${job_url}api/json" | jq -r .result)"
done

# The job has finished, so we check the status, that can be SUCCESS, FAILED, UNSTABLE...
# we can do a lot of things depending on each case.
if [ "$result" != 'SUCCESS' ]
then
    echo "[ERROR] Job finished with status $result, please check: ${job_url}console" 1>&2
    exit 1
fi

# As last step, we can retrieve an Artifact called foo.txt from the job execution URL.
# At this point, job can only be in SUCCESS state.
id="$(curl -s "${job_url}api/json" | jq -r .id)"
curl "${job_url}artifact/foo.txt" -o "foo_$id.txt"
echo "[INFO] Created foo_$id.txt:"
cat "foo_$id.txt"

# Ta-da!!
echo "[INFO] Finished SUCCESSFULLY!"

I hope this snippet is useful to you as well!