Ansible playbook shell output

Learn ansible playbook shell output with practical examples, diagrams, and best practices. Covers shell, sed, ansible development techniques with visual explanations.

Capturing and Manipulating Shell Output in Ansible Playbooks

Hero image for Ansible playbook shell output

Learn how to effectively capture, process, and utilize shell command output within your Ansible playbooks using various modules and techniques.

Ansible is a powerful automation engine, but sometimes you need to interact with the underlying operating system directly using shell commands. Capturing the output of these commands and using it in subsequent tasks is a common requirement. This article will guide you through the best practices for executing shell commands, capturing their output, and manipulating that output for various automation scenarios.

Executing Shell Commands and Capturing Output

The primary way to run shell commands in Ansible is using the shell or command modules. While command is safer as it doesn't process variables or shell features, shell is often necessary when you need piping, redirection, or other shell-specific functionalities. To capture the output, you use the register keyword.

- name: Execute a shell command and capture output
  shell: 'ls -l /tmp'
  register: ls_output

- name: Print the captured output
  debug:
    var: ls_output.stdout_lines

Capturing and printing standard output lines from a shell command.

The register keyword stores the result of the task in a variable (e.g., ls_output). This variable is a dictionary containing several keys, including stdout, stderr, stdout_lines, stderr_lines, and rc (return code). For most text processing, stdout_lines (a list of lines) is more convenient than stdout (a single string).

Processing Output with sed and awk

When the raw shell output isn't in the desired format, you can leverage powerful text processing tools like sed and awk directly within your shell commands. This allows for on-the-fly manipulation before Ansible even registers the output.

- name: Extract specific lines using grep and sed
  shell: 'cat /etc/passwd | grep "^root" | sed -E "s/^([^:]+):x:([^:]+):.*/User: \1, UID: \2/"'
  register: user_info

- name: Print extracted user info
  debug:
    var: user_info.stdout_lines

Using grep and sed to extract and reformat specific user information from /etc/passwd.

flowchart TD
    A[Ansible Playbook Task] --> B{Execute Shell Command}
    B --> C[Raw Shell Output]
    C --> D{Pipe to `grep`}
    D --> E{Pipe to `sed`}
    E --> F[Processed Output]
    F --> G[Register Variable in Ansible]
    G --> H[Use Variable in Subsequent Tasks]

Workflow for processing shell output using grep and sed within an Ansible task.

Advanced Output Manipulation with Jinja2 Filters

Once the output is captured in an Ansible variable, you can use Jinja2 filters to perform more complex manipulations directly within your playbook. This is often more readable and maintainable than complex sed/awk one-liners, especially for JSON or YAML output.

- name: Get disk usage as JSON
  shell: 'df -h --output=source,size,used,avail,pcent | tail -n +2 | awk "{print \"{\\\"filesystem\\\":\\\"\"$1\\\",\\\"size\\\":\\\"\"$2\\\",\\\"used\\\":\\\"\"$3\\\",\\\"available\\\":\\\"\"$4\\\",\\\"percent_used\\\":\\\"\"$5\\\"}\"}" | jq -s .'
  register: disk_usage_json
  args:
    executable: /bin/bash

- name: Parse JSON output and filter
  set_fact:
    high_usage_disks: "{{ disk_usage_json.stdout | from_json | selectattr('percent_used', 'match', '9[0-9]%|100%') | list }}"

- name: Print disks with high usage
  debug:
    var: high_usage_disks

Capturing df output, converting to JSON, and then filtering with Jinja2 to find high-usage disks.

In the example above, we first use awk and jq to transform the df command's output into a JSON array of objects. Then, Ansible's from_json filter converts this string into a Python list of dictionaries. Finally, selectattr and match filters are used to find disks where percent_used is 90% or higher.