Using Cisco ACI as Inventory for Ansible Tower

May 11, 2020 by Zach Peterson

Managing a software-defined networking (SDN) solution in Ansible can be tricky. In most use cases, Ansible communicates with each managed node individually. However, in a SDN scenario, Ansible is most likely managing policy on a controller appliance, which ultimately may make changes to thousands of network endpoints behind it.

But what about these endpoints behind that controller abstraction? Wouldn't it be best if Ansible Tower had visibility to every node, in addition to the controller? Not to run playbooks directly against those nodes, but for the following reasons:

  • Many organizations need to perform capacity planning against current and projected license usage. Ansible Tower provides license usage data, but it's not accurate if Ansible Tower only knows about the SDN controller(s).
  • Networking teams are still asked for physical node information by other IT areas, even if the focus is on a software-defined fabric. What if the best system to get that data from is Ansible Tower? You may not want a CMDB probe (for example) to communicate directly with the SDN controller, or Ansible Tower may already be home to similar data about other systems.


Inventory plugin for Cisco ACI

I wrote an Ansible inventory plugin that solves these issues for Cisco ACI (Application Centric Infrastructure) implementations, as an example of a way you can develop inventory plugins yourself in situations like these. It queries physical inventory elements (spine switches, leaf switches, and APIC controllers), and saves some basic data about them as Ansible host variables.


Using the plugin with Ansible Tower

Consuming this in Ansible Tower is a multi-step process, involving the use of Collections. Let’s set it up using the Cisco DevNet ACI Sandbox as an example. Your Ansible Tower environment will need to be able to access Ansible Galaxy. You will also need to log in to DevNet and access the sandbox’s Topology page to get access credentials. 


Prepare your Project

Create a requirements file at collections/requirements.yml that looks like this:

  - zjpeterson.aci

Then, create a YAML inventory called aci.yml that looks like this:

plugin: zjpeterson.aci.aci_inventory
validate_certs: no

Commit these two files in a project using your SCM of choice, and add it as a Project in Ansible Tower.


Prepare your Credential

First, there needs to be a Credential Type in place that can output ACI_USERNAME and ACI_PASSWORD environment variables for the plugin to use. The collection contains a role that will do this for you, or you can do the following minimum configuration manually:

  • Go to Administration > Credential Types > New Credential Type
  • Name the new Type “Cisco ACI” (or something similar)
  • Input the following for “Input Configuration” (JSON)
 "fields": [
   "id": "username",
   "type": "string",
   "label": "APIC Username",
   "secret": false
   "id": "password",
   "type": "string",
   "label": "APIC Password",
   "secret": true
 "required": [
  • Input the following for “Injector Configuration” (JSON)
 "env": {
  "ACI_PASSWORD": "{{ password }}",
  "ACI_USERNAME": "{{ username }}"
  • Save

Now, with the Credential Type in place, you can go to Resources > Credentials and make a new Credential. The end result will look like this:

ACI blog 1


Prepare your Ansible Tower Inventory

Create a new Ansible Tower Inventory under Resources > Inventories. Give your inventory a name and organization, then click Save. Go to the Sources tab, and create a new Source.

Do all of the following:

  • Give your Source a name
  • In the Source dropdown, choose “Sourced from a Project”
  • In the Project dropdown, choose the Project that you added earlier
  • Under Inventory File, enter aci.yml. You may or may not see it in the dropdown. If you don’t, you can just type it in.
  • Check the “Overwrite” and “Overwrite Variables” boxes in order to overwrite anything that may be changed or decommissioned upon sync.
  • Save

The result should look like this:

ACI blog 2

Now, when you sync this source, the plugin will do its work. After a successful sync, you have an inventory that looks something like this:

ACI blog 3

And, under each host, you have some hostvars like this:

ACI blog 4


Wrapping up

Once all the setup has been completed, Ansible Tower now has a complete inventory of the physical switches behind the given APIC, and some basic information about them stored as host variables.

Without any further intervention, the first problem from above should be solved. The presence of the additional nodes in Ansible Tower appropriately increments the node count shown on the Settings > License screen.

For the second problem, the hostvars data is available for querying via the Ansible Tower REST API. The Collection contains an example of how to do this, written in Python. Whatever technology you choose, here is a suggested approach for API calls to make against Ansible Tower:

  • GET api/v2/inventories/<inventory id>/hosts/
    One time, to get all the hosts in the inventory
  • GET api/v2/hosts/<host id>/variable_data/
    Once for each host found in the above call

This project is available on Ansible Galaxy as part of my ACI Ansible Collection, or you can refer to the upstream GitHub source repository. As a disclaimer, please keep in mind that this is community content, per the Ansible Certified Content FAQ: Collections published to Ansible Galaxy are the latest content published by the Ansible community and have no joint support claims associated.

If you're interested in using this, please have a look through the full plugin documentation for some additional usage details.

If you want more information on developing inventory plugins, you can check out the Ansible Developer Guide. I also recommend checking out a presentation from AnsibleFest 2019 called Managing Meaningful Inventories to put the concept in perspective.

Finally, I encourage you to take a look at a recent blog post from Cisco, "What's New and Exciting on Cisco ACI with Red Hat Ansible."



Zach Peterson

Zach Peterson is a Senior Consultant at Red Hat, focused on Ansible Network Automation. He is based in Minneapolis, MN. Zach came to Red Hat after several years at a Fortune 500 enterprise, and holds Red Hat and Cisco certifications. You can find his work at:


See All

rss-icon  RSS Feed