Deploying a VM to vCloud Director using API and Python


Further to my other post about getting started with vCloud Director API using Python, I thought I would write down how I deployed a VM to a vApp.

Once you have got a login function working by obtaining the x-vcloud-authorization token, you need to use the recomposeVApp REST API call, which as outlined by the documentation is done by using appending this URI, /vApp/{id}/action/recomposeVApp, to your endpoint. NOte, your endpoint is your API URL.

Here is my example code, which can be found on my github account,

def recompose_vapp(self):

    post_headers = self.headers

    xml = """<?xml version="1.0" encoding="UTF-8" standalone="yes"?>
              <Description> "api deployed vm" </Description>
                  <SourcedItem sourceDelete="false">
                      <Source href="%s"/>
              </RecomposeVAppParams>""" % (self.vapp_template)

    post = + '/action/recomposeVApp', data=xml, headers=post_headers)
    result = ET.fromstring(post.text)

Firstly, you can see I am setting the post_headers value to self.headers – these headers include the x-vcloud-authorization token. You also need to add the content type which is outlined in the documentation. It just tells the API what XML to expect, which in this case is recomposeVAppParams+xml.

Next, I have specified the XML file and I’ve cut down what I have specified and kept it simple. The only thing I have provided is the vapp_template value, which is something I have obtain earlier in my API script.

Finally, I then post this request as shown again below:

post = + '/action/recomposeVApp', data=xml, headers=post_headers)

Here I am using the python requests library and adding the required parameers, which in cludes the self.vapp URL, again obtained earlier in my API script, which appends /action/recomposeVApp. I then provide the XML which is contained in the function and add the post_headers.

You need to look at the whole API script to understand how I have built up the post request, including how I have obtained the various requirements such as vapp href and the vapp_template href.

Please do post a comment if you would like more info or use the contact me link on my blog.

Happy scripting. 🙂


Installing Python and Django


Here are my notes on how to set up Python and Django

UPDATE: These are old notes and I would definitely use virtenv when setting up Python. I will try to do a blog about this


# Depending on whether you run as root, you may need to use sudo.


sudo yum install gcc tcl tk sqlite-devel readline-devel gdbm-devel -y
sudo yum install tkinter ncurses-devel libdbi-devel tk-devel zlib-devel -y
sudo yum install openssl-devel bzip2-devel -y
sudo yum install httpd httpd-devel -y
sudo yum install mysql mysql-devel -y
tar zxvf Python-2.7.3.tgz
cd Python-2.7.3
sudo ./configure --prefix=/opt/python2.7 --with-threads --enable-shared
sudo make install
ln -s /opt/python2.7/bin/python /usr/bin/python2.7
echo '/opt/python2.7/lib'>> /etc/
echo "alias python='/opt/python2.7/bin/python'" >> /etc/bashrc
echo "alias python2.7='/opt/python2.7/bin/python'" >> /etc/bashrc

***log out and log back in at this point. This is to ensure your bash rpofie is updated with the new python location***

sh setuptools-0.6c11-py2.7.egg
tar zxvf mod_wsgi-3.3.tar.gz
cd mod_wsgi-3.3
./configure --with-python=/opt/python2.7/bin/python
make install
curl | python
curl | python
pip install Django

Using the cloudstack API

I’ve recently been working on a project to enhance a testing strategy that involved revisiting cloudstack API. I dove headfirst into the deep end, which made me realise just how much you can do with it. I’d like to share some of my experiences, in the hope of getting some feedback from people who have used CloudStack or CloudPlatform or are thinking about it and want to find out more. After plunging into the depths of the API, this is what I discovered.

First, it’s good! Everything you can do in our compute platform can be done via the API and it’s actually quite simple once you get the hang of things. I wrote my test scripts in Python, but there are a few different languages you can use and I know some guys have dived in using PHP and .NET. There is also a Java library available from Jclouds so you can make your choice depending on your comfort zone.

Second, most of the API commands are asynchronous, depending on what you’re trying to do. This means you can call an API command and move on to the next one or, like me, hold out for the async command response. This way you can do one task, wait for the response and grab certain criteria for the next command. You can then build up a fairly comprehensive set of commands. For example, you can deploy a virtual machine (VM) using the deployVirtualMachine together with the following:

  • Serviceofferingid: The serviceofferingid relates to what instance size you wish to use. You can get a list of available service offerings by using the listServiceOfferingscommand.
  • Templateid: The templateid refers to the ID of either the templates we offer to customers, or one you have preconfigured. These values can be obtained by using the listTemplates command.
  • Zoneid: The zoneid signifies the zone in which you deploy the VM. This is achieved by running the listZones command.

The good news is that once you have these values, you can use them over and over again. Or you can list all the template IDs and service offerings and pass a value from a collection if you want different size instances or want to use different templates.

Once you have built up your deployVirtualMachine API command string, you’re ready to move on to the next command – and this is just the beginning. As the deployVirtualMachine is an async command, what you really need before you can move on is the ID of the VM you have deployed.

Then you can do other things like attach disks if required, assign an IP or enable a static NAT, as these commands need a VM ID to work. This is where the queryAsyncJobResult command comes into play. Once you run your command to deploy a VM, it responds with a jobid and jobstatus. The jobid is the asyncjobid number, which you can query using the queryAsyncJobResult command. Once the deployVirtualMachine job finishes, the jobstatus changes to 1, which means you can find out all sorts of information about the VM that has been deployed. One key piece of information is the VM ID, which is required for most commands to work. Once you have this, the world is your oyster.

What I did was create a class in Python, which I was able to reuse to find out the status of the async job. Once I had this working, I was able to call the method in the class every time I wanted to find out the status of the async job.

So, let’s quickly look at attaching a disk to a virtual machine. First, you need to create a disk using createVolume and, again, apply the queryAsyncJobResult to find out the ID of the volume you have created. You will also use the diskofferingid to create a disk of a certain size (I created a 50GB disk). Once you have this information and the queryAsyncJobResult comes back, you are ready to attach the disk to the VM. Here are the required API commands to do so:

The virtualmachineid is the ID of the VM, which you can get when using the queryAsyncJobResult of the deployVirtualMachine job.

ID refers to the ID of the disk volume you can obtain when using the queryAsyncJobResult of the createVolume.

Once you have built up your createVolume and have the corresponding async job results, you’re ready to use the attachVolume commands. Hey presto! You’ve now deployed a VM and attached a disk. And so it continues…

This isn’t a tutorial or documentation explaining how to use our API but merely a blog on how I did it. I’ve just finished testing all our API commands and yes, there are many ways to skin a cat, but the point is that it can be done and you can achieve a lot with just the click of a button. It’s not all plain sailing – it never is – but once you get involved, you can easily work these things out and it does become quite simple.

I’m keen to hear from people who have used our API and what programming language they used, or from people who are thinking about using the API its capabilities. I’m happy to share some of the framework of the Python scripts I’ve put together, so get in touch if you’d like a hand getting started with our compute API.