vRealize Automation

vRealize Orchestrator – using Telnet to test port connectvitiy

vRealize Orchestrator has a TelnetClient scripting class that you can use to test port connectivity. This is far easier to use than dropping to a telnet client on a cli and test connectivity that way. Together with using the TelnetClient scripting class and a workflow that would loop and wait for the port to be open, fail gracefully or continue the workflow but notify that connectivity wasn’t available, the options the TelnetClient scripting class gives you are very handy.

Here a screenshot of what the scripting class looks like in the API browser:

Screen Shot 2015-03-08 at 11.19.18

The constructor looks like this:

var telnet = new TelnetClient(string)

The parameter is a string and takes a terminal type, which be default is VT100. Other terminal types would be ANSI, VT52, VT102, VTNT and so on. VVT stands for video terminal so mostly you can use the default value of VT100.

Here is some simple code to test telnet connectivity:

var port = 22; // number
var target = "localhost"; // string

try {

// testing for port connectivity on target host
System.log("Trying to connect to port " + port + " on host  " + target);
var telnet = new TelnetClient("vt100") ;
telnet.connect(target, port);
connectionSuccess = telnet.isConnected();
System.log("Connectivity to target host " + target + " is successful")

} catch (e) {

connectionSuccess = telnet.isConnected();
System.log("Connection failed: " + e)

} finally {



The isConnected() method returns a boolean of true or false, depending on the connection result, so together with this block of code and a loop workflow, you can test connectivity to a telnet port or wait for a socket to be up. Using a try / cath / finally statement, you catch any errors and disconnect appropriately, although strictly speaking disconnec() can go in the try statement as you’ll only ever connect in the try block so you can only ever disconnect there as well. Skin a cat in several ways on that one. Up to you 🙂

Here’s a screenshot of a workflow that puts it all together. The decision just checks for true or false which is set depending on the connection response.

Screen Shot 2015-03-08 at 11.27.05

Obviously in the sample code above, you should configure the inputs appropriately for the target and the port. This is also a good way to check if a service is available, for example check for SMTP or a REST API service by using telnet.

Happy telnetting!






vRealize Orchestrator – connecting to more than one domain using the Active Directory plugin

  1. The vRealize Orchestrator plugin only supports connecting to one active directory domain so what happens if you want to connect to more than one domain to create a user or an organizational unit? Well, it is possible to refocus the active directory plugin on the fly to do just that but you need to consider how this will work in your environment and design your AD workflows appropriately.

    Firstly, lets take a look at the vRO Active Directory plugin ‘Configure Active Directory server’ workflow that configures the plugin to connect to a domain.

    Screen Shot 2015-03-08 at 14.21.03

    When you run the workflow to add a domain, you are presented with the following form:

    Screen Shot 2015-03-08 at 14.23.08

    Now you can relate these presentation inputs to what is going on inside the workflow. So the first scripting element in the workflow determines whether you are using SSL and the second workflow element imports the certificate if you are.

    The reason why I mention this is that if you are using SSL for LDAP connectivity, it’s worth importing all your domain SSL certificates into vRO so vRO can use a valid certificate if you are to refocus the plugin to connect to a domain over SSL.

    Now if we take a look at the ‘Update configuration’ scripting element, you can see the following code:

    var configuration = new AD_ServerConfiguration();
    configuration.host = host;
    configuration.port = port;
    configuration.ldapBase = ldapBase;
    configuration.useSSL = useSSL;
    configuration.defaultDomain = defaultDomain;
    configuration.useSharedSession = useSharedSession;
    configuration.sharedUserName = sharedUserName;
    configuration.sharedUserPassword = sharedUserPassword;

    So what you can do is run this specific block of code to connect to another domain each and every time time you want to do domain operations in a specific domain in a multi-domain environment. It’s best to use a configuration element to store your domain information and by using some logic, you can determine what domain you want to use and then use the values in your configuration element to populate the required domain configuration values in the block of code above.

    However, there is one absolute key thing here to implement and that is design your workflow using the LockingSystem scripting class. Any workflow that needs to configure active directory objects will need to run the block of code that refocuses the domain connection to use, then run a workflow to create, update, read, delete AD objects. So you need to let workflow operation finish first before allowing another workflow change the focus of the domain that you are connecting to. You can do this using the LockingSystem scripting class, example as follows:

    Screen Shot 2015-03-08 at 14.39.09

    So the set lock would look like this:

    LockingSystem.lockAndWait("AdOperationLock", workflow.id);

    Then once the lock is set, the refocus domain code runs and you can change to a different domain,  then run an AD operation, then remove the lock. So the remove lock would look like this:

    LockingSystem.unlock("AdOperationLock", workflow.id);

    You can have multiple workflows using this locking system, as long as the lock you use is called the same, in other words “AdOperationLock” in the example above.

    Also, you could use an action item to lock and refocus the domain connection and use the action item for any workflows you are calling that run AD operations, making the locking and refocus aspects modularized. Remember to unlock the lock and to use the same lockid name, for example “AdOperationLock”.

    There are limitations with this as you can only ever run one workflow to read / configure / delete active directory objects at any one time as you can’t have refocusing happen when existing AD operation workflows are running. However, most of the time active directory updates are generally small operations and the workflows should run pretty quick. Therefore, if you do have a lot of AD operations, the likelihood is that you should only have a few seconds to wait until your workflow in waiting runs – workflows check every 2 seconds to see if the lock it has has been released in the vRA database, so the length of time the workflow in waiting will have to wait before it runs will be 2 seconds max plus the length of time the current workflow execution time is. However this is something you need to consider in your environment.

    There are other solutions in a multi-domain environment, for example using PowerShell scripts, but if you can keep it all in vRO, you can keep all the code in one place, and create modular workflow elements for AD operations, then it’s one less programming language and dependancy to manage in vRO.

The power of vCAC 6.0 and the workflow designer tool


I’ve been working with vCAC 6.0 and wanted to blog about its power and some differences to its market counterparts. Citrix have a great product in Cloud Portal Business Manager, CPBM 2.x, and it has a nice look and feel. It is indeed very similar to vCAC 6.0 – they both have entitlements, you can easily manage access to resources via user and group assignments and you can add multiple endpoints, but there is one major difference between CPBM 2.x and vCAC 6.x – blueprints. Well, some might say blueprints are bundles, but there is functionality in vCAC 6.x that isn’t in CPBM 2.x. Blueprints are really just the beginning, for example you can create build profiles that get assigned to a blueprint that can, for example, run customisation scripts during the build process, i.e. use a company standard naming convention or run pre and post build scripts on the virtual machine you are provisioning. Here is a very good website that details the power of build scripts.

However, it doesn’t stop there. Using service blueprints and you begin to open up the power of vCAC. Service blueprints leverage vCO workflows. vCO, vCenter Orchestrator, provides a powerful Microsoft SSIS like workflow designer tool, which you can pretty much do anything with. VMware call this XaaS – anything as a service. From running a database backup, to connecting to a CMDB, running a multi VM provisioning workflow, creating objects in Active Directory or use one of many vCO plugins, for example configuring a f5 BIG IP device using the vCO f5 plugin.

Workflows can be quite complex, or you can use out of the box workflows. Once you have created your service blueprint, you can use entitlements which is a mechanism to allow end users to consume these blueprints.

Here is a quick look at how you can leverage vCO workflows using blue prints:

vco - advco - ad workflow

The above shows you a simple vCO workflow that creates an Active Directory organisational unit.

vco - ad 1

You can see here the the vCO workflow asks for 2 inputs  –  ouName and container. Now these would be user driven, but you could use a vCO workflow wrapper to get these values from an external source, i.e. a CMDB that determines the OU name using a site ID prefix.

vco - ad 2

And above you can see the javascript code that calls the method that creates the objects in Active Directory. This is a default plugin provided by vCO. Pretty simple code really.

Now you can get vCAC to provide this workflow as a catalogue item that a user can consume. You can see here that when you add a workflow in vCAC, you simply chose the vCO workflow that you want, in this case ‘Create an origanizational Unit’:

vco - ad 3

Once you have added the vCO workflow as a blueprint, you simply publish this for consumption:

ad- publish vco workflow

Now the workflow is published, you just need to add the workflow to a catalogue and configure who can consume this by using entitlements:

ad - publish 1

Above you can see that you chose a service, in this case a pre-configured ‘generic’ service, and click on manage catalog items:

ad - publish 2

Then simply chose the new catalogue item, which is the vCO workflow that we discussed earlier. Once you have configured the catalogue item in a service, the  users and groups that have been granted access to the services can then chose the new catalogue item from their service catalogues when they’re logged in to their vCAC portal, as shown below:

ad - publish 3

When the user requests the new catalogue item, they are prompted for the ouContainer and ouName:

ad - consume

Okay, I’ve skipped over a few of the more intricate configuration aspects, but you get the drift hopefully AND this is only a active directory container! (Yep, I too am thinking of a use case …). But, imagine the endless possibilities you can have, and I haven’t even mentioned vCloud Application Director! Wow… powerful stuff.

For information on vCO and vCAC, see these 2 great websites: