Workflow automation tools could mean the difference between efficient and disrupted, error-prone tasks.
What's on your typical data center to-do list? Provision a server, allocate storage, assign network access rights to users, change control, monitor performance for service level agreement (SLA) adherence and countless other operations. Since these processes are discrete, IT staff perform each one manually as needed. This leaves significant potential for errors and oversights, which can lead to poor performance, wasted resources, security vulnerabilities and other problems.
IT process automation allows organizations to standardize certain processes and apply them uniformly and automatically. This speeds process implementation and reduces errors, which gives IT staff more time for complex work, helps lower expenses and generates records that meet regulatory compliance requirements.
An IT workflow is an organized pattern of repeatable activities -- the various discrete steps and decision-making points needed to accomplish a task. For example, provisioning a virtual machine (VM) onto a physical server, testing system performance for SLA compliance or creating a thin-provisioned logical unit number (LUN) for an application are common IT processes that can be distilled into automated workflows.
There are countless permutations to this idea, but every variation in the steps -- no matter how unique -- is a workflow. A workflow, for example, might start with a service request ticket that requires IT staff to assess resource requirements, locate an adequate platform, implement the request, perform testing or verification of the work, and then close the ticket.
The goal of IT workflow automation is to minimize dependence on human interaction. IT administrators should not need to test or verify every step of a workflow, but there may be situations when human interaction is required. For instance, if an automation tool encounters an error that prevents completion of the workflow, such as inadequate resources, it alerts an administrator for intervention.
Many automation tools integrate with systems management products for insight into the data center. The reporting might flag potential problems. For example, if a systems management tool says there is only 1 terabyte of free storage and the workflow automation system creates 500 MB of LUNs each month, the combined data show possible storage exhaustion in two months or less -- prompting IT to add capacity and review existing use.
Scripts vs. automation tools
Most organizations rely on scripts for basic automation tasks, but should they?
Scripting is built into the operating system, such as PowerShell scripts in Windows Server environments. However, scripts demand a modicum of programming savvy and a keen knowledge of the infrastructure, directories, server names and other granular data center details. Existing scripts are difficult to update with changes -- particularly without much supporting documentation.
By comparison, automation tools provide codeless, graphics-based, drag-and-drop workflow engines that include checkbox-style options. Modern automation tools generate scripts, convert the workflow to standardized languages (such as XML), integrate annotations and documentation for the staff, and help an organization standardize components and approaches. This makes workflow automation platforms easier and more intuitive than scripts to update and modify over time.
The inflection point for moving from scripting to automation tools isn't size -- a particular number of servers, users or applications -- as much as it is variation. For example, small companies may manage an astounding number of systems and depend on a myriad of frequently changing tasks, while large organizations might use data center applications and servers where change is infrequent.
Organizations must also consider compliance. Automation tools produce detailed task logs, which can prove compliance with regulations or internal corporate policy. Logs prove that newly provisioned VMs are properly licensed and use the standard set of computing resources, for example.
Don't set it and then forget it
Evaluate the benefits of a prospective tool against the time, effort and cost of implementing it. For example, if the IT staff time that is freed up to handle more tasks and tackle more complex projects is greater than the time needed to learn and manage the tool, then the investment in data center automation is worthwhile.
Good automation tools should be dynamic -- allowing an organization to define workflows and stick with them, then optimize them when the data center needs change.
Don't ask "how often?" Ask "how easy?" IT administrators will need to optimize and change workflows on demand, so the automation tool you purchase should make changes easy to implement, test and document.
Look for tools that provide seamless changes with minimal disruption. An updated workflow, for example, should not confuse existing task requests waiting in the queue. Changes should also be well-documented in order to educate IT staff and to ensure adherence to compliance requirements. Test and vet all workflow changes in a lab environment before using them in production.
Workflow automation tools are supposed to make life easier, not harder. That means a minimum of management. Every tool has a required learning curve -- see that it is within reason. And make sure the tool adequately integrates with the existing systems management tools.