Governance, Risk, and Compliance (GRC) 2012 Prediction & Reality
Experton Group's GRC predictions were on target across the board. The various agencies and branches of government globally are still developing and implementing new policies and directives faster than ever. These regulatory changes, like the Dodd-Frank Wall Street Reform and Consumer Protection Act and the Patient Protection and Affordable Care Act (Obamacare), will be producing rules and regulations for years to come and will have negative corporate impacts throughout the decade. Moreover, it is unlikely European banks are sufficiently protected against the impacts of sovereign risk exposures that still exist globally. The U.S. has delayed the implementation of Basel III, which will help limit the risk of too much money concentrated in too few sovereign securities. The EU may do the same until 2014 at a minimum, which should help reduce the risk exposure – the opposite effect of why the law exists. Thus, executives can plan on GRC uncertainties and the ripple effect of regulatory changes to flow well beyond 2012 and the next five years. This is not good news for the business community or a global economic recovery. Nonetheless, vendors enhanced their GRC products with more analytics, policy-based options, key indicators, and metrics. In addition, the market has matured to the point where industry titans such as IBM Corp., Oracle Corp., SAP AG, and SAS Institute provide stability, integrated solutions, and thought leadership. And even with all the acquisitions, there are still more than 400 small to mid-sized vendors offering GRC point solutions.
The ERM (Enterprise Risk Management) mantra has gained some traction but because of economic and regulatory uncertainties, it is not gaining executive support to spend more resources in this area. In addition, the majority of companies have a disconnect between their GRC programs and legal activities. Instead, executives that are focused in this area have been working to drive standardization and process maturity across the enterprise. Overall, as expected, funding was tight and progress was slow.
DR/BC remained a top ERM priority in 2012 with IT executives pursuing traditional and new options (such as cloud computing, deduplication and snapshots) designed to reduce costs, resources and processing time. Experton Group did not see an increased focus on DR/BC, leaving too many companies exposed to the vagaries of man-made or natural disasters. This year it was Hurricane Sandy that caught many unprepared for the consequences caused by the extent and duration of the damage, especially the extended loss of electricity.
Companies never expected long delays in the restoration of electricity and those that did not have sufficient emergency backups or backup sites were unable to operate for more than a week (and some even beyond a month). NYU Langone Medical Center's basement (where the Smilow Research Center was located) and lower floors flooded, resulting in loss of years of research. Meanwhile, the New York Stock Exchange closed for two days as its DR/BC contingency plan proved to be flawed and not fully tested. Even when they got back up, many securities firms struggled to function under backup power and poor waste management capabilities.
Experton Group notes that more than one-third of companies do not have an adequate DR/BC plan in place in the Western World and over two-thirds in the Middle East, and amongst those that do, there still remain large swaths of business processes not effectively covered. According to the 2011 European Disaster Recovery survey, 74 percent of organizations are not very confident that they can fully recover after a disaster; the top causes of disasters in Europe were hardware failures (61 percent), power failures (42 percent) and data corruption (35 percent). DR/BC is an insurance policy and companies should be willing to spend a small amount (two to four percent of their IT budgets) to minimize the risk of business failure.
Investments in ERM enhancements were limited again this year even though most companies have put in place a formal ERM program or structure. One encouraging note is that the ERM program usually includes an IT risk assessment. But Experton Group remains amazed at the number of companies that do not commit to ERM initiatives and fall behind in the pursuit of protecting the entire enterprise from potential disasters. Furthermore, funding remained constrained for most and projects are moving forward at a snail's pace. With 2013 expected to be another mediocre economic year Experton Group expects a small minority of companies to increase their investments in ERM while most will continue apace or reduce commitments.
Cloud and Outsourcing 2012 Prediction & Reality
Experton Group accurately predicted 2012 as the year cloud computing would gain momentum and become mainstream and that the outsourcing markets would stabilize with limited growth. More than 80 percent of U.S. companies currently deploy cloud computing solutions or services, according to recent surveys. About 70 percent of cloud users take advantage of software-as-a-service (SaaS) offerings. The primary motivator was saving capital and/or operating expenses. The top four cloud application usages are email (47 percent of respondents), Web presence (45 percent), virtual desktop (44 percent) and collaboration tools (42 percent). SMB firms are moving more rapidly to clouds than large enterprises and they are primarily users of off-premise private and public clouds. The larger firms are concentrating on in-house infrastructure- and platform-as-a-service solutions along with some hybrid off-site offerings. Security, data portability and Internet downtime are the top user concerns when using off-premise clouds. However, automation, integration, management and training also are issues business and IT executives express are areas of concern.
Overall the outsourcing markets expanded at a rate similar to 2011 at less than 10 percent. To maintain market share many of the major outsourcing providers reduced their basic and blended rates slightly, which has impacted their margins as their costs have increased. Outsourcers are also including cloud computing solutions and services as another opportunity. It is currently a small component of overall revenues but it should be growing as companies seek to cut costs and improve time to market. As predicted, the lack of standards in definitions by vendors for cloud computing as well as the range of offerings and types of offerings still continues to obfuscate the meaning of clouds. Experton Group expects this to continue to evolve but slowly stabilize as winners in the various spaces, such as Amazon Inc. and Salesforce.com Inc., set the standards.
The enterprise consolidation and virtualization initiatives gained momentum in 2012; yet there remains a long way to go before most IT shops reach optimization. Nonetheless, IT executives are moving forward and acceptance is gaining speed. Experton Group still finds most enterprise data centers are still in the early stages of virtualization and therefore utilization levels remain less than 20 percent on average. Moreover, there is not general knowledge of usage levels or the overall costs associated with running old equipment and the lack of virtualization. This fact is true for storage as well as servers, which translates to wasted hardware, software and staffing expenses. Until management fully understands the impact of current operational characteristics, data centers will not become optimized. Moving to cloud solutions are not the answer to the problem – building best practices needs to precede the move to the cloud. Those enterprises that are creating cloud environments in-house are primarily building infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) solutions. Long term these platforms will reduce personnel costs but initially costs will rise as the transitional costs, including training, add to existing operating expenses.
Experton Group was correct about this being a decent year for offshoring, as pointed out above. Outsourcing providers invested more into targeted areas such as application verticals, application development, business process outsourcing, call centers/help desks and cloud computing to better differentiate themselves. Experton Group was also on target in terms of outsourcers moving to global sourcing by acquiring more resources local to the markets they served, so as to operate with a 15-20 percent blended rate, with top talent being local. Pricing models are starting to change to deal with the cloud computing models. The models are slowly moving to "pay for performance" or fixed price and away from the traditional "per day" structure. India still leads in most areas of outsourcing with China in second place and growing rapidly. Business process outsourcing (BPO) is still dominated by U.S.-based corporations.
Data Center Effectiveness Metrics
Most data center executives are improving their data center effectiveness and efficiency within and across data centers as part of their optimization efforts. However, what holds a number of executives back from making major strides in optimization planning are the questions of where to begin and what the metrics, paybacks, and desired targets are. Experton Group finds IT executives want a baseline of best practices and best of breed targets against which to compare their current operations.
Experton Group studies have found that most enterprises are not run at maximum effectiveness levels. As a result, a typical data center operation has the prospect of reducing costs by up to 40 percent and cutting energy expenditures by up to 80 percent within an existing data center. If one includes data center consolidation and use of cloud computing, the savings can be even greater.
IT executives that meet the proposed server metrics can reduce their overall operations costs by 25 percent or more. Depending on the current state of the equipment and software, IT executives can achieve significantly greater savings. For example, IT executives that have utilized the standard scale out approach to adding x86-architected servers can potentially reduce the number of servers in operation by 50 percent or more. Server virtualization can enhance the savings to a point greater than 75 percent.
There have been tremendous advances in storage technologies as relates to energy conservation as well. In one case study that Experton Group looked at a company was able to expand its total storage from less than one petabyte to almost six petabytes over four years while reducing its overall costs by more than $12 million. Since enterprises constantly upgrade storage and in many cases turn it over every three years, this is another excellent place to look for savings.
IT executives should approach data center optimization as an opportunity to have each manager look closely at his/her area of focus and find innovative ways to reduce the cost of operations through new processes and technologies. This should not be a one shot deal but an ongoing annual effort that should be done as part of process improvement efforts or spring/fall planning. See Figure - IT Transformational Approach
There is no reason that through detailed analysis IT executives should not be able to cut operations costs annually by 10 percent for each of the next five years while still expanding capacity to meet the business workloads. IT executives should also consider working with outside firms, to assist with the baseline and workload analyses as well as "what if" scenarios.
Experton Group POV: Data center architecture and architects are needed to drive maximum efficiency and effectiveness within and across data centers. By using the KPIs, metrics and suggested practices outlined in this research report, IT executives should be able to benchmark their environments, perform a gap analysis, and then drive optimization and process improvement initiatives that can yield incremental benefits year after year. IT executives, working with facilities and finance staffs, should bend the IT cost curves so that each year they are taking at least 10 percent out of the total Opex costs and 15 percent out of the energy expenses. Moreover, IT executives should be able to find some of these projects that are self-funding (i.e., payback in less than 12 months), whose added savings can be used to fund the initiatives that have a longer payback period.
Please click here for free reading of this research note.
High Velocity Hybrid Cloud: CloudVelocity Emerges from Stealth Mode
A new type of hybrid cloud automation platform service provider, CloudVelocity, has emerged after “A” round funding. The primary objective of this company is to enable enterprises to operate hybrid clouds seamlessly. "Cloud cloning, migration and failover are our first steps in that direction,” said Rajeev Chawla, CEO of CloudVelocity.
Led by a deeply experienced team of system software, virtualization, storage, security and networking executives, the patent-pending CloudVelocity platform - called the One Hybrid Cloud (OHC) platform - aims to allow data center teams to scale and secure their distributed multi-tier applications and services into and between clouds. Success in doing that would remove many of the major barriers to the enterprise adoption of public clouds.
CloudVelocity automates the time-consuming and risk-laden critical processes required to deploy existing applications from traditional data centers into and between public clouds.
The following steps describe the Developer Edition:
Step 1: Download and install CloudVelocity software on your servers, which make up the application (the Enterprise Edition discovers the servers, so you don’t need to do it there).
Step 2: Cloud Velocity software creates a blueprint of the servers and starts replicating OS, binaries, libraries, app stacks, and application data in the cloud.
Step 3: At the click of a button, the client is ready to provision and launch a multi-tier application (made up of a group of systems) in the cloud. CloudVelocity extends the enterprise data center to the Cloud so that these distributed multi-tier applications now running in the cloud can access any necessary services residing back in the enterprise.
Experton Group POV: First of all, we love the idea of this. At first glance this seems to be a dream come true – as a plug and play solution. When the cloud first emerged as a potential alternative to building internal computing empires that you had to then spend a lot of resources to maintain, stay a head, etc., everyone was very skeptical. As time progressed and some cloud based services started to mature, organizations started using the cloud on a limited basis. CloudVelocity has opened the door to the idea of a transparent hybrid cloud solution. It is still too early in the process to come out and make a definitive yes or no statement as to whether they have become successful.
One challenge here will be to overcome the security/privacy issues that continue to plague the cloud computing industry. This is especially true when you have a solution that admittedly will go into your network and “discover” all of the computing resources so it can back them up.
Call To Action: Look. Most organizations are not in the business of building and maintaining computing capabilities. They realize that computing resources are needed, but IT is viewed as a tool to be used by the business to become successful.
In recent organizational history, cost cutting is a primary business objective and this will continue to be so. Therefore, cloud-based computing resources are becoming a key part of a computing infrastructure.
In the case of a newly emerged service, it is best to let others get the bugs out. Once an organization decides to use this service it is best to try it out on non-critical, segregated business support areas. After a while, a certain level of comfort will be achieved. At that time decisions should be made to increase the level of cloud computing vs. internal computing or not.
2013: The 10 Recommendations for the CIO
For the first time business executives found technology factors to be the most important external force impacting their organizations. The CIO needs to adjust the priorities and required skills accordingly.
- Build Innovation Teams: Nearly every company has put innovation on the top of their agenda. IT needs to build x-functional teams to pro-actively address all areas of innovation – product, process, and business model innovations. Almost 75 percent of business people think IT is not providing innovation leadership and even more feel IT professionals lack the skills to do so.
- Develop Business Process Knowledge: The first problem is to make the existing applications more agile and for doing this IT needs a better understanding of the business processes. Secondly with all the new opportunities around SaaS and web services IT has to take a more pro-active part in redesigning the business processes as well as using this knowledge for driving process innovations.
- Business Process Master Plan, Self Service, Total Customer Experience (TCE): Business owns the processes but IT has to build the process knowledge to support and integrate them. Many new opportunities for automation and horizontal process improvements, such as Self-Service offerings, need a full understanding and redesign by the IT organization. The Total Customer Experience should be the ultimate goal and needs to be measured as the TCE quality is often not very high.
- BI, Big Data, Enterprise Performance Management: Business Intelligence has been on the top of the list for quite some time but the implementation is still relatively slow and most of the time it happens in islands. It is necessary to move to the next level and implement an Enterprise Performance Management concept. With the increasing amount of data and from many different sources “Big Data” concepts provide excellent value for many enterprises.
- Workplace of the Future: BYOD - Bring your own device has started many controversy discussion. But the fact is it cannot be stopped completely. In most companies those devices are first brought in by the executives themselves. Employees will use their preferred personal productivity tool(s) and make their own decision about notebooks, tablets, and smartphones. IT has to act and build a strategy but needs to go one step further and look more into the workplace of the future. How should it look and how can it deliver value to the business, motivate and differentiate.
- Social Collaboration, Strategy & Recommendations: Social Collaboration is the decentralization of organization and the internal oriented communication style. For external and business orientation often the terms Social Enterprise / Social Business are used, even though there is nothing really “social” such as sustainability, responsible investments or mindfulness. Through the impact of social media we all know how the communication style between people and organizations has changed. This needs to be included in the overall ITC strategy and the IT Organization is requested to formulate recommendations for the enterprise.
- Adjust IT Strategy with Flexibility and Business Focus: While most companies have accomplished a stable IT environment and have improved their efficiency a lot over the last years, still most of them have not accomplished building an agile IT. This means being able to adjust the spending and resources very quickly to changing markets. Similarly most companies have an IT strategy but the chapter on business vision, direction, and requirement is empty.
- Security & Data Protection: With Cloud, BYOD, and Mobility gaining momentum Security and Data Protection becomes mandatory for every business. Identity management has been already on the top priority list for some time, and it will also be fundamental for the rollout of many cloud solutions. Organizations need to build a framework for the Cloud including also single-sign-on, provisioning, charge back, and security.
- Skill Analysis &HCM Strategy: Many companies have 90% of the IT skills focused on keeping the systems going. They have skills for operational management, help desk, infrastructure management, desktops and mobile devices, applications support but only a small number of architects and business process experts. In other words most of the skills exist in areas which are already or will soon be commodity offerings and don’t deliver any differentiation for the company. This needs to be addressed and changed, but the required skills are difficult to find and many vendor and user companies are competing for those scare skills.
- Rework Sourcing Strategy: 80% of all server based computing will be external by 2020. Companies need to understand the trend and prepare for it. It is important to decide what is commodity and where value for the enterprise can be generated. While datacenters have been regularly updated many still lack the support for modern demands on power and cooling as well as DR. Also business continuity needs to be updated. The penetration for server virtualization has not reached a sufficient level. In the storage area most companies have not implemented the newest technologies such as de-duplication, thin provisioning, data compression and encryption. The right balance between internal and external services including cloud computing offerings needs to be established under consideration of the required stability and agility for the enterprise.
Business Technology – IT is not providing innovation Leadership
An InformationWeek survey of 382 business professionals – IT and non-IT – finds only 43 percent of the non-IT people consider IT teams integral to the business while 54 percent consider IT a maintenance or support organization and not an innovator. However, 59 percent believe technology is becoming critical to the business. On the other hand, 60 percent of the surveyed IT professionals consider IT integral to the business while only 39 percent consider IT a maintenance or support organization. Furthermore, only 18 percent of the non-IT respondents were completely or very satisfied with IT projects based on costs, quality and timeliness. Another 32 percent were moderately satisfied. From the IT perspective the numbers were 29 and 39 percent, respectively. Approximately three-fourths of all respondents viewed IT as not demonstrating innovation leadership. Sadly only 19 percent of respondents viewed their organizations as actively helping their IT professionals develop the business or "soft" skills needed to stay current and aid innovation.
The 2012 IBM Corp. CEO survey showed similar results when looking at technology being critical to the business. 71 percent of the CEO study respondents for the first time found technology factors to be the most important external force impacting their organizations. In that almost 75 percent of business people think IT is not providing innovation leadership and even more feel IT professionals lack the skills to do so, IT executives need to change this perception and invest in correcting the problem. IT executives have a chance to "sit at the table" with their business peers but if they cannot address the business and innovation issues, they will be shunted aside.