IBM: The Times They Are Changing
IBM Corp. reported poor fourth quarter and fiscal 2013 revenue results and announced heavy investments in the cloud and Watson. The company also announced System x enhancements and its sale of the System x server business to Lenovo Group.
- IBM released its fourth quarter and fiscal year 2013 results, both of which showed year-over-year decreases in revenues. Fourth quarter revenues were $27.7 billion, down five percent from the previous year, while its net income on a GAAP basis came in at $6.6 billion, up eight percent. The Americas and Europe/Middle East/Africa had little change while Asia Pacific dropped 16 percent and BRIC (Brazil, Russia, India, and China) countries fell 14 percent from the previous year's fourth quarter. The Services, Software and Financing groups showed little change but the Systems and Technology Group (STG) experienced a 26 percent decline in revenue year-over-year. Power Systems and System z server revenues slumped 30-plus percent each while System x and Storage revenues each fell more than 10 percent. IBM's key growth areas fared much better overall with the Smarter Planet initiative seeing about a 20 percent year-over-year growth, business analytics getting about nine percent growth, and cloud revenues jumping 69 percent over the previous year's quarter. For the fiscal 2013 year IBM had revenues of $99.8 billion, a drop of five percent from the 2012 year, while its full year net income on a GAAP basis was $16.5 billion, virtually unchanged year-over-year.
- In the cloud arena IBM intends to invest more than $1.2 billion to significantly expand its global cloud footprint. By yearend the company plans on having 40 data centers in 15 countries on five continents. 15 new data centers are planned to be opened up adding to the existing 13 data centers from SoftLayer and 12 from IBM. Over time IBM expects to offer a unified platform with common management and programming interfaces across all sites. IBM will be offering BPO, SaaS, PaaS, and IaaS services (see chart). On the Watson front, the company is opening up its approach to driving is cognitive computing platform. IBM will invest more than $1 billion over the next couple of years in the Watson initiative, dedicate more than 2,000 professionals, and create a $100 million equity investment vehicle to gain developer acceptance and usage. Additionally, the company announced the creation of the Watson Group, its own integrated business unit with headquarters in New York City, and launched three new offerings: Watson Discovery Advisor, Watson Analytics, and Watson Explorer. The NYC Silicon Alley site will bring together research, development, service, go-to-market teams and will feature an innovation hub. Last November the IBM Watson Ecosystem was announced and more than 800 applicants have applied to date.
- IBM announced the long-speculated deal to sell its System x server business. The definitive agreement calls for Lenovo to acquire all of the System x server business – System x, Blade Center, and Flex System blade servers and switches, NeXtScale and iDataPlex servers and associated software, blade networking and maintenance operations. The financial terms call for Lenovo to pay approximately $2 billion in cash and the remaining $300 million in Lenovo shares. Approximately 7,500 IBM employees are expected to be offered jobs by Lenovo when the transaction closes later this year. Lenovo will become IBM's supplier of x86 server technology and Lenovo will license, OEM and resell IBM Storwize and tape storage technologies, SmartCloud Entry, elements of the x86 system software portfolio, and some other technology. IBM will retain Power-based Flex servers, storage systems, and PureApplication and PureData appliances. Also this month IBM launched its X6 generation of System x processors. These new solutions are more agile, faster and more resilient than previous generations. The processors have a much lower latency with significant cost savings. IBM introduced exFlash memory channel storage that enables integration of an ultra-low latency cache directly from the memory bus and a FlashCache Storage Accelerator.
Experton Group POV: Since 2007 IBM revenues have hovered around $100 billion. The company has managed earnings growth through cost containment and stock buybacks but senior executives have yet to find the golden egg that will provide Big Blue with a breakout in revenues. Some of the non-growth issues can be attributed to management and the offerings while the downturn in the global economies and their poor recoveries are easily responsible for the rest.
Nonetheless, IBM is making some smart moves in getting out of low margin hardware solutions and focusing on higher margin software and services solutions and sales to non-IT departments. The Smarter Planet initiative is paying off and is giving IBM positive visibility into the Business Technology (BT) arena. With the addition of SoftLayer IBM and Amazon Inc.'s Amazon Web Services (AWS) become two of the largest cloud providers (who is first is hard to tell since both have been opaque on these revenue streams). IBM's further commitment to cloud and Watson could finally give the company the impetus needed to overcome market headwinds.
The sale of System x to Lenovo is another good move. x86 servers will experience a multi-year revenue compression due to the rise of Atom and ARM hyperscale computers, cloud computing, the drive to virtualization and greater server utilization, the slowdown of workload shifts from Unix and mainframe computers, white boxes, and continued price/performance improvements. While the divestiture may impair IBM revenues initially, it should improve margins and have little impact upon its other commercial and government businesses.
IBM is not fading away or endangered of becoming obsolete. The global economies and disruptive technologies are changing the face of business and IBM is similarly undergoing change to keep pace and eventually grow. It will continue to be a top IT investor in R&D, new patents and strategic acquisitions. Experton Group expects Smarter Planet and big data/analytics to gain traction year-over-year but the surprise for most will be IBM's ability to be one of the top three cloud service providers over the next few years. Cloud will not be the death of IBM (as some predict) but the basis for its next wave up. Business and IT executives should understand IBM's vision and roadmap, determine how these correlate with their own strategies and goals, and then include IBM on the short list of strategic partners and/or vendors where appropriate.
Governance, Risk, and Compliance (GRC) 2012 Prediction & Reality
Experton Group's GRC predictions were on target across the board. The various agencies and branches of government globally are still developing and implementing new policies and directives faster than ever. These regulatory changes, like the Dodd-Frank Wall Street Reform and Consumer Protection Act and the Patient Protection and Affordable Care Act (Obamacare), will be producing rules and regulations for years to come and will have negative corporate impacts throughout the decade. Moreover, it is unlikely European banks are sufficiently protected against the impacts of sovereign risk exposures that still exist globally. The U.S. has delayed the implementation of Basel III, which will help limit the risk of too much money concentrated in too few sovereign securities. The EU may do the same until 2014 at a minimum, which should help reduce the risk exposure – the opposite effect of why the law exists. Thus, executives can plan on GRC uncertainties and the ripple effect of regulatory changes to flow well beyond 2012 and the next five years. This is not good news for the business community or a global economic recovery. Nonetheless, vendors enhanced their GRC products with more analytics, policy-based options, key indicators, and metrics. In addition, the market has matured to the point where industry titans such as IBM Corp., Oracle Corp., SAP AG, and SAS Institute provide stability, integrated solutions, and thought leadership. And even with all the acquisitions, there are still more than 400 small to mid-sized vendors offering GRC point solutions.
The ERM (Enterprise Risk Management) mantra has gained some traction but because of economic and regulatory uncertainties, it is not gaining executive support to spend more resources in this area. In addition, the majority of companies have a disconnect between their GRC programs and legal activities. Instead, executives that are focused in this area have been working to drive standardization and process maturity across the enterprise. Overall, as expected, funding was tight and progress was slow.
DR/BC remained a top ERM priority in 2012 with IT executives pursuing traditional and new options (such as cloud computing, deduplication and snapshots) designed to reduce costs, resources and processing time. Experton Group did not see an increased focus on DR/BC, leaving too many companies exposed to the vagaries of man-made or natural disasters. This year it was Hurricane Sandy that caught many unprepared for the consequences caused by the extent and duration of the damage, especially the extended loss of electricity.
Companies never expected long delays in the restoration of electricity and those that did not have sufficient emergency backups or backup sites were unable to operate for more than a week (and some even beyond a month). NYU Langone Medical Center's basement (where the Smilow Research Center was located) and lower floors flooded, resulting in loss of years of research. Meanwhile, the New York Stock Exchange closed for two days as its DR/BC contingency plan proved to be flawed and not fully tested. Even when they got back up, many securities firms struggled to function under backup power and poor waste management capabilities.
Experton Group notes that more than one-third of companies do not have an adequate DR/BC plan in place in the Western World and over two-thirds in the Middle East, and amongst those that do, there still remain large swaths of business processes not effectively covered. According to the 2011 European Disaster Recovery survey, 74 percent of organizations are not very confident that they can fully recover after a disaster; the top causes of disasters in Europe were hardware failures (61 percent), power failures (42 percent) and data corruption (35 percent). DR/BC is an insurance policy and companies should be willing to spend a small amount (two to four percent of their IT budgets) to minimize the risk of business failure.
Investments in ERM enhancements were limited again this year even though most companies have put in place a formal ERM program or structure. One encouraging note is that the ERM program usually includes an IT risk assessment. But Experton Group remains amazed at the number of companies that do not commit to ERM initiatives and fall behind in the pursuit of protecting the entire enterprise from potential disasters. Furthermore, funding remained constrained for most and projects are moving forward at a snail's pace. With 2013 expected to be another mediocre economic year Experton Group expects a small minority of companies to increase their investments in ERM while most will continue apace or reduce commitments.
Cloud and Outsourcing 2012 Prediction & Reality
Experton Group accurately predicted 2012 as the year cloud computing would gain momentum and become mainstream and that the outsourcing markets would stabilize with limited growth. More than 80 percent of U.S. companies currently deploy cloud computing solutions or services, according to recent surveys. About 70 percent of cloud users take advantage of software-as-a-service (SaaS) offerings. The primary motivator was saving capital and/or operating expenses. The top four cloud application usages are email (47 percent of respondents), Web presence (45 percent), virtual desktop (44 percent) and collaboration tools (42 percent). SMB firms are moving more rapidly to clouds than large enterprises and they are primarily users of off-premise private and public clouds. The larger firms are concentrating on in-house infrastructure- and platform-as-a-service solutions along with some hybrid off-site offerings. Security, data portability and Internet downtime are the top user concerns when using off-premise clouds. However, automation, integration, management and training also are issues business and IT executives express are areas of concern.
Overall the outsourcing markets expanded at a rate similar to 2011 at less than 10 percent. To maintain market share many of the major outsourcing providers reduced their basic and blended rates slightly, which has impacted their margins as their costs have increased. Outsourcers are also including cloud computing solutions and services as another opportunity. It is currently a small component of overall revenues but it should be growing as companies seek to cut costs and improve time to market. As predicted, the lack of standards in definitions by vendors for cloud computing as well as the range of offerings and types of offerings still continues to obfuscate the meaning of clouds. Experton Group expects this to continue to evolve but slowly stabilize as winners in the various spaces, such as Amazon Inc. and Salesforce.com Inc., set the standards.
The enterprise consolidation and virtualization initiatives gained momentum in 2012; yet there remains a long way to go before most IT shops reach optimization. Nonetheless, IT executives are moving forward and acceptance is gaining speed. Experton Group still finds most enterprise data centers are still in the early stages of virtualization and therefore utilization levels remain less than 20 percent on average. Moreover, there is not general knowledge of usage levels or the overall costs associated with running old equipment and the lack of virtualization. This fact is true for storage as well as servers, which translates to wasted hardware, software and staffing expenses. Until management fully understands the impact of current operational characteristics, data centers will not become optimized. Moving to cloud solutions are not the answer to the problem – building best practices needs to precede the move to the cloud. Those enterprises that are creating cloud environments in-house are primarily building infrastructure-as-a-service (IaaS) and platform-as-a-service (PaaS) solutions. Long term these platforms will reduce personnel costs but initially costs will rise as the transitional costs, including training, add to existing operating expenses.
Experton Group was correct about this being a decent year for offshoring, as pointed out above. Outsourcing providers invested more into targeted areas such as application verticals, application development, business process outsourcing, call centers/help desks and cloud computing to better differentiate themselves. Experton Group was also on target in terms of outsourcers moving to global sourcing by acquiring more resources local to the markets they served, so as to operate with a 15-20 percent blended rate, with top talent being local. Pricing models are starting to change to deal with the cloud computing models. The models are slowly moving to "pay for performance" or fixed price and away from the traditional "per day" structure. India still leads in most areas of outsourcing with China in second place and growing rapidly. Business process outsourcing (BPO) is still dominated by U.S.-based corporations.
Data Center Effectiveness Metrics
Most data center executives are improving their data center effectiveness and efficiency within and across data centers as part of their optimization efforts. However, what holds a number of executives back from making major strides in optimization planning are the questions of where to begin and what the metrics, paybacks, and desired targets are. Experton Group finds IT executives want a baseline of best practices and best of breed targets against which to compare their current operations.
Experton Group studies have found that most enterprises are not run at maximum effectiveness levels. As a result, a typical data center operation has the prospect of reducing costs by up to 40 percent and cutting energy expenditures by up to 80 percent within an existing data center. If one includes data center consolidation and use of cloud computing, the savings can be even greater.
IT executives that meet the proposed server metrics can reduce their overall operations costs by 25 percent or more. Depending on the current state of the equipment and software, IT executives can achieve significantly greater savings. For example, IT executives that have utilized the standard scale out approach to adding x86-architected servers can potentially reduce the number of servers in operation by 50 percent or more. Server virtualization can enhance the savings to a point greater than 75 percent.
There have been tremendous advances in storage technologies as relates to energy conservation as well. In one case study that Experton Group looked at a company was able to expand its total storage from less than one petabyte to almost six petabytes over four years while reducing its overall costs by more than $12 million. Since enterprises constantly upgrade storage and in many cases turn it over every three years, this is another excellent place to look for savings.
IT executives should approach data center optimization as an opportunity to have each manager look closely at his/her area of focus and find innovative ways to reduce the cost of operations through new processes and technologies. This should not be a one shot deal but an ongoing annual effort that should be done as part of process improvement efforts or spring/fall planning. See Figure - IT Transformational Approach
There is no reason that through detailed analysis IT executives should not be able to cut operations costs annually by 10 percent for each of the next five years while still expanding capacity to meet the business workloads. IT executives should also consider working with outside firms, to assist with the baseline and workload analyses as well as "what if" scenarios.
Experton Group POV: Data center architecture and architects are needed to drive maximum efficiency and effectiveness within and across data centers. By using the KPIs, metrics and suggested practices outlined in this research report, IT executives should be able to benchmark their environments, perform a gap analysis, and then drive optimization and process improvement initiatives that can yield incremental benefits year after year. IT executives, working with facilities and finance staffs, should bend the IT cost curves so that each year they are taking at least 10 percent out of the total Opex costs and 15 percent out of the energy expenses. Moreover, IT executives should be able to find some of these projects that are self-funding (i.e., payback in less than 12 months), whose added savings can be used to fund the initiatives that have a longer payback period.
Please click here for free reading of this research note.
High Velocity Hybrid Cloud: CloudVelocity Emerges from Stealth Mode
A new type of hybrid cloud automation platform service provider, CloudVelocity, has emerged after “A” round funding. The primary objective of this company is to enable enterprises to operate hybrid clouds seamlessly. "Cloud cloning, migration and failover are our first steps in that direction,” said Rajeev Chawla, CEO of CloudVelocity.
Led by a deeply experienced team of system software, virtualization, storage, security and networking executives, the patent-pending CloudVelocity platform - called the One Hybrid Cloud (OHC) platform - aims to allow data center teams to scale and secure their distributed multi-tier applications and services into and between clouds. Success in doing that would remove many of the major barriers to the enterprise adoption of public clouds.
CloudVelocity automates the time-consuming and risk-laden critical processes required to deploy existing applications from traditional data centers into and between public clouds.
The following steps describe the Developer Edition:
Step 1: Download and install CloudVelocity software on your servers, which make up the application (the Enterprise Edition discovers the servers, so you don’t need to do it there).
Step 2: Cloud Velocity software creates a blueprint of the servers and starts replicating OS, binaries, libraries, app stacks, and application data in the cloud.
Step 3: At the click of a button, the client is ready to provision and launch a multi-tier application (made up of a group of systems) in the cloud. CloudVelocity extends the enterprise data center to the Cloud so that these distributed multi-tier applications now running in the cloud can access any necessary services residing back in the enterprise.
Experton Group POV: First of all, we love the idea of this. At first glance this seems to be a dream come true – as a plug and play solution. When the cloud first emerged as a potential alternative to building internal computing empires that you had to then spend a lot of resources to maintain, stay a head, etc., everyone was very skeptical. As time progressed and some cloud based services started to mature, organizations started using the cloud on a limited basis. CloudVelocity has opened the door to the idea of a transparent hybrid cloud solution. It is still too early in the process to come out and make a definitive yes or no statement as to whether they have become successful.
One challenge here will be to overcome the security/privacy issues that continue to plague the cloud computing industry. This is especially true when you have a solution that admittedly will go into your network and “discover” all of the computing resources so it can back them up.
Call To Action: Look. Most organizations are not in the business of building and maintaining computing capabilities. They realize that computing resources are needed, but IT is viewed as a tool to be used by the business to become successful.
In recent organizational history, cost cutting is a primary business objective and this will continue to be so. Therefore, cloud-based computing resources are becoming a key part of a computing infrastructure.
In the case of a newly emerged service, it is best to let others get the bugs out. Once an organization decides to use this service it is best to try it out on non-critical, segregated business support areas. After a while, a certain level of comfort will be achieved. At that time decisions should be made to increase the level of cloud computing vs. internal computing or not.
2013: The 10 Recommendations for the CIO
For the first time business executives found technology factors to be the most important external force impacting their organizations. The CIO needs to adjust the priorities and required skills accordingly.
- Build Innovation Teams: Nearly every company has put innovation on the top of their agenda. IT needs to build x-functional teams to pro-actively address all areas of innovation – product, process, and business model innovations. Almost 75 percent of business people think IT is not providing innovation leadership and even more feel IT professionals lack the skills to do so.
- Develop Business Process Knowledge: The first problem is to make the existing applications more agile and for doing this IT needs a better understanding of the business processes. Secondly with all the new opportunities around SaaS and web services IT has to take a more pro-active part in redesigning the business processes as well as using this knowledge for driving process innovations.
- Business Process Master Plan, Self Service, Total Customer Experience (TCE): Business owns the processes but IT has to build the process knowledge to support and integrate them. Many new opportunities for automation and horizontal process improvements, such as Self-Service offerings, need a full understanding and redesign by the IT organization. The Total Customer Experience should be the ultimate goal and needs to be measured as the TCE quality is often not very high.
- BI, Big Data, Enterprise Performance Management: Business Intelligence has been on the top of the list for quite some time but the implementation is still relatively slow and most of the time it happens in islands. It is necessary to move to the next level and implement an Enterprise Performance Management concept. With the increasing amount of data and from many different sources “Big Data” concepts provide excellent value for many enterprises.
- Workplace of the Future: BYOD - Bring your own device has started many controversy discussion. But the fact is it cannot be stopped completely. In most companies those devices are first brought in by the executives themselves. Employees will use their preferred personal productivity tool(s) and make their own decision about notebooks, tablets, and smartphones. IT has to act and build a strategy but needs to go one step further and look more into the workplace of the future. How should it look and how can it deliver value to the business, motivate and differentiate.
- Social Collaboration, Strategy & Recommendations: Social Collaboration is the decentralization of organization and the internal oriented communication style. For external and business orientation often the terms Social Enterprise / Social Business are used, even though there is nothing really “social” such as sustainability, responsible investments or mindfulness. Through the impact of social media we all know how the communication style between people and organizations has changed. This needs to be included in the overall ITC strategy and the IT Organization is requested to formulate recommendations for the enterprise.
- Adjust IT Strategy with Flexibility and Business Focus: While most companies have accomplished a stable IT environment and have improved their efficiency a lot over the last years, still most of them have not accomplished building an agile IT. This means being able to adjust the spending and resources very quickly to changing markets. Similarly most companies have an IT strategy but the chapter on business vision, direction, and requirement is empty.
- Security & Data Protection: With Cloud, BYOD, and Mobility gaining momentum Security and Data Protection becomes mandatory for every business. Identity management has been already on the top priority list for some time, and it will also be fundamental for the rollout of many cloud solutions. Organizations need to build a framework for the Cloud including also single-sign-on, provisioning, charge back, and security.
- Skill Analysis &HCM Strategy: Many companies have 90% of the IT skills focused on keeping the systems going. They have skills for operational management, help desk, infrastructure management, desktops and mobile devices, applications support but only a small number of architects and business process experts. In other words most of the skills exist in areas which are already or will soon be commodity offerings and don’t deliver any differentiation for the company. This needs to be addressed and changed, but the required skills are difficult to find and many vendor and user companies are competing for those scare skills.
- Rework Sourcing Strategy: 80% of all server based computing will be external by 2020. Companies need to understand the trend and prepare for it. It is important to decide what is commodity and where value for the enterprise can be generated. While datacenters have been regularly updated many still lack the support for modern demands on power and cooling as well as DR. Also business continuity needs to be updated. The penetration for server virtualization has not reached a sufficient level. In the storage area most companies have not implemented the newest technologies such as de-duplication, thin provisioning, data compression and encryption. The right balance between internal and external services including cloud computing offerings needs to be established under consideration of the required stability and agility for the enterprise.