City lights at night

What Are Data Centers and Why Do They Matter? (Part 3 of 3)

18 Aug 2016 by Howard M Cohen

Every cloud service is produced at a purpose-built, special-function data center location. Some produce data backup services, others provide productivity tools and then there are general-purpose infrastructure support providers. But they are all IT data centers. We’ll wrap things up in the final article in this series.

Writer Michael Palmer, posting on the Association of National Advertisers blog on November 3, 2006, may have been the first to say, “Data is the new oil.” A recent whitepaper of the same title from EMC Greenplum states, “Markets are fast disappearing, and being replaced by networks. Networks of intelligence.”

These data center networks have many things in common with oil. The most important characteristic of oil has been availability. Reduced availability drove up prices, and no availability created disasters. It has become obvious in the aftermath of natural disasters that lack of oil is the greatest risk to the ongoing operation of municipalities and nations.

Today, the same can be said of information.

Mitigating risk, maintaining availability

Over the next five years, the profile, topology and purpose of the data center tiers standard will change dramatically, driven by the digital world and the use of emerging technologies. Between 2013 and the end of 2014, $1 billion of new venture capital funding went into the Internet of Things (IoT), and in 2014, more than $40 billion was spent by enterprises on designing, implementing and operating the IoT (see “Drive Digital Business Using Insights From Symposium's Analyst Keynote” for more details). At the same time, there will be an increase in investment around the Nexus of Forces (cloud, social, mobile and information) as many businesses focus on growth and new opportunities,” according to Gartner’s report, “How to Select the Correct Data Center Option for the Digital World,” published on October 8, 2015.

This desire for change is leading to the development of the digital economy and forcing heads of these organizations to focus on agility, innovation and gaining a competitive advantage. However, data centers must also use the latest technologies and integrate through intelligent software layers. 

Gartner makes recommendations for a modern data center strategy in the report, including three different data center personality models to classify your workloads. Read about the first of these personalities, which Gartner refers to as “Agility and Innovation” in part one of this three-part article series. The next personality is the “Intelligent and Integrated” data center. Today, we wrap up this article series with the “Availability and Risk” data center personality.

Governance, risk and compliance security

When we think about risk in the context of the data center, we must also consider its attendant issues. Governance, Risk and Compliance (GRC) have long been considered concerns that must be addressed together, not separately.

It is useful to remember that we have made no mention yet of security. Achieving optimal network and data security is fundamentally unrelated to GRC. The achievement of optimal security does not mean that we have achieved any particular level of government, risk or compliance. Conversely, achieving effective GRC has no bearing upon whether or not our information management environment is secure to any level.

Similar to security, however, GRC is only as strong as its weakest link. This means we must pay close attention to every measure we take to achieve security at every step along the information management value chain.

The largest link, and perhaps the most vulnerable, is the data center where the broadest selection of components in the chain is housed. With all data center information that is accessed and shared, risks abound at every level of the International Standards Organization’s Open Systems Interconnection (ISO/OSI) Seven Layer Model. These layers include: 

Physical — A truly risk-resistant data center begins at the physical layer with a well-reinforced structure that can resist penetration. Cables coming into the building from carriers are a point of vulnerability that must be heavily protected. The Department of Defense specified and built a conduit that could survive nuclear attack for the original Advanced Research Projects Agency Network (ARPANET). Some of that infrastructure, protecting the Mobitex network, enabled Blackberry devices to continue working when everything else failed during the World Trade Center attack in New York on September 11, 2001.

Data link, network and transport — These layers take signals from the physical cables, translate them into manageable packets, and move those packets between devices both within the data center and externally. All three levels are vulnerable to external attackers and must be protected.

Session, presentation and application — These layers connect the information to the user by establishing a user session, protected at least by ID and password in data center operations. This presents resources to the user in an efficient manner, with the definition of “efficient” being widely open to interpretation, and allows the user to run applications that take advantage of these resources. These layers present an additional set of vulnerabilities.

But at the end of this chain of layers is the one most technologists identify as “the toughest segment to manage” and “the segment between the keyboard and the back of the chair.” They’re referring, of course, to users.

Governing the use of information and information resources

The user level is where the greatest risk is introduced. This is actually the beginning and end of the model, in that information starts moving from the user to the application through the presentation offered by the session. The application then transports the data across the network to the data link where it is moved across the physical layer to the other end of the transaction. At that point, it climbs back up the layers from the physical to the data link to the network to transport via a session that uses the presentation layer to display an application that finally conveys the data to a user. With users at both ends, there is vulnerability at both ends.

Generally, users cannot be controlled, only governed. They are furnished with the rules and regulations regarding their use of the network. When they obey these rules, healthy governance is maintained. The greatest risk on noncompliance, data loss or data exploit comes when users don’t thoroughly obey the rules. This is why social engineering is one of the most popular forms of hacking today. Users are simply easier to exploit than hardware or software.

Well-designed data center systems are created with great consciousness of the user challenge. Many safeguards are built in to validate data flowing in all directions. Heuristic or behavioral analysis software has added useful tools to the effort, but only the establishment and ongoing enforcement of governance in the data center and the user community can fill in the gaps.

High availability, continuity and disaster recovery

You really can’t talk about GRC without discussing the high availability of the network, continuity of business in the event of challenges to the network and disaster recovery in the aftermath of catastrophe.

Modern data center technology that's provided by large, well-resourced providers are preferable in this context. Most major providers “bake in” continuity and high availability by connecting multiple redundant data centers across multiple redundant circuits with multiple redundant fail-over capabilities. The sheer expense of this redundancy puts it beyond the budget of most individual companies. But since these large-provider services are managed by experienced professionals who focus solely on this task, customers can count on enjoying the benefits of regular failover testing, the lack of which has caused some of the most celebrated failures.

Co-location facilities are not quite as appropriate for assuring risk mitigation and high availability in that much of the equipment is customer-furnished, which reduces the liability and thus the reliability of the co-location provider. However, customers building their own private cloud capability may have the budget to include the required redundancies, checks and balances.

Regulatory compliance

Customers are well advised to remember that no individual technology product or facility is, by definition, compliant with any specific regulatory act. It is the business itself that must establish and maintain regulatory compliance. Facilities and products may be designed and implemented in ways that facilitate customer achievement of compliance with a specific regulatory act, but any provider claiming to be compliant is simply incorrect.

Compliance is meant to mitigate the risk of information being inappropriately shared; something that is especially true with an enterprise data center. Governance is meant to facilitate and enable compliance. This is why the three have been grouped closely, and why any solution must address all three. Then the organization must determine whether they’ve achieved adequate security and privacy protection for its network and data.

Gain Insight

Remember that the term “data center” no longer refers as much to a specific facility as it does to a functionality, and most information management environments going forward will combine the work of many data centers to achieve the goals of the organization.

Download Gartner’s report, “How to Select the Correct Data Center Option for the Digital World,” to help you blend the three personalities of data center and provide your customers with everything they require from a data center to maximize their profitability and optimize the operations of their businesses.