fbpx
Wikipedia

Data center

A data center (American English)[1] or data centre (Commonwealth English)[2][note 1] is a building, a dedicated space within a building, or a group of buildings[3] used to house computer systems and associated components, such as telecommunications and storage systems.[4][5]

ARSAT data center (2014)

Since IT operations are crucial for business continuity, it generally includes redundant or backup components and infrastructure for power supply, data communication connections, environmental controls (e.g., air conditioning, fire suppression), and various security devices. A large data center is an industrial-scale operation using as much electricity as a small town.[6]

History

 
NASA mission control computer room c. 1962

Data centers have their roots in the huge computer rooms of the 1940s, typified by ENIAC, one of the earliest examples of a data center.[7][note 2] Early computer systems, complex to operate and maintain, required a special environment in which to operate. Many cables were necessary to connect all the components, and methods to accommodate and organize these were devised such as standard racks to mount equipment, raised floors, and cable trays (installed overhead or under the elevated floor). A single mainframe required a great deal of power and had to be cooled to avoid overheating. Security became important – computers were expensive, and were often used for military purposes.[7][note 3] Basic design-guidelines for controlling access to the computer room were therefore devised.

During the boom of the microcomputer industry, and especially during the 1980s, users started to deploy computers everywhere, in many cases with little or no care about operating requirements. However, as information technology (IT) operations started to grow in complexity, organizations grew aware of the need to control IT resources. The availability of inexpensive networking equipment, coupled with new standards for the network structured cabling, made it possible to use a hierarchical design that put the servers in a specific room inside the company. The use of the term "data center", as applied to specially designed computer rooms, started to gain popular recognition about this time.[7][note 4]

The boom of data centers came during the dot-com bubble of 1997–2000.[8][note 5] Companies needed fast Internet connectivity and non-stop operation to deploy systems and to establish a presence on the Internet. Installing such equipment was not viable for many smaller companies. Many companies started building very large facilities, called Internet data centers (IDCs),[9] which provide enhanced capabilities, such as crossover backup: "If a Bell Atlantic line is cut, we can transfer them to ... to minimize the time of outage."[9]

The term cloud data centers (CDCs) has been used.[10] Data centers typically cost a lot to build and maintain.[8] Increasingly, the division of these terms has almost disappeared and they are being integrated into the term "data center".[11]

Requirements for modern data centers

 
Racks of telecommunications equipment in part of a data center

Modernization and data center transformation enhances performance and energy efficiency.[12]

Information security is also a concern, and for this reason, a data center has to offer a secure environment that minimizes the chances of a security breach. A data center must, therefore, keep high standards for assuring the integrity and functionality of its hosted computer environment.

Industry research company International Data Corporation (IDC) puts the average age of a data center at nine years old.[12] Gartner, another research company, says data centers older than seven years are obsolete.[13] The growth in data (163 zettabytes by 2025[14]) is one factor driving the need for data centers to modernize.

Focus on modernization is not new: concern about obsolete equipment was decried in 2007,[15] and in 2011 Uptime Institute was concerned about the age of the equipment therein.[note 6] By 2018 concern had shifted once again, this time to the age of the staff: "data center staff are aging faster than the equipment."[16]

Meeting standards for data centers

The Telecommunications Industry Association's Telecommunications Infrastructure Standard for Data Centers[17] specifies the minimum requirements for telecommunications infrastructure of data centers and computer rooms including single tenant enterprise data centers and multi-tenant Internet hosting data centers. The topology proposed in this document is intended to be applicable to any size data center.[18]

Telcordia GR-3160, NEBS Requirements for Telecommunications Data Center Equipment and Spaces,[19] provides guidelines for data center spaces within telecommunications networks, and environmental requirements for the equipment intended for installation in those spaces. These criteria were developed jointly by Telcordia and industry representatives. They may be applied to data center spaces housing data processing or Information Technology (IT) equipment. The equipment may be used to:

  • Operate and manage a carrier's telecommunication network
  • Provide data center based applications directly to the carrier's customers
  • Provide hosted applications for a third party to provide services to their customers
  • Provide a combination of these and similar data center applications

Data center transformation

Data center transformation takes a step-by-step approach through integrated projects carried out over time. This differs from a traditional method of data center upgrades that takes a serial and siloed approach.[20] The typical projects within a data center transformation initiative include standardization/consolidation, virtualization, automation and security.

  • Standardization/consolidation: Reducing the number of data centers[21][22] and avoiding server sprawl[23] (both physical and virtual)[24] often includes replacing aging data center equipment,[25] and is aided by standardization.[26]
  • Virtualization: Lowers capital and operational expenses,[27] reduces energy consumption.[28] Virtualized desktops can be hosted in data centers and rented out on a subscription basis.[29] Investment bank Lazard Capital Markets estimated in 2008 that 48 percent of enterprise operations will be virtualized by 2012. Gartner views virtualization as a catalyst for modernization.[30]
  • Automating: Automating tasks such as provisioning, configuration, patching, release management, and compliance is needed, not just when facing fewer skilled IT workers.[26]
  • Securing: Protection of virtual systems is integrated with the existing security of physical infrastructures.[31]

Raised floor

 
Perforated cooling floor tile.

A raised floor standards guide named GR-2930 was developed by Telcordia Technologies, a subsidiary of Ericsson.[32]

Although the first raised floor computer room was made by IBM in 1956,[33] and they've "been around since the 1960s",[34] it was the 1970s that made it more common for computer centers to thereby allow cool air to circulate more efficiently.[35][36]

The first purpose of the raised floor was to allow access for wiring.[33]

Lights out

The "lights-out"[37] data center, also known as a darkened or a dark data center, is a data center that, ideally, has all but eliminated the need for direct access by personnel, except under extraordinary circumstances. Because of the lack of need for staff to enter the data center, it can be operated without lighting. All of the devices are accessed and managed by remote systems, with automation programs used to perform unattended operations. In addition to the energy savings, reduction in staffing costs and the ability to locate the site further from population centers, implementing a lights-out data center reduces the threat of malicious attacks upon the infrastructure.[38][39]

Noise levels

Generally speaking, local authorities prefer noise levels at data centers to be "10dB below the existing night time background noise level at the nearest residence."[40]

OSHA regulations require monitoring of noise levels inside data centers if noise exceeds 85 decibels.[41] The average noise level in server areas of a data center may reach as high as 92-96 dB(A).[42]

Residents living near data centers have described the sound as "a high-pitched whirring noise 24/7", saying “It’s like being on a tarmac with an airplane engine running constantly ... Except that the airplane keeps idling and never leaves.”[43][44][45][46]

External sources of noise include HVAC equipment and energy generators.[47][48]

Data center levels and tiers

The two organizations in the United States that publish data center standards are the Telecommunications Industry Association (TIA) and the Uptime Institute.

International standards EN50600 and ISO22237 Information technology — Data center facilities and infrastructures

  • Class 1 single path solution
  • Class 2 single path with redundancy solution
  • Class 3 multiple paths providing a concurrent repair/operate solution
  • Class 4 multiple paths providing a fault tolerant solution (except during maintenance)

Telecommunications Industry Association

The Telecommunications Industry Association's TIA-942 standard for data centers, published in 2005 and updated four times since, defined four infrastructure levels.[49]

  • Rated-1 - basically a server room, following basic guidelines
  • Rated-2 - Redundant component, key components are redundant
  • Rated-3 - Concurrently maintainabile, able to handle maintenance on any part of the distribution path or any single piece of equipment without causing an interruption to the data center operations
  • Rated-4 - Fault tolerant, able to handle one single fault at a time on any part of the distribution path or any single piece of equipment without causing interruption to the data center operations

Uptime Institute – Data center Tier Classification Standard

Four Tiers are defined by the Uptime Institute standard:

  • Tier I - BASIC CAPACITY and must include a UPS (uninterruptible power source)
  • Tier II - REDUNDANT CAPACITY and adds redundant power and cooling
  • Tier III - CONCURRENTLY MAINTAINABLE and ensures that ANY component can be taken out of service without affecting production
  • Tier IV - FAULT TOLERANT allowing any production capacity to be insulated from ANY type of failure.

A fifth tier has been Trademarked by Switch (company), who have used this tier to define The Citadel, the largest data center in the world.[50][51]

Data center design

The field of data center design has been growing for decades in various directions, including new construction big and small along with the creative re-use of existing facilities, like abandoned retail space, old salt mines and war-era bunkers.

  • a 65-story data center has already been proposed[52]
  • the number of data centers as of 2016 had grown beyond 3 million USA-wide, and more than triple that number worldwide[8]

Local building codes may govern the minimum ceiling heights and other parameters. Some of the considerations in the design of data centers are:

 
A typical server rack, commonly seen in colocation
  • Size - one room of a building, one or more floors, or an entire building,
  • Capacity - can hold up to or past 1,000 servers[53]
  • Other considerations - Space, power, cooling, and costs in the data center.[54]
  • Mechanical engineering infrastructure - heating, ventilation and air conditioning (HVAC); humidification and dehumidification equipment; pressurization.[55]
  • Electrical engineering infrastructure design - utility service planning; distribution, switching and bypass from power sources; uninterruptible power source (UPS) systems; and more.[55][56]
 
CRAC Air Handle

Design criteria and trade-offs

  • Availability expectations: The costs of avoiding downtime should not exceed the cost of the downtime itself[57]
  • Site selection: Location factors include proximity to power grids, telecommunications infrastructure, networking services, transportation lines and emergency services. Other considerations should include flight paths, neighboring power drains, geological risks, and climate (associated with cooling costs).[58]
    • Often, power availability is the hardest to change.

High availability

Various metrics exist for measuring the data-availability that results from data-center availability beyond 95% uptime, with the top of the scale counting how many "nines" can be placed after "99%".[59]

Modularity and flexibility

Modularity and flexibility are key elements in allowing for a data center to grow and change over time. Data center modules are pre-engineered, standardized building blocks that can be easily configured and moved as needed.[60]

A modular data center may consist of data center equipment contained within shipping containers or similar portable containers.[61] Components of the data center can be prefabricated and standardized which facilitates moving if needed.[62]

Environmental control

Temperature and humidity are controlled via:

It is important that computers do not get humid or overheat, as high humidity can lead to dust clogging the fans, which leads to overheat, or can cause components to malfunction, ruining the board and running a fire hazard. Overheat can cause components, usually the silicon or copper of the wires or circuits to melt, causing connections to loosen, causing fire hazards.

Electrical power

 
A bank of batteries in a large data center, used to provide power until diesel generators can start

Backup power consists of one or more uninterruptible power supplies, battery banks, and/or diesel / gas turbine generators.[65]

To prevent single points of failure, all elements of the electrical systems, including backup systems, are typically given redundant copies , and critical servers are connected to both the "A-side" and "B-side" power feeds. This arrangement is often made to achieve N+1 redundancy in the systems. Static transfer switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure.

Low-voltage cable routing

Options include:

  • Data cabling can be routed through overhead cable trays[66]
  • Raised floor cabling, both for security reasons and to avoid the extra cost of cooling systems over the racks.
  • Smaller/less expensive data centers may use anti-static tiles instead for a flooring surface.

Air flow

Air flow management addresses the need to improve data center computer cooling efficiency by preventing the recirculation of hot air exhausted from IT equipment and reducing bypass airflow. There are several methods of separating hot and cold airstreams, such as hot/cold aisle containment and in-row cooling units.[67]

Aisle containment

Cold aisle containment is done by exposing the rear of equipment racks, while the fronts of the servers are enclosed with doors and covers. This is similar to how large-scale food companies refrigerate and store their products.

 
Typical cold aisle configuration with server rack fronts facing each other and cold air distributed through the raised floor.

Computer cabinets/Server farms are often organized for containment of hot/cold aisles. Proper air duct placement prevents the cold and hot air from mixing. Rows of cabinets are paired to face each other so that the cool and hot air intakes and exhausts don't mix air, which would severely reduce cooling efficiency.

Alternatively, a range of underfloor panels can create efficient cold air pathways directed to the raised floor vented tiles. Either the cold aisle or the hot aisle can be contained.[68]

Another option is fitting cabinets with vertical exhaust ducts Chimney[69] Hot exhaust pipes/vents/ducts can direct the air into a Plenum space above a Dropped ceiling and back to the cooling units or to outside vents. With this configuration, traditional hot/cold aisle configuration is not a requirement.[70]

Fire protection

 
FM200 fire suppression tanks

Data centers feature fire protection systems, including passive and Active Design elements, as well as implementation of fire prevention programs in operations. Smoke detectors are usually installed to provide early warning of a fire at its incipient stage.

Although the main room usually does not allow Wet Pipe-based Systems due to the fragile nature of Circuit-boards, there still exist systems that can be used in the rest of the facility or in cold/hot aisle air circulation systems that are closed systems, such as:[71]

  • Sprinkler systems
  • Misting, using high pressure to create extremely small water droplets, which can be used in sensitive rooms due to the nature of the droplets.

However, there also exist other means to put out fires, especially in Sensitive areas, usually using Gaseous fire suppression, of which Halon gas was the most popular, until the negative effects of producing and using it were discovered.[1]

Security

Physical access is usually restricted. Layered security often starts with fencing, bollards and mantraps.[72] Video camera surveillance and permanent security guards are almost always present if the data center is large or contains sensitive information. Fingerprint recognition mantraps is starting to be commonplace.

Logging access is required by some data protection regulations; some organizations tightly link this to access control systems. Multiple log entries can occur at the main entrance, entrances to internal rooms, and at equipment cabinets. Access control at cabinets can be integrated with intelligent power distribution units, so that locks are networked through the same appliance.[73]

Energy use

Energy use is a central issue for data centers. Power draw ranges from a few kW for a rack of servers in a closet to several tens of MW for large facilities. Some facilities have power densities more than 100 times that of a typical office building.[74] For higher power density facilities, electricity costs are a dominant operating expense and account for over 10% of the total cost of ownership (TCO) of a data center.[75]

Greenhouse gas emissions

In 2020 data centers (excluding cryptocurrency mining) and data transmission each used about 1% of world electricity.[76] Although some of this electricity was low carbon, the IEA called for more "government and industry efforts on energy efficiency, renewables procurement and RD&D",[76] as some data centers still use electricity generated by fossil fuels.[77] They also said that lifecycle emissions should be considered, that is including "embodied" emissions, such as in buildings.[76] Data centers are estimated to have been responsible for 0.5% of US greenhouse gas emissions in 2018.[78] Some Chinese companies, such as Tencent, have pledged to be carbon neutral by 2030, while others such as Alibaba have been criticized by Greenpeace for not committing to become carbon neutral.[79]

Energy efficiency and overhead

The most commonly used energy efficiency metric of data center energy efficiency is power usage effectiveness (PUE), calculated as the ratio of total power entering the data center divided by the power used by IT equipment.

 

It measures the percentage of power used by overhead (cooling, lighting, etc.). The average USA data center has a PUE of 2.0,[80] meaning two watts of total power (overhead + IT equipment) for every watt delivered to IT equipment. State-of-the-art is estimated to be roughly 1.2.[81] Google publishes quarterly efficiency from data centers in operation.[82]

The U.S. Environmental Protection Agency has an Energy Star rating for standalone or large data centers. To qualify for the ecolabel, a data center must be within the top quartile of energy efficiency of all reported facilities.[83] The Energy Efficiency Improvement Act of 2015 (United States) requires federal facilities — including data centers — to operate more efficiently. California's title 24 (2014) of the California Code of Regulations mandates that every newly constructed data center must have some form of airflow containment in place to optimize energy efficiency.

European Union also has a similar initiative: EU Code of Conduct for Data Centres.[84]

Energy use analysis and projects

The focus of measuring and analyzing energy use goes beyond what's used by IT equipment; facility support hardware such as chillers and fans also use energy.[85]

In 2011 server racks in data centers were designed for more than 25 kW and the typical server was estimated to waste about 30% of the electricity it consumed. The energy demand for information storage systems was also rising. A high availability data center was estimated to have a 1 mega watt (MW) demand and consume $20,000,000 in electricity over its lifetime, with cooling representing 35% to 45% of the data center's total cost of ownership. Calculations showed that in two years the cost of powering and cooling a server could be equal to the cost of purchasing the server hardware.[86] Research in 2018 has shown that substantial amount of energy could still be conserved by optimizing IT refresh rates and increasing server utilization.[87]

In 2011 Facebook, Rackspace and others founded the Open Compute Project (OCP) to develop and publish open standards for greener data center computing technologies. As part of the project Facebook published the designs of its server, which it had built for its first dedicated data center in Prineville. Making servers taller left space for more effective heat sinks and enabled the use of fans that moved more air with less energy. By not buying commercial off-the-shelf servers, energy consumption due to unnecessary expansion slots on the motherboard and unneeded components, such as a graphics card, was also saved.[88] In 2016 Google joined the project and published the designs of its 48V DC shallow data center rack. This design had long been part of Google data centers. By eliminating the multiple transformers usually deployed in data centers, Google had achieved a 30% increase in energy efficiency.[89] In 2017 sales for data center hardware built to OCP designs topped $1.2 billion and are expected to reach $6 billion by 2021.[88]

Power and cooling analysis

 
Data center at CERN (2010)

Power is the largest recurring cost to the user of a data center.[90] Cooling it at or below 70 °F (21 °C) wastes money and energy.[90] Furthermore, overcooling equipment in environments with a high relative humidity can expose equipment to a high amount of moisture that facilitates the growth of salt deposits on conductive filaments in the circuitry.[91]

A power and cooling analysis, also referred to as a thermal assessment, measures the relative temperatures in specific areas as well as the capacity of the cooling systems to handle specific ambient temperatures.[92] A power and cooling analysis can help to identify hot spots, over-cooled areas that can handle greater power use density, the breakpoint of equipment loading, the effectiveness of a raised-floor strategy, and optimal equipment positioning (such as AC units) to balance temperatures across the data center. Power cooling density is a measure of how much square footage the center can cool at maximum capacity.[93] The cooling of data centers is the second largest power consumer after servers. The cooling energy varies from 10% of the total energy consumption in the most efficient data centers and goes up to 45% in standard air-cooled data centers.

Energy efficiency analysis

An energy efficiency analysis measures the energy use of data center IT and facilities equipment. A typical energy efficiency analysis measures factors such as a data center's power use effectiveness (PUE) against industry standards, identifies mechanical and electrical sources of inefficiency, and identifies air-management metrics.[94] However, the limitation of most current metrics and approaches is that they do not include IT in the analysis. Case studies have shown that by addressing energy efficiency holistically in a data center, major efficiencies can be achieved that are not possible otherwise.[95]

Computational fluid dynamics (CFD) analysis

This type of analysis uses sophisticated tools and techniques to understand the unique thermal conditions present in each data center—predicting the temperature, airflow, and pressure behavior of a data center to assess performance and energy consumption, using numerical modeling.[96] By predicting the effects of these environmental conditions, CFD analysis in the data center can be used to predict the impact of high-density racks mixed with low-density racks[97] and the onward impact on cooling resources, poor infrastructure management practices and AC failure or AC shutdown for scheduled maintenance.

Thermal zone mapping

Thermal zone mapping uses sensors and computer modeling to create a three-dimensional image of the hot and cool zones in a data center.[98]

This information can help to identify optimal positioning of data center equipment. For example, critical servers might be placed in a cool zone that is serviced by redundant AC units.

Green data centers

 
This water-cooled data center in the Port of Strasbourg, France claims the attribute green.

Data centers use a lot of power, consumed by two main usages: the power required to run the actual equipment and then the power required to cool the equipment. Power-efficiency reduces the first category.

Cooling cost reduction from natural ways includes location decisions: When the focus is not being near good fiber connectivity, power grid connections and people-concentrations to manage the equipment, a data center can be miles away from the users. 'Mass' data centers like Google or Facebook don't need to be near population centers. Arctic locations can use outside air, which provides cooling, are getting more popular.[99]

Renewable electricity sources are another plus. Thus countries with favorable conditions, such as: Canada,[100] Finland,[101] Sweden,[102] Norway,[103] and Switzerland,[104] are trying to attract cloud computing data centers.

Bitcoin mining is increasingly being seen as a potential way to build data centers at the site of renewable energy production. Curtailed and clipped energy can be used to secure transactions on the Bitcoin blockchain providing another revenue stream to renewable energy producers.[105]

Energy reuse

It is very difficult to reuse the heat which comes from air cooled data centers. For this reason, data center infrastructures are more often equipped with heat pumps.[106] An alternative to heat pumps is the adoption of liquid cooling throughout a data center. Different liquid cooling techniques are mixed and matched to allow for a fully liquid cooled infrastructure which captures all heat in water. Different liquid technologies are categorized in 3 main groups, Indirect liquid cooling (water cooled racks), Direct liquid cooling (direct-to-chip cooling) and Total liquid cooling (complete immersion in liquid, see Server immersion cooling). This combination of technologies allows the creation of a thermal cascade as part of temperature chaining scenarios to create high temperature water outputs from the data center.

Dynamic infrastructure

Dynamic infrastructure[107] provides the ability to intelligently, automatically and securely move workloads within a data center[108] anytime, anywhere, for migrations, provisioning,[109] to enhance performance, or building co-location facilities. It also facilitates performing routine maintenance on either physical or virtual systems all while minimizing interruption. A related concept is Composable infrastructure, which allows for the dynamic reconfiguration of the available resources to suit needs, only when needed.[110]

Side benefits include

Network infrastructure

 
An operation engineer overseeing a network operations control room of a data center (2006)
 
An example of network infrastructure of a data center

Communications in data centers today are most often based on networks running the IP protocol suite. Data centers contain a set of routers and switches that transport traffic between the servers and to the outside world[112] which are connected according to the data center network architecture. Redundancy of the Internet connection is often provided by using two or more upstream service providers (see Multihoming).

Some of the servers at the data center are used for running the basic Internet and intranet services needed by internal users in the organization, e.g., e-mail servers, proxy servers, and DNS servers.

Network security elements are also usually deployed: firewalls, VPN gateways, intrusion detection systems, and so on. Also common are monitoring systems for the network and some of the applications. Additional off site monitoring systems are also typical, in case of a failure of communications inside the data center.

Software/data backup

Non-mutually exclusive options for data backup are:

  • Onsite
  • Offsite

Onsite is traditional,[113] and one major advantage is immediate availability.

Offsite backup storage

Data backup techniques include having an encrypted copy of the data offsite. Methods used for transporting data are:[114]

  • having the customer write the data to a physical medium, such as magnetic tape, and then transporting the tape elsewhere.[115]
  • directly transferring the data to another site during the backup, using appropriate links
  • uploading the data "into the cloud"[116]

Modular data center

For quick deployment or disaster recovery, several large hardware vendors have developed mobile/modular solutions that can be installed and made operational in very short time.

See also

Notes

  1. ^ See spelling differences.
  2. ^ Old large computer rooms that housed machines like the U.S. Army's ENIAC, which were developed pre-1960 (1945), were now referred to as "data centers".
  3. ^ Until the early 1960s, it was primarily the government that used computers, which were large mainframes housed in rooms that today we call data centers.
  4. ^ In the 1990s, network-connected minicomputers (servers) running without input or display devices were housed in the old computer rooms. These new "data centers" or "server rooms" were built within company walls, co-located with low-cost networking equipment.
  5. ^ There was considerable construction of data centers during the early 2000s, in the period of expanding dot-com businesses.
  6. ^ In May 2011, data center research organization Uptime Institute reported that 36 percent of the large companies it surveyed expect to exhaust IT capacity within the next 18 months. James Niccolai. "Data Centers Turn to Outsourcing to Meet Capacity Needs". CIO magazine.
  7. ^ instead of chillers/air conditioners, resulting in energy savings

References

  1. ^ "An Oregon Mill Town Learns to Love Facebook and Apple". The New York Times. March 6, 2018.
  2. ^ "Google announces London cloud computing data centre". BBC.com. July 13, 2017.
  3. ^ "Cloud Computing Brings Sprawling Centers, but Few Jobs". The New York Times. August 27, 2016. data center .. a giant .. facility .. 15 of these buildings, and six more .. under construction
  4. ^ "From Manhattan to Montvale". The New York Times. April 20, 1986.
  5. ^ Ashlee Vance (December 8, 2008). "Dell Sees Double With Data Center in a Container". The New York Times.
  6. ^ James Glanz (September 22, 2012). "Power, Pollution and the Internet". The New York Times. Retrieved 2012-09-25.
  7. ^ a b c Angela Bartels (August 31, 2011). . Archived from the original on October 24, 2018. Retrieved October 24, 2018.
  8. ^ a b c Cynthia Harvey (July 10, 2017). "Data Center". Datamation.
  9. ^ a b John Holusha (May 14, 2000). "Commercial Property/Engine Room for the Internet; Combining a Data Center With a 'Telco Hotel'". The New York Times. Retrieved June 23, 2019.
  10. ^ H Yuan. "Workload-Aware Request Routing in Cloud Data Center". doi:10.1109/JSEE.2015.00020. S2CID 59487957. {{cite journal}}: Cite journal requires |journal= (help)
  11. ^ Quentin Hardy (October 4, 2011). "A Data Center Power Solution". The New York Times.
  12. ^ a b "Mukhar, Nicholas. "HP Updates Data Center Transformation Solutions," August 17, 2011".
  13. ^ "Sperling, Ed. "Next-Generation Data Centers," Forbes, March 15. 2010". Forbes.com. Retrieved 2013-08-30.
  14. ^ "IDC white paper, sponsored by Seagate" (PDF).
  15. ^ "Data centers are aging, unsuited for new technologies". December 10, 2007.
  16. ^ "Data center staff are aging faster than the equipment". Network World. August 30, 2018.
  17. ^ "TIA-942 Certified Data Centers - Consultants - Auditors - TIA-942.org". www.tia-942.org.
  18. ^ . Archived from the original on November 6, 2011. Retrieved November 7, 2011.
  19. ^ "GR-3160 - Telecommunications Data Center - Telcordia". telecom-info.njdepot.ericsson.net.
  20. ^ . Archived from the original on August 10, 2011. Retrieved September 9, 2011.
  21. ^ "the Era of Great Data Center Consolidation". Fortune. February 16, 2017. 'Friends don't let friends build data centers,' said Charles Phillips, chief executive officer of Infor, a business software maker
  22. ^ "This Wave of Data Center Consolidation is Different from the First One". February 8, 2018.
  23. ^ "Start A Fire". startafire.com.
  24. ^ "Stop Virtual Server Sprawl". IBMsystemsMagazine.com.
  25. ^ "Top reasons to upgrade vintage data centers" (PDF).
  26. ^ a b "Complexity: Growing Data Center Challenge". Data Center Knowledge. May 16, 2007.
  27. ^ "Carousel's Expert Walks Through Major Benefits of Virtualization". technews.tmcnet.com.
  28. ^ Stephen Delahunty (August 15, 2011). . InformationWeek. Archived from the original on 2012-04-02.
  29. ^ "HVD: the cloud's silver lining" (PDF). Intrinsic Technology. Archived from the original (PDF) on October 2, 2012. Retrieved August 30, 2012.
  30. ^ "Gartner: Virtualization Disrupts Server Vendors". December 2, 2008.
  31. ^ "Ritter, Ted. Nemertes Research, "Securing the Data-Center Transformation Aligning Security and Data-Center Dynamics"".
  32. ^ "GR-2930 - NEBS: Raised Floor Requirements".
  33. ^ a b "Data Center Raised Floor History" (PDF).
  34. ^ "Raised Floor Info | Tips for Ordering Replacement Raised Floor Tiles". www.accessfloorsystems.com.
  35. ^ Hwaiyu Geng (2014). Data Center Handbook. ISBN 978-1118436639.
  36. ^ Steven Spinazzola (2005). "HVAC: The Challenge And Benefits of Under Floor Air Distribution Systems". FacilitiesNet.com.
  37. ^ "Premier 100 Q&A: HP's CIO sees 'lights-out' data centers". Informationweek. March 6, 2006.
  38. ^ Victor Kasacavage (2002). Complete book of remote access: connectivity and security. The Auerbach Best Practices Series. CRC Press. p. 227. ISBN 0-8493-1253-1.
  39. ^ Roxanne E. Burkey; Charles V. Breakfield (2000). Designing a total data solution: technology, implementation and deployment. Auerbach Best Practices. CRC Press. p. 24. ISBN 0-8493-0893-3.
  40. ^ Clarke, Renaud (2020-07-01). "Acoustic Barriers for Data Centres". IAC Acoustics. Retrieved 2023-02-11.
  41. ^ Thibodeau, Patrick (2007-07-31). "That sound you hear? The next data center problem". Computerworld. Retrieved 2023-02-11.
  42. ^ Sensear. "Data Center Noise Levels". Sensear. Retrieved 2023-02-11.
  43. ^ Weisbrod, Katelyn (2023-02-10). "In Northern Virginia, a Coming Data Center Boom Sounds a Community Alarm". Inside Climate News. Retrieved 2023-02-11.
  44. ^ Judge, Peter (2022-07-19). "Prince William residents complain of "catastrophic noise" from data centers". DCD. Retrieved 2023-02-11.
  45. ^ Judge, Peter (2022-07-27). "Chicago residents complain of noise from Digital Realty data center". DCD. Retrieved 2023-02-11.
  46. ^ Phillips, Mark (2021-11-30). "Chandler to consider banning data centers amid noise complaints". ABC15 Arizona in Phoenix (KNXV). Retrieved 2023-02-11.
  47. ^ "Data Center Soundproofing and Noise Control- Reduce Server Noise". DDS Acoustical Specialties. Retrieved 2023-02-11.
  48. ^ Bosker, Bianca (2019-12-06). "Your "cloud" data is making noise on the ground". Marketplace. Retrieved 2023-02-11.
  49. ^ "Telecommunications Infrastructure Standard for Data Centers". ihs.com. 2005-04-12. Retrieved 2017-02-28.
  50. ^ "The World's Only Tier 5® Data Center Provider". Switch. Retrieved 2022-04-21.
  51. ^ "Citadel". Switch. Retrieved 2022-04-21.
  52. ^ Patrick Thibodeau (April 12, 2016). "Envisioning a 65-story data center". Computerworld.
  53. ^ "Google Container Datacenter Tour (video)". YouTube. Archived from the original on 2021-11-04.
  54. ^ "Romonet Offers Predictive Modeling Tool For Data Center Planning". June 29, 2011.
  55. ^ a b "BICSI News Magazine - May/June 2010". www.nxtbook.com.
  56. ^ "Hedging Your Data Center Power".
  57. ^ Clark, Jeffrey. "The Price of Data Center Availability—How much availability do you need?", Oct. 12, 2011, The Data Center Journal . Archived from the original on 2011-12-03. Retrieved 2012-02-08.
  58. ^ "Five tips on selecting a data center location".
  59. ^ . YouTube. Archived from the original on 2012-08-29.
  60. ^ Niles, Susan. "Standardization and Modularity in Data Center Physical Infrastructure," 2011, Schneider Electric, page 4. (PDF). Archived from the original (PDF) on 2012-04-16. Retrieved 2012-02-08.
  61. ^ "Strategies for the Containerized Data Center". September 8, 2011.
  62. ^ Niccolai, James (2010-07-27). "HP says prefab data center cuts costs in half".
  63. ^ . Reuters. 2009-09-14. Archived from the original on 2009-09-26.
  64. ^ "Air to air combat - indirect air cooling wars".
  65. ^ Detailed explanation of UPS topologies (PDF). Archived from the original (PDF) on 2010-11-22.
  66. ^ "Cable tray systems support cables' journey through the data center". April 2016.
  67. ^ Mike Fox (2012-02-15). . DataCenterFix. Archived from the original on March 1, 2012. Retrieved February 27, 2012.
  68. ^ Hot-Aisle vs. Cold-Aisle Containment for Data Centers, John Niemann, Kevin Brown, and Victor Avelar, APC by Schneider Electric White Paper 135, Revision 1
  69. ^ "US Patent Application for DUCTED EXHAUST EQUIPMENT ENCLOSURE Patent Application (Application #20180042143 issued February 8, 2018) - Justia Patents Search". patents.justia.com. Retrieved 2018-04-17.
  70. ^ "Airflow Management Basics – Comparing Containment Systems • Data Center Frontier". Data Center Frontier. 2017-07-27. Retrieved 2018-04-17.
  71. ^ "Data Center Fire Suppression Systems: What Facility Managers Should Consider". Facilitiesnet.
  72. ^ Sarah D. Scalet (2005-11-01). "19 Ways to Build Physical Security Into a Data Center". Csoonline.com. Retrieved 2013-08-30.
  73. ^ Systems and methods for controlling an electronic lock for a remote device, 2016-08-01, retrieved 2018-04-25
  74. ^ "Data Center Energy Consumption Trends". U.S. Department of Energy. Retrieved 2010-06-10.
  75. ^ J. Koomey, C. Belady, M. Patterson, A. Santos, K.D. Lange: Assessing Trends Over Time in Performance, Costs, and Energy Use for Servers Released on the web August 17th, 2009.
  76. ^ a b c "Data Centres and Data Transmission Networks – Analysis". IEA. Retrieved 2022-03-06.
  77. ^ Kantor, Alice (2021-05-18). "Big Tech races to clean up act as cloud energy use grows". Financial Times. Archived from the original on 2022-12-10. Retrieved 2022-03-06.
  78. ^ Siddik, Md Abu Bakar; Shehabi, Arman; Marston, Landon (2021-05-21). "The environmental footprint of data centers in the United States". Environmental Research Letters. 16 (6): 064017. Bibcode:2021ERL....16f4017S. doi:10.1088/1748-9326/abfba1. ISSN 1748-9326. S2CID 235282419.
  79. ^ James, Greg (2022-03-01). "Tencent pledges to achieve carbon neutrality by 2030". SupChina. Retrieved 2022-03-06.
  80. ^ "Report to Congress on Server and Data Center Energy Efficiency" (PDF). U.S. Environmental Protection Agency ENERGY STAR Program.
  81. ^ (PDF). Silicon Valley Leadership Group. Archived from the original (PDF) on 2011-07-07. Retrieved 2010-06-10.
  82. ^ "Efficiency: How we do it – Data centers". Retrieved 2015-01-19.
  83. ^ Commentary on introduction of Energy Star for Data Centers . Jack Pouchet. 2010-09-27. Archived from the original (Web site) on 2010-09-25. Retrieved 2010-09-27.
  84. ^ . iet.jrc.ec.europa.eu. Archived from the original on 2013-08-11. Retrieved 2013-08-30.
  85. ^ "UNICOM Global :: Home" (PDF). www.gtsi.com.
  86. ^ Daniel Minoli (2011). Designing Green Networks and Network Operations: Saving Run-the-Engine Costs. CRC Press. p. 5. ISBN 9781439816394.
  87. ^ Rabih Bashroush (2018). "A Comprehensive Reasoning Framework for Hardware Refresh in Data Centres". IEEE Transactions on Sustainable Computing. 3 (4): 209–220. doi:10.1109/TSUSC.2018.2795465. S2CID 54462006.
  88. ^ a b Peter Sayer (March 28, 2018). "What is the Open Compute Project?". NetworkWorld.
  89. ^ Peter Judge (March 9, 2016). "OCP Summit: Google joins and shares 48V tech". DCD Data center Dynamics.
  90. ^ a b Joe Cosmano (2009), Choosing a Data Center (PDF), Disaster Recovery Journal, retrieved 2012-07-21
  91. ^ David Garrett (2004), Heat Of The Moment, Processor, archived from the original on 2013-01-31, retrieved 2012-07-21
  92. ^ "HP's Green Data Center Portfolio Keeps Growing - InternetNews". www.internetnews.com. 25 July 2007.
  93. ^ Inc. staff (2010), How to Choose a Data Center, retrieved 2012-07-21
  94. ^ "Siranosian, Kathryn. "HP Shows Companies How to Integrate Energy Management and Carbon Reduction," TriplePundit, April 5, 2011".
  95. ^ Rabih Bashroush; Eoin Woods (2017). "Architectural Principles for Energy-Aware Internet-Scale Applications". IEEE Software. 34 (3): 14–17. doi:10.1109/MS.2017.60. S2CID 8984662.
  96. ^ Bullock, Michael. "Computation Fluid Dynamics - Hot topic at Data Center World," Transitional Data Services, March 18, 2010. January 3, 2012, at the Wayback Machine
  97. ^ (PDF). Archived from the original (PDF) on 2014-04-29. Retrieved 2012-02-08.
  98. ^ "HP Thermal Zone Mapping plots data center hot spots".
  99. ^ "Fjord-cooled DC in Norway claims to be greenest". 23 December 2011. Retrieved 23 December 2011.
  100. ^ Canada Called Prime Real Estate for Massive Data Computers - Globe & Mail Retrieved June 29, 2011.
  101. ^ Finland - First Choice for Siting Your Cloud Computing Data Center.. Retrieved 4 August 2010.
  102. ^ . Archived from the original on 19 August 2010. Retrieved 4 August 2010.
  103. ^ In a world of rapidly increasing carbon emissions from the ICT industry, Norway offers a sustainable solution Retrieved 1 March 2016.
  104. ^ Swiss Carbon-Neutral Servers Hit the Cloud.. Retrieved 4 August 2010.
  105. ^ Bitcoin, Surplus. "Bitcoin Does Not Waste Energy". Surplus Bitcoin. Retrieved 2020-04-19.
  106. ^ "Data Center Cooling with Heat Recovery" (PDF). StockholmDataParks.com. January 23, 2017.
  107. ^ "Method for Dynamic Information Technology Infrastructure Provisioning".
  108. ^ Meyler, Kerrie (April 29, 2008). "The Dynamic Datacenter". Network World.
  109. ^ "Computation on Demand: The Promise of Dynamic Provisioning".
  110. ^ "Just What the Heck Is Composable Infrastructure, Anyway?". IT Pro. July 14, 2016.
  111. ^ Montazerolghaem, Ahmadreza (2020-07-13). "Software-defined load-balanced data center: design, implementation and performance analysis" (PDF). Cluster Computing. 24 (2): 591–610. doi:10.1007/s10586-020-03134-x. ISSN 1386-7857. S2CID 220490312.
  112. ^ Mohammad Noormohammadpour; Cauligi Raghavendra (July 16, 2018). "Datacenter Traffic Control: Understanding Techniques and Tradeoffs". IEEE Communications Surveys & Tutorials. 20 (2): 1492–1525. arXiv:1712.03530. doi:10.1109/comst.2017.2782753. S2CID 28143006.
  113. ^ "Protecting Data Without Blowing The Budget, Part 1: Onsite Backup". Forbes. October 4, 2018.
  114. ^ "Iron Mountain vs Amazon Glacier: Total Cost Analysis" (PDF).
  115. ^ What IBM calls "PTAM: Pickup Truck Access Method." "PTAM - Pickup Truck Access Method (disaster recovery slang)".
  116. ^ "Iron Mountain introduces cloud backup and management service". September 14, 2017. {{cite magazine}}: Cite magazine requires |magazine= (help)

External links

  • - Research, development, demonstration, and deployment of energy-efficient technologies and practices for data centers
  • - FAQ: 380VDC testing and demonstration at a Sun data center.
  • White Paper - Property Taxes: The New Challenge for Data Centers
  • The European Commission H2020 EURECA Data Centre Project - Data centre energy efficiency guidelines, extensive online training material, case studies/lectures (under events page), and tools.

data, center, data, center, american, english, data, centre, commonwealth, english, note, building, dedicated, space, within, building, group, buildings, used, house, computer, systems, associated, components, such, telecommunications, storage, systems, arsat,. A data center American English 1 or data centre Commonwealth English 2 note 1 is a building a dedicated space within a building or a group of buildings 3 used to house computer systems and associated components such as telecommunications and storage systems 4 5 ARSAT data center 2014 Since IT operations are crucial for business continuity it generally includes redundant or backup components and infrastructure for power supply data communication connections environmental controls e g air conditioning fire suppression and various security devices A large data center is an industrial scale operation using as much electricity as a small town 6 Contents 1 History 2 Requirements for modern data centers 2 1 Meeting standards for data centers 2 2 Data center transformation 2 3 Raised floor 2 4 Lights out 2 5 Noise levels 3 Data center levels and tiers 3 1 International standards EN50600 and ISO22237 Information technology Data center facilities and infrastructures 3 2 Telecommunications Industry Association 3 3 Uptime Institute Data center Tier Classification Standard 4 Data center design 4 1 Design criteria and trade offs 4 1 1 High availability 4 1 2 Modularity and flexibility 4 2 Environmental control 4 3 Electrical power 4 4 Low voltage cable routing 4 5 Air flow 4 5 1 Aisle containment 4 6 Fire protection 4 7 Security 5 Energy use 5 1 Greenhouse gas emissions 5 2 Energy efficiency and overhead 5 3 Energy use analysis and projects 5 4 Power and cooling analysis 5 5 Energy efficiency analysis 5 6 Computational fluid dynamics CFD analysis 5 7 Thermal zone mapping 5 8 Green data centers 5 9 Energy reuse 6 Dynamic infrastructure 7 Network infrastructure 8 Software data backup 8 1 Offsite backup storage 9 Modular data center 10 See also 11 Notes 12 References 13 External linksHistory Edit NASA mission control computer room c 1962 Data centers have their roots in the huge computer rooms of the 1940s typified by ENIAC one of the earliest examples of a data center 7 note 2 Early computer systems complex to operate and maintain required a special environment in which to operate Many cables were necessary to connect all the components and methods to accommodate and organize these were devised such as standard racks to mount equipment raised floors and cable trays installed overhead or under the elevated floor A single mainframe required a great deal of power and had to be cooled to avoid overheating Security became important computers were expensive and were often used for military purposes 7 note 3 Basic design guidelines for controlling access to the computer room were therefore devised During the boom of the microcomputer industry and especially during the 1980s users started to deploy computers everywhere in many cases with little or no care about operating requirements However as information technology IT operations started to grow in complexity organizations grew aware of the need to control IT resources The availability of inexpensive networking equipment coupled with new standards for the network structured cabling made it possible to use a hierarchical design that put the servers in a specific room inside the company The use of the term data center as applied to specially designed computer rooms started to gain popular recognition about this time 7 note 4 The boom of data centers came during the dot com bubble of 1997 2000 8 note 5 Companies needed fast Internet connectivity and non stop operation to deploy systems and to establish a presence on the Internet Installing such equipment was not viable for many smaller companies Many companies started building very large facilities called Internet data centers IDCs 9 which provide enhanced capabilities such as crossover backup If a Bell Atlantic line is cut we can transfer them to to minimize the time of outage 9 The term cloud data centers CDCs has been used 10 Data centers typically cost a lot to build and maintain 8 Increasingly the division of these terms has almost disappeared and they are being integrated into the term data center 11 Requirements for modern data centers Edit Racks of telecommunications equipment in part of a data center Modernization and data center transformation enhances performance and energy efficiency 12 Information security is also a concern and for this reason a data center has to offer a secure environment that minimizes the chances of a security breach A data center must therefore keep high standards for assuring the integrity and functionality of its hosted computer environment Industry research company International Data Corporation IDC puts the average age of a data center at nine years old 12 Gartner another research company says data centers older than seven years are obsolete 13 The growth in data 163 zettabytes by 2025 14 is one factor driving the need for data centers to modernize Focus on modernization is not new concern about obsolete equipment was decried in 2007 15 and in 2011 Uptime Institute was concerned about the age of the equipment therein note 6 By 2018 concern had shifted once again this time to the age of the staff data center staff are aging faster than the equipment 16 Meeting standards for data centers Edit The Telecommunications Industry Association s Telecommunications Infrastructure Standard for Data Centers 17 specifies the minimum requirements for telecommunications infrastructure of data centers and computer rooms including single tenant enterprise data centers and multi tenant Internet hosting data centers The topology proposed in this document is intended to be applicable to any size data center 18 Telcordia GR 3160 NEBS Requirements for Telecommunications Data Center Equipment and Spaces 19 provides guidelines for data center spaces within telecommunications networks and environmental requirements for the equipment intended for installation in those spaces These criteria were developed jointly by Telcordia and industry representatives They may be applied to data center spaces housing data processing or Information Technology IT equipment The equipment may be used to Operate and manage a carrier s telecommunication network Provide data center based applications directly to the carrier s customers Provide hosted applications for a third party to provide services to their customers Provide a combination of these and similar data center applicationsData center transformation Edit Data center transformation takes a step by step approach through integrated projects carried out over time This differs from a traditional method of data center upgrades that takes a serial and siloed approach 20 The typical projects within a data center transformation initiative include standardization consolidation virtualization automation and security Standardization consolidation Reducing the number of data centers 21 22 and avoiding server sprawl 23 both physical and virtual 24 often includes replacing aging data center equipment 25 and is aided by standardization 26 Virtualization Lowers capital and operational expenses 27 reduces energy consumption 28 Virtualized desktops can be hosted in data centers and rented out on a subscription basis 29 Investment bank Lazard Capital Markets estimated in 2008 that 48 percent of enterprise operations will be virtualized by 2012 Gartner views virtualization as a catalyst for modernization 30 Automating Automating tasks such as provisioning configuration patching release management and compliance is needed not just when facing fewer skilled IT workers 26 Securing Protection of virtual systems is integrated with the existing security of physical infrastructures 31 Raised floor Edit Perforated cooling floor tile Main article Raised floor A raised floor standards guide named GR 2930 was developed by Telcordia Technologies a subsidiary of Ericsson 32 Although the first raised floor computer room was made by IBM in 1956 33 and they ve been around since the 1960s 34 it was the 1970s that made it more common for computer centers to thereby allow cool air to circulate more efficiently 35 36 The first purpose of the raised floor was to allow access for wiring 33 Lights out Edit The lights out 37 data center also known as a darkened or a dark data center is a data center that ideally has all but eliminated the need for direct access by personnel except under extraordinary circumstances Because of the lack of need for staff to enter the data center it can be operated without lighting All of the devices are accessed and managed by remote systems with automation programs used to perform unattended operations In addition to the energy savings reduction in staffing costs and the ability to locate the site further from population centers implementing a lights out data center reduces the threat of malicious attacks upon the infrastructure 38 39 Noise levels Edit Generally speaking local authorities prefer noise levels at data centers to be 10dB below the existing night time background noise level at the nearest residence 40 OSHA regulations require monitoring of noise levels inside data centers if noise exceeds 85 decibels 41 The average noise level in server areas of a data center may reach as high as 92 96 dB A 42 Residents living near data centers have described the sound as a high pitched whirring noise 24 7 saying It s like being on a tarmac with an airplane engine running constantly Except that the airplane keeps idling and never leaves 43 44 45 46 External sources of noise include HVAC equipment and energy generators 47 48 Data center levels and tiers EditThe two organizations in the United States that publish data center standards are the Telecommunications Industry Association TIA and the Uptime Institute International standards EN50600 and ISO22237 Information technology Data center facilities and infrastructures Edit Class 1 single path solution Class 2 single path with redundancy solution Class 3 multiple paths providing a concurrent repair operate solution Class 4 multiple paths providing a fault tolerant solution except during maintenance Telecommunications Industry Association Edit Main article TIA 942 The Telecommunications Industry Association s TIA 942 standard for data centers published in 2005 and updated four times since defined four infrastructure levels 49 Rated 1 basically a server room following basic guidelines Rated 2 Redundant component key components are redundant Rated 3 Concurrently maintainabile able to handle maintenance on any part of the distribution path or any single piece of equipment without causing an interruption to the data center operations Rated 4 Fault tolerant able to handle one single fault at a time on any part of the distribution path or any single piece of equipment without causing interruption to the data center operationsUptime Institute Data center Tier Classification Standard Edit Four Tiers are defined by the Uptime Institute standard Tier I BASIC CAPACITY and must include a UPS uninterruptible power source Tier II REDUNDANT CAPACITY and adds redundant power and cooling Tier III CONCURRENTLY MAINTAINABLE and ensures that ANY component can be taken out of service without affecting production Tier IV FAULT TOLERANT allowing any production capacity to be insulated from ANY type of failure A fifth tier has been Trademarked by Switch company who have used this tier to define The Citadel the largest data center in the world 50 51 Data center design EditThe field of data center design has been growing for decades in various directions including new construction big and small along with the creative re use of existing facilities like abandoned retail space old salt mines and war era bunkers a 65 story data center has already been proposed 52 the number of data centers as of 2016 had grown beyond 3 million USA wide and more than triple that number worldwide 8 Local building codes may govern the minimum ceiling heights and other parameters Some of the considerations in the design of data centers are A typical server rack commonly seen in colocation Size one room of a building one or more floors or an entire building Capacity can hold up to or past 1 000 servers 53 Other considerations Space power cooling and costs in the data center 54 Mechanical engineering infrastructure heating ventilation and air conditioning HVAC humidification and dehumidification equipment pressurization 55 Electrical engineering infrastructure design utility service planning distribution switching and bypass from power sources uninterruptible power source UPS systems and more 55 56 CRAC Air Handle Design criteria and trade offs Edit Availability expectations The costs of avoiding downtime should not exceed the cost of the downtime itself 57 Site selection Location factors include proximity to power grids telecommunications infrastructure networking services transportation lines and emergency services Other considerations should include flight paths neighboring power drains geological risks and climate associated with cooling costs 58 Often power availability is the hardest to change High availability Edit Main article High availability Various metrics exist for measuring the data availability that results from data center availability beyond 95 uptime with the top of the scale counting how many nines can be placed after 99 59 Modularity and flexibility Edit Main article Modular data center Modularity and flexibility are key elements in allowing for a data center to grow and change over time Data center modules are pre engineered standardized building blocks that can be easily configured and moved as needed 60 A modular data center may consist of data center equipment contained within shipping containers or similar portable containers 61 Components of the data center can be prefabricated and standardized which facilitates moving if needed 62 Environmental control Edit Temperature and humidity are controlled via Air conditioning indirect cooling such as using outside air 63 64 note 7 Indirect Evaporative Cooling IDEC units and also using sea water It is important that computers do not get humid or overheat as high humidity can lead to dust clogging the fans which leads to overheat or can cause components to malfunction ruining the board and running a fire hazard Overheat can cause components usually the silicon or copper of the wires or circuits to melt causing connections to loosen causing fire hazards Electrical power Edit A bank of batteries in a large data center used to provide power until diesel generators can start Backup power consists of one or more uninterruptible power supplies battery banks and or diesel gas turbine generators 65 To prevent single points of failure all elements of the electrical systems including backup systems are typically given redundant copies and critical servers are connected to both the A side and B side power feeds This arrangement is often made to achieve N 1 redundancy in the systems Static transfer switches are sometimes used to ensure instantaneous switchover from one supply to the other in the event of a power failure Low voltage cable routing Edit Options include Data cabling can be routed through overhead cable trays 66 Raised floor cabling both for security reasons and to avoid the extra cost of cooling systems over the racks Smaller less expensive data centers may use anti static tiles instead for a flooring surface Air flow Edit Air flow management addresses the need to improve data center computer cooling efficiency by preventing the recirculation of hot air exhausted from IT equipment and reducing bypass airflow There are several methods of separating hot and cold airstreams such as hot cold aisle containment and in row cooling units 67 Aisle containment EditCold aisle containment is done by exposing the rear of equipment racks while the fronts of the servers are enclosed with doors and covers This is similar to how large scale food companies refrigerate and store their products Typical cold aisle configuration with server rack fronts facing each other and cold air distributed through the raised floor Computer cabinets Server farms are often organized for containment of hot cold aisles Proper air duct placement prevents the cold and hot air from mixing Rows of cabinets are paired to face each other so that the cool and hot air intakes and exhausts don t mix air which would severely reduce cooling efficiency Alternatively a range of underfloor panels can create efficient cold air pathways directed to the raised floor vented tiles Either the cold aisle or the hot aisle can be contained 68 Another option is fitting cabinets with vertical exhaust ducts Chimney 69 Hot exhaust pipes vents ducts can direct the air into a Plenum space above a Dropped ceiling and back to the cooling units or to outside vents With this configuration traditional hot cold aisle configuration is not a requirement 70 Fire protection Edit FM200 fire suppression tanks Data centers feature fire protection systems including passive and Active Design elements as well as implementation of fire prevention programs in operations Smoke detectors are usually installed to provide early warning of a fire at its incipient stage Although the main room usually does not allow Wet Pipe based Systems due to the fragile nature of Circuit boards there still exist systems that can be used in the rest of the facility or in cold hot aisle air circulation systems that are closed systems such as 71 Sprinkler systems Misting using high pressure to create extremely small water droplets which can be used in sensitive rooms due to the nature of the droplets However there also exist other means to put out fires especially in Sensitive areas usually using Gaseous fire suppression of which Halon gas was the most popular until the negative effects of producing and using it were discovered 1 Security Edit Main article Data center security Physical access is usually restricted Layered security often starts with fencing bollards and mantraps 72 Video camera surveillance and permanent security guards are almost always present if the data center is large or contains sensitive information Fingerprint recognition mantraps is starting to be commonplace Logging access is required by some data protection regulations some organizations tightly link this to access control systems Multiple log entries can occur at the main entrance entrances to internal rooms and at equipment cabinets Access control at cabinets can be integrated with intelligent power distribution units so that locks are networked through the same appliance 73 Energy use Edit Google Data Center The Dalles Oregon Main article IT energy management Energy use is a central issue for data centers Power draw ranges from a few kW for a rack of servers in a closet to several tens of MW for large facilities Some facilities have power densities more than 100 times that of a typical office building 74 For higher power density facilities electricity costs are a dominant operating expense and account for over 10 of the total cost of ownership TCO of a data center 75 Greenhouse gas emissions Edit In 2020 data centers excluding cryptocurrency mining and data transmission each used about 1 of world electricity 76 Although some of this electricity was low carbon the IEA called for more government and industry efforts on energy efficiency renewables procurement and RD amp D 76 as some data centers still use electricity generated by fossil fuels 77 They also said that lifecycle emissions should be considered that is including embodied emissions such as in buildings 76 Data centers are estimated to have been responsible for 0 5 of US greenhouse gas emissions in 2018 78 Some Chinese companies such as Tencent have pledged to be carbon neutral by 2030 while others such as Alibaba have been criticized by Greenpeace for not committing to become carbon neutral 79 Energy efficiency and overhead Edit The most commonly used energy efficiency metric of data center energy efficiency is power usage effectiveness PUE calculated as the ratio of total power entering the data center divided by the power used by IT equipment P U E Total Facility Power IT Equipment Power displaystyle mathrm PUE mbox Total Facility Power over mbox IT Equipment Power It measures the percentage of power used by overhead cooling lighting etc The average USA data center has a PUE of 2 0 80 meaning two watts of total power overhead IT equipment for every watt delivered to IT equipment State of the art is estimated to be roughly 1 2 81 Google publishes quarterly efficiency from data centers in operation 82 The U S Environmental Protection Agency has an Energy Star rating for standalone or large data centers To qualify for the ecolabel a data center must be within the top quartile of energy efficiency of all reported facilities 83 The Energy Efficiency Improvement Act of 2015 United States requires federal facilities including data centers to operate more efficiently California s title 24 2014 of the California Code of Regulations mandates that every newly constructed data center must have some form of airflow containment in place to optimize energy efficiency European Union also has a similar initiative EU Code of Conduct for Data Centres 84 Energy use analysis and projects Edit The focus of measuring and analyzing energy use goes beyond what s used by IT equipment facility support hardware such as chillers and fans also use energy 85 In 2011 server racks in data centers were designed for more than 25 kW and the typical server was estimated to waste about 30 of the electricity it consumed The energy demand for information storage systems was also rising A high availability data center was estimated to have a 1 mega watt MW demand and consume 20 000 000 in electricity over its lifetime with cooling representing 35 to 45 of the data center s total cost of ownership Calculations showed that in two years the cost of powering and cooling a server could be equal to the cost of purchasing the server hardware 86 Research in 2018 has shown that substantial amount of energy could still be conserved by optimizing IT refresh rates and increasing server utilization 87 In 2011 Facebook Rackspace and others founded the Open Compute Project OCP to develop and publish open standards for greener data center computing technologies As part of the project Facebook published the designs of its server which it had built for its first dedicated data center in Prineville Making servers taller left space for more effective heat sinks and enabled the use of fans that moved more air with less energy By not buying commercial off the shelf servers energy consumption due to unnecessary expansion slots on the motherboard and unneeded components such as a graphics card was also saved 88 In 2016 Google joined the project and published the designs of its 48V DC shallow data center rack This design had long been part of Google data centers By eliminating the multiple transformers usually deployed in data centers Google had achieved a 30 increase in energy efficiency 89 In 2017 sales for data center hardware built to OCP designs topped 1 2 billion and are expected to reach 6 billion by 2021 88 Power and cooling analysis Edit Data center at CERN 2010 Power is the largest recurring cost to the user of a data center 90 Cooling it at or below 70 F 21 C wastes money and energy 90 Furthermore overcooling equipment in environments with a high relative humidity can expose equipment to a high amount of moisture that facilitates the growth of salt deposits on conductive filaments in the circuitry 91 A power and cooling analysis also referred to as a thermal assessment measures the relative temperatures in specific areas as well as the capacity of the cooling systems to handle specific ambient temperatures 92 A power and cooling analysis can help to identify hot spots over cooled areas that can handle greater power use density the breakpoint of equipment loading the effectiveness of a raised floor strategy and optimal equipment positioning such as AC units to balance temperatures across the data center Power cooling density is a measure of how much square footage the center can cool at maximum capacity 93 The cooling of data centers is the second largest power consumer after servers The cooling energy varies from 10 of the total energy consumption in the most efficient data centers and goes up to 45 in standard air cooled data centers Energy efficiency analysis Edit An energy efficiency analysis measures the energy use of data center IT and facilities equipment A typical energy efficiency analysis measures factors such as a data center s power use effectiveness PUE against industry standards identifies mechanical and electrical sources of inefficiency and identifies air management metrics 94 However the limitation of most current metrics and approaches is that they do not include IT in the analysis Case studies have shown that by addressing energy efficiency holistically in a data center major efficiencies can be achieved that are not possible otherwise 95 Computational fluid dynamics CFD analysis Edit Main article Computational fluid dynamics This type of analysis uses sophisticated tools and techniques to understand the unique thermal conditions present in each data center predicting the temperature airflow and pressure behavior of a data center to assess performance and energy consumption using numerical modeling 96 By predicting the effects of these environmental conditions CFD analysis in the data center can be used to predict the impact of high density racks mixed with low density racks 97 and the onward impact on cooling resources poor infrastructure management practices and AC failure or AC shutdown for scheduled maintenance Thermal zone mapping Edit Thermal zone mapping uses sensors and computer modeling to create a three dimensional image of the hot and cool zones in a data center 98 This information can help to identify optimal positioning of data center equipment For example critical servers might be placed in a cool zone that is serviced by redundant AC units Green data centers Edit Main article Green data center This water cooled data center in the Port of Strasbourg France claims the attribute green Data centers use a lot of power consumed by two main usages the power required to run the actual equipment and then the power required to cool the equipment Power efficiency reduces the first category Cooling cost reduction from natural ways includes location decisions When the focus is not being near good fiber connectivity power grid connections and people concentrations to manage the equipment a data center can be miles away from the users Mass data centers like Google or Facebook don t need to be near population centers Arctic locations can use outside air which provides cooling are getting more popular 99 Renewable electricity sources are another plus Thus countries with favorable conditions such as Canada 100 Finland 101 Sweden 102 Norway 103 and Switzerland 104 are trying to attract cloud computing data centers Bitcoin mining is increasingly being seen as a potential way to build data centers at the site of renewable energy production Curtailed and clipped energy can be used to secure transactions on the Bitcoin blockchain providing another revenue stream to renewable energy producers 105 Energy reuse Edit It is very difficult to reuse the heat which comes from air cooled data centers For this reason data center infrastructures are more often equipped with heat pumps 106 An alternative to heat pumps is the adoption of liquid cooling throughout a data center Different liquid cooling techniques are mixed and matched to allow for a fully liquid cooled infrastructure which captures all heat in water Different liquid technologies are categorized in 3 main groups Indirect liquid cooling water cooled racks Direct liquid cooling direct to chip cooling and Total liquid cooling complete immersion in liquid see Server immersion cooling This combination of technologies allows the creation of a thermal cascade as part of temperature chaining scenarios to create high temperature water outputs from the data center Dynamic infrastructure EditMain article Dynamic infrastructure Dynamic infrastructure 107 provides the ability to intelligently automatically and securely move workloads within a data center 108 anytime anywhere for migrations provisioning 109 to enhance performance or building co location facilities It also facilitates performing routine maintenance on either physical or virtual systems all while minimizing interruption A related concept is Composable infrastructure which allows for the dynamic reconfiguration of the available resources to suit needs only when needed 110 Side benefits include reducing cost facilitating business continuity and high availability enabling cloud and grid computing 111 Network infrastructure Edit An operation engineer overseeing a network operations control room of a data center 2006 An example of network infrastructure of a data center Communications in data centers today are most often based on networks running the IP protocol suite Data centers contain a set of routers and switches that transport traffic between the servers and to the outside world 112 which are connected according to the data center network architecture Redundancy of the Internet connection is often provided by using two or more upstream service providers see Multihoming Some of the servers at the data center are used for running the basic Internet and intranet services needed by internal users in the organization e g e mail servers proxy servers and DNS servers Network security elements are also usually deployed firewalls VPN gateways intrusion detection systems and so on Also common are monitoring systems for the network and some of the applications Additional off site monitoring systems are also typical in case of a failure of communications inside the data center Software data backup EditNon mutually exclusive options for data backup are Onsite OffsiteOnsite is traditional 113 and one major advantage is immediate availability Offsite backup storage Edit Main article Disaster recovery offsite backup storage Data backup techniques include having an encrypted copy of the data offsite Methods used for transporting data are 114 having the customer write the data to a physical medium such as magnetic tape and then transporting the tape elsewhere 115 directly transferring the data to another site during the backup using appropriate links uploading the data into the cloud 116 Modular data center EditMain article Modular data center A 40 foot Portable Modular Data Center For quick deployment or disaster recovery several large hardware vendors have developed mobile modular solutions that can be installed and made operational in very short time See also EditColocation centre Computer cooling Data center management Disaster recovery Dynamic infrastructure Electrical network Internet exchange point Internet hosting service Microsoft underwater data center Neher McGrath method Network operations center Open Compute Project by Facebook Peering Server farm Server room Server Room Environment Monitoring System Telecommunications network Utah Data Center Web hosting serviceNotes Edit See spelling differences Old large computer rooms that housed machines like the U S Army s ENIAC which were developed pre 1960 1945 were now referred to as data centers Until the early 1960s it was primarily the government that used computers which were large mainframes housed in rooms that today we call data centers In the 1990s network connected minicomputers servers running without input or display devices were housed in the old computer rooms These new data centers or server rooms were built within company walls co located with low cost networking equipment There was considerable construction of data centers during the early 2000s in the period of expanding dot com businesses In May 2011 data center research organization Uptime Institute reported that 36 percent of the large companies it surveyed expect to exhaust IT capacity within the next 18 months James Niccolai Data Centers Turn to Outsourcing to Meet Capacity Needs CIO magazine instead of chillers air conditioners resulting in energy savingsReferences Edit An Oregon Mill Town Learns to Love Facebook and Apple The New York Times March 6 2018 Google announces London cloud computing data centre BBC com July 13 2017 Cloud Computing Brings Sprawling Centers but Few Jobs The New York Times August 27 2016 data center a giant facility 15 of these buildings and six more under construction From Manhattan to Montvale The New York Times April 20 1986 Ashlee Vance December 8 2008 Dell Sees Double With Data Center in a Container The New York Times James Glanz September 22 2012 Power Pollution and the Internet The New York Times Retrieved 2012 09 25 a b c Angela Bartels August 31 2011 Data Center Evolution 1960 to 2000 Archived from the original on October 24 2018 Retrieved October 24 2018 a b c Cynthia Harvey July 10 2017 Data Center Datamation a b John Holusha May 14 2000 Commercial Property Engine Room for the Internet Combining a Data Center With a Telco Hotel The New York Times Retrieved June 23 2019 H Yuan Workload Aware Request Routing in Cloud Data Center doi 10 1109 JSEE 2015 00020 S2CID 59487957 a href Template Cite journal html title Template Cite journal cite journal a Cite journal requires journal help Quentin Hardy October 4 2011 A Data Center Power Solution The New York Times a b Mukhar Nicholas HP Updates Data Center Transformation Solutions August 17 2011 Sperling Ed Next Generation Data Centers Forbes March 15 2010 Forbes com Retrieved 2013 08 30 IDC white paper sponsored by Seagate PDF Data centers are aging unsuited for new technologies December 10 2007 Data center staff are aging faster than the equipment Network World August 30 2018 TIA 942 Certified Data Centers Consultants Auditors TIA 942 org www tia 942 org Telecommunications Standards Development Archived from the original on November 6 2011 Retrieved November 7 2011 GR 3160 Telecommunications Data Center Telcordia telecom info njdepot ericsson net Tang Helen Three Signs it s time to transform your data center August 3 2010 Data Center Knowledge Archived from the original on August 10 2011 Retrieved September 9 2011 the Era of Great Data Center Consolidation Fortune February 16 2017 Friends don t let friends build data centers said Charles Phillips chief executive officer of Infor a business software maker This Wave of Data Center Consolidation is Different from the First One February 8 2018 Start A Fire startafire com Stop Virtual Server Sprawl IBMsystemsMagazine com Top reasons to upgrade vintage data centers PDF a b Complexity Growing Data Center Challenge Data Center Knowledge May 16 2007 Carousel s Expert Walks Through Major Benefits of Virtualization technews tmcnet com Stephen Delahunty August 15 2011 The New urgency for Server Virtualization InformationWeek Archived from the original on 2012 04 02 HVD the cloud s silver lining PDF Intrinsic Technology Archived from the original PDF on October 2 2012 Retrieved August 30 2012 Gartner Virtualization Disrupts Server Vendors December 2 2008 Ritter Ted Nemertes Research Securing the Data Center Transformation Aligning Security and Data Center Dynamics GR 2930 NEBS Raised Floor Requirements a b Data Center Raised Floor History PDF Raised Floor Info Tips for Ordering Replacement Raised Floor Tiles www accessfloorsystems com Hwaiyu Geng 2014 Data Center Handbook ISBN 978 1118436639 Steven Spinazzola 2005 HVAC The Challenge And Benefits of Under Floor Air Distribution Systems FacilitiesNet com Premier 100 Q amp A HP s CIO sees lights out data centers Informationweek March 6 2006 Victor Kasacavage 2002 Complete book of remote access connectivity and security The Auerbach Best Practices Series CRC Press p 227 ISBN 0 8493 1253 1 Roxanne E Burkey Charles V Breakfield 2000 Designing a total data solution technology implementation and deployment Auerbach Best Practices CRC Press p 24 ISBN 0 8493 0893 3 Clarke Renaud 2020 07 01 Acoustic Barriers for Data Centres IAC Acoustics Retrieved 2023 02 11 Thibodeau Patrick 2007 07 31 That sound you hear The next data center problem Computerworld Retrieved 2023 02 11 Sensear Data Center Noise Levels Sensear Retrieved 2023 02 11 Weisbrod Katelyn 2023 02 10 In Northern Virginia a Coming Data Center Boom Sounds a Community Alarm Inside Climate News Retrieved 2023 02 11 Judge Peter 2022 07 19 Prince William residents complain of catastrophic noise from data centers DCD Retrieved 2023 02 11 Judge Peter 2022 07 27 Chicago residents complain of noise from Digital Realty data center DCD Retrieved 2023 02 11 Phillips Mark 2021 11 30 Chandler to consider banning data centers amid noise complaints ABC15 Arizona in Phoenix KNXV Retrieved 2023 02 11 Data Center Soundproofing and Noise Control Reduce Server Noise DDS Acoustical Specialties Retrieved 2023 02 11 Bosker Bianca 2019 12 06 Your cloud data is making noise on the ground Marketplace Retrieved 2023 02 11 Telecommunications Infrastructure Standard for Data Centers ihs com 2005 04 12 Retrieved 2017 02 28 The World s Only Tier 5 Data Center Provider Switch Retrieved 2022 04 21 Citadel Switch Retrieved 2022 04 21 Patrick Thibodeau April 12 2016 Envisioning a 65 story data center Computerworld Google Container Datacenter Tour video YouTube Archived from the original on 2021 11 04 Romonet Offers Predictive Modeling Tool For Data Center Planning June 29 2011 a b BICSI News Magazine May June 2010 www nxtbook com Hedging Your Data Center Power Clark Jeffrey The Price of Data Center Availability How much availability do you need Oct 12 2011 The Data Center Journal Data Center Outsourcing in India projected to grow according to Gartner Archived from the original on 2011 12 03 Retrieved 2012 02 08 Five tips on selecting a data center location IBM zEnterprise EC12 Business Value Video YouTube Archived from the original on 2012 08 29 Niles Susan Standardization and Modularity in Data Center Physical Infrastructure 2011 Schneider Electric page 4 Standardization and Modularity in Data Center Physical Infrastructure PDF Archived from the original PDF on 2012 04 16 Retrieved 2012 02 08 Strategies for the Containerized Data Center September 8 2011 Niccolai James 2010 07 27 HP says prefab data center cuts costs in half tw telecom and NYSERDA Announce Co location Expansion Reuters 2009 09 14 Archived from the original on 2009 09 26 Air to air combat indirect air cooling wars Detailed explanation of UPS topologies EVALUATING THE ECONOMIC IMPACT OF UPS TECHNOLOGY PDF Archived from the original PDF on 2010 11 22 Cable tray systems support cables journey through the data center April 2016 Mike Fox 2012 02 15 Stulz announced it has begun manufacturing In Row server cooling units under the name CyberRow DataCenterFix Archived from the original on March 1 2012 Retrieved February 27 2012 Hot Aisle vs Cold Aisle Containment for Data Centers John Niemann Kevin Brown and Victor Avelar APC by Schneider Electric White Paper 135 Revision 1 US Patent Application for DUCTED EXHAUST EQUIPMENT ENCLOSURE Patent Application Application 20180042143 issued February 8 2018 Justia Patents Search patents justia com Retrieved 2018 04 17 Airflow Management Basics Comparing Containment Systems Data Center Frontier Data Center Frontier 2017 07 27 Retrieved 2018 04 17 Data Center Fire Suppression Systems What Facility Managers Should Consider Facilitiesnet Sarah D Scalet 2005 11 01 19 Ways to Build Physical Security Into a Data Center Csoonline com Retrieved 2013 08 30 Systems and methods for controlling an electronic lock for a remote device 2016 08 01 retrieved 2018 04 25 Data Center Energy Consumption Trends U S Department of Energy Retrieved 2010 06 10 J Koomey C Belady M Patterson A Santos K D Lange Assessing Trends Over Time in Performance Costs and Energy Use for Servers Released on the web August 17th 2009 a b c Data Centres and Data Transmission Networks Analysis IEA Retrieved 2022 03 06 Kantor Alice 2021 05 18 Big Tech races to clean up act as cloud energy use grows Financial Times Archived from the original on 2022 12 10 Retrieved 2022 03 06 Siddik Md Abu Bakar Shehabi Arman Marston Landon 2021 05 21 The environmental footprint of data centers in the United States Environmental Research Letters 16 6 064017 Bibcode 2021ERL 16f4017S doi 10 1088 1748 9326 abfba1 ISSN 1748 9326 S2CID 235282419 James Greg 2022 03 01 Tencent pledges to achieve carbon neutrality by 2030 SupChina Retrieved 2022 03 06 Report to Congress on Server and Data Center Energy Efficiency PDF U S Environmental Protection Agency ENERGY STAR Program Data Center Energy Forecast PDF Silicon Valley Leadership Group Archived from the original PDF on 2011 07 07 Retrieved 2010 06 10 Efficiency How we do it Data centers Retrieved 2015 01 19 Commentary on introduction of Energy Star for Data Centers Introducing EPA ENERGY STAR for Data Centers Jack Pouchet 2010 09 27 Archived from the original Web site on 2010 09 25 Retrieved 2010 09 27 EU Code of Conduct for Data Centres iet jrc ec europa eu Archived from the original on 2013 08 11 Retrieved 2013 08 30 UNICOM Global Home PDF www gtsi com Daniel Minoli 2011 Designing Green Networks and Network Operations Saving Run the Engine Costs CRC Press p 5 ISBN 9781439816394 Rabih Bashroush 2018 A Comprehensive Reasoning Framework for Hardware Refresh in Data Centres IEEE Transactions on Sustainable Computing 3 4 209 220 doi 10 1109 TSUSC 2018 2795465 S2CID 54462006 a b Peter Sayer March 28 2018 What is the Open Compute Project NetworkWorld Peter Judge March 9 2016 OCP Summit Google joins and shares 48V tech DCD Data center Dynamics a b Joe Cosmano 2009 Choosing a Data Center PDF Disaster Recovery Journal retrieved 2012 07 21 David Garrett 2004 Heat Of The Moment Processor archived from the original on 2013 01 31 retrieved 2012 07 21 HP s Green Data Center Portfolio Keeps Growing InternetNews www internetnews com 25 July 2007 Inc staff 2010 How to Choose a Data Center retrieved 2012 07 21 Siranosian Kathryn HP Shows Companies How to Integrate Energy Management and Carbon Reduction TriplePundit April 5 2011 Rabih Bashroush Eoin Woods 2017 Architectural Principles for Energy Aware Internet Scale Applications IEEE Software 34 3 14 17 doi 10 1109 MS 2017 60 S2CID 8984662 Bullock Michael Computation Fluid Dynamics Hot topic at Data Center World Transitional Data Services March 18 2010 Archived January 3 2012 at the Wayback Machine Bouley Dennis editor Impact of Virtualization on Data Center Physical Infrastructure The Green grid 2010 PDF Archived from the original PDF on 2014 04 29 Retrieved 2012 02 08 HP Thermal Zone Mapping plots data center hot spots Fjord cooled DC in Norway claims to be greenest 23 December 2011 Retrieved 23 December 2011 Canada Called Prime Real Estate for Massive Data Computers Globe amp Mail Retrieved June 29 2011 Finland First Choice for Siting Your Cloud Computing Data Center Retrieved 4 August 2010 Stockholm sets sights on data center customers Archived from the original on 19 August 2010 Retrieved 4 August 2010 In a world of rapidly increasing carbon emissions from the ICT industry Norway offers a sustainable solution Retrieved 1 March 2016 Swiss Carbon Neutral Servers Hit the Cloud Retrieved 4 August 2010 Bitcoin Surplus Bitcoin Does Not Waste Energy Surplus Bitcoin Retrieved 2020 04 19 Data Center Cooling with Heat Recovery PDF StockholmDataParks com January 23 2017 Method for Dynamic Information Technology Infrastructure Provisioning Meyler Kerrie April 29 2008 The Dynamic Datacenter Network World Computation on Demand The Promise of Dynamic Provisioning Just What the Heck Is Composable Infrastructure Anyway IT Pro July 14 2016 Montazerolghaem Ahmadreza 2020 07 13 Software defined load balanced data center design implementation and performance analysis PDF Cluster Computing 24 2 591 610 doi 10 1007 s10586 020 03134 x ISSN 1386 7857 S2CID 220490312 Mohammad Noormohammadpour Cauligi Raghavendra July 16 2018 Datacenter Traffic Control Understanding Techniques and Tradeoffs IEEE Communications Surveys amp Tutorials 20 2 1492 1525 arXiv 1712 03530 doi 10 1109 comst 2017 2782753 S2CID 28143006 Protecting Data Without Blowing The Budget Part 1 Onsite Backup Forbes October 4 2018 Iron Mountain vs Amazon Glacier Total Cost Analysis PDF What IBM calls PTAM Pickup Truck Access Method PTAM Pickup Truck Access Method disaster recovery slang Iron Mountain introduces cloud backup and management service September 14 2017 a href Template Cite magazine html title Template Cite magazine cite magazine a Cite magazine requires magazine help External links Edit Wikimedia Commons has media related to Data centers Wikibooks has a book on the topic of The Design and Organization of Data Centers Look up data center in Wiktionary the free dictionary Lawrence Berkeley Lab Research development demonstration and deployment of energy efficient technologies and practices for data centers DC Power For Data Centers Of The Future FAQ 380VDC testing and demonstration at a Sun data center White Paper Property Taxes The New Challenge for Data Centers The European Commission H2020 EURECA Data Centre Project Data centre energy efficiency guidelines extensive online training material case studies lectures under events page and tools Retrieved from https en wikipedia org w index php title Data center amp oldid 1143283222, wikipedia, wiki, book, books, library,

article

, read, download, free, free download, mp3, video, mp4, 3gp, jpg, jpeg, gif, png, picture, music, song, movie, book, game, games.