Understanding the New ASHRAE 90.4 Standard

Renewable Energy

The American Society of Heating, Refrigerating, and Air-conditioning Engineers, more commonly known as ASHRAE, has finally put forth the finished version of the data center efficiency and energy usage standards.  According to ASHRAE, the purpose of the 90.4 Standard is “to establish the minimum energy efficiency requirements of Data Centers for: the design, construction, and a plan for operation and maintenance, and utilization of on-site or off-site renewable energy resources.” The ASHRAE Standard takes into consideration that most existing data centers pass the requirements set under the standard, going with an 80/20 approach. This means that only the data centers that are the most energy inefficient are the ones that will fail to comply. The standard also takes into consideration the constant innovation and progress that takes place in the IT world and subsequently affects data centers.

Part of Standard 90.4 focuses on the efficiency of the use of on and off site renewable energy resources. Instead of opting for the unit of PUE or Power Usage Effectiveness to measure the efficiency of a data center, 90.4 divides it into two parts: Mechanical Load Component (MLC), a merit to measure the minimum energy efficiency of all mechanical cooling systems specified for a variety of different climate zones, and Electrical Loss Component (ELC). The values are calculated and compared to the specified limits for each unique climate zone. The Standard requires that the calculated values be lower than the set limits in order that the goal of the standard to be achieved.

An alternative path also allows data centers to perform tradeoffs between the two: MLC and ELC, if required in the case either of the systems compensates for the inefficiency of the other.

The older 90.1 Standard was implemented for almost all kinds of buildings, but the 90.4 Standard is built to ensure that it covers the design and structure of data centers specifically. ASHRAE now looks to shift the compliance from the previous version of the Standard to the new recommendation for the efficient functioning of a data center sites as well as efficient energy usage.

Jeffrey Dorf is the Editor of The Data Center Blog, President of the Global Data Center Alliance, and oversees the Mission Critical Practice for the M+W Group

Data Center Promotes STEM Education

Wanted to share this video with our 4,000 monthly visitors, as STEM (Science, Technology, Engineering, and Mathematics Education Coalition) is a priority for us at the Data Center Blog.  We were thrilled to learn of H5 Data Centers recent opening of a STEM  Innovation Center in conjunction with the Colorado Technology Association, Handy Networks and Eaton Corp.

The Innovation Center offers  a classroom setting inside an operational data center, providing students focused on Science and Technology a hands-on experience.

“Just as it is important for commercial companies to assist in producing our leaders for tomorrow, it is equally vital to promote the development of tomorrow’s engineers and technologists. H5 Data Centers is proud to provide an opportunity to help grow and nurture ambitious students and provide them real-world experience,” said Bill Johnson, vice president of colocation and data center operations at H5 Data Centers.

“Business-led learning experiences are paramount to filling the Colorado tech talent gap,” said Wendy Nkomo, COO and interim CEO of the Colorado Technology Association. “Exposure to technology in a career setting will inspire Colorado youth to pursue exciting and rewarding careers in tech.”

This unique, non-profit program launched on April 21, 2016. H5 and its partners are taking this ground-breaking step to enhance career development of the next generation of technology leaders. Beginning in the 2016-2017 school year, H5 will host STEM students on a monthly basis.

Google’s Transparency Reports – Not That Transparent

Google

No one disputes that the Internet has become one of the primary and central objects in all of our lives. Almost everything and everyone is connected through a series of invisible signals and waves that travel through the air and provide us a digital platform for sharing information and staying in touch with one another.

With the exponential increase in the use of the Web and with more and more people linking themselves and their lives with accounts on it, governments have a new tool to perform surveillance on their citizens.  With the increasing trend of social media usage, governments have increased the number of requests they send to online platforms for data reports regarding the services they have provided to customers.

Google, which is currently the most used and accessed online platform, released its latest transparency reports last month. Through the reports, it can clearly be observed that the requests for the personal data of citizens by the Federal Government are at an all-time high. These reports are released every six months, with the most recent highlighting the results and facts and figures for the 2nd half of of 2015.

Out of all countries around the world, the United States is at the top of the list, having requested personal data on 27,157 user accounts (a total of 12,523 times) during the six months that this transparency report covers. All nations, collectively, made 40,677 requests about 81,311 users or accounts, with the United States comprising the highest percent of requested data.

Those concerned with these increasing numbers argue that, while private citizens often take extreme measures to prevent others from entering and hacking their personal accounts, the government’s ability to simply request access to their private data (access is granted more than half the time) is an abuse of power and invasion of privacy.  Of course, governments assert that this invasion is only done for the ultimate benefit of the country as well as the citizenry as a whole. They point out that the increased use of social media for communication and execution by extremist groups, makes it necessary for them to increase surveillance and security in all possible ways. Their ultimate argument surrounds the “greater good” – compromising some privacy to save lives.

While I can see both sides of the coin, and would welcome some heathy discussion on the topic, I’d point out that these “transparency reports” really are not all that transparent.   Aside from basic quantitative facts and figures regarding the requests made by each country, nothing more related to the content of these requests is revealed.

Jeffrey Dorf is the Editor of The Data Center Blog and President of the Global Data Center Alliance

Obama Requires Agency Data Centers to be “Greener”

Obama

The Federal Information Technology Acquisition Reform Act, or the FITARA, requires federal agencies to provide reports highlighting their data center inventories, activity timelines, investments made and revenue generated, as well as plans they have implemented in order to optimize data centers under their ownership on an annual basis.

Under President Obama, the Executive Office passed a memorandum that went into effect on August 1, 2016, making it compulsory for agencies to develop, implement and report any strategies that may help to take the place of physical infrastructure – helping to optimize the facilities into a non-tangible substructure. I believe this is a great step towards a future that is stepping away from data centers that continuously guzzle up critical energy and require large investments for expansion and growth.

By outlining a general skeleton of a proper data center optimization strategy and highlighting the criteria by which one can be deemed successful, the memorandum sets up a framework for making data centers much more power efficient.  Under this memorandum,  agencies may not expand or build new data sites at their discretion, but must propose a written application that displays an analytical overview of why more cost efficient, cloud-based alternatives might not be the best choice for their business.

Cloud computing is a primary constituent of cost effective financial investment. Instead of purchasing hardware and physical infrastructure, you can be provided a storage and interaction network as a service by a service provider and access it through your desktop devices or through the internet. More and more businesses are turning towards this kind of a service as they look to lower invested cost in multiple data centers and also seek to lower their PUE or Power Usage Effectiveness down to considerable levels.

This new memorandum carefully determines and puts limits on the metrics under which a company’s PUE (Power Usage Effectiveness) should fall. A higher limit for already established agencies and a lower one for ones that are just starting up, helps to significantly reduce the amount of capital that is spent on energy expenditure and expansion of multiple data centers under a single company’s ownership.

At this critical point in time, when continuous industrial and economic motion is essential for long-term success, it is salient that we look into better alternatives.  A well structured strategic plan (that is required to be put to paper and given to the respective authorities) ensures that agencies optimize on a continuous basis and that they do not follow a plan with loopholes. This legislation will, undoubtedly, greatly help to reduce power ineffectiveness by shutting down facilities that are non-tiered or are no longer in use.

Jeffrey Dorf is the Editor of The Data Center Blog and President of the Global Data Center Alliance

Project Delivery & Delighting the End User

Harold

The process of delivering a new data center build is a multi-faceted one that involves many disciplines. The difference between a successful project where the owner is delighted in the end product and one that is filled with needless difficulties along the way is often related to the atmosphere that exists between the various disciplines involved in the delivery.
The following three principles are characteristic (though not comprehensive) of the interactions that exist between the end users, architects, engineering team, contractors, and suppliers on successful data center projects.
Proactive & Regular Communication
One of the most representative characteristics of successful projects is pro-active communication that takes place on a regular basis. There needs to be constant discussion between the key players involved in the various delivery points on the project. One simple yet effective way to ensure clear communication is for the end user/general contractor to set up weekly calls with the subcontractors and various component suppliers. It is very common for the end user, contractors, and subcontractors to be involved in these weekly calls; however, it as not as common as it should be to invite the major component suppliers to the table. By extending the sphere of communication to the key component suppliers it helps ensure that everyone is on the same page, delivery schedules are on track, and that any changes needed on the project can be addressed quickly throughout the supply chain.
Trust & Accountability
Once the communication lines have been opened it is imperative that there is a culture of accountability & trust established. The contractors need to be able to trust all of the players on the project and the various subcontractors and suppliers need to show themselves to be trustworthy. One aspect of trustworthiness that owners need from their supply chain and contractors is a realistic time frame/delivery schedule for the completion of tasks. If one of the component suppliers has won the job by misleading the owner about lead times there is a ripple effect that can take place throughout the project delivery schedule that can have significant financial effects. This is why it is imperative that component suppliers and sub-contractors be properly vetted in regards to their reputation in the industry. If on a given project one of the material suppliers falls behind schedule they need to communicate this upstream and have a secondary solution to improve the worst case delivery schedule.
Teamwork
At the end of the day the most important thing on a complex construction project is making sure that the owner/end user is satisfied and delighted in the final delivered product. This does not happen without the suppliers, contractors, and other disciplines being committed to this end goal. If everyone is not committed to this goal, suppliers will be quick to shirk responsibility, slow to help the other related disciplines, and there will be a lot of cross discipline finger pointing. A commitment to the end goal of user delight will lead to an atmosphere where the related disciplines are making sure that their alliance partners in the project are successful. An example of this on the HVAC manufacturer side of things has to do with controls. There are times when the end user desires for the manufacturer to integrate 3rd party components and controls into the cooling system. A commitment to end user delight in this case means that the 3rd party controls supplier does everything in their power to provide the HVAC manufacturer the components on time or ahead of schedule. This allows the HVAC manufacturer time to integrate the controls and deliver to the mechanical contractor on time while eliminating needless field work.
End Result = End User Delight
Proactive communication, trust & accountability, and teamwork are nothing earth shattering or new to those of us who have any business experience. That being said, these characteristics are most likely known by all, talked about by many, and unfortunately only implemented effectively by a few. The end goal needs to always be end user delight and satisfaction. Hopefully, by taking a few minutes to reflect on these characteristics we all will be able to better serve the end user community in the delivery of successful data center projects.

Harold Simmons is the Global Director of Strategy at United Metal Products (UMP) http://www.unitedmetal.com, and serves on the boards of AFCOM and the Global Data Center Alliance

The Rio Olympics, the NFL’s Raiders, & Data Center Incentives

Olympics

As the world focuses on Rio, and Las Vegas tries to lure an NFL franchise, we hear contrasting opinions about the local economic impact of events like the Olympics and professional sports teams.  Construction jobs are great – but often go to out-of-town specialty contractors.  Stadium jobs are seasonal, and often low-wage.  Often cited is the impact to local businesses, who benefit from increased traffic on game days, or even contracts with the team or organization.  Reviewing the arguments from both sides, when it comes down to it one thing is clear – no one has been able to quantify the long term impact.  Despite this lack of clarity, politicians and economic development groups clamor to land these high-profile teams and events through tax breaks and subsidies for construction.  Sound familiar?

With New Mexico and Utah reportedly fighting for the opportunity to build a new 550,000 SF data center (Data Center Dynamics suggests Facebook is the force behind shell company Greater Kuda LLC) we see the same type of divide.  Detractors point out that a half million square foot facility like the one being proposed by Greater Kuda LLC would require a tremendous amount of water (over 5 million gallons daily), while only providing 70 – 90 jobs. Proponents point out that as internet and online networking becomes the center of all major businesses and cloud computing rises through the ranks, having become the prioritized choice for new and upcoming as well as small and well established businesses alike, data centers have undoubtedly become one of the most in-demand resources. Because of the central role they play in our contemporary business world, data centers have become invaluable revenue generators and, hence, contribute to a prosperous economy in the location they are placed.  There is a further argument to be made that the prestige that comes from a best-in-class firm choosing to locate in a community attracts other, sought-after, companies.

This is why several states in the US look to attract potential data centers with offers and policies that favor them and convince them to choose that particular state as the ideal location for the placement of their cluster of data centers.  One of the strategies deployed by state governments is tax incentives. Tax incentives are a percent of exemptions from property or equipment tax. According to an analysis, around 23 states in the US provide a degree of such incentives to attract potential customers.

Here are some states and their tax incentive policies, along with the results they have received.

ALABAMA

In accordance with a data center incentive that was created and implemented in 2012 in the state of Alabama, data centers could be provided with a hundred percent exemption from any and all equipment tax they may have to pay for their bought infrastructure. This law not only allowed $200 million of capital investment to be made, but also paved way for at least 20 new jobs in the area. This incentive convinced Google to invest in the state $600 million worth of facility in exchange of which it receives more than $81 million of local and state incentives.

ARIZONA

Arizona began to run the race of data center competition in 2013, where under the newly passed law, any business investing a minimum of $50 million may be provided with up to a hundred percent of sales and property tax incentives. This is why a collection of more than ten high-profile companies, including eBay and GoDaddy, are hosted in the state of Arizona, where they receive more than $5.5 million in tax breaks over a period of 10 to 20 years.

GEORGIA

Atlanta has been deemed one of the leading markets or destinations for data centers due to a luring tax incentive policy. Passed in 2005, its incentive program has been a long running and attractive one, requiring only $15 million worth of investments be made annually. In return, the businesses would be exempted for all of the sales tax or computer equipment tax they were to pay. This is the primary reason why the state has been able to grab some big fish in the industry (although the names have not been disclosed).

NEBRASKA

Nebraska’s incentive program offers larger data center tiers, which require at least $200 million to be invested as capital and a minimum of 30 new jobs to be created. This was called the Nebraska Advantage program and as a result of this investment, companies were provided with a full refund of any sales, income, or real estate taxes paid. This large tier policy has, as a result, made Nebraska’s clients larger and more well established companies, such as Yahoo, who remains one of the prime guests of the state.

TEXAS

The state of Texas deployed a new incentive program in the year 2013, under which a company investing a minimum of $200 million in capital and providing 20 jobs would receive a full exemption from all sales taxes: on equipment, infrastructure, cooling systems, fuel, electricity and software amongst others. Although they did introduce another limit; the exemption only applied to those data centers that comprised of more than 100,000 square feet of space.

POWER PURCHASE AGREEMENT

On the other hand, if a business or data center owner has not been tax-exempt and has partaken in a Power Purchase Agreement, which is a financial agreement where a seller arranges for and provides power or electricity services to a data center’s property, then they can make use of federal tax incentives such as Investment Tax Credit, that offers tax paying premises a 30% credit on the total cost of their power system, or Accelerated Depreciation, where Internal Rescue Service offers a cost recovery system for commercial power equipment. Thus, these attractive policies may play a huge part in helping big names such as Apple and Microsoft to invest in a particular state.

Jeffrey Dorf is the Editor of The Data Center Blog and President of the Global Data Center Alliance

The Impact of Mobile Games on Data Centers

Walking and Texting

These days, watching little children, teenagers, and even mature adults walking around with their phones out in their hands, scanning the immediate area around them for something that is seemingly supposed to materialize from thin air and appear before them, is quite a common sight. For all we know, you could, quite possibly, be one of those who tend to venture out on the streets in search of virtual critters, if statistics are any indication! Around 21 million people worldwide are thought to be spending more or less 30 minutes on the new Pokemon Go app, making it the biggest mobile game in the history of the United States.

Along with unprecedented popularity, the Pokemon Go application has also managed to rack up one more thing: a plethora of complaints. And most of these complaints originate from the fact that the servers of the application crash so often, that for many people, playing the game becomes an impossibility. And this is not only the case with Pokemon Go; it is a problem that several other applications who receive a large amount of traffic and usage face. Halted loading, application crashes, and server unconnectivity are all factors that arise when the host cloud is over-crammed.

This has had a huge impact on the world of data centers. It is now known, that Pokemon Go, like many phone applications, is hosted on a cloud platform reportedly provided by Google. Due to the involvement of cloud computing, the role of the data center becomes that much more important than before; it needs to continuously link and transfer data to and from multiple users of the application at the same time.

If the entire application is hosted on one, central data center, the primary problem that is faced is over usage of certain assets and receival of a huge amount of data by consumers that is not manageable for the single data center. As a result, every time you come close to catching that rare Pokemon that you have walked miles to capture, the application freezes and crashes, leaving you staring in disappointment at the now non-augmented reality before you.

Despite these facts, the popularity of current augmented reality phone games, and rapid development of brand new ones, leads us to a realization: edge data centers can prove to be the ultimate solution. With the increase in demand of data centers as phone games and applications take off using cloud computing as their hosting platform, a series of edge centers placed throughout an impact area is a better system than having a single data center that acts as the headquarters and handles everything related to the application.

Edge computing is said to be playing a huge part in supporting future computing endeavors of firms and franchises, as it regulates the transfer of data to a smaller scale and improves performance and functionality exponentially. But there is still no doubt that applications such as Pokemon Go have greatly impacted the growth of the data center industry as it is and that they also promote the efficiency of a cloud based platform or host.  Hopefully, in the near future, you can catch all the Pokemon you wish to without having the app crash at a critical moment!

Jeffrey Dorf is the Editor of The Data Center Blog and President of the Global Data Center Alliance