July 27, 2010
Today HP unveiled its new flexible datacenter, which they claim will not only cut upfront costs in half, it will lead to a projected 14 percent decrease in carbon footprint. As the release noted, this new center which is still awaiting full patent, “offers a standardized, modular approach to designing and building data centers that allows clients to replace traditional data center designs with a flexible solution that can be expanded as needed while conserving resources.”
As Kit Godrich, CTO of Technology Services at HP told CTOEdge this week, “while HP has standardized the physical design of the data center shell, HP offers the customer the option to efficiently customize the final configuration for different power levels by using modular UPSes. Moreover, they can choose from four types of cooling systems designed to be most efficient for different climates, mostly using outside air economizers for primary cooling with DX cooling units (not chilled water) as backup on warmer days.”
Some are suggesting that HP’s shift in data center design marks the beginning of a trend to simplify such designs in order to realize as-of-yet unseen energy efficiency goals. This would certainly upset the trend of mega-datacenters, which do not appear to be going away anytime soon as each day brings news of datacenter expansion and new construction. While there are increased “green” datacenter efforts underway, these remain the anomalies—if HP can prove its claims, however, this might be an attractive option, unless, of course, you’re living in the state of California—where HP is based, by the way.
Full story at CTOEdge
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.