August 04, 2010
Datacenters, as one might imagine, have a tendency to appear in areas where both power costs and taxes are low. Far from being a rarity where the conditions are so favorable, the number of these centers is expected to grow in the coming years. Signs pointing to this are numerous, including recent projections from Intel claiming that nearly one-quarter of their chips will go into mega datacenters by the end of next year.
Making the conditions favorable for datacenter construction is not a top priority for some governments and some of those who have constructed their datacenters in a once-hospitable climate have found the tables turned as tax hikes occurred or previous breaks were overturned. This is what happened last year when the state of Washington declared that its host of datacenters were no longer permitted the sales tax break they counted on and that there would be higher taxes on new centers. Far from taking this in stride, Microsoft pulled its Quincy, Washington Azure facility up by its roots and migrated to Texas. As Laura Smith from TechTarget noted, Google is also faced with a similar decision as it considers moving out of North Carolina.
Smith also stated today that “people can wax poetic about the cloud, but the services flying over the Web touch down on a piece of physical equipment somewhere. Consider Digital Realty Trust, a provider of data centers (move-in or custom) with more than 15 million square feet of space in 70 locations worldwide. Its datacenter facility in Chicago is the city’s second-largest consumer of power, behind O’Hare International Airport.”
Are containerized datacenters the best alternative until the technology catches up with the desire to migrate? Smith says, “when other parts of the country—or world—begin to offer tax incentives for building mega datacenters in their backyards, being able to move workloads from one datacenter to another would make good economic sense. However, this requires a software later that Google and others are still working on.”
Full story at TechTarget
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.