June 24, 2011
Tilera unveiled its TILE-Gx 3000 Series processors that they tout as offering ten times the performance per watt of Intel SandyBridge with emphasis on boosting the power of cloud computing applications and placement in cloud datacenters.
According to Tilera, their new family of processors are designed not only to outpace Intel’s performance, but to provide a 50 percent reduction in total cost of ownership.
Ihab Bishara who directs Tilera’s server solutions group, says his company has been working alongside some of the world’s largest cloud providers for two years to ensure their processor will address some of the problem spots, including issues stemming from virtualization support and high processor frequency.
He claims that in addition to the provision of 64-bit processing, the Tilera chips mark the end of an era where 20-30 percent incremental gains are over, noting that the Gx-3000 series will provide the order of magnitude improvements that cloud datacenter operators have been scrambling for.
As the company’s release noted, these chips are targeted a number of common web applications and their back-end needs:
The TILE-Gx 3000 series processors target scale-out datacenters running throughput-oriented applications including:
Tilera is pitching a chip series that is not only tailored to web applications, but for those who are running high performance computing operations in cloud environments.
Full story at Tilera
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.