July 04, 2012
PHOENIX, July 2 — Phoenix NAP, a full service data center, premier infrastructure-as-a-Service (IaaS) provider and primary network access point (NAP), announced today that its highly anticipated new Intel Xeon processor E5-2600 product family based servers are now available. The leading data center cited customer satisfaction as a significant motivating factor in choosing to offer new servers.
"We are excited that our new servers based on Intel Xeon processors E5-2600 family are finally available," said Jordan Jacobs, director of corporate strategy for Phoenix NAP. "Providing leading systems and solutions to our current and future customers is a main priority of Phoenix NAP Secured Servers and for us, the decision to offer the these servers with new Intel Xeon processors was a way to keep up with the latest and greatest."
With server configurations that include a 4-bay chassis and RAM options ranging from 64GB (included) up to 256GB with 500GB SATA hard drives, the new servers enable clients to leverage the computing power of a six core Intel Xeon processors E5-2600 product family to meet rising CPU demands. Additionally, the servers feature 3.2TB of premium bandwidth and 29 allocated IPs, helping to ensure Internet traffic needs are addressed.
"These new servers based on latest Intel Xeon processors meet a variety of needs from private cloud to advanced application servers," added Jacobs. "Priced to fall into a variety of IT budgets with no required contracts, our new Intel Xeon based Servers offer excellent performance without breaking the bank."
Phoenix NAP is a PCI DSS Validated Service Provider and a SOC Type 2 audited facility.
About Phoenix NAP
Phoenix NAP, a full service data center, premier infrastructure-as-a-service (IaaS) provider and primary network access point (NAP) for the Phoenix, Ariz., metro region leads the path for technology systems and solutions through its innovation and vastly redundant data center systems. Our highly personalized approach ensures that all of your requirements are met. Whether it's high-density colocation, flexible storage, physical servers or Phoenix NAP Secured Cloud services, our enterprise-grade facility and certified NOC technicians supply IT solutions to fit your every need. For more information, visit the company's website at http://www.phoenixnap.com.
Source: Phoenix NAP
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.