December 04, 2006
The ONStor Pantera Clustered NAS system was named by Fortune 1000 IT managers as one of the industry's most exciting products in use in the TheInfoPro's Wave 8 Storage Study, Fall 2006. Introduced in August of 2006, ONStor claims Pantera has undercut pricing by 50 percent in the competitive NAS market.
Pantera answers user demand for innovative NAS solutions that deliver file virtualization capability. According to TheInfoPro's Wave 8 Storage Study, purchases of NAS of capacity, which grew 59 percent in the last year at Fortune 1000 companies, have led to a surge in file virtualization, which tops TheInfoPro's Storage Management Technology Heat Index measuring users' reported spending and implementation plans. ONStor Pantera includes both integrated server and storage virtualization, delivering single-pool storage management regardless of the capacity required. Virtualized resources enable non-disruptive growth, free of costly data migration and user disruption.
"The findings in our Wave 8 Storage Study indicate that NAS deployments have been accelerating as installed capacity in the Fortune 1000 has grown from 44 TB in Wave 4 (Fall 2004) to 224 TB in Wave 8," said Rob Stevenson, managing director for Storage at TheInfoPro. "In conjunction with this growth, file virtualization has become a top priority, supplanting ILM, as storage professionals look to decouple the relationship between the filer and the host in order to manage each independently, using file virtualization on the storage side to manage growth transparently and server virtualization on the host to increase application agility."
TheInfoPro's findings are derived from 153 hour-long interviews, conducted between July 2006 and September 2006, with TheInfoPro's TIPNetwork of the world's largest buyers and users of IT from Fortune 1000 enterprises. Users named Pantera one of the top 10 most exciting products in the industry when asked the question, "What emerging or exciting vendor product do you have in use?"
"The latest report from TheInfoPro clearly indicates the importance Fortune 1000 companies are placing on NAS solutions and file virtualization to handle the rapid growth in file storage," said Bob Miller, president and CEO of ONStor. "Cognizant of these needs, we strive to deliver affordable NAS solutions that provide the utmost in performance, scalability and flexibility as demonstrated in our recently announced Pantera systems. We are thrilled that users are excited about the product and we promise them there's more to come!"
By integrating features such as server virtualization and n-way clustering, ONStor Pantera allows customers to start with a complete configuration, and then independently grow both performance and capacity by adding either filer nodes or disk. According to the company, expansion is simple, fast, and non-disruptive. Pantera provides the capability to consolidate data from thousands of servers, and tens of thousands of users to a single storage environment.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.