September 07, 2010
SCHAUMBURG, IL., September 7, 2010 -- XtremeData, an innovator providing solutions for large data analytics and complex computing problems, today announced that the Data Intensive Computing Initiative (Di2) at the University at Buffalo, SUNY, has adopted dbX™. XtremeData's dbX offering constitutes the next-generation in database appliances: the only systems created specifically for unconstrained analysis and exploration of very large data sets.
Today's data intensive challenges in science and engineering are forcing researchers to obtain different core technologies, like high performance computing (HPC) clusters, graphics processing unit (GPU) accelerators, field programmable gate array (FPGA) acceleration, and massively parallel processing (MPP) database technology to do their work successfully. Scientific research such as microarray data analysis, gene environmental interaction analysis, and combustion/jet engine simulations regularly create datasets from a few to more than 100 terabytes (TB) for a single experiment. dbX is the newest and most cost-effective technology for warehousing and analyzing these datasets, allowing fast unrestricted ad hoc access to the entire dataset.
What People are Saying About dbX
"The Di2 is an academic, government and industry high performance computer research consortium. It leads in developing applications of novel data intensive technologies with traditional and new compute-intensive architectures to shorten complex discovery cycles," said Vipin Chaudhary, co-founder and director of the Di2, associate professor of Computer Science at the University at Buffalo, SUNY, and CEO of Computational Research Laboratories (CRL) Ltd., a wholly-owned subsidiary of Tata Sons Ltd. "The XtremeData dbX will allow our researchers to perform very complex analytics against massive data sets in multiple science and engineering domains. We strongly believe the result of exploiting these assets together will create new knowledge sooner, and deliver solutions faster."
"Data tsunamis are being created by the new technologies addressing human health, advanced energy systems, the weather and global warming as well as principles of our world and universe," said Todd C. Scofield, founder and co-director of the Di2 and managing director of Big Data Fast LLC. "For this range of research efforts we require cost-effective, high-performance computing technologies. We chose the XtremeData dbX because it delivers best-in-class time-to-knowledge for our researchers. In our production tests dbX performed extremely well and delivered performance that was one to two orders of magnitude faster than traditional architectures."
"Hybrid FPGA-enabled Data Intensive Supercomputer (DISC) appliances are having a major impact on our discovery processes," said Murali Ramanathan, a leading Multiple Sclerosis and pharmacogenetics researcher, and associate professor of Pharmaceutical Sciences and Neurology at SUNY at Buffalo. "Our algorithms for gene/environment interactional analysis now run incredibly fast, from many hours previously to a few minutes today. The combinatorial explosion problem size we can now effectively solve is quite extraordinary."
The Di2 has combined an unprecedented amount of computing, storage, and acceleration resources. XtremeData's dbX enterprise class data warehousing appliance is integrated with the Di2's compute intensive HPC and GPU infrastructure, allowing complex simulations, including detailed analyses of archived datasets, archival of results into network attached storage (NAS), followed by visualization of them. Application of this new technology is now being used to solve computational problems once considered impossible.
"The fields of drug discovery, cancer research, and genomics are of personal interest to me," said Geno Valente, vice president of sales and marketing at XtremeData. "Di2's research will have impacts beyond our wildest dreams on some of the world's worst diseases. Being a part of the computational framework supporting cutting edge research is particularly gratifying. By partnering with Di2, we have unlocked powerful FPGAs inside dbX for the worldwide research community via user defined functions within the FPGA. We are already seeing exciting results from this effort and believe this is only the tip of the iceberg."
About University at Buffalo, SUNY
The University at Buffalo is a premier research-intensive public university, a flagship institution in the State University of New York system and its largest and most comprehensive campus. UB's more than 28,000 students pursue their academic interests through more than 300 undergraduate, graduate and professional degree programs. Founded in 1846, the University at Buffalo is a member of the Association of American Universities.
About Data Intensive Discovery Initiative (Di2)
Di2 is dedicated to helping scientists and engineers increase the pace of discovery through the analysis of massive data sets. The current computing platforms used in science and engineering for data analytic tasks of this scale are unable to effectively meet the challenge. This is a huge problem in the science and engineering domains where data is being created at an unprecedented pace. Di2 researchers are developing novel architectures and algorithms for discovery. Di2 also works by bring vendors and researchers together in ways that have not been possible in the past.
About Computational Research Laboratories Limited
Computational Research Laboratories Ltd. (CRL) is a wholly owned subsidiary of Tata Sons Ltd. CRL provides High Performance Computing (HPC) Cloud Research & Development services and solutions to the Automotive, Aerospace, Life Sciences, Media & Entertainment, Weather, Sciences and other sectors. With 'eka,' CRL's supercomputer (the most versatile, largest, and fastest commercially available HPC system in the world), CRL provides an expanded portfolio of cloud services that allows organizations to avail HPC services, without having to undergo the demands for capital expenditure and long turnaround times of cluster set up. CRL has an experienced team of computational experts in multiple domains including Biology, Fluid Dynamics, Materials Science, Seismic Data Processing and Reservoir Simulations with a deep understanding of the scaling of applications in a highly parallel environment.
About XtremeData, Inc.
XtremeData offers innovative solutions to large data analysis and complex computing problems. Its products combine commodity hardware, open-source software and custom hardware-accelerators in a unique and optimal technology mix. XtremeData offers appliances purpose-built for ad hoc analysis of large data sets. XtremeData is headquartered in Schaumburg, IL, USA. For more information, visit www.xtremedata.com.
Source: XtremeData, Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
he study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.