September 17, 2012
NEW YORK, Sept. 17 — In today's financial markets, more so than at any time in the past, data is paramount. Data quality and comprehensiveness can easily translate into undeniable advantages in today's hyper-competitive financial markets. Every day, billions of quotes and trades occur on exchanges around the country and around the world, posing an unprecedented computational challenge. In the face of every increasing market data volume, storage facilities capable of holding hundreds of terabytes are now required to save and analyze market data. The hardware costs and specialized system administration expertise required to maintain these systems can easily run into hundreds of thousands of dollars a year, putting high resolution data beyond the reach of all but the largest institutions. The end result has been a highly unfair marketplace where smaller traders and firms have been placed at the mercy of large hedge funds and banks.
On September 1st, 2012, QuantQuote launched the QuantCloud service in response to this troubling trend. QuantQuote seeks to level the playing field by drastically reducing hardware and system administration costs by creating a shared cloud-based platform for market research. "In many ways, cloud computing is perfectly suited for finance,",says Jerry Goldberg, Director of Sales at QuantQuote, "in addition to being able to share costs among numerous users, there are many economies of scale that appear when you build one large computing cluster versus a dozen medium sized clusters."
QuantCloud will operate on a subscription pricing model where each user is given a memory, CPU, disk I/O, and bandwidth allocation. In times of low load, QuantCloud's sophisticated load balancing systems will dynamically increase the resources allocated to active users to allow maximum utilization of resources at all times. By using QuantQuote owned and maintained hardware with costs split amongst multiple users, individual costs are greatly reduced.
"We estimate yearly cost savings of $50,000-$100,000 for most users" says Kevin Lu, chief system architect at QuantQuote, "most of that from eliminating the need for a full time system administrator and savings from sharing colocation and bandwidth costs. The pilot program run over the past year with a half dozen trial users has been a resounding success." In addition, QuantCloud users will have access to the full range of QuantQuote software tools such as the highly acclaimed TickMAP and QuantEDGE libraries. The availability of these tools will make data processing much easier and save customers months of work. "QuantCloud is an ambitious undertaking", says Jerry Goldberg, "but we are confident the undeniable advantages it provides with pave the way to success. It is really a disruptive innovation that will revolutionize the financial data marketplace in the long run."
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.