June 06, 2012
Company holds one of the largest cloud computing patent portfolios, receives Utah Genius Award
PROVO, Utah, June 5 — Adaptive Computing, managers of the world's largest private cloud and technical computing systems, today announced that it has been granted its fifteenth cloud computing patent by the United States Patent and Trademark Office, U.S. Patent No. 8,179,490. This patent is the most recent in a long line of core cloud computing patents issued to David Jackson, Adaptive Computing's CTO and founder.
The growing Adaptive Computing patent portfolio covers key cloud computing concepts such as elastic computing, cloudbursting, dynamic provisioning, multi-tenancy, compute resource guarantees, usage billing, and more. These patented techniques have formed the basis of Adaptive's products and are now widely used in the industry. Enterprises today are increasingly moving their computing workloads to public and private clouds, taking advantage of cloud computing's lower costs and higher service levels. Adaptive Computing's innovations have made these advantages possible.
In recognition of its patent portfolio, Adaptive Computing recently received a Utah Genius Award for being one of the top patent companies in Utah. Presented annually by Bateman IP Law Group, KSL, and Zions Bank, the Utah Genius Awards recognize companies which excel in patents and trademarks at both the national and international level.
The most recent patents awarded cover concepts core to the next generation of cloud computing. These include methods of supporting multi-tier applications, sharing resources in public-private hybrid clouds, guaranteeing service delivery while efficiently utilizing cloud resources, enabling multi-tenant cloud computing, and facilitating time-based policy enforcement. Adaptive Computing, founded in 2001, has long been an innovator in the cloud computing space. "Many of these patents have 2004 priority dates, and were filed when few were even thinking about cloud computing," said Rob Clyde, CEO of Adaptive Computing. "David Jackson's forethought and vision are evident throughout these patents."
From the early days of Adaptive Computing, Dave Jackson realized that business would need more dynamic, flexible computing environments. He set to work solving the hard problems around dynamic provisioning, usage billing models, reservations, security, and more. Throughout this process, he maintained a focus on guaranteeing what businesses value most.
"Cloud is hard. Businesses are coming to understand that done correctly, cloud must not just provide general efficiency and agility, but must do so while modeling the capabilities of compute infrastructure, the needs of applications, and the service level agreements of users and departments," said David Jackson, Adaptive's CTO. "Tomorrow's cloud must allow sites to take advantage of new technologies while preserving the business-critical investment they have already made." Jackson continued, "Our core inventions allow organizations to realize cloud's promise, guaranteeing resource access and service levels with a flexibility, intelligence and efficiency that is unrivalled anywhere in the industry."
Adaptive has long pioneered innovation in cloud computing. Beyond the 15 core patents now issued, Adaptive Computing has over 130 domestic and international patent filings, most dating to 2007 and earlier. This positions Adaptive with one the earliest and most extensive portfolios of cloud patents in the industry. Summarizing the situation, Clyde said, "From its earliest days, Adaptive Computing has been committed to pushing the state of the art in computing. Our large and growing patent portfolio is a testament to that commitment."
About Adaptive Computing
Adaptive Computing, manages the world's largest private cloud and technical computing environments with its self-optimizing cloud management solutions and HPC workload management systems driven by Moab, a patented multi-dimensional intelligence engine. Moab delivers policy-based governance, allowing customers to consolidate and virtualize resources, allocate and manage applications, optimize service levels and reduce operational costs. Adaptive Computing offers a portfolio of products and services that accelerate, automate, and self-optimize IT workloads, resources, and services in large, complex heterogeneous computing environments. Adaptive's products act as a brain on top of diverse infrastructure and middleware to enable it to self-optimize and deliver higher return on investment. Adaptive's core products are:
For more information, call (801) 717-3700 or visit www.adaptivecomputing.com.
Source: Adaptive Computing, Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.