April 03, 2012
GCE's Big Data framework enables government agencies and organizations to rapidly and cost effectively store, manage, and analyze growing volumes of data
RESTON, Va., April 3 — GCE today announced its SMART Cloud for Big Data and Analytics – a Big Data framework for storing and managing data and a toolset for performing consumer-grade search and analysis on that data.
GCE, a Cloud pioneer in the Federal market, built the SMART Cloud solution using open-source technologies such as Hadoop, the Apache open-source tools developed by industry leaders. With this solution, business users are empowered to Search, Mash, Analyze, Research, and Target their data – posing questions and exploring results with unprecedented speed and scale. The GCE solution is ideally suited for both government agencies and commercial enterprises struggling to gain rapid insight from large volumes of data. GCE's Big Data solution is hosted on the GCE Cloud and requires no capital investment.
The Obama Administration announced its "Big Data Research and Development Initiative" last week – reinforcing the role that innovative tools and technologies can play ushering in new ways to harness, analyze, and extract actionable insights from the growing volume of digital data. GCE has already deployed its SMART Cloud solution for several customers of its financial management and procurement data Clouds.
"Today we all have consumer-grade expectations when it comes to extracting meaningful information from our business systems, however, existing enterprise-grade solutions cannot match these expectations," says Ray Muslimani, GCE president and CEO. "The GCE SMART Cloud delivers the ability to search and analyze the data with the same intuitive tools and performance expectations consumers take for granted."
For more than a decade, GCE has been an established industry leader in the development of innovative cloud solutions. The GCE Cloud offers a wide range of business services, including financial management, asset management, procurement, Big Data, and litigation support services. For more information, visit the company at http://www.GCEcloud.com.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.