January 15, 2013
NEW YORK, N.Y., Jan. 15 – TheInfoPro, a service of 451 Research, released its latest Servers and Virtualization Study, indicating a major refresh of x86 server infrastructure and the associated network, storage and software technologies required to optimize performance in virtualized, cloud-ready datacenters. Conducted during the second half of 2012, TheInfoPro study identifies key initiatives of senior server infrastructure managers and examines market factors and major players. This annual study is based on extensive live interviews with server professionals and primary decision-makers at large and midsize enterprises in North America and Europe.
Highlights from the TheInfoPro Servers and Virtualization Study include:
"Server virtualization projects are still dominating IT activity, creating a one-time spending bubble as organizations lay down the foundation for a cloud-ready infrastructure," said Peter ffoulkes, TheInfoPro's Research Director for Servers and Virtualization. "Complexity is driving interest in converged infrastructure solutions, with 13% of respondents planning to implement the technology for the first time within the next two years."
Research Director and report author Peter ffoulkes will host a 451 Research Innovation webinar on January 31 to discuss the report's findings.
About TheInfoPro Servers and Virtualization Study
TheInfoPro's Servers and Virtualization Study takes an in-depth look at key industry trends and tracks the performance of individual vendors. Now in its ninth year, this study was finalized in December 2012. TheInfoPro's methodology uses extensive interviews with a proprietary network of IT professionals and key decision-makers at large and midsize enterprises. Each interview explores several fundamental areas, including the implementation and spending plans for more than 30 technologies, evaluations of vendors observed from business and product perspectives, macro IT influences transforming the sector, and factors affecting decision-making processes. Results are collated into comprehensive research reports providing business intelligence in the form of technological roadmaps, budget trends and vendor spending plans and performance ratings. A sampling of vendors covered in the Vendor Performance and Technology Roadmap components of the study include BMC Software, CA, Cisco, Citrix, Dell, EMC, HP, IBM, Microsoft, Novell, Oracle, Red Hat, ServiceNow, SolarWinds, VCE and VMware.
About 451 Research
451 Research, a division of The 451 Group, is focused on the business of enterprise IT innovation. The company's analysts provide critical and timely insight into the competitive dynamics of innovation in emerging technology segments. Business value is delivered via daily concise and insightful published research, periodic deeper-dive reports, data tools, market-sizing research, analyst advisory, and conferences and events. Clients of the company – at vendor, investor, service-provider and end-user organizations – rely on 451 Research's insight to support both strategic and tactical decision-making. 451 Research is headquartered in New York, with offices in key locations, including San Francisco, Washington DC, London, Boston, Seattle and Denver.
Source: 451 Research
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.