October 05, 2012
Compliance Reference Architecture for PCI now available to help customers ensure business-critical applications are PCI-compliant
PALO ALTO, Calif., Oct. 4 — VMware, Inc., the global leader in virtualization and cloud infrastructure, today introduced new VMware Compliance Reference Architectures, a set of resources including solution guides and design architectures intended to simplify compliance for business-critical applications in the cloud era. These new reference architectures are designed to help customers and partners easily build and deploy cloud infrastructures that allow business-critical workloads in the cloud to meet a variety of compliance standards. The VMware Compliance Reference Architecture for PCI, available today, is the first in this series of reference architectures, and is designed to help customers ensure their virtualized business-critical applications are PCI-compliant.
"More and more of our customers are progressing toward advanced stages of the virtualization journey, and addressing the compliance requirements associated with virtualizing business-critical applications is becoming a key consideration," said Ramin Sayar, vice president and general manager, Virtualization and Cloud Management, VMware. "The new VMware Compliance Reference Architectures will help our customers accelerate the move of compliance-sensitive workloads to the cloud while enabling our partners to develop integrated solutions that will help customers meet compliance standards more quickly."
The VMware Compliance Reference Architecture for PCI includes the VMware Solution Guide for PCIand the VMware Architecture Design Guide for PCI. These documents help customers understand how they can leverage the VMware vCloud® Suite and VMware View® to deploy secure and compliant cloud infrastructures in accordance with PCI requirements. The reference architecture also includes Partner Solution Guides that highlight how solutions from VMware's security and compliance partners can be leveraged alongside VMware technologies in heterogeneous environments. Eight Partner Solution Guides are available today on the VMware Solution Exchange.
Read Industry Comments on the VMware Compliance Reference Architectures
VMware is also enabling audit advisory firms to join the VMware Technology Alliance Partner (TAP) Program and the VMware Consulting and Integration Partner Program (CIPP). This will enable audit partners including Accuvant, Coalfire, Forsythe, IBM, IOActive and K3DES to attain VMware accreditations that enable them to combine their audit expertise with VMware product knowledge to meet customer requirements for guidance on the design and operation of VMware environments. Accredited audit partners can then work with VMware's technology partner community to deliver auditor-reviewed Partner Solution Guides that complement the reference architectures.
The VMware Compliance Reference Architecture for PCI is available today. Additional reference architectures addressing the Federal Risk and Authorization Management Program (FedRAMP), Health Insurance Portability and Accountability Act (HIPAA), Health Information Technology for Economic and Clinical Health (HITECH) Act, Sarbanes-Oxley (SOX) Act and other industry and government compliance standards are expected to be available throughout the rest of 2012 and 2013.
VMware (NYSE: VMW) is the leader in virtualization and cloud infrastructure solutions that enable businesses to thrive in the Cloud Era. Customers rely on VMware to help them transform the way they build, deliver and consume Information Technology resources in a manner that is evolutionary and based on their specific needs. With 2011 revenues of $3.77 billion, VMware has more than 400,000 customers and 55,000 partners. The company is headquartered in Silicon Valley with offices throughout the world and can be found online at www.vmware.com.
Source: VMware, Inc.
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.