December 13, 2010
CARY, N.C., December 13, 2010 -- A successful audit of the SAS Solutions OnDemand infrastructure by TRUSTe, the leading online privacy seal and services provider, validates the commitment of SAS, the leader in business analytics, to protecting customer data in the cloud.
The TRUSTe audit was a significant milestone for SAS. The company recently filed for Safe Harbor certification, a system to manage personal data privacy practices between the US and the stricter European Union Directive on Data Protection. The Safe Harbor Framework, established by the U.S. Department of Commerce in consultation with the European Commission, allows US companies to satisfy EU privacy directives protecting the personal information of European citizens.
“As a global company with more than 50 percent of sales from non-US customers and substantial operations throughout the EU, Safe Harbor certification is critical for SAS’ hosted applications,” said John Brocklebank, Ph.D., Vice President, SAS Solutions OnDemand.
Brocklebank noted that in addition to the pending Safe Harbor certification, SAS OnDemand customers are reassured when their data resides within SAS owned-and-operated facilities. “SAS maintains strict physical infrastructure standards for cloud applications,” he said. “Unlike other independent software providers, we maintain complete control to ensure that our customers’ critical data assets are secure.”
The SAS Solutions OnDemand portfolio comprises 18 offerings including applications for marketing automation, social media analytics, anti-money laundering, drug development and other cross-industry and vertical industry purposes. SAS has provided hosted solutions for more than 10 years.
TRUSTe’s privacy certification of SAS Solutions OnDemand includes ongoing platform monitoring and multi-lingual privacy dispute resolution services for consumers. SAS Solutions OnDemand can now display a TRUSTe privacy seal, a well-recognized symbol that verifies a company’s commitment to transparency, accountability and consumer choice.
“Given abundant consumer privacy concerns it’s no surprise that 71 percent of consumers look for trustmarks before doing business online,” said Fran Maier, President of TRUSTe. “With a TRUSTe privacy seal, SAS sends a clear signal to its SAS Solutions OnDemand customers that it respects their personal information. We’ve found that 82 percent of consumers who see a TRUSTe privacy seal trust it, which translates directly into increased site engagement. That’s the privacy payoff.”
SAS is the leader in business analytics software and services, and the largest independent vendor in the business intelligence market. Through innovative solutions delivered within an integrated framework, SAS helps customers at more than 50,000 sites improve performance and deliver value by making better decisions faster. Since 1976 SAS has been giving customers around the world THE POWER TO KNOW.
Thousands of companies rely on TRUSTe's leading online privacy seal to enhance consumer trust, drive increased registrations and transactions, and comply with complex privacy requirements. Consumers know that when they see the TRUSTe seal they can "Click with Confidence" because the certified web site is responsible with their personal information. TRUSTe has certified more than 40 percent of the top 50 Web sites including Yahoo, Microsoft, eBay, AOL, AT&T, Comcast, Disney, Weather.com, LinkedIn, WebMD and Yelp. For additional information on TRUSTe and its services, please visit http://www.truste.com.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.