September 28, 2010
Software and risk analysis firm Veracode announced that over the past 18 months, over half of the applications sent to them for extensive testing were unable to meet minimum security standards, even when the company downgraded some of its standards for applications that did not require exhaustive security requirements.
According to leaders at Veracode, this is in part due to the fact that building web or cloud-based applications requires an extra set of skills and a new range of expertise, not to mention far more time than some in-house developers want to commit to retooling applications that have been running without fail on dedicated servers.
As Samskriti King of Veracode noted, “Unfortunately, developers trained with software that’s generated and used in one location with a single set of servers often don’t understand the precautions needed for Web applications that take code, data, and elements of the interface from many servers.”
CEO of Virtacore Systems, Thomas Kilbin told Network World that his customers are moving back office apps to the private clouds without developing applications that are built to work in a cloud-based model. This exposes them to a far greater number of threats because many developers don’t want to have to rewrite all of their applications that are only running in the cloud for “bursty” periods and are only doing so to save money.
Kilbin also noted that the cloud “is more threat-rich than the shared hosting model, mainly because in shared hosting the core OS and apps—php, perl, mysql—are kept updated by the service provider. In the cloud, the customer has to keep the core OS updated, along with the application stacks, in addition to their code” which can be a major undertaking that some teams don’t have the time or expertise to handle.
Full story at NetworkWorld
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.