December 10, 2010
MOUNTAIN VIEW, CA., December 10, 2010 -- Symantec Corp. (NASDAQ: SYMC) today announced its security and storage predictions for 2011 based on what its security and storage experts are observing in the information protection landscape. Today's organizations are overloaded with information as data grows exponentially. Almost daily, a new technology is either announced or brought to market with the promise of making the cost of doing business lower, more convenient and more timely.
"Given today's stagnant and declining IT budgets, it's imperative that organizations achieve more value from their IT spending," said Steve Morton, vice president, enterprise product marketing, Symantec. "By understanding challenges, risks and threats, organizations can plan and implement strategic technology initiatives such as virtualization, mobile security, encryption, backup and recovery, archiving and cloud computing to protect and manage their information more efficiently."
New Technologies, New Challenges
As technologies become smarter and faster, the threats to these technological assets follow suit. For example, the exponential consumer adoption of smart mobile devices will increasingly result in these devices making their way into enterprises through the back door, blurring the lines between business and personal use, and driving new IT security models to market in 2011.
Analyst firm IDC estimates that by year's end new mobile device shipments will have increased by 55 percent, and Gartner projects that in the same timeframe, 1.2 billion people will be using mobile phones capable of rich Web connectivity. Although cyber criminals have shown little interest in mobile devices in the past, as devices grow more sophisticated and as a handful of mobile platforms corner the market, it is inevitable that attackers will target mobile devices in 2011 and that mobile devices will continue to grow as a source of confidential data loss.
Gap in Virtual Machine Protection
A similar challenge exists with the widespread adoption of virtualization. Although many companies believe the information and applications within their virtual infrastructure are protected, many IT administrators will face the harsh reality that they are not in 2011. The rapid adoption, fragmented implementation and lack of standardization of virtual infrastructures will continue to expose gaps in the security, backup and high availability of virtual environments. Although virtualization decreases server costs, organizations are realizing that virtualization is simultaneously increasing management and storage costs, and without a plan to protect these environments, they may not realize the full return on investment.
Taking Control of Information
The exponential level of data growth is impeding organizations' ability to effectively manage and recover data. In 2011, storage administrators must regain control of information, lose their "pack-rat" mentality and categorize what information is most important for retention purposes. Otherwise, storage costs will continue to skyrocket, and organizations will face extensive recovery times and be unable to comply with regulatory compliance standards, including privacy laws, and e-Discovery.
Adding to the complexity is the use of social media to improve communication and productivity throughout an organization. Although social media will continue to change the way we collaborate in 2011, IT organizations will also need to understand how to protect and manage these non-standard applications for recovery and discovery of business information that is communicated in these channels. Social media archiving will grow in importance as companies unleash the power of social business but maintain archiving as a form of control to reduce information risk.
Additionally, as data goes "mobile" and becomes less centralized, regulators will start cracking down in 2011, which will drive organizations to increasingly implement encryption technologies, particularly for mobile devices.
The Next Generation Data Center of 2011
As organizations continue to manage with limited resources in 2011 while facing more intelligent and specific threats, IT will take a more strategic and innovative approach to solving problems. While software will continue to drive innovation, 2011 will bring new delivery models in response to customers' need to ease IT operations. Cloud computing, hosted services and appliances are examples of increasingly attractive delivery models that will change the landscape of today's data center by providing organizations with flexibility and ease of deployment.
Organizations will leverage public and private clouds as they become highly available in the coming year. Tools will also emerge to manage this new, complex storage environment and to help IT administrators better understand and capture information about unstructured data that resides within it. This will allow IT to fully utilize the benefits of the cloud and intelligently report to management. While customers opt to take advantage of cloud messaging services, they are still finding that they can drive greater cost out of the discovery process by keeping their archives in-house. This hybrid cloud archiving model allows organizations to use hosted messaging services while keeping their archives on-premise. This way they can combine email with other on-premise content sources like PSTs, IM and SharePoint that are relevant to the discovery process.
Symantec is a global leader in providing security, storage and systems management solutions to help consumers and organizations secure and manage their information-driven world. Our software and services protect against more risks at more points, more completely and efficiently, enabling confidence wherever information is used or stored. More information is available at www.symantec.com.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Cloud computing has become mainstream in today’s HPC world. In order to enable the HPC researchers who currently work with large distributed computing systems, to bring their expertise to cloud computing, it is essential to provide them with easier means of applying their knowledge.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.