December 10, 2010
MOUNTAIN VIEW, CA., December 10, 2010 -- Symantec Corp. (NASDAQ: SYMC) today announced its security and storage predictions for 2011 based on what its security and storage experts are observing in the information protection landscape. Today's organizations are overloaded with information as data grows exponentially. Almost daily, a new technology is either announced or brought to market with the promise of making the cost of doing business lower, more convenient and more timely.
"Given today's stagnant and declining IT budgets, it's imperative that organizations achieve more value from their IT spending," said Steve Morton, vice president, enterprise product marketing, Symantec. "By understanding challenges, risks and threats, organizations can plan and implement strategic technology initiatives such as virtualization, mobile security, encryption, backup and recovery, archiving and cloud computing to protect and manage their information more efficiently."
New Technologies, New Challenges
As technologies become smarter and faster, the threats to these technological assets follow suit. For example, the exponential consumer adoption of smart mobile devices will increasingly result in these devices making their way into enterprises through the back door, blurring the lines between business and personal use, and driving new IT security models to market in 2011.
Analyst firm IDC estimates that by year's end new mobile device shipments will have increased by 55 percent, and Gartner projects that in the same timeframe, 1.2 billion people will be using mobile phones capable of rich Web connectivity. Although cyber criminals have shown little interest in mobile devices in the past, as devices grow more sophisticated and as a handful of mobile platforms corner the market, it is inevitable that attackers will target mobile devices in 2011 and that mobile devices will continue to grow as a source of confidential data loss.
Gap in Virtual Machine Protection
A similar challenge exists with the widespread adoption of virtualization. Although many companies believe the information and applications within their virtual infrastructure are protected, many IT administrators will face the harsh reality that they are not in 2011. The rapid adoption, fragmented implementation and lack of standardization of virtual infrastructures will continue to expose gaps in the security, backup and high availability of virtual environments. Although virtualization decreases server costs, organizations are realizing that virtualization is simultaneously increasing management and storage costs, and without a plan to protect these environments, they may not realize the full return on investment.
Taking Control of Information
The exponential level of data growth is impeding organizations' ability to effectively manage and recover data. In 2011, storage administrators must regain control of information, lose their "pack-rat" mentality and categorize what information is most important for retention purposes. Otherwise, storage costs will continue to skyrocket, and organizations will face extensive recovery times and be unable to comply with regulatory compliance standards, including privacy laws, and e-Discovery.
Adding to the complexity is the use of social media to improve communication and productivity throughout an organization. Although social media will continue to change the way we collaborate in 2011, IT organizations will also need to understand how to protect and manage these non-standard applications for recovery and discovery of business information that is communicated in these channels. Social media archiving will grow in importance as companies unleash the power of social business but maintain archiving as a form of control to reduce information risk.
Additionally, as data goes "mobile" and becomes less centralized, regulators will start cracking down in 2011, which will drive organizations to increasingly implement encryption technologies, particularly for mobile devices.
The Next Generation Data Center of 2011
As organizations continue to manage with limited resources in 2011 while facing more intelligent and specific threats, IT will take a more strategic and innovative approach to solving problems. While software will continue to drive innovation, 2011 will bring new delivery models in response to customers' need to ease IT operations. Cloud computing, hosted services and appliances are examples of increasingly attractive delivery models that will change the landscape of today's data center by providing organizations with flexibility and ease of deployment.
Organizations will leverage public and private clouds as they become highly available in the coming year. Tools will also emerge to manage this new, complex storage environment and to help IT administrators better understand and capture information about unstructured data that resides within it. This will allow IT to fully utilize the benefits of the cloud and intelligently report to management. While customers opt to take advantage of cloud messaging services, they are still finding that they can drive greater cost out of the discovery process by keeping their archives in-house. This hybrid cloud archiving model allows organizations to use hosted messaging services while keeping their archives on-premise. This way they can combine email with other on-premise content sources like PSTs, IM and SharePoint that are relevant to the discovery process.
Symantec is a global leader in providing security, storage and systems management solutions to help consumers and organizations secure and manage their information-driven world. Our software and services protect against more risks at more points, more completely and efficiently, enabling confidence wherever information is used or stored. More information is available at www.symantec.com.
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
The private industry least likely to adopt public cloud services for data storage are financial institutions. Holding the most sensitive and heavily-regulated of data types, personal financial information, banks and similar institutions are mostly moving towards private cloud services – and doing so at great cost.
In this week's hand-picked assortment, researchers explore the path to more energy-efficient cloud datacenters, investigate new frameworks and runtime environments that are compatible with Windows Azure, and design a uniﬁed programming model for diverse data-intensive cloud computing paradigms.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
May 08, 2013 |
For engineers looking to leverage high-performance computing, the accessibility of a cloud-based approach is a powerful draw, but there are costs that may not be readily apparent.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.