September 21, 2012
CHICAGO, IL, Sept. 20 — CohesiveFT announced it has joined the non-profit organization the Open Networking Foundation (ONF) to help develop, promote and expand the use of OpenFlow and the standardization of a unique, new architecture called Software-Defined Networking (SDN). As a member of the consortium, and as a provider of application-controlled SDN, CohesiveFT will actively participate in the ONF's conversations about the creation of a new generation of programmable network services.
"The OpenFlow standard is an essential component of SDNs for the management and control of business-critical enterprise applications that also run in the cloud," said Dan Pitt, executive director, Open Networking Foundation. "By utilizing open standards like OpenFlow, member companies can drive the delivery of SDNs to customers. We are pleased to welcome member companies that help develop and advance this type of networking innovation."
According to IDC, the OpenFlow and SDN market will grow from a $168 million market in 2013 to one that is $2 billion by 2016.
"Enabling enterprises to run business operations via the cloud is rapidly becoming an imperative for every organization. We have been working with OpenFlow and SDNs since 2008 and joined ONF to share our knowledge with the community of developers dedicated to creating the next generation enterprise network," said CohesiveFT CEO Patrick Kerpan.
Part of CohesiveFT's plans to share SDN knowledge includes being a Silver Sponsor at the upcoming Software-Defined Networking Summit in November. The Summit will take place in London's Canary Wharf and will feature a White Paper and presentation from CohesiveFT's UK sales team and CEO Patrick Kerpan. For more on the Summit, visit http://sdnconference.com/
Cohesive VNS3 3.0:
CohesiveFT formed in 2006, and by 2008 delivered the first iteration of its flagship product, VNS3. Now in its 3.0 version, VNS3 is the only application-controlled SDN product on the market, meaning that enterprises can use the security and control features of VNS3 from their application layer rather than the virtual or physical infrastructure layers. CohesiveFT and VNS3 are provider, vendor, application, OS, and script neutral.
CohesiveFT is the leader in enterprise application-to-cloud migration and provides more software defined networking for cloud computing than all competitors combined. The CFT team has decades of enterprise management experience and has built an enterprise solution-oriented organization that solves the needs of the cloud customer, from virtual server assembly time through virtual topology boot time, all contained within a secure, application network where the enterprise customer controls addressing, protocol, topology and security of their communications in, between, and to the clouds. CohesiveFT's Cloud Container provides everything necessary to achieve "cloud convergence" across a computing portfolio comprised of numerous software components, operating systems, and cloud and virtual platform choices. Secure Cloud Containers can be constructed with many different combinations of CohesiveFT's products (Server3, VNS3 and Context3), along with those of our Cloud Container Technology Partners, as well as other customer-defined needs. To find out more about CohesiveFT and their SDN product, VNS3, please visit www.cohesiveft.com
About the Open Network Foundation
Launched in 2011 by Deutsche Telekom, Facebook, Google, Microsoft, Verizon, and Yahoo!, the Open Networking Foundation (ONF) is a growing non-profit organization with more than 70 members whose mission is to promote the commercialization and use of SDN and OpenFlow, and collaboratively bring to market standards and solutions. ONF will accelerate the delivery and use of SDN and OpenFlow technologies and standards while fostering a vibrant market of products, services, applications, customers, and users. For further details visit the ONF website at: http://www.opennetworking.org.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.