February 06, 2012
Will also discuss how to extend the benefits of server virtualization to the storage layer and in doing so lower costs, improve availability and simplify management
WATERTOWN, Mass., Feb. 6 —
Wednesday, February 8, 11:00 a.m. – 11:45 a.m. Eastern Time
Sanbolic Webinar: Building a Private Cloud with Microsoft Hyper-V and Sanbolic Melio
For customers building a Microsoft-based private cloud solution, the combination of Sanbolic Melio, Microsoft Hyper-V and Microsoft Systems Center offers a more flexible, reliable and QoS-aware approach compared to the market leading alternatives. Melio's powerful abstraction layer augments a customer's existing application and storage infrastructure to deliver enhanced application availability and scale to virtualized workloads. This webinar will review the advantages and elements necessary to build a flexible, scalable, highly available private cloud using Microsoft Hyper-V, VMM, DPM and Melio.
Key points covered during the webinar will include:
- Consolidating and providing shared access to storage
- Creating highly available active/active Hyper-V clusters
- Providing a seamless growth path for future infrastructure expansion
- Simplifying the management of storage and virtual machine provisioning
- Using Quality of Service (QoS) assignment to prioritize virtual machine I/O
- Extending shared storage capabilities to virtual machines for application data access
- The benefits of using Melio vs. CSV
Wednesday, February 15, 11:00 a.m. – 11:45 a.m. Eastern Time
Sanbolic Webinar: Adding Data Management and Application High Availability Capabilities to Local Storage
This webinar will discuss how to use Sanbolic's Melio distributed data management platform with local storage to simplify data and storage management and enable high availability of enterprise workloads such as virtual desktop (VDI), File-Serving and SQL Server without the costs and complexities associated with SAN storage or proprietary NAS appliances.
Wednesday, February 22, 11:00 a.m. – 11:45 a.m. Eastern Time
Sanbolic Webinar: SQL Server Virtualization with Sanbolic Melio
Virtualization provides a tool for consolidating and simplifying SQL Server environments by providing the ability to run multiple SQL Servers on a physical machine. However, virtualizing the SQL instances does not address the complexity in provisioning, protecting and managing performance and utilization of the storage resources where the databases reside. Physical deployments need to be managed separately from the virtualized SQL instances, and consequently limit the ability to realize the full benefit of virtualization, consolidation and improved utilization, especially in environments where the workloads may be variable and unpredictable. In this webinar, learn how Sanbolic Melio software extends the benefits of server virtualization to the storage layer, and in doing so simplifies storage management, improves database availability, and streamlines the task of distributing workloads across the SQL server resource pool.
Please visit the Sanbolic Events Page often for a continuously updated schedule of webinars and other events.
About Sanbolic, Inc.
Founded in 2000, Sanbolic is a global leader in distributed data management. Its Melio software suite delivers dramatically increased levels of application availability, scalability, protection and performance while decreasing cost and management complexity across enterprise data center applications such as Microsoft SQL, Microsoft SharePoint and Windows file-/web-serving, Citrix XenDesktop virtual desktop (VDI), and Microsoft Hyper-V private cloud environments. For further information, visit the Sanbolic website at: www.sanbolic.com or email: firstname.lastname@example.org.
Join the conversation - follow Sanbolic on Twitter.
Source: Sanbolic, Inc.
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.