January 31, 2011
For the HPC cloud to gain practical acceptance as a viable decision-support tool in a wide variety of businesses and industries, it must include Remote Interactive 3D Visualization as a fundamental component of its architecture. Without this vital functionality, the HPC cloud runs the risk of being considered a technological novelty with limited commercial success.
Unfortunately, the lack of Remote Interactive 3D Visualization capabilities remains a gaping hole in most HPC cloud implementations. Until recently, technological limitations were responsible for this situation, but today these barriers have largely been overcome.
While the implementation of this complex functionality may be difficult because of “soft-side” barriers, it is possible to overcome these roadblocks.
Each component of Remote Interactive 3D Visualization is important and can stand by itself, but together they enable users in a multitude of fields to make decisions that are faster, more insightful, and better informed.
To fully appreciate their value, let’s examine what each component contributes to the decisions that drive science, technology and business.
Visualization – One of the core capabilities of computing, visualization is routinely placed in two major groupings: scientific (or data); and information (and knowledge).
Scientific visualization is the transformation of data from simulations, experiments, or reality with a geometric structure that allows exploration, analysis and understanding of the data. This generally means a user works with a comparison of variables within a well-established coordinate system—to represent a tangible object such as an airplane wing or geologic strata, for example.
Informational visualization, on the other hand, concentrates on the use of computer-supported tools to explore large amount of abstract data generally represented n-dimensionally. Info viz is difficult because it can involve correlating data with no obvious relationships to enhance the perception of patterns and structural relations in the abstract data.
The outcome of each visualization category is the same: to visually communicate information in manners that are reasonable, sensible, and digestible. By visualizing many variables with computer tools, engineers can determine the best design for a new product and retail executives select the most efficient supply chain, for example.
3D – Viewing information in three dimensions is a natural extension of visualization because nearly all humans are accustomed to seeing things in 3D. Just as a 3D movie heightens the senses of the viewer, 3D computer visualization makes information more digestible and understandable than it is in two or pseudo three dimensions because the user can visually extract a greater amount of insight and intelligence from the data being presented.
Interactive – This enables the user to query and guide the visualization. In a decision-support role, the interactive element allows the user to point and click on a feature or item being viewed to access more information about it, creating a learn-on-demand experience. Interactivity also lets the user “manipulate scenarios and experiences” to steer the data and simulations in (near) real time.
Remote – In today’s interconnected world, decisions are often arrived at collaboratively which makes it critical for everyone in the decision-making process—whether they are on site or around the world—to have the same informational view. Remote access means that individuals and teams, regardless of location, can view and interact with the visualization simultaneously, providing everyone an equal share in the collaborative process that will ultimately drive the decision making.
There are many real and hypothetical examples of how powerful these components are when integrated. One of the best is a surgeon in Boston examining an onscreen MRI to see an anomalous tissue mass. He can rotate it to view it from all sides and instantly query a database of possible cancers or diseases. With a life hanging in the balance, the attending physician may request an online collaboration with neurological experts at a lab across the country, giving them real-time access to the same MRI. Together, the best minds in medicine will make a timely, well-informed decision of how best to proceed with diagnosis and treatment.
The value of Remote Interactive 3D Visualization is beyond question; this fusion of capability becoming a crucial aspect of high performance computing (HPC) in the world of on-demand. But as noted at the outset of this column, non-technical barriers stand in the way of its implementation as a foundational element of the HPC cloud.
Two of these “soft-side” barriers—HPC cloud software licensing challenges and service-level agreements (SLAs)—will be addressed in future columns. In the future, I plan on providing an investigation of how these important barriers can be addressed to advance HPC cloud implementations.
About the Author
Earl Dodd is the Executive Director of the Rocky Mountain Supercomputing Centers, Inc. (RMSC) in Butte, Montana.
His areas of technical interest are in strategy formulation for Peta/ExaScale architectures and UltraScale 3D visualization and collaboration to drive the next generation computationally-steered workflows and applications. He holds BS and MS degrees in Mining Engineering from Montana College of Mineral Science & Technology ("Montana Tech") and an MBA from Tulane University. Earl has 29 years experience in Supercomputing and High Performance Computing (HPC).
To view more by Earl Dodd you can view his other pieces in the Behind the Cloud section.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.