December 15, 2010
Integrated smart phone technology is merging with cloud computing, supercomputing and mobile applications to provide one of the richest layers of technology widely adapted by the consumer market.
Growing sophistication of mobile applications and the hardware they run on via a variety of devices, from handhelds to tablets, means that more possibilities will exist to access high-performance resources. A number of enterprises have already seen the possibilities of storing data on the cloud, which can then be accessed securely in the field or on the go.
One can argue that there is currently more of a drive to integrate mobility into the enterprise versus providing access to scientific or technical computing applications. A number of organizations are already moving to use new software that integrates with devices and smart phones. One such example is the SAP software, which now includes application integration for mobile devices. Smartphones can be used to cleverly access web information or backend SAP data necessary for mission critical business decisions.
Soon all types of supercomputers will be linked to smartphone counterparts for similar capabilities. For example, “Researchers at MIT have created an experimental system for smart phones that allows engineers to leverage the power of super computers for instant computation and analysis.
Creators of design and engineering software, Autodesk, for instance, have announced that some of their most widely-used applications, including AutoCAD, can be delivered via mobile devices. This means that designers and engineers can now update rendering projects via the cloud on a smartphone or tablet.
As supercomputing continues to advance, so will the need for new mobile applications that allow researchers to access and view supercomputing results and data from a smartphone. Once on the smartphone, researchers can “calculate on the go” to determine critical information for use in a host of other research and business scenarios.
This kind of powerful hand-held mobile computing will enable more users to realize the benefits of next generation supercomputing. Others have also noted that it is clear that companies like IBM continue to aggressively pursue development of efficient and optimized super computers that will be available to support more data processing with a smaller, more efficient footprint. This will in turn provide increased data throughput for integration with smart phones.
Eventually, government agencies will adopt integrated smartphone technology. They will use new mobile features delivered through integration with supercomputers to share information with the public “on the go”. For example, weather simulations will be run on supercomputers with the ability to send data directly to smartphone devices to share critical information about changing weather patterns. The smartphones will also have the ability to calculate, analyze and chart the results in a meaningful way so all types of users can evaluate and use the data.
Once the data is used on the hand-held, updated information can be formulated and transmitted back to the supercomputer. This will aid in the timely processing of new weather models and simulations that will help us better understand drastic changes in weather and climate.
The next step in this direction will be to continue to improve the smart phone technology. Companies like Nvida are working on new approaches to increased smart phone technology. For example, smartphone graphics and computing capabilities will be more closely aligned with the demands of users integrating with super computers. These devices will be updated to include new memory, computing and storage capacity to handle the feeds from next generation super computers. This will support the increasing demand for business, scientific, educational and government demand for smart phone integration with super computers.
Cloud computing has opened doors to accessing high-end resources that many organizations were once barred from due to cost. With the convergence of mobile applications that provide interfaces to sophisticated resources, we are on the verge of yet another shift in high-performance computing in the next few years.
While this movement is still happening, there seems to be enough momentum building in research and enterprise alike to warrant this as one of the key trends to watch in 2011.
About the Author
Valery Herrington is CEO of Herrington Technology. She is an enterprise technology leader with Tier 1 Global Technology and a trusted executive advisor and consultant with successful project deployments at over 25 large-scale enterprise organizations. Ms. Herrington specializes in emerging technologies, enterprise architecture, program and project management, and in rapid deployments for the finance, pharmaceutical industries, among others. Ms. Herrington can be found on the web at http://www.herringtontechnology.net
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.