March 04, 2011
HANNOVER, Germany, March 4, 2011 -- Pacific Control Systems LLC, Dubai, announced a joint initiative with the Vienna University of Technology (TU Vienna) in opening a cloud computing lab dedicated to research and development on ICT enabled managed energy services solutions for buildings and infrastructure. The announcement was made on the sidelines of the CeBIT conference in Hannover.
The “Pacific Controls Cloud Computing Research Lab” will enable the Dubai-headquartered world-class developer and provider of global total automation solutions to engage in various strategic scientific developments. This will include R&D in enhanced virtual software services (Gbots), virtual robots for energy efficiency.
Dilip Rahulan, Chairman and CEO of Pacific Controls, said, “We are pleased to collaborate with Vienna University of Technology. It is our constant endeavor to support ICT enabled managed services solutions for buildings and infrastructure. Through the use of our virtual software technology (Gbots), a new paradigm can be created to revolutionize managed services for buildings and infrastructure achieving substantial energy savings and reducing operational costs.”
Professor Schahram Dustdar of the Distributed Systems Group will review the design and development strategy and conduct scientific research in the field of virtual software services (Gbots) developed by Pacific Controls through the use of ICT enabled managed services solutions for buildings and infrastructure.
Team members of the “Pacific Controls Cloud Computing Research Lab” at Vienna, will provide access and training at the company’s R&D facilities in Dubai, India, USA, Saudi Arabia and 24 cities where the company intends to establish Command Control Centers for managed services. Pacific Controls will offer internship opportunities for students from the Vienna University of Technology at its sites.
About Pacific Controls
Pacific Controls is an IS0 9001:2008 company providing ICT enabled managed services and converged engineering solutions for buildings and infrastructure projects globally. Pacific Controls has pioneered the concept of city centric management of buildings and infrastructure, and has established the world’s first Global Command Control Center for energy and managed engineering services. The Global Command Control Center serves clients globally by enabling the monitoring and management of their assets, carrying out: continuous commissioning, measurement & verification of their carbon footprint and converting them into carbon financial instruments, in real time. Pacific Controls offers its managed services using “GALAXY” its enterprise platform hosted on the Pacific Controls Cloud Computing Network. Global assets are connected in real time to the cloud using M2M hardware developed by Pacific Controls. The company has delivered some of the world’s largest ICT enabled integration projects.
About Vienna University of Technology
The Vienna University of Technology (TU Vienna) is located in the heart of Europe, in a cosmopolitan city of great cultural diversity. For nearly 200 years, the TU Vienna has been a place of research, teaching and learning in the service of progress. The TU Vienna is among the most successful technical universities in Europe and is Austria’s largest scientific-technical research and educational institution. visit www.tuwien.ac.at
The Distributed Systems Group (DSG) of the Vienna University of Technology directed by Prof. Schahram Dustdar, conducts research and teaching in distributed computing and systems with particular emphasis on software services, software architectures, and novel paradigms for distributed systems. The DSG investigates foundations and explores novel applications of technologies to the development of distributed services. The focus is on of Service-oriented Computing, Autonomic, Complex, and Context-aware Computing, Cloud Computing, and Mobile and Ubiquitous Computing. visit www.infosys.tuwien.ac.at
Source: Pacific Controls
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 23, 2013 |
The study of climate change is one of those scientific problems where it is almost essential to model the entire Earth to attain accurate results and make worthwhile predictions. In an attempt to make climate science more accessible to smaller research facilities, NASA introduced what they call ‘Climate in a Box,’ a system they note acts as a desktop supercomputer.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.