March 15, 2011
CTO of NASA, Chris Kemp, formally stepped down from his post today, leaving the public with a parting message that could have been penned by the combined hands of Bill Gates, Stan Lee, and Franz Kafka that read as follows:
“I now realize my mistake: I was an entrepreneur chained in the kryptonite of bureaucracy—with almost no startup capital.”
Some could argue that NASA’s CTO has been more like a CEO rather than a technical lead with his numerous public appearances and direct widely-aired involvement with a number of large tech companies. In his parting post he lamented the fact that he while he once thought he “had the best of both worlds being a Headquarters employee stationed in Silicon Valley [I] actually had the worst of both worlds…no influence when I can’t be in all of those meetings at NASA HQ, with no mandate to manage projects at Ames.”
This “worst of both worlds” was compounded for Kemp due to the fact that “budgets kept getting cut and continuing resolutions from Congress continued to make funding unavailable.” At this point the CTO decaled it was time for him to leave the place he dreamed “of working at as a kid to find a garage in Palo Alto to do what I love.”
But alas, being chained in the kryptonite of NASA’s endless swirl of initiatives and policies (even with all the seed money in the world) very likely came with a set of rigorous challenges that a new master of IT for the space agency will have to address.
Among a number of achievements Kemp first mentions that when he first joined the agency his first role was to help catalyze public-private partnerships and the commercialization of NASA data.
This commercialization effort of the agency’s data was contained in a number of initiatives on the parts of a number of tech giants like Google, which made use (to say the least) of NASA data to bring Google Earth to the desktops of millions. Kemp noted that part of the success was that they were able to deploy the Space Act’s authority to make sure that American tax money wasn’t going behind the effort.
Under the leadership of Kemp’s directive to explore private-public partnerships the agency also formed a partnership with Microsoft to allow the company to use NASA data on their Worldwide Telescope project.
Kemp is also credited with helping the Nebula cloud project gather steam and thrive, making the case that clouds have a big place in the future of IT—both within and outside of the public sector.
Carving out space in a Palo Alto garage to toil away in obscurity sounds rough enough, but if one has spent a few years chained in kryptonite—and without financial backing to boot—even a leaky carport would do.
Full story at NASA Blogs
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.