February 01, 2011
In my last post I mentioned that I am working with a small bio-tech client on creating an IT strategy that involves utilizing the various facets of cloud computing to fulfill their IT needs as much as possible. This company is less than 50 people but has some of the same IT needs and challenges of much larger organizations. They also have as a stated goal in creating an IT infrastructure that requires essentially next to no on-site support and uses hosted applications and remote infrastructure for their primary applications and systems.
Like many other small bio-techs they have limited financial resources. They are totally focused on getting their primary drug candidates through the research and development process and into clinical trials as quickly as possible.
Their research generates considerable amounts of data that needs to be stored and protected. This company, like others of approximately the same size, has similar business, computational, communications and collaboration needs as with any other organization.
Additionally, like any similar company, they must also generate and safeguard the huge number of reports and documents that will be incorporated into the final FDA new drug application (NDA). While doing all of this they must also adhere to the appropriate regulatory guidelines that are a part of the life sciences industry.
So from a tactical standpoint what has been done to move them towards a cloud/SAAS based IT strategy?
Here are some example areas where my client has leveraged the cloud to obtain applications and systems needed to run their business:
Document Management - This is a critical function for the overall approval process. Any bio-tech must be able to provide a centralized repository where documents and files related to regulatory approval of their products can be safely stored and assembled into the actual approval applications. My client uses a SAAS provisioned document management system that provides all of the functionality they need without the overhead and expense of buying and maintaining the system internally. There are many vendors in this space (EMC, LiveLink etc.) that can fulfill this need.
Collaboration – Many bio-techs have the need to share files, collect information and collaborate with outside organizations and agencies. Using a cloud based application, such as SharePoint, allows my client to perform this function without the need to provide internal IT support.
Backup/Restore – one of the easiest to provision and to implement. Since my client is using outside contract research organizations for their clinical trials they do not have any HIPAA requirements for their data. To facilitate their backups we engaged one of the major vendors and now have their data being backed up into the cloud.
E-mail – This is a no-brainer. More and more companies are turning to outside vendors to provide the commodity IT functions such as e-mail. Given the issues around viruses, spam, archival and e-discovery (especially in the bio-tech industry) the costs and complexity of administering e-mail internally just cannot be justified in the current economic client. Using a SAAS e-mail system makes perfect sense for those companies that wish to outsource their e-mail applications.
The results are that my client has access to the systems and applications that they need to run their business without having to create and support the costly and complex infrastructure required. Currently their on-site IT support needs are fulfilled with just having a tech on site 2 days a week and once we finish their migration the support needs will be reduced even more.
I am also working with another even smaller client where we have put so much functionality into cloud based systems that their office consists of a wireless router and printers and that’s it. All IT functions are provisioned externally and they access their data and applications via laptops, pad devices, or cell phones making their IT support requirements essentially non-existent.
As more and more applications and services are available in the cloud the need for need for life sciences companies (or any small company for that matter) to have to fund and support their own IT infrastructure will become less and less. Those that embrace putting their IT functions in the cloud will be able to direct more resources to getting their products approved and to market.
Posted by Bruce Maches - February 01, 2011 @ 9:15 AM, Pacific Standard Time
Former Director of Information Technology for Pfizer's R&D division, current CIO for BRMaches & Associates.
No Recent Blog Comments
Researchers from the Suddhananda Engineering and Research Centre in Bhubaneswar, India developed a job scheduling system, which they call Service Level Agreement (SLA) scheduling, that is meant to achieve acceptable methods of resource provisioning similar to that of potential in-house systems. They combined that with an on-demand resource provisioner to ensure utilization optimization of virtual machines.
Experimental scientific HPC applications are continually being moved to the cloud, as covered here in several capacities over the last couple of weeks. Included in that rundown, Co-founder and CEO of CloudSigma Robert Jenkins penned an article for HPC in the Cloud where he discussed the emergence of cloud technologies to supplement research capabilities of big scientific initiatives like CERN and ESA (the European Space Agency)...
When considering moving excess or experimental HPC applications to a cloud environment, there will always be obstacles. Were that not the case, the cost effectiveness of cloud-based HPC would rule the high performance landscape. Jonathan Stewart Ward and Adam Barker of the University of St. Andrews produced an intriguing report on the state of cloud computing, paying a significant amount of attention to the problems facing cloud computing.
Jun 19, 2013 |
Ruan Pethiyagoda, Cameron Boehmer, John S. Dvorak, and Tim Sze, trained at San Francisco’s Hack Reactor, an institute designed for intense fast paced learning of programming, put together a program based on the N-Queens algorithm designed by the University of Cambridge’s Martin Richards, and modified it to run in parallel across multiple machines.
Jun 17, 2013 |
With that in mind, Datapipe hopes to establish themselves as a green-savvy HPC cloud provider with their recently announced Stratosphere platform. Datapipe markets Stratosphere as a green HPC cloud service and in doing so partnering with Verne Global and their Icelandic datacenter, which is known for its propensity in green computing.
Jun 12, 2013 |
Cloud computing is gaining ground in utilization by mid-sized institutions who are looking to expand their experimental high performance computing resources. As such, IBM released what they call Redbooks, in part to assist institutions’ movement of high performance computing applications to the cloud.
Jun 06, 2013 |
The San Diego Supercomputer Center launched a public cloud system for universities in the area designed specifically to run on commodity hardware with high performance solid-state drives. The center, which currently holds 5.5 PB of raw storage, is open to educational and research users in the University of California.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.