February 28, 2011
Pharmaceutical research has always been a time consuming and expensive endeavor with each new therapeutic advance requiring millions of dollars in expenditures and years of effort. As I have mentioned in prior blogs all sectors of the life science community are looking at ways to reduce costs, IT complexity, and speed time to market. As the types of research being done become more complex and the amount of data generated grows these companies simply cannot afford to keep expanding their IT infrastructure and application environments to meet all of the demands being placed upon them.
The advent of genomic sequencing alone has had a dramatic impact on the amount of data being created. Overall it is estimated that data storage demands by life science companies are doubling about every 3 months. Companies with hard pressed research budgets simply cannot afford to keep throwing more and more money at the problem along with the associated issues of storage management, backup/recovery, and disaster recovery planning.
The same holds true with the types and complexity of the applications being used. The need for systems to perform complex data modeling, gene sequencing, target identification, and computational chemistry places an ever increasing support burden on the IT department. Many of these applications require significant hardware investments and experienced support staff to maintain in house. Not to mention the expense and effort required to validate and implement these systems to meet FDA regulatory requirements.
In a prior blog I discussed how IaaS is helping many life science organizations deal with some of the infrastructure demand and support issues. In this entry we will take a high level look at SaaS and how that can be leveraged by these organizations to help them meet their research objectives in a more cost effective and timely manner.
Many life science organizations are taking a harder look at utilizing SaaS as a means of provisioning the complex applications required for their research. There are many vendors offering specialized applications for life science research via the SaaS model and more industry vendors are moving in this direction. Life science companies can take advantage of systems in functional areas such as:
- genomic sequencing
- molecular modeling
- toxicology studies
- drug stability
- clinical trials
- document management
- regulatory submissions
What advantages can these companies realize from utilizing SaaS based applications?
- There is no hardware required as the storage and compute resources will be provisioned at the vendors data center
- Many of these applications are either validated or since a portion of the application environment is already validated that process can be completed much quicker
- Reducing time spent on acquiring and implementing applications can allow a life science company to reduce time to market for new products
- Professional data centers can usually provide much better security than in-house environments
- The vendor takes care of all support requirements, patching, and administrative duties
- The primary disaster recovery responsibilities are also handled by the vendor
Of course nothing is perfect, so what are some of the downsides to following a SaaS based strategy?
- Some companies are concerned with the security of their data in a shared environment
- Depending on a SaaS based application that is critical for your business requires extra care when negotiating contracts and service level agreements
- Using off premise applications may cause latency issues
- You never own the application so you are continually paying
- There may be regulatory issues with storing patient data from clinical trials or other private information in a 3rd party data center
While all of these factors require consideration, with careful planning and foresight life science companies can utilize SaaS based systems to reduce costs and complexity while enhancing the R&D process and speeding time to market for new therapies.
Posted by Bruce Maches - February 28, 2011 @ 12:27 PM, Pacific Standard Time
Former Director of Information Technology for Pfizer's R&D division, current CIO for BRMaches & Associates.
No Recent Blog Comments
The ever-growing complexity of scientific and engineering problems continues to pose new computational challenges. Thus, we present a novel federation model that enables end-users with the ability to aggregate heterogeneous resource scale problems. The feasibility of this federation model has been proven, in the context of the UberCloud HPC Experiment, by gathering the most comprehensive information to date on the effects of pillars on microfluid channel flow.
Large-scale, worldwide scientific initiatives rely on some cloud-based system to both coordinate efforts and manage computational efforts at peak times that cannot be contained within the combined in-house HPC resources. Last week at Google I/O, Brookhaven National Lab’s Sergey Panitkin discussed the role of the Google Compute Engine in providing computational support to ATLAS, a detector of high-energy particles at the Large Hadron Collider (LHC).
Frank Ding, engineering analysis & technical computing manager at Simpson Strong-Tie, discussed the advantages of utilizing the cloud for occasional scientific computing, identified the obstacles to doing so, and proposed workarounds to some of those obstacles.
May 16, 2013 |
When it comes to cloud, long distances mean unacceptably high latencies. Researchers from the University of Bonn in Germany examined those latency issues of doing CFD modeling in the cloud by utilizing a common CFD and its utilization in HPC instance types including both CPU and GPU cores of Amazon EC2.
May 10, 2013 |
Australian visual effects company, Animal Logic, is considering a move to the public cloud.
May 10, 2013 |
Program provides cash awards up to $10,000 for the best open-source end-user applications deployed on 100G network.
05/10/2013 | Cleversafe, Cray, DDN, NetApp, & Panasas | From Wall Street to Hollywood, drug discovery to homeland security, companies and organizations of all sizes and stripes are coming face to face with the challenges – and opportunities – afforded by Big Data. Before anyone can utilize these extraordinary data repositories, however, they must first harness and manage their data stores, and do so utilizing technologies that underscore affordability, security, and scalability.
04/02/2012 | AMD | Developers today are just beginning to explore the potential of heterogeneous computing, but the potential for this new paradigm is huge. This brief article reviews how the technology might impact a range of application development areas, including client experiences and cloud-based data management. As platforms like OpenCL continue to evolve, the benefits of heterogeneous computing will become even more accessible. Use this quick article to jump-start your own thinking on heterogeneous computing.