Pharmaceuticals use big data to expedite production

Scientists have claimed that rising populations combined with industrialization have exacerbated the prevalence of diseases and disorders around the globe. The demand for medicine is higher than ever, obligating those in the health care industry to use cloud computing tools to assist them in finding a way to accelerate the production and delivery of treatments. 

A new approach to discovery
Ben Fidler, a contributor to Xconomy, noted that Berg Pharma, a small biotechnology company based out of Framingham, MA., could potentially reduce drug development expenses by nearly 50 percent, according to co-founder Carl Berg. The startup has almost 200 employees and is developing research programs seeking to discover better treatments and cures for cancer, diabetes and Parkinson's disease.  

Although many biotechnology companies believe they have the resources toward developing a panacea for the world's ailments, Berg's assertion as a "revolutionizing force" in the pharmaceutical industry may not be so ridiculous. The co-founder told the news source that Berg Pharma integrates big data analytics into each of its investigative endeavors, which should provide a unique, algorithmic perspective on each of the organization's investigative endeavors. 

Eric Schadt, director of the Icahn Institute for Genomics and Multiscale Biology at Mt. Sinai Medical Center in New York, initially expressed skepticism regarding Berg's confidence, but believes the corporation is headed in the right direction. 

"On one hand, there's lots of literature to support the approach they're taking on the integration of the data," Schadt told Xconomy. "But what there isn't any publication on is that the predictions made, the therapeutics made, actually achieve clinical efficacy in a human population - that's the money shot."

Enhancing production 
According to InformationWeek contributor Dough Henschen, pharmaceutical firm Merck is looking to use a data analytics program working off of a cloud server to provide a more accurate view of the organization's vaccine manufacturing process. Producing these treatments requires assiduous surveillance in order to ensure that the products aren't tarnished. George Llado, vice president of information technology at Merck, claimed that interest in computation investment began two years ago. 

Back in 2012, Merck relied on spreadsheets and ground-based investigative measures to determine why the company was encountering abnormally high discard rates on particular vaccines. Data sources were greatly varied, ranging from building management systems capturing air pressure and temperature to shop floor tags that traced each batch of medicines. 

Llado noted that the process of manually assembling all of this data onto a single reference point was inefficient and costly. In turn, Merck's staff began experimenting with a public cloud storage service that allowed employees to organize 16 separate sources more fluidly. As a result, the manufacturing team was able to find conclusive answers regarding production yield variance within three months. 

"Through 15 billion calculations and more than 5.5 million batch-to-batch comparisons, Merck discovered that certain characteristics in the fermentation phase of vaccine production were closely tied to yield in a final purification step," noted Henschen. 

It's evident that if the company had chosen to abide by previous techniques, it's very likely that they wouldn't have reached the conclusions they did through the cloud server. 

Related Articles


  • Welcome to GoGrid!
  • I'm a Cloud Infrastructure and Big Data Solutions expert.
  • What questions do you have today?
Call us at 1(877) 946-4743 (US & Canada)
GoGrid Compliance