Welcome!

@CloudExpo Authors: Harry Trott, Liz McMillan, Elizabeth White, Mamoon Yunus, Karthick Viswanathan

Related Topics: @CloudExpo, Java IoT, Microservices Expo, Open Source Cloud, Agile Computing, Apache

@CloudExpo: Article

The Cure for the Common Cloud-Based Big Data Initiative

Understanding how to work with Big Data

There is no doubt that Big Data holds infinite promise for a range of industries. Better visibility into data across various sources enables everything from insight into saving electricity to agricultural yield to placement of ads on Google. But when it comes to deriving value from data, no industry has been doing it as long or with as much rigor as clinical researchers.

Unlike other markets that are delving into Big Data for the first time and don't know where to begin, drug and device developers have spent years refining complex processes for asking very specific questions with clear purposes and goals. Whether using data for designing an effective and safe treatment for cholesterol, or collecting and mining data to understand proper dosage of cancer drugs, life sciences has had to dot every "i" and cross every "t" in order to keep people safe and for new therapies to pass muster with the FDA. Other industries are now marveling at a new ability to uncover information about efficiencies and cost savings, but - with less than rigorous processes in place - they are often shooting in the dark or only scratching the surface of what Big Data offers.

Drug developers today are standing on the shoulders of those who created, tested and secured FDA approval for treatments involving millions of data points (for one drug alone!) without the luxury of the cloud or sophisticated analytics systems. These systems have the potential to make the best data-driven industry even better. This article will outline key lessons and real-world examples of what other industries can and should learn from life sciences when it comes to understanding how to work with Big Data.

What Questions to Ask, What Data to Collect
In order to gain valuable insights from Big Data, there are two absolute requirements that must be met - understanding both what questions to ask and what data to collect. These two components are symbiotic, and understanding both fully is difficult, requiring both domain expertise and practical experience.

In order to know what data to collect, you first must know the types of questions that you're going to want to ask - often an enigma. With the appropriate planning and experience-based guesses, you can often make educated assumptions. The trick to collecting data is that you need to collect enough to answer questions, but if you collect too much then you may not be able to distill the specific subset that will answer your questions. Also, explicit or inherent cost can prevent you from collecting all possible data, in which case you need to carefully select which areas to collect data about.

Let's take a look at how this is done in clinical trials. Say you're designing a clinical study that will analyze cancer data. You may not have specific questions when the study is being designed, but it's reasonable to assume that you'll want to collect data related to commonly impacted readings for the type of cancer and whatever body system is affected, so that you have the right information to analyze when it comes time.

You may also want to collect data unrelated to the specific disease that subsequent questions will likely require, such as information on demographics and medications that the patient is taking that are different from the treatment. During the post-study data analysis, questions on these areas often arise, even though the questions aren't initially apparent. Thus clinical researchers have adopted common processes for collecting data on demographics and concomitant medications. Through planning and experience, you can also identify areas that do not need to be collected for each study. For example, if you're studying lung cancer, collecting cognitive function data is probably unrelated.

How can other industries anticipate what questions to ask, as is done in life sciences? Well, determine a predefined set of questions that are directly related to the goal of the data analysis. Since you will not know all of the questions until after the data collection have started, it's important to 1) know the domain, and 2) collect any data you'll need to answer the likely questions that could come up.

Also, clinical researchers have learned that questions can be discovered automatically. There are data mining techniques that can uncover statistically significant connections, which in effect are raising questions that can be explored in more detail afterwards. An analysis can be planned before data is collected, but not actually be run until afterwards (or potentially during), if the appropriate data is collected.

One other area that has proven to be extremely important to collect is metadata, or data about the data - such as, when it was collected, where it was collected, what instrumentation was used in the process and what calibration information was available. All of this information can be utilized later on to answer a lot of potentially important questions. Maybe there was a specific instrument that was incorrectly configured and all the resulting data that it recorded is invalid. If you're running an ad network, maybe there's a specific web site where your ads are run that are gaming the system trying to get you to pay more. If you're running a minor league team, maybe there's a specific referee that's biased, which you can address for subsequent games. Or, if you're plotting oil reserves in the Gulf of Mexico, maybe there are certain exploratory vessels that are taking advantage of you. In all of these cases, without the appropriate metadata, it'd be impossible to know where real problems reside.

Identifying Touch Points to Be Reviewed Along the Way
There are ways to specify which types of analysis can be performed, even while data is being collected, that can affect either how data will continue to be collected or the outcome as a whole.

For example, some clinical studies run what's called interim analysis while the study is in progress. These interim analyses are planned, and the various courses that can be used afterwards are well defined, but the results afterward are statistically usable. This is called an adaptive clinical trial, and there are a lot of studies that are being performed to determine more effective and useful ways that these can be done in the future. The most important aspect of these is preventing biases, and this is something that has been well understood and tested by the pharmaceutical community over the past several decades. Simply understanding what's happening during the course of a trial, or how it affects the desired outcome, can actually bias the results.

The other key factor is that the touch points are accessible to everybody who needs the data. For example, if you have a person in the field, then it's important to have him or her access the data in a format that's easily consumable to them - maybe through an iPad or an existing intranet portal. Similarly, if you have an executive that needs to understand something at a high level, then getting it to them in an easily consumable executive dashboard is extremely important.

As the life sciences industry has learned, if the distribution channels of the analytics aren't seamless and frictionless, then they won't be utilized to their fullest extent. This is where cloud-based analytics become exceptionally powerful - the cloud makes it much easier to integrate analytics into every user's day. Once each user gets the exact information they need, effortlessly, they can then do their job better and the entire organization will work better - regardless of how and why the tools are being used.

Augmenting Human Intuition
Think about the different types of tools that people use on a daily basis. People use wrenches to help turn screws, cars to get to places faster and word processers to write. Sure, we can use our hands or walk, but we're much more efficient and better when we can use tools.

Cloud-based analytics is a tool that enables everybody in an organization to perform more efficiently and effectively. The first example of this type of augmentation in the life sciences industry is alerting. A user tells the computer what they want to see, and then the computer alerts them via email or text message when the situation arises. Users can set rules for the data it wants to see, and then the tools keep on the lookout to notify the user when the data they are looking for becomes available.

Another area the pharmaceutical industry has thoroughly explored is data-driven collaboration techniques. In the clinical trial process, there are many different groups of users: those who are physically collecting the data (investigators), others who are reviewing it to make sure that it's clean (data managers), and also people who are stuck in the middle (clinical monitors). Of course there are many other types of users, but this is just a subset to illustrate the point. These different groups of users all serve a particular purpose relating to the overall collection of data and success of the study. When the data looks problematic or unclean, the data managers will flag it for review, which the clinical monitors can act on.

What's unique about the way that life sciences deals with this is that they've set up complex systems and rules to make sure that the whole system runs well. The tools associated around these processes help augment human intuition through alerting, automated dissemination and automatic feedback. The questions aren't necessarily known at the beginning of a trial, but as the data is collected, new questions evolve and the tools and processes in place are built to handle the changing landscape.

No matter what the purpose of Big Data analytics, any organization can benefit from the mindset of cloud-based analytics as a tool that needs to consistently be adjusted and refined to meet the needs of users.

Ongoing Challenges of Big Data Analytics
Given this history with data, one would expect that drug and device developers would be light years ahead when it comes to leveraging Big Data technologies - especially given that the collection and analytics of clinical data is often a matter of life and death. But while they have much more experience with data, the truth is that life sciences organizations are just now starting to integrate analytics technologies that will enable them to work with that data in new, more efficient ways - no longer involving billions of dollars a year, countless statisticians, archaic methods, and, if we're being honest, brute force. As new technology becomes available, the industry will continue to become more and more seamless. In the meantime, other industries looking to wrap their heads around the Big Data challenge should look to life sciences as the starting point for best practices in understanding how and when to ask the right questions, monitoring data along the way and selecting tools that improve the user experience.

More Stories By Rick Morrison

Rick Morrison is CEO and co-founder of Comprehend Systems. Prior to Comprehend Systems, he was the Chief Technology Officer of an Internet-based data aggregator, where he was responsible for product development and operations. Prior to that, he was at Integrated Clinical Systems, where he led the design and implementation of several major new features. He also proposed and led a major infrastructure redesign, and introduced new, streamlined development processes. Rick holds a BS in Computer Science from Carnegie Mellon University in Pittsburgh, Pennsylvania.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
The question before companies today is not whether to become intelligent, it’s a question of how and how fast. The key is to adopt and deploy an intelligent application strategy while simultaneously preparing to scale that intelligence. In her session at 21st Cloud Expo, Sangeeta Chakraborty, Chief Customer Officer at Ayasdi, will provide a tactical framework to become a truly intelligent enterprise, including how to identify the right applications for AI, how to build a Center of Excellence to...
SYS-CON Events announced today that Golden Gate University will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Since 1901, non-profit Golden Gate University (GGU) has been helping adults achieve their professional goals by providing high quality, practice-based undergraduate and graduate educational programs in law, taxation, business and related professions. Many of its courses are taug...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
SYS-CON Events announced today that DXWorldExpo has been named “Global Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Digital Transformation is the key issue driving the global enterprise IT business. Digital Transformation is most prominent among Global 2000 enterprises and government institutions.
21st International Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, will feature technical sessions from a rock star conference faculty and the leading industry players in the world. Cloud computing is now being embraced by a majority of enterprises of all sizes. Yesterday's debate about public vs. private has transformed into the reality of hybrid cloud: a recent survey shows that 74% of enterprises have a hybrid cloud strategy. Me...
yperConvergence came to market with the objective of being simple, flexible and to help drive down operating expenses. It reduced the footprint by bundling the compute/storage/network into one box. This brought a new set of challenges as the HyperConverged vendors are very focused on their own proprietary building blocks. If you want to scale in a certain way, let’s say you identified a need for more storage and want to add a device that is not sold by the HyperConverged vendor, forget about it....
With Cloud Foundry you can easily deploy and use apps utilizing websocket technology, but not everybody realizes that scaling them out is not that trivial. In his session at 21st Cloud Expo, Roman Swoszowski, CTO and VP, Cloud Foundry Services, at Grape Up, will show you an example of how to deal with this issue. He will demonstrate a cloud-native Spring Boot app running in Cloud Foundry and communicating with clients over websocket protocol that can be easily scaled horizontally and coordinate...
In his session at 20th Cloud Expo, Scott Davis, CTO of Embotics, discussed how automation can provide the dynamic management required to cost-effectively deliver microservices and container solutions at scale. He also discussed how flexible automation is the key to effectively bridging and seamlessly coordinating both IT and developer needs for component orchestration across disparate clouds – an increasingly important requirement at today’s multi-cloud enterprise.
Any startup has to have a clear go –to-market strategy from the beginning. Similarly, any data science project has to have a go to production strategy from its first days, so it could go beyond proof-of-concept. Machine learning and artificial intelligence in production would result in hundreds of training pipelines and machine learning models that are continuously revised by teams of data scientists and seamlessly connected with web applications for tenants and users.
Recently, WebRTC has a lot of eyes from market. The use cases of WebRTC are expanding - video chat, online education, online health care etc. Not only for human-to-human communication, but also IoT use cases such as machine to human use cases can be seen recently. One of the typical use-case is remote camera monitoring. With WebRTC, people can have interoperability and flexibility for deploying monitoring service. However, the benefit of WebRTC for IoT is not only its convenience and interopera...
Vulnerability management is vital for large companies that need to secure containers across thousands of hosts, but many struggle to understand how exposed they are when they discover a new high security vulnerability. In his session at 21st Cloud Expo, John Morello, CTO of Twistlock, will address this pressing concern by introducing the concept of the “Vulnerability Risk Tree API,” which brings all the data together in a simple REST endpoint, allowing companies to easily grasp the severity of t...
DevOps at Cloud Expo, taking place October 31 - November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 21st Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to w...
IT organizations are moving to the cloud in hopes to approve efficiency, increase agility and save money. Migrating workloads might seem like a simple task, but what many businesses don’t realize is that application migration criteria differs across organizations, making it difficult for architects to arrive at an accurate TCO number. In his session at 21st Cloud Expo, Joe Kinsella, CTO of CloudHealth Technologies, will offer a systematic approach to understanding the TCO of a cloud application...
"With Digital Experience Monitoring what used to be a simple visit to a web page has exploded into app on phones, data from social media feeds, competitive benchmarking - these are all components that are only available because of some type of digital asset," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Secure Channels, a cybersecurity firm, will exhibit at SYS-CON's 21st International Cloud Expo®, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Secure Channels, Inc. offers several products and solutions to its many clients, helping them protect critical data from being compromised and access to computer networks from the unauthorized. The company develops comprehensive data encryption security strategie...
For financial firms, the cloud is going to increasingly become a crucial part of dealing with customers over the next five years and beyond, particularly with the growing use and acceptance of virtual currencies. There are new data storage paradigms on the horizon that will deliver secure solutions for storing and moving sensitive financial data around the world without touching terrestrial networks. In his session at 20th Cloud Expo, Cliff Beek, President of Cloud Constellation Corporation, d...
Connecting to major cloud service providers is becoming central to doing business. But your cloud provider’s performance is only as good as your connectivity solution. Massive Networks will place you in the driver's seat by exposing how you can extend your LAN from any location to include any cloud platform through an advanced high-performance connection that is secure and dedicated to your business-critical data. In his session at 21st Cloud Expo, Paul Mako, CEO & CIO of Massive Networks, wil...
From 2013, NTT Communications has been providing cPaaS service, SkyWay. Its customer’s expectations for leveraging WebRTC technology are not only typical real-time communication use cases such as Web conference, remote education, but also IoT use cases such as remote camera monitoring, smart-glass, and robotic. Because of this, NTT Communications has numerous IoT business use-cases that its customers are developing on top of PaaS. WebRTC will lead IoT businesses to be more innovative and address...
Deep learning has been very successful in social sciences and specially areas where there is a lot of data. Trading is another field that can be viewed as social science with a lot of data. With the advent of Deep Learning and Big Data technologies for efficient computation, we are finally able to use the same methods in investment management as we would in face recognition or in making chat-bots. In his session at 20th Cloud Expo, Gaurav Chakravorty, co-founder and Head of Strategy Development ...