Click here to close now.

Welcome!

Cloud Expo Authors: Jnan Dash, Carmen Gonzalez, Elizabeth White, Lori MacVittie, Mike Kavis

Related Topics: Cloud Expo, Java, SOA & WOA, Virtualization, Web 2.0, Big Data Journal

Cloud Expo: Article

Healthcare IT and the Cloud

Moving healthcare data into a Cloud Ecosytem

Over the last few weeks I've been hearing a lot of discussion around HIPAA. When we speak about HIPAA, invariably the two components of data security and data privacy arises.

In the traditional data centers, database managers and data owners know where their data resides and implement the necessary processes to preserve privacy and audit access.

However, when we move to the cloud, the cloud being all about data, we are looking at servers, network, and storage that are abstracted. This raises concern that data owners may not necessarily know where their data sets physically reside and we are looking at Cloud Service Provider (CSP) employees who will be handling confidential patient data or Personally Identifiable Information (PII).

Of importance here is that when it comes to leveraging the cloud ecosystem for healthcare segments, the foremost concerns are around HIPAA and the  HITECH Act compliance capabilities and meaningful use provisions.

So what is meaningful use? According to HealthIT.gov

"Meaningful use is the set of standards defined by the Centers for Medicare & Medicaid Services (CMS) Incentive Programs that governs the use of electronic health records and allows eligible providers and hospitals to earn incentive payments by meeting specific criteria."

The goal of meaningful use is to promote the spread of Electronic Health Records (EHR) to improve health care in the United States.

Benefits of meaningful use of EHRs include:

  • Complete and accurate information.
  • Better access to information.
  • Patient empowerment.

In the healthcare world, organizations are positioning to attain meaningful use. This to capture the incentives allocated by the Federal Government as well as to ensure that reimbursements do not face jeopardy for providers not in line with the meaningful use provisions.

As healthcare practitioners and organizations increase the use of technology solutions in delivering clinical care, their IT departments are faced with additional stress to provide availability on demand and operate data center approaching 99.999 percent availability. In most cases this is a major challenge that can lead to the risk of unscheduled outages and costly solutions.

Assuring high availability for healthcare applications, means meeting uptime requirements; and in today's environments will require access to more than one data center. This can significantly impact the overall capital investment in data center infrastructure for healthcare organizations.

Looking to the cloud as a solution is not only the next step in services but will ensure high availability of clinical applications. This will allow a healthcare organization to leverage the expertise and financial stability of an established CSP. Another advantage of leveraging a cloud ecosystem, is that of rapid provisioning and deployment, with the ability to change compute capacity as demand changes.

Thus in the event of failure, server instances can be seamlessly moved to alternate hosts or in anticipation can be clustered to provide redundancy.

Some may ask whether it is risky to transfer data from site to cloud. The answer is no as a majority of organizations move data over the Internet via encryption channels. Where we can see concerns arising is with the hand-off of data into the (CSP) environment.

In a seamless environment all data will have site to site encryption up to and including storage. Where we can see some separation is with healthcare application vendors support.

In the cloud, it is a given that we can have a number of people with access to the physical servers and storage that cloud consumers have no control over. For an IT Security person this will elicit conflicting concerns as on one hand there is the presupposition that complete control is being relinquished which can only be assured with prescriptive precautions defined by a CSP.

The cloud computing ecosystem is still evolving and as such there is still a lack of industry-wide certifications. As we mature within this ecosystem the intent is to drive toward processes, best practices and certifications which would provide legal protection that can reduce the complexities of a long negotiation and complex SLA requirements.

Within a regular data center or even a small IT shop, as an IT Security leader one of my first expectation for any shop is some form of centralized logging with automation. Similarly by transferring such a mindset into the cloud ecosystem (they are after datacenters) any healthcare customer security leaders expect the assurance that detailed reporting is a given.

Having worked on the security strategy and assessment separately for a few cloud computing projects I have seen first-hand, that access rights was a major focus. In light of this, it is not a complex process to segment solutions for healthcare. As a result any access to servers and storage dedicated to a healthcare customer by anyone within a CSP organization will be logged and thus can provide the assurance of controls around access.

From a legal perspective, more specifically talking contracts, healthcare customers expect the provisions of strong financial penalties to indemnify against a breech as well as to hold the CSP accountable.

Some CSPs are moving to providing a HIPAA Business Associate Agreement (BAA) for their healthcare customers. The assurance provided by their BAA demonstrates meeting the compliance requirements (enabling the physical, technical, and administrative safeguards required) of the HIPAA and the HITECH Acts.

In closing, I will state that HIPPA compliance and cloud computing do not have to be in conflict. Rather healthcare entities can leverage the benefits of the cloud, coupled with the necessary due diligence and legal contracts to meet their needs.

More Stories By Jon Shende

Jon RG Shende is an executive with over 18 years of industry experience. He commenced his career, in the medical arena, then moved into the Oil and Gas environment where he was introduced to SCADA and network technologies,also becoming certified in Industrial Pump and Valve repairs. Jon gained global experience over his career working within several verticals to include pharma, medical sales and marketing services as well as within the technology services environment, eventually becoming the youngest VP of an international enterprise. He is a graduate of the University of Oxford, holds a Masters certificate in Business Administration, as well as an MSc in IT Security, specializing in Computer Crime and Forensics with a thesis on security in the Cloud. Jon, well versed with the technology startup and mid sized venture ecosystems, has contributed at the C and Senior Director level for former clients. As an IT Security Executive, Jon has experience with Virtualization,Strategy, Governance,Risk Management, Continuity and Compliance. He was an early adopter of web-services, web-based tools and successfully beta tested a remote assistance and support software for a major telecom. Within the realm of sales, marketing and business development, Jon earned commendations for turnaround strategies within the services and pharma industry. For one pharma contract he was responsibe for bringing low performing districts up to number 1 rankings for consecutive quarters; as well as outperforming quotas from 125% up to 314%. Part of this was achieved by working closely with sales and marketing teams to ensure message and product placement were on point. Professionally he is a Fellow of the BCS Chartered Institute for IT, an HITRUST Certified CSF Practitioner and holds the CITP and CRISC certifications.Jon Shende currently works as a Senior Director for a CSP. A recognised thought Leader, Jon has been invited to speak for the SANs Institute, has spoken at Cloud Expo in New York as well as sat on a panel at Cloud Expo Santa Clara, and has been an Ernst and Young CPE conference speaker. His personal blog is located at http://jonshende.blogspot.com/view/magazine "We are what we repeatedly do. Excellence, therefore, is not an act, but a habit."

@CloudExpo Stories
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
Security can create serious friction for DevOps processes. We've come up with an approach to alleviate the friction and provide security value to DevOps teams. In her session at DevOps Summit, Shannon Lietz, Senior Manager of DevSecOps at Intuit, will discuss how DevSecOps got started and how it has evolved. Shannon Lietz has over two decades of experience pursuing next generation security solutions. She is currently the DevSecOps Leader for Intuit where she is responsible for setting and driv...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...
SYS-CON Events announced today that Open Data Centers (ODC), a carrier-neutral colocation provider, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Open Data Centers is a carrier-neutral data center operator in New Jersey and New York City offering alternative connectivity options for carriers, service providers and enterprise customers.
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
In his session at DevOps Summit, Tapabrata Pal, Director of Enterprise Architecture at Capital One, will tell a story about how Capital One has embraced Agile and DevOps Security practices across the Enterprise – driven by Enterprise Architecture; bringing in Development, Operations and Information Security organizations together. Capital Ones DevOpsSec practice is based upon three "pillars" – Shift-Left, Automate Everything, Dashboard Everything. Within about three years, from 100% waterfall, C...
PubNub on Monday has announced that it is partnering with IBM to bring its sophisticated real-time data streaming and messaging capabilities to Bluemix, IBM’s cloud development platform. “Today’s app and connected devices require an always-on connection, but building a secure, scalable solution from the ground up is time consuming, resource intensive, and error-prone,” said Todd Greene, CEO of PubNub. “PubNub enables web, mobile and IoT developers building apps on IBM Bluemix to quickly add sc...
Data-intensive companies that strive to gain insights from data using Big Data analytics tools can gain tremendous competitive advantage by deploying data-centric storage. Organizations generate large volumes of data, the vast majority of which is unstructured. As the volume and velocity of this unstructured data increases, the costs, risks and usability challenges associated with managing the unstructured data (regardless of file type, size or device) increases simultaneously, including end-to-...
The excitement around the possibilities enabled by Big Data is being tempered by the daunting task of feeding the analytics engines with high quality data on a continuous basis. As the once distinct fields of data integration and data management increasingly converge, cloud-based data solutions providers have emerged that can buffer your organization from the complexities of this continuous data cleansing and management so that you’re free to focus on the end goal: actionable insight.
With several hundred implementations of IoT-enabled solutions in the past 12 months alone, this session will focus on experience over the art of the possible. Many can only imagine the most advanced telematics platform ever deployed, supporting millions of customers, producing tens of thousands events or GBs per trip, and hundreds of TBs per month. With the ability to support a billion sensor events per second, over 30PB of warm data for analytics, and hundreds of PBs for an data analytics arc...
Between the compelling mockups and specs produced by your analysts and designers, and the resulting application built by your developers, there is a gulf where projects fail, costs spiral out of control, and applications fall short of requirements. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a new approach where business and development users collaborate – each using tools appropriate to their goals and expertise – to build mo...