Click here to close now.

Welcome!

Cloud Expo Authors: Liz McMillan, Elizabeth White, Pat Romanski, William Schmarzo, Dana Gardner

Related Topics: Cloud Expo, Microservices Journal

Cloud Expo: Article

A Cloud Computing Business Intelligence Organization

Moving data warehouses to the cloud

Data Warehousing As A Cloud Candidate
Over the past  year, we have started seeing greater support for Cloud from major vendors and Cloud is here to stay. The   bigger impact is that,  the path is clearly drawn for the enterprises to adopt Cloud.  With this in mind,  it is time to identify the potential for existing  data center applications  to be migrated to Cloud.

Most of the major IT majors predict a  HYBRID Delivery will be future,  where by the future enterprises needs to look  for a delivery model that comprises of certain work loads  on  Clouds  and some of them continue to be on data centers and then look for a model that will integrate  them together.

Before  we go further into  a  blue print of  How Data warehouses  fit  within a HYBRID Cloud environment, we  will see the salient features of  Data warehouses  and how the Cloud tenants  make them a  very viable work load to be  moved to Cloud.

A data warehouse  is a subject oriented, integrated, time variant and non volatile collection  of data in support of management's decision making process.

Data Warehousing  Usage

Cloud Tenant Value Proposition

ETL   (Extract, Cleaning, Transform, Load) process is  subject to variable patterns. Normally we may get large files over the week end or in night time to be processed and loaded.

It is better to  use the COMPUTE  resources on demand for the ETL as they require , rather than having a fixed capacity

OLAP (Online Analytical Processing) and related  processing needs for  MOLAP (Multi dimensional OLAP) and / or ROLAP (Relational OLAP) are highly compute intensive and requires stronger processing needs

High Performance Computing and ability to scale up on demand,   tenants of Cloud will be highly aligned to this need

Physical architecture needs are complex in a data warehousing environment.

  • MPP Servers (Massively Parallel Processing)
  • Shared Nothing Data Architecture
  • Mirrored Copies of Disk Space
  • High Availability Clustering

Most of the IaaS , PaaS offerings  like Azure platform, Amazon EC2 have built in  provisions for a highly available architecture, with most of the day to day administration  is  abstracted from the  enterprises.

The below are some of the advantages of SQL Azure Platform

  • No physical administration required - software installation and patching is included, as this is a platform as a service (PAAS)
  • High availability and fault tolerance are built in

Multiple Software and platform  needs,

  • Database Design Tools (STAR Schema Modeling)
  • ETL Tools
  • Data Cleansing Tools
  • OLAP Tools
  • Spatial Tools
  • Data Mining Tools
  • BI Reporting Tools

The product stack of  data warehousing  environment  is really huge and most organizations will normally find  it difficult to get into  a ideal list of software and platforms and tools  for their BI  platform. platform. SaaS for  applications like  data cleansing or address validation and PaaS for reporting like Microsoft SQL Azure reporting will be ideal to solve the tools and platform maze.

 

The following are the ideal steps for migrating  a  in-premise  data warehouse  system to a cloud platform, for the sake of case study , Microsoft  Windows Azure platform is chosen  as the target platform.

1. Create Initial Database / Allocate Storage / Migrate Data
The existing STAR Schema design  of the existing  data warehousing system can be migrated to  Cloud platform as it is.  And  migrating  to a  Relational database  platform like  SQL Azure should be straightforward. To migrate the data,   the initial  storage allocations of the existing  database on  the data center needs to be calculated and the same amount  Storage resources will be allocated on the Cloud.

You can store any amount of data, from kilobytes to terabytes, in SQL Azure. However, individual databases are limited to 10 GB in size. To create solutions that store more than 10 GB of data, you must partition large data sets across multiple databases and use parallel queries to access the data.

Once a high scalable database infrastructure is setup on SQL Azure platform , the following are some of the methods in which the data from the existing on-premise  data warehouses can be  moved to SQL Azure.

Traditional BCP Tool : BCP  is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can use BCP to backup and restore your data on SQL Azure You can import large numbers of new rows into SQL Azure tables or export data out of tables into data files by using the bcp utility.

The following tools  are also useful, if you existing  Data warehouse is in  Sql Server within the data center.

You can transfer data to SQL Azure by using SQL Server 2008 Integration Services (SSIS). SQL Server 2008 R2 or later supports the Import and Export Data Wizard and bulk copy for the transfer of data between an instance of Microsoft SQL Server and SQL Azure.

SQL Server Migration Assistant (SSMA for Access v4.2) supports migrating your schema and data from Microsoft Access to SQL Azure.

2. Set Up ETL & Integration With Existing  On Premise Data Sources
After the initial load of the  data warehouse on Cloud,  it required to be continuously refreshed   with the operational data.  This process  needs to extract  data from different data sources (such as flat files, legacy databases, RDBMS, ERP, CRM and SCM application packages).

This process will also carry out necessary transformations such as joining of tables, sorting, applying  various filters.

The following are typical  options available in Sql Azure platform  to  build a  ETL platform between the On Premise and data warehouse hosted on cloud.  The tools mentioned above on the initial load of the data also holds good  for ETL tool, however they are not repeated  to avoid duplication.

SQL Azure Data Sync :

  • Cloud to cloud synchronization
  • Enterprise (on-premise) to cloud
  • Cloud to on-premise.
  • Bi-directional or sync-to-hub or sync-from-hub synchronization

The following diagram courtesy  of  Vendor will give a over view of how the SQL Azure Data Sync can be used for ETL purposes.

Integration provides common  Biztalk  Server integration capabilities (e.g. pipeline, transforms, adapters) on Windows Azure, using out-of-box integration patterns to accelerate and simplify development. It also delivers higher level business user enablement capabilities such as Business Activity Monitoring and Rules, as well as self-service trading partner community portal and provisioning of business-to-business pipelines.  The following diagram courtesy of the vendor shows how the  Windows Azure Appfabric Integration can be used as a ETL platform.

3. Create CUBES & Other Analytics  Structures
The multi dimensional nature of  OLAP requires a analytical engine to process the underlying data and create a multi dimensional view and  the success of OLAP has resulted in a large  number of vendors  offering OLAP servers using different architectures.

MLOAP :  A Proprietary multidimensional database with a aim on performance.

ROLAP :   Relational OLAP is a technology that provides sophisticated multidimensional analysis that is performed on open relational databases.  ROLAP can scale to  large data sets in the terabyte range.

HOLAP : Hybrid OLAP is an attempt to combine some of the features of MOLAP and ROLAP technology.

SQL Azure Database does not support all of the features and data types found in SQL Server. Analysis Services, Replication, and Service Broker are not currently provided as services on the Windows Azure platform.

At this time  there is no direct support for OLAP and CUBE processing on SQL Azure,  however with the HPC (High Performance Computing ) attributes  using multiple Worker roles,  manually  aggregation of the data can be achieved.

4. Generate Reports
Reporting consists of  analyzing the data  stored in the data warehouse in multiple dimensions and  generate standard reports for business intelligence and also generate ad-hoc reports.  These reports present data in graphical/tabular form and also provide statistical analysis features.  These reports should be rendered as Excel, PDF and other formats.

It is better to utilize the SaaS based or PaaS based reporting infrastructure rather than custom coding all the reports.

SQL Azure Reporting enables developers to enhance their applications by embedding cloud based reports on information stored in a SQL Azure database.  Developers can author reports using familiar SQL Server Reporting Services tools and then use these reports in their applications which may be on-premises or in the cloud.

SQL Azure Reporting  also currently can connect only to SQL Azure databases.

Summary
The above steps will provide a path to migrate   on premise  Data warehousing  applications to Cloud. As we needed lot of support from the  vendor in terms of IaaS, PaaS  and SaaS,   Microsoft Azure Platform is chosen as a platform to support the case study.  With several features  integrated as part of  this, Microsoft  Cloud Platform  positioned to be  one of the leading platform for BI on Cloud.

The following diagram  indicates a blue print of a  typical Cloud BI Organization on a Microsoft Azure Platform.

More Stories By Srinivasan Sundara Rajan

Srinivasan is passionate about ownership and driving things on his own, with his breadth and depth on Enterprise Technology he could run any aspect of IT Industry and make it a success.

He is a seasoned Enterprise IT Expert, mainly in the areas of Solution, Integration and Architecture, across Structured, Unstructured data sources, especially in manufacturing domain.

He currently works as Technology Head For GAVS Technologies.

@CloudExpo Stories
There's no doubt that the Internet of Things is driving the next wave of innovation. Google has spent billions over the past few months vacuuming up companies that specialize in smart appliances and machine learning. Already, Philips light bulbs, Audi automobiles, and Samsung washers and dryers can communicate with and be controlled from mobile devices. To take advantage of the opportunities the Internet of Things brings to your business, you'll want to start preparing now.
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
In a world of ever-accelerating business cycles and fast-changing client expectations, the cloud increasingly serves as a growth engine and a path to new business models. Dynamic clouds enable businesses to continuously reinvent themselves, adapting their business processes, their service and software delivery and their operations to achieve speed-to-market and quick response to customer feedback. As the cloud evolves, the industry has multiple competing cloud technologies, offering on-premises ...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
The OpenStack cloud operating system includes Trove, a database abstraction layer. Rather than applications connecting directly to a specific type of database, they connect to Trove, which in turn connects to one or more specific databases. One target database is Postgres Plus Cloud Database, which includes its own RESTful API. Trove was originally developed around MySQL, whose interfaces are significantly less complicated than those of the Postgres cloud database. In his session at 16th Cloud...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
SAP is delivering break-through innovation combined with fantastic user experience powered by the market-leading in-memory technology, SAP HANA. In his General Session at 15th Cloud Expo, Thorsten Leiduck, VP ISVs & Digital Commerce, SAP, discussed how SAP and partners provide cloud and hybrid cloud solutions as well as real-time Big Data offerings that help companies of all sizes and industries run better. SAP launched an application challenge to award the most innovative SAP HANA and SAP HANA...
With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo in Silicon Valley. Learn what is going on, contribute to the discussions, and ensure that your enterprise is as "IoT-Ready" as it can be! Internet of @ThingsExpo, taking place Nov 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with 17th Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading in...
P2P RTC will impact the landscape of communications, shifting from traditional telephony style communications models to OTT (Over-The-Top) cloud assisted & PaaS (Platform as a Service) communication services. The P2P shift will impact many areas of our lives, from mobile communication, human interactive web services, RTC and telephony infrastructure, user federation, security and privacy implications, business costs, and scalability. In his session at @ThingsExpo, Robin Raymond, Chief Architect...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
There is no question that the cloud is where businesses want to host data. Until recently hypervisor virtualization was the most widely used method in cloud computing. Recently virtual containers have been gaining in popularity, and for good reason. In the debate between virtual machines and containers, the latter have been seen as the new kid on the block – and like other emerging technology have had some initial shortcomings. However, the container space has evolved drastically since coming on...
Explosive growth in connected devices. Enormous amounts of data for collection and analysis. Critical use of data for split-second decision making and actionable information. All three are factors in making the Internet of Things a reality. Yet, any one factor would have an IT organization pondering its infrastructure strategy. How should your organization enhance its IT framework to enable an Internet of Things implementation? In his session at Internet of @ThingsExpo, James Kirkland, Chief Ar...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
In a recent research, analyst firm IDC found that the average cost of a critical application failure is $500,000 to $1 million per hour and the average total cost of unplanned application downtime is $1.25 billion to $2.5 billion per year for Fortune 1000 companies. In addition to the findings on the cost of the downtime, the research also highlighted best practices for development, testing, application support, infrastructure, and operations teams.
The security devil is always in the details of the attack: the ones you've endured, the ones you prepare yourself to fend off, and the ones that, you fear, will catch you completely unaware and defenseless. The Internet of Things (IoT) is nothing if not an endless proliferation of details. It's the vision of a world in which continuous Internet connectivity and addressability is embedded into a growing range of human artifacts, into the natural world, and even into our smartphones, appliances, a...
There is no doubt that Big Data is here and getting bigger every day. Building a Big Data infrastructure today is no easy task. There are an enormous number of choices for database engines and technologies. To make things even more challenging, requirements are getting more sophisticated, and the standard paradigm of supporting historical analytics queries is often just one facet of what is needed. As Big Data growth continues, organizations are demanding real-time access to data, allowing immed...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...