|By Srinivasan Sundara Rajan||
|February 14, 2011 06:00 PM EST||
Data Warehousing As A Cloud Candidate
Over the past year, we have started seeing greater support for Cloud from major vendors and Cloud is here to stay. The bigger impact is that, the path is clearly drawn for the enterprises to adopt Cloud. With this in mind, it is time to identify the potential for existing data center applications to be migrated to Cloud.
Most of the major IT majors predict a HYBRID Delivery will be future, where by the future enterprises needs to look for a delivery model that comprises of certain work loads on Clouds and some of them continue to be on data centers and then look for a model that will integrate them together.
Before we go further into a blue print of How Data warehouses fit within a HYBRID Cloud environment, we will see the salient features of Data warehouses and how the Cloud tenants make them a very viable work load to be moved to Cloud.
A data warehouse is a subject oriented, integrated, time variant and non volatile collection of data in support of management's decision making process.
Data Warehousing Usage
Cloud Tenant Value Proposition
ETL (Extract, Cleaning, Transform, Load) process is subject to variable patterns. Normally we may get large files over the week end or in night time to be processed and loaded.
It is better to use the COMPUTE resources on demand for the ETL as they require , rather than having a fixed capacity
OLAP (Online Analytical Processing) and related processing needs for MOLAP (Multi dimensional OLAP) and / or ROLAP (Relational OLAP) are highly compute intensive and requires stronger processing needs
High Performance Computing and ability to scale up on demand, tenants of Cloud will be highly aligned to this need
Physical architecture needs are complex in a data warehousing environment.
Most of the IaaS , PaaS offerings like Azure platform, Amazon EC2 have built in provisions for a highly available architecture, with most of the day to day administration is abstracted from the enterprises.
The below are some of the advantages of SQL Azure Platform
Multiple Software and platform needs,
The product stack of data warehousing environment is really huge and most organizations will normally find it difficult to get into a ideal list of software and platforms and tools for their BI platform. platform. SaaS for applications like data cleansing or address validation and PaaS for reporting like Microsoft SQL Azure reporting will be ideal to solve the tools and platform maze.
The following are the ideal steps for migrating a in-premise data warehouse system to a cloud platform, for the sake of case study , Microsoft Windows Azure platform is chosen as the target platform.
1. Create Initial Database / Allocate Storage / Migrate Data
The existing STAR Schema design of the existing data warehousing system can be migrated to Cloud platform as it is. And migrating to a Relational database platform like SQL Azure should be straightforward. To migrate the data, the initial storage allocations of the existing database on the data center needs to be calculated and the same amount Storage resources will be allocated on the Cloud.
You can store any amount of data, from kilobytes to terabytes, in SQL Azure. However, individual databases are limited to 10 GB in size. To create solutions that store more than 10 GB of data, you must partition large data sets across multiple databases and use parallel queries to access the data.
Once a high scalable database infrastructure is setup on SQL Azure platform , the following are some of the methods in which the data from the existing on-premise data warehouses can be moved to SQL Azure.
Traditional BCP Tool : BCP is a command line utility that ships with Microsoft SQL Server. It bulk copies data between SQL Azure (or SQL Server) and a data file in a user-specified format. The bcp utility that ships with SQL Server 2008 R2 is fully supported by SQL Azure. You can use BCP to backup and restore your data on SQL Azure You can import large numbers of new rows into SQL Azure tables or export data out of tables into data files by using the bcp utility.
The following tools are also useful, if you existing Data warehouse is in Sql Server within the data center.
You can transfer data to SQL Azure by using SQL Server 2008 Integration Services (SSIS). SQL Server 2008 R2 or later supports the Import and Export Data Wizard and bulk copy for the transfer of data between an instance of Microsoft SQL Server and SQL Azure.
SQL Server Migration Assistant (SSMA for Access v4.2) supports migrating your schema and data from Microsoft Access to SQL Azure.
2. Set Up ETL & Integration With Existing On Premise Data Sources
After the initial load of the data warehouse on Cloud, it required to be continuously refreshed with the operational data. This process needs to extract data from different data sources (such as flat files, legacy databases, RDBMS, ERP, CRM and SCM application packages).
This process will also carry out necessary transformations such as joining of tables, sorting, applying various filters.
The following are typical options available in Sql Azure platform to build a ETL platform between the On Premise and data warehouse hosted on cloud. The tools mentioned above on the initial load of the data also holds good for ETL tool, however they are not repeated to avoid duplication.
SQL Azure Data Sync :
- Cloud to cloud synchronization
- Enterprise (on-premise) to cloud
- Cloud to on-premise.
- Bi-directional or sync-to-hub or sync-from-hub synchronization
The following diagram courtesy of Vendor will give a over view of how the SQL Azure Data Sync can be used for ETL purposes.
Integration provides common Biztalk Server integration capabilities (e.g. pipeline, transforms, adapters) on Windows Azure, using out-of-box integration patterns to accelerate and simplify development. It also delivers higher level business user enablement capabilities such as Business Activity Monitoring and Rules, as well as self-service trading partner community portal and provisioning of business-to-business pipelines. The following diagram courtesy of the vendor shows how the Windows Azure Appfabric Integration can be used as a ETL platform.
3. Create CUBES & Other Analytics Structures
The multi dimensional nature of OLAP requires a analytical engine to process the underlying data and create a multi dimensional view and the success of OLAP has resulted in a large number of vendors offering OLAP servers using different architectures.
MLOAP : A Proprietary multidimensional database with a aim on performance.
ROLAP : Relational OLAP is a technology that provides sophisticated multidimensional analysis that is performed on open relational databases. ROLAP can scale to large data sets in the terabyte range.
HOLAP : Hybrid OLAP is an attempt to combine some of the features of MOLAP and ROLAP technology.
SQL Azure Database does not support all of the features and data types found in SQL Server. Analysis Services, Replication, and Service Broker are not currently provided as services on the Windows Azure platform.
At this time there is no direct support for OLAP and CUBE processing on SQL Azure, however with the HPC (High Performance Computing ) attributes using multiple Worker roles, manually aggregation of the data can be achieved.
4. Generate Reports
Reporting consists of analyzing the data stored in the data warehouse in multiple dimensions and generate standard reports for business intelligence and also generate ad-hoc reports. These reports present data in graphical/tabular form and also provide statistical analysis features. These reports should be rendered as Excel, PDF and other formats.
It is better to utilize the SaaS based or PaaS based reporting infrastructure rather than custom coding all the reports.
SQL Azure Reporting enables developers to enhance their applications by embedding cloud based reports on information stored in a SQL Azure database. Developers can author reports using familiar SQL Server Reporting Services tools and then use these reports in their applications which may be on-premises or in the cloud.
SQL Azure Reporting also currently can connect only to SQL Azure databases.
The above steps will provide a path to migrate on premise Data warehousing applications to Cloud. As we needed lot of support from the vendor in terms of IaaS, PaaS and SaaS, Microsoft Azure Platform is chosen as a platform to support the case study. With several features integrated as part of this, Microsoft Cloud Platform positioned to be one of the leading platform for BI on Cloud.
The following diagram indicates a blue print of a typical Cloud BI Organization on a Microsoft Azure Platform.
Creating replica copies to tolerate a certain number of failures is easy, but very expensive at cloud-scale. Conventional RAID has lower overhead, but it is limited in the number of failures it can tolerate. And the management is like herding cats (overseeing capacity, rebuilds, migrations, and degraded performance). Download Slide Deck: ▸ Here In his general session at 18th Cloud Expo, Scott Cleland, Senior Director of Product Marketing for the HGST Cloud Infrastructure Business Unit, discusse...
Jun. 30, 2016 05:00 PM EDT Reads: 1,139
Unless your company can spend a lot of money on new technology, re-engineering your environment and hiring a comprehensive cybersecurity team, you will most likely move to the cloud or seek external service partnerships. In his session at 18th Cloud Expo, Darren Guccione, CEO of Keeper Security, revealed what you need to know when it comes to encryption in the cloud.
Jun. 30, 2016 04:47 PM EDT Reads: 143
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data profession...
Jun. 30, 2016 04:00 PM EDT Reads: 350
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jun. 30, 2016 04:00 PM EDT Reads: 337
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to imp...
Jun. 30, 2016 04:00 PM EDT Reads: 999
It's easy to assume that your app will run on a fast and reliable network. The reality for your app's users, though, is often a slow, unreliable network with spotty coverage. What happens when the network doesn't work, or when the device is in airplane mode? You get unhappy, frustrated users. An offline-first app is an app that works, without error, when there is no network connection. In his session at 18th Cloud Expo, Bradley Holt, a Developer Advocate with IBM Cloud Data Services, discussed...
Jun. 30, 2016 04:00 PM EDT Reads: 1,025
[session] Redis Functions and Data Structures By @DaveNielsen | @CloudExpo #Cloud #Redis #Containers
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Jun. 30, 2016 03:48 PM EDT Reads: 251
In the world of DevOps there are ‘known good practices’ – aka ‘patterns’ – and ‘known bad practices’ – aka ‘anti-patterns.' Many of these patterns and anti-patterns have been developed from real world experience, especially by the early adopters of DevOps theory; but many are more feasible in theory than in practice, especially for more recent entrants to the DevOps scene. In this power panel at @DevOpsSummit at 18th Cloud Expo, moderated by DevOps Conference Chair Andi Mann, panelists discusse...
Jun. 30, 2016 03:30 PM EDT Reads: 976
A strange thing is happening along the way to the Internet of Things, namely far too many devices to work with and manage. It has become clear that we'll need much higher efficiency user experiences that can allow us to more easily and scalably work with the thousands of devices that will soon be in each of our lives. Enter the conversational interface revolution, combining bots we can literally talk with, gesture to, and even direct with our thoughts, with embedded artificial intelligence, wh...
Jun. 30, 2016 03:00 PM EDT Reads: 1,264
Connected devices and the industrial internet are growing exponentially every year with Cisco expecting 50 billion devices to be in operation by 2020. In this period of growth, location-based insights are becoming invaluable to many businesses as they adopt new connected technologies. Knowing when and where these devices connect from is critical for a number of scenarios in supply chain management, disaster management, emergency response, M2M, location marketing and more. In his session at @Th...
Jun. 30, 2016 01:30 PM EDT Reads: 1,337
As organizations shift towards IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. Commvault can ensure protection, access and E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his general session at 18th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Part...
Jun. 30, 2016 01:15 PM EDT Reads: 702
In his keynote at 18th Cloud Expo, Andrew Keys, Co-Founder of ConsenSys Enterprise, provided an overview of the evolution of the Internet and the Database and the future of their combination – the Blockchain. Andrew Keys is Co-Founder of ConsenSys Enterprise. He comes to ConsenSys Enterprise with capital markets, technology and entrepreneurial experience. Previously, he worked for UBS investment bank in equities analysis. Later, he was responsible for the creation and distribution of life sett...
Jun. 30, 2016 01:00 PM EDT Reads: 1,500
What does it look like when you have access to cloud infrastructure and platform under the same roof? Let’s talk about the different layers of Technology as a Service: who cares, what runs where, and how does it all fit together. In his session at 18th Cloud Expo, Phil Jackson, Lead Technology Evangelist at SoftLayer, an IBM company, spoke about the picture being painted by IBM Cloud and how the tools being crafted can help fill the gaps in your IT infrastructure.
Jun. 30, 2016 01:00 PM EDT Reads: 1,004
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2016' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited t...
Jun. 30, 2016 12:30 PM EDT Reads: 471
"We work in the area of Big Data analytics and Big Data analytics is a very crowded space - you have Hadoop, ETL, warehousing, visualization and there's a lot of effort trying to get these tools to talk to each other," explained Mukund Deshpande, head of the Analytics practice at Accelerite, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jun. 30, 2016 12:00 PM EDT Reads: 532
"delaPlex is a software development company. We do team-based outsourcing development," explained Mark Rivers, COO and Co-founder of delaPlex Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jun. 30, 2016 11:45 AM EDT Reads: 589
University of Colorado Athletics has selected FORTRUST, Colorado’s only Tier III Gold certified data center, as their official data center and colocation services provider, FORTRUST announced today. A nationally recognized and prominent collegiate athletics program, CU provides a high quality and comprehensive student-athlete experience. The program sponsors 17 varsity teams and in their history, the Colorado Buffaloes have collected an impressive 28 national championships. Maintaining uptime...
Jun. 30, 2016 11:45 AM EDT Reads: 828
IoT is rapidly changing the way enterprises are using data to improve business decision-making. In order to derive business value, organizations must unlock insights from the data gathered and then act on these. In their session at @ThingsExpo, Eric Hoffman, Vice President at EastBanc Technologies, and Peter Shashkin, Head of Development Department at EastBanc Technologies, discussed how one organization leveraged IoT, cloud technology and data analysis to improve customer experiences and effi...
Jun. 30, 2016 11:30 AM EDT Reads: 628
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Jun. 30, 2016 11:15 AM EDT Reads: 729
The U.S. Army Intelligence and Security Command (INSCOM) has awarded BAE Systems a five-year contract worth as much as $75 million to provide enhanced geospatial intelligence technical and analytical support. The award was issued under the INSCOM Global Intelligence indefinite delivery, indefinite quantity contract.
Jun. 30, 2016 11:00 AM EDT Reads: 576