Welcome!

Cloud Expo Authors: Yeshim Deniz, Mike Kavis, Elizabeth White, Hovhannes Avoyan, Liz McMillan

Blog Feed Post

Data Security Using SQL Azure



One of the major concerns in using SQL Azure is the security of data such as credit card numbers, Social Security numbers, salaries, bonuses etc. The degree to which data needs to be protected is to be determined by each business entity but generally, on-site data is more secure than data stored in the cloud.
This is a simple example of using SQL Server Integration Services SIS and SQL Server Reporting Services tools to accomplish just that.
We start off with this scenario: The fictitious company SecureAce wants to place one of their Employee tables on SQL Azure, but they do not want to keep any sensitive information such as employee salaries. However from time to time they need to generate report of their employees and salaries to management.
The solution to this scenario is divided in two parts.
In the first part, the on-site data in the employees table is partitioned in such a way that the sensitive information stays on-site and the larger, non-sensitive data is stored on SQL Azure.
In the second part SSIS is used to bring the two pieces of data together and load an Access database (on-site) which is used as a front end for reporting information to management, an entirely realistic way of data management. Although a Microsoft Access database is used, any other destination handled by SSIS can also be used[s1] , such as another SQL Server database. Herein we used MS Access as it is a very common product used in many small businesses.
 It may be noted however that Microsoft is now supporting connecting SQL Azure to MS Access directly, review this link for details: http://social.msdn.microsoft.com/Forums/en-US/ssdsgetstarted/thread/05dd7620-f209-43d2-8c41-63b251c62970. With the availability of Microsoft Office Professional Plus 2010, the author was able to directly connect to SQL Azure using an ODBC connection.
Splitting the data and uploading to SQL Azure
This is a preparation for the SSIS task that follows. We will be using Northwind database’s Employee table and splitting it in two parts each containing different columns, a vertical partition. One part will remain on site which contains the salary information of employees and the other which is loaded to SQL Azure will contain most of other information.  In the Northwind database, the employee table does not have a salary column and hence an extra column will be added for this simulation. The procedure is described in the following[s2]  steps[Maitreya3] .
·         Create a table Employees in VerticalPart using the following statement:
CREATE TABLE [dbo].[Employees](
[EmployeeID] [int] PRIMARY KEY CLUSTERED NOT NULL,
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[HomePhone] [nvarchar](24) NULL,
[Extension] [nvarchar](4) NULL,
[Salary] [money] NULL
)
·         Use Import / Export Wizard to populate the columns (except Salary) of the above table using Northwind's Employees table
·         Modify table by adding salary for each employee
[s6] [j7] There are only few employees and this should not be a problem. When you want to save the table, you may not be able to do so unless you have turned-on this option, in the Tools menu of SSMS. You will get a reply after you save [s8] [j9] the Employees table as shown.

Now run a SELECT query to verify that the salary column has been populated as shown.


Copy the script for Northwind’s Employee table and modify it by changing the table name and removing some columns resulting in the following statement:

CREATE TABLE [dbo].[AzureEmployees](
[EmployeeID] [int] PRIMARY KEY CLUSTERED  NOT NULL,
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[Title] [nvarchar](30) NULL,
[TitleOfCourtesy] [nvarchar](25) NULL,
[HireDate] [datetime] NULL,
[Address] [nvarchar](60) NULL,
[City] [nvarchar](15) NULL,
[Region] [nvarchar](15) NULL,
[PostalCode] [nvarchar](10) NULL,
[Country] [nvarchar](15)
)
Note that the table name has been changed to AzureEmployees. This is the table that will be stored in the Bluesky database on SQL Azure.
Login to SQL Azure and create the table in Bluesky database by running the above create table statement.
The table will be created with the above schema which you may verify in the Object Browser.

Use Import and Export Wizard to populate the columns of AzureEmployees with data from Northwind. Use the query option to move data from source to destination using the following query.
SELECT EmployeeID, LastName, FirstName,
Title, TitleOfCourtesy, HireDate,
Address, City,Region, PostalCode,
Country
FROM
Employees
Save the query results to the AzureEmployees table you created earlier as shown. 

 
Follow wizard’s steps to review data mapping as shown


Complete the wizard steps as shown.


Verify data in AzureEmployees in Bluesky database on SQL Azure by running a SELECT statement.
By following the above we have created two tables, one on-site and the other on SQL Azure.
Although data transformation of string data types did not present any error due to string length it could present some problems if the string length is over 8000 if the strings are of type varchar (max) and text. In these cases just change them to nvarchar (max) to overcome the problem. For details review the following link:  http://blogs.msdn.com/b/sqlazure/archive/2010/06/01/10018602.aspx
Merging data and loading an Access database
In this section we will reconstruct the Employees table on-site by retrieving data from SQL Azure as well as SQL Server’s VerticalPart database and merge them. After merging them, we will place them in an MS Access database so that simple reports can be authored.
In order to do this we take the following steps.
  1. Click open BIDS from its shortcut.
  2. Create a Integration Services Project after providing a name for the project. Change the default name of the Package file.
The Project folder should appear as shown in the next image. Project name and Package name were provided.

  1. Drag and drop a Data Flow task to the Control Flow tabbed page of the package designer surface.
  2.  In the bottom pane Connection Managers, configure connection managers one each for SQL Azure database; VerticalPart database on SQL Server 2008; and an MS Access database as shown.



The next image shows the details of the connection manager Hodentek3\KUMO.VerticalPart. Note that SqlClient Data Provider is used. The SQL Server Hodentek3\KUMO is configured for Windows Authentication.



This next image shows the connection xxxxxxxxxx.database.windows.net.Bluesky.mysorian1 for the Bluesky database on SQL Azure. The authentication information is the same one you have used so far and, if it is correct you should be able to see the available databases.


  1. Create an MS Access database (Access 2003 format) and use it for this connection.
Later we also create a table in this database to receive the merged fields from SQL Azure and the on-site server.
For this connection manager we use the following settings and verify by clicking the Test Connection button:
Provider:                 Native OLE DB\Microsoft Jet 4.0 OLE DB Provider
Database file is at:  C:\Users\Jay\AccessSQLAzure.mdb
User name:              Admin
Password:               <empty>

It is assumed that the reader has familiarity with using SSIS. The author recommends his own book on SSIS for beginners, which may be found here: https://www.packtpub.com/sql-server-integration-services-visual-studio-2005/book.
Each of the above connections can be tested using the Test Connection button on them.
Merging columns from SQL Azure and SQL Server
You will use two ADO.NET Source data flow sources, one each for SQL Azure and SQL Server. The outputs will be merged.
  1. Add two ADO.NET data flow sources to the tabbed designer pane Data Flow.
  2. Rename the default names of the source components to read From SQL Azure Database and From SQL Server 2008 database.



  1. Configure the ADO.NET Source Editor connected to SQL Azure to display the following as shown in the next image.
ADO.NET Connection manager: XXXXXXX.database.windows.net.Bluesky.mysorian1
Data access mode: Table or view
Name of the table or view: "dbo"."AzureEmployees"
You must use the server name appropriate for your SQL Azure instance.

Configured as shown and you should be able to view the data in this table with the Preview…button.


  1. Configure the ADO.NET Source Editor connected to SQL Server to display the following as shown in the next image.
Use the following details to configure  From SQL Server 2008 database source used in the ADO.NET Source Editor are as follows:
ADO.NET Connection manager: Hodentek3\KUMO.Verticalpart
Data access mode: Table or view
Name of the table or view: "dbo"."Employees"


Again you should be able to view the data in this table with the Preview…button.
Sorting the outputs of the sources
Since the data coming at the exit point of the sources are not sorted it is important to get the sorting correct and same in both sources before they can be merged.
  1. Drag and drop two Sort dataflow controls from the Toolbox to the design surface just below the ADO.NET data sources.
  2. Start with the one that is going to be receiving its input from the From SQL Azure Database source control.
  3. Click From SQL Azure Database and drag and drop the green dangling line on to the Sort control below it as shown.



  1. Double click the Sort control to display the Sort Transformation Editor and place a check mark for EmployeeID as shown.

  1. Repeat the same procedure for the From SQL Server 2008 Database source. Now we have two sort controls receiving their inputs from two source controls with outputs sorted.
  2. Drag and drop a Merge Join Data Flow Transformation from the Toolbox on to the design surface.
  3. Click the Sort data flow transformation on the left (connected to From SQL Azure Database) and drag and drop its green dangling line on to the Merge Join data flow transformation.
The Input Output Selection window will be displayed as shown.



  1. Select the Merge Join Left Input and click OK.
  2. Repeat the same for the other Sort on the right (this time select Merge Join Right Output).
This Merge control now merges the output from the two sort controls and provides a merged output.
You still need to configure the Merge Join.
  1. Double click Merge Join to open the Merge Join Transformation editor page as shown.
Read the instructions on this window.



  1. Place check mark for EmployeeID in both the Sort lists shown in the top pane. The bottom pane gets populated with Input columns and Output aliases. Make sure the join type is Left outer join as in the above image (use drop-down handle if needed).
We can add for each flow path a Data Viewer so that we can monitor the flow of data at run time by momentarily stopping the flow downstream. We are skipping this diagnostic step.
Porting output data from Merge Join to an MS Access Database
We will be using the merged data from the two sources to fill up a table in an MS Access 2003 database. 
  1. In the MS Access database you created while setting up the Connection Managers create a table, Salary Report table with the design parameters shown in the next image.


  1. Drag and drop an OLE DB Destination component from the Toolbox on to the package designer pane just underneath the Merge Join component.
  2. Drag and drop the green dangling line from Merge Join to the OLE DB Destination component.
  3. Double click the OLE DB Destination component to open its editor and fill in the details as follows:
OLEDB connection manager:   AccessSQLAzure
Data access mode:                     Table or View
Name of the table or view:        Salary Report


  1. Click Mappings to verify all the columns are present.
  2. Build the project and execute the package.
The package elements turn yellow and later green indicating a successful run.
You can verify the table in the access database for the transferred values. This should have all the merged columns from the two databases. Note that in the image, columns have been rearranged to move the Salary column into view.


This is an excerpt of Chapter 6 from my book:
Book published by http://www.packtpub.com/






 [s1]Do you want to elaborate on this a bit and put it up as a tip for the readers?
 [s2]This sounds like an incomplete sentence. Please complete it
 [Maitreya3]'....in the following procedure:' or a similar term can be used. This statement sounds incomplete.
 [s4]This looks out of place. Do we need an explanation under this or do we have it as a part of the explanation above?
 [j5]Modified. Part of a number of steps, now bulleted.
 [s6]How about a numbered bullet list here?
 [j7]Modified
 [s8]Save what?
 [j9]Modified

Read the original blog entry...

More Stories By Jayaram Krishnaswamy

Jayaram Krishnaswamy is a technical writer, mostly writing articles that are related to the web and databases. He is the author of SQL Server Integration Services published by Packt Publishers in the UK. His book, 'Learn SQL Server Reporting Services 2008' was also published by Packt Publishers Inc, Birmingham. 3. "Microsoft SQL Azure Enterprise Application Development" (Dec 2010) was published by Packt Publishing Inc. 4. "Microsoft Visual Studio LightSwitch Business Application Development [Paperback] "(2011) was published by Packt Publishing Inc. 5. "Learning SQL Server Reporting Services 2012 [Paperback]" (June 2013) was Published by Packt Publishing Inc. Visit his blogs at: http://hodentek.blogspot.com http://hodentekHelp.blogspot.com http://hodnetekMSSS.blogspot.com http://hodnetekMobile.blogspot.com He writes articles on several topics to many sites.

Cloud Expo Latest Stories
14th International Cloud Expo, held on June 10–12, 2014 at the Javits Center in New York City, featured three content-packed days with a rich array of sessions about the business and technical value of cloud computing, Internet of Things, Big Data, and DevOps led by exceptional speakers from every sector of the IT ecosystem. The Cloud Expo series is the fastest-growing Enterprise IT event in the past 10 years, devoted to every aspect of delivering massively scalable enterprise IT as a service.
As more applications and services move "to the cloud" (public or on-premise) cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. NuoDB is involved in both building enterprise software and using enterprise cloud capabilities. In his session at 15th Cloud Expo, Seth Proctor, CTO at NuoDB, Inc., will discuss the experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.
Until recently, many organizations required specialized departments to perform mapping and geospatial analysis, and they used Esri on-premise solutions for that work. In his session at 15th Cloud Expo, Dave Peters, author of the Esri Press book Building a GIS, System Architecture Design Strategies for Managers, will discuss how Esri has successfully included the cloud as a fully integrated SaaS expansion of the ArcGIS mapping platform. Organizations that have incorporated Esri cloud-based applications and content within their business models are reaping huge benefits by directly leveraging cloud-based mapping and analysis capabilities within their existing enterprise investments. The ArcGIS mapping platform includes cloud-based content management and information resources to more widely, efficiently, and affordably deliver real-time actionable information and analysis capabilities to your organization.
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services. Speaker Bio: Mark Hinkle is the Senior Director, Open Source Solutions, at Citrix Systems Inc. He joined Citrix as a result of their July 2011 acquisition of Cloud.com where he was their Vice President of Community. He is currently responsible for Citrix open source efforts around the open source cloud computing platform, Apache CloudStack and the Xen Hypervisor. Previously he was the VP of Community at Zenoss Inc., a producer of the open source application, server, and network management software, where he grew the Zenoss Core project to over 10...
Almost everyone sees the potential of Internet of Things but how can businesses truly unlock that potential. The key will be in the ability to discover business insight in the midst of an ocean of Big Data generated from billions of embedded devices via Systems of Discover. Businesses will also need to ensure that they can sustain that insight by leveraging the cloud for global reach, scale and elasticity. In his session at Internet of @ThingsExpo, Mac Devine, Distinguished Engineer at IBM, will discuss bringing these three elements together via Systems of Discover.
Cloud and Big Data present unique dilemmas: embracing the benefits of these new technologies while maintaining the security of your organization’s assets. When an outside party owns, controls and manages your infrastructure and computational resources, how can you be assured that sensitive data remains private and secure? How do you best protect data in mixed use cloud and big data infrastructure sets? Can you still satisfy the full range of reporting, compliance and regulatory requirements? In his session at 15th Cloud Expo, Derek Tumulak, Vice President of Product Management at Vormetric, will discuss how to address data security in cloud and Big Data environments so that your organization isn’t next week’s data breach headline.
The cloud is everywhere and growing, and with it SaaS has become an accepted means for software delivery. SaaS is more than just a technology, it is a thriving business model estimated to be worth around $53 billion dollars by 2015, according to IDC. The question is – how do you build and scale a profitable SaaS business model? In his session at 15th Cloud Expo, Jason Cumberland, Vice President, SaaS Solutions at Dimension Data, will give the audience an understanding of common mistakes businesses make when transitioning to SaaS; how to avoid them; and how to build a profitable and scalable SaaS business.
SYS-CON Events announced today that Gridstore™, the leader in software-defined storage (SDS) purpose-built for Windows Servers and Hyper-V, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Gridstore™ is the leader in software-defined storage purpose built for virtualization that is designed to accelerate applications in virtualized environments. Using its patented Server-Side Virtual Controller™ Technology (SVCT) to eliminate the I/O blender effect and accelerate applications Gridstore delivers vmOptimized™ Storage that self-optimizes to each application or VM across both virtual and physical environments. Leveraging a grid architecture, Gridstore delivers the first end-to-end storage QoS to ensure the most important App or VM performance is never compromised. The storage grid, that uses Gridstore’s performance optimized nodes or capacity optimized nodes, starts with as few a...
SYS-CON Events announced today that Solgenia, the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Solgenia is the global market leader in Cloud Collaboration and Cloud Infrastructure software solutions. Designed to “Bridge the Gap” between personal and professional social, mobile and cloud user experiences, our solutions help large and medium-sized organizations dramatically improve productivity, reduce collaboration costs, and increase the overall enterprise value by bringing collaboration and infrastructure solutions to the cloud.
Cloud computing started a technology revolution; now DevOps is driving that revolution forward. By enabling new approaches to service delivery, cloud and DevOps together are delivering even greater speed, agility, and efficiency. No wonder leading innovators are adopting DevOps and cloud together! In his session at DevOps Summit, Andi Mann, Vice President of Strategic Solutions at CA Technologies, will explore the synergies in these two approaches, with practical tips, techniques, research data, war stories, case studies, and recommendations.
Enterprises require the performance, agility and on-demand access of the public cloud, and the management, security and compatibility of the private cloud. The solution? In his session at 15th Cloud Expo, Simone Brunozzi, VP and Chief Technologist(global role) for VMware, will explore how to unlock the power of the hybrid cloud and the steps to get there. He'll discuss the challenges that conventional approaches to both public and private cloud computing, and outline the tough decisions that must be made to accelerate the journey to the hybrid cloud. As part of the transition, an Infrastructure-as-a-Service model will enable enterprise IT to build services beyond their data center while owning what gets moved, when to move it, and for how long. IT can then move forward on what matters most to the organization that it supports – availability, agility and efficiency.
Every healthy ecosystem is diverse. This is especially true in cloud ecosystems, where portability and interoperability are more important than old enterprise models of proprietary ownership. In his session at 15th Cloud Expo, Mark Baker, Server Product Manager at Canonical/Ubuntu, will discuss how single vendors used to take the lead in creating and delivering technology, but in a cloud economy, where users want tools of their preference, when and where they need them, it makes no sense.
The 15th International Cloud Expo has just expanded its conference program, to bring together Cloud Computing, APM, APIs, Security, Big Data, Internet of Things, DevOps and WebRTC at one location. Cloud Expo is the single show where delegates and technology vendors can meet to experience and discuss the entire world of the cloud. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to learn about the latest technology developments and solutions.
SYS-CON Events announced today that Bsquare Corporation, a leading enabler of smart connected systems, has been named “Bronze Sponsor” of SYS-CON's Internet of @ThingsExpo, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Bsquare is a global leader of embedded software solutions. We enable smart connected systems at the device level and beyond that millions use every day and provide actionable data solutions for the growing Internet of Things (IoT) market. We empower our world-class customers with our products, services and solutions to achieve innovation and success.
SYS-CON Events announced today that NuoDB, Inc., the leader in webscale distributed database technology, has been named “Bronze Sponsor” of SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. NuoDB was launched in 2010 by industry-renowned database architect Jim Starkey and accomplished software CEO Barry Morris to deliver a webscale distributed database management system that is specifically designed for the cloud and the modern datacenter.