@CloudExpo Authors: Liz McMillan, Pat Romanski, Elizabeth White, Yeshim Deniz, David Paquette

Blog Feed Post

Data Security Using SQL Azure

One of the major concerns in using SQL Azure is the security of data such as credit card numbers, Social Security numbers, salaries, bonuses etc. The degree to which data needs to be protected is to be determined by each business entity but generally, on-site data is more secure than data stored in the cloud.
This is a simple example of using SQL Server Integration Services SIS and SQL Server Reporting Services tools to accomplish just that.
We start off with this scenario: The fictitious company SecureAce wants to place one of their Employee tables on SQL Azure, but they do not want to keep any sensitive information such as employee salaries. However from time to time they need to generate report of their employees and salaries to management.
The solution to this scenario is divided in two parts.
In the first part, the on-site data in the employees table is partitioned in such a way that the sensitive information stays on-site and the larger, non-sensitive data is stored on SQL Azure.
In the second part SSIS is used to bring the two pieces of data together and load an Access database (on-site) which is used as a front end for reporting information to management, an entirely realistic way of data management. Although a Microsoft Access database is used, any other destination handled by SSIS can also be used[s1] , such as another SQL Server database. Herein we used MS Access as it is a very common product used in many small businesses.
 It may be noted however that Microsoft is now supporting connecting SQL Azure to MS Access directly, review this link for details: http://social.msdn.microsoft.com/Forums/en-US/ssdsgetstarted/thread/05dd7620-f209-43d2-8c41-63b251c62970. With the availability of Microsoft Office Professional Plus 2010, the author was able to directly connect to SQL Azure using an ODBC connection.
Splitting the data and uploading to SQL Azure
This is a preparation for the SSIS task that follows. We will be using Northwind database’s Employee table and splitting it in two parts each containing different columns, a vertical partition. One part will remain on site which contains the salary information of employees and the other which is loaded to SQL Azure will contain most of other information.  In the Northwind database, the employee table does not have a salary column and hence an extra column will be added for this simulation. The procedure is described in the following[s2]  steps[Maitreya3] .
·         Create a table Employees in VerticalPart using the following statement:
CREATE TABLE [dbo].[Employees](
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[HomePhone] [nvarchar](24) NULL,
[Extension] [nvarchar](4) NULL,
[Salary] [money] NULL
·         Use Import / Export Wizard to populate the columns (except Salary) of the above table using Northwind's Employees table
·         Modify table by adding salary for each employee
[s6] [j7] There are only few employees and this should not be a problem. When you want to save the table, you may not be able to do so unless you have turned-on this option, in the Tools menu of SSMS. You will get a reply after you save [s8] [j9] the Employees table as shown.

Now run a SELECT query to verify that the salary column has been populated as shown.

Copy the script for Northwind’s Employee table and modify it by changing the table name and removing some columns resulting in the following statement:

CREATE TABLE [dbo].[AzureEmployees](
[LastName] [nvarchar](20) NOT NULL,
[FirstName] [nvarchar](10) NOT NULL,
[Title] [nvarchar](30) NULL,
[TitleOfCourtesy] [nvarchar](25) NULL,
[HireDate] [datetime] NULL,
[Address] [nvarchar](60) NULL,
[City] [nvarchar](15) NULL,
[Region] [nvarchar](15) NULL,
[PostalCode] [nvarchar](10) NULL,
[Country] [nvarchar](15)
Note that the table name has been changed to AzureEmployees. This is the table that will be stored in the Bluesky database on SQL Azure.
Login to SQL Azure and create the table in Bluesky database by running the above create table statement.
The table will be created with the above schema which you may verify in the Object Browser.

Use Import and Export Wizard to populate the columns of AzureEmployees with data from Northwind. Use the query option to move data from source to destination using the following query.
SELECT EmployeeID, LastName, FirstName,
Title, TitleOfCourtesy, HireDate,
Address, City,Region, PostalCode,
Save the query results to the AzureEmployees table you created earlier as shown. 

Follow wizard’s steps to review data mapping as shown

Complete the wizard steps as shown.

Verify data in AzureEmployees in Bluesky database on SQL Azure by running a SELECT statement.
By following the above we have created two tables, one on-site and the other on SQL Azure.
Although data transformation of string data types did not present any error due to string length it could present some problems if the string length is over 8000 if the strings are of type varchar (max) and text. In these cases just change them to nvarchar (max) to overcome the problem. For details review the following link:  http://blogs.msdn.com/b/sqlazure/archive/2010/06/01/10018602.aspx
Merging data and loading an Access database
In this section we will reconstruct the Employees table on-site by retrieving data from SQL Azure as well as SQL Server’s VerticalPart database and merge them. After merging them, we will place them in an MS Access database so that simple reports can be authored.
In order to do this we take the following steps.
  1. Click open BIDS from its shortcut.
  2. Create a Integration Services Project after providing a name for the project. Change the default name of the Package file.
The Project folder should appear as shown in the next image. Project name and Package name were provided.

  1. Drag and drop a Data Flow task to the Control Flow tabbed page of the package designer surface.
  2.  In the bottom pane Connection Managers, configure connection managers one each for SQL Azure database; VerticalPart database on SQL Server 2008; and an MS Access database as shown.

The next image shows the details of the connection manager Hodentek3\KUMO.VerticalPart. Note that SqlClient Data Provider is used. The SQL Server Hodentek3\KUMO is configured for Windows Authentication.

This next image shows the connection xxxxxxxxxx.database.windows.net.Bluesky.mysorian1 for the Bluesky database on SQL Azure. The authentication information is the same one you have used so far and, if it is correct you should be able to see the available databases.

  1. Create an MS Access database (Access 2003 format) and use it for this connection.
Later we also create a table in this database to receive the merged fields from SQL Azure and the on-site server.
For this connection manager we use the following settings and verify by clicking the Test Connection button:
Provider:                 Native OLE DB\Microsoft Jet 4.0 OLE DB Provider
Database file is at:  C:\Users\Jay\AccessSQLAzure.mdb
User name:              Admin
Password:               <empty>

It is assumed that the reader has familiarity with using SSIS. The author recommends his own book on SSIS for beginners, which may be found here: https://www.packtpub.com/sql-server-integration-services-visual-studio-2005/book.
Each of the above connections can be tested using the Test Connection button on them.
Merging columns from SQL Azure and SQL Server
You will use two ADO.NET Source data flow sources, one each for SQL Azure and SQL Server. The outputs will be merged.
  1. Add two ADO.NET data flow sources to the tabbed designer pane Data Flow.
  2. Rename the default names of the source components to read From SQL Azure Database and From SQL Server 2008 database.

  1. Configure the ADO.NET Source Editor connected to SQL Azure to display the following as shown in the next image.
ADO.NET Connection manager: XXXXXXX.database.windows.net.Bluesky.mysorian1
Data access mode: Table or view
Name of the table or view: "dbo"."AzureEmployees"
You must use the server name appropriate for your SQL Azure instance.

Configured as shown and you should be able to view the data in this table with the Preview…button.

  1. Configure the ADO.NET Source Editor connected to SQL Server to display the following as shown in the next image.
Use the following details to configure  From SQL Server 2008 database source used in the ADO.NET Source Editor are as follows:
ADO.NET Connection manager: Hodentek3\KUMO.Verticalpart
Data access mode: Table or view
Name of the table or view: "dbo"."Employees"

Again you should be able to view the data in this table with the Preview…button.
Sorting the outputs of the sources
Since the data coming at the exit point of the sources are not sorted it is important to get the sorting correct and same in both sources before they can be merged.
  1. Drag and drop two Sort dataflow controls from the Toolbox to the design surface just below the ADO.NET data sources.
  2. Start with the one that is going to be receiving its input from the From SQL Azure Database source control.
  3. Click From SQL Azure Database and drag and drop the green dangling line on to the Sort control below it as shown.

  1. Double click the Sort control to display the Sort Transformation Editor and place a check mark for EmployeeID as shown.

  1. Repeat the same procedure for the From SQL Server 2008 Database source. Now we have two sort controls receiving their inputs from two source controls with outputs sorted.
  2. Drag and drop a Merge Join Data Flow Transformation from the Toolbox on to the design surface.
  3. Click the Sort data flow transformation on the left (connected to From SQL Azure Database) and drag and drop its green dangling line on to the Merge Join data flow transformation.
The Input Output Selection window will be displayed as shown.

  1. Select the Merge Join Left Input and click OK.
  2. Repeat the same for the other Sort on the right (this time select Merge Join Right Output).
This Merge control now merges the output from the two sort controls and provides a merged output.
You still need to configure the Merge Join.
  1. Double click Merge Join to open the Merge Join Transformation editor page as shown.
Read the instructions on this window.

  1. Place check mark for EmployeeID in both the Sort lists shown in the top pane. The bottom pane gets populated with Input columns and Output aliases. Make sure the join type is Left outer join as in the above image (use drop-down handle if needed).
We can add for each flow path a Data Viewer so that we can monitor the flow of data at run time by momentarily stopping the flow downstream. We are skipping this diagnostic step.
Porting output data from Merge Join to an MS Access Database
We will be using the merged data from the two sources to fill up a table in an MS Access 2003 database. 
  1. In the MS Access database you created while setting up the Connection Managers create a table, Salary Report table with the design parameters shown in the next image.

  1. Drag and drop an OLE DB Destination component from the Toolbox on to the package designer pane just underneath the Merge Join component.
  2. Drag and drop the green dangling line from Merge Join to the OLE DB Destination component.
  3. Double click the OLE DB Destination component to open its editor and fill in the details as follows:
OLEDB connection manager:   AccessSQLAzure
Data access mode:                     Table or View
Name of the table or view:        Salary Report

  1. Click Mappings to verify all the columns are present.
  2. Build the project and execute the package.
The package elements turn yellow and later green indicating a successful run.
You can verify the table in the access database for the transferred values. This should have all the merged columns from the two databases. Note that in the image, columns have been rearranged to move the Salary column into view.

This is an excerpt of Chapter 6 from my book:
Book published by http://www.packtpub.com/

 [s1]Do you want to elaborate on this a bit and put it up as a tip for the readers?
 [s2]This sounds like an incomplete sentence. Please complete it
 [Maitreya3]'....in the following procedure:' or a similar term can be used. This statement sounds incomplete.
 [s4]This looks out of place. Do we need an explanation under this or do we have it as a part of the explanation above?
 [j5]Modified. Part of a number of steps, now bulleted.
 [s6]How about a numbered bullet list here?
 [s8]Save what?

Read the original blog entry...

More Stories By Jayaram Krishnaswamy

Jayaram Krishnaswamy is a technical writer, mostly writing articles that are related to the web and databases. He is the author of SQL Server Integration Services published by Packt Publishers in the UK. His book, 'Learn SQL Server Reporting Services 2008' was also published by Packt Publishers Inc, Birmingham. 3. "Microsoft SQL Azure Enterprise Application Development" (Dec 2010) was published by Packt Publishing Inc. 4. "Microsoft Visual Studio LightSwitch Business Application Development [Paperback] "(2011) was published by Packt Publishing Inc. 5. "Learning SQL Server Reporting Services 2012 [Paperback]" (June 2013) was Published by Packt Publishing Inc. Visit his blogs at: http://hodentek.blogspot.com http://hodentekHelp.blogspot.com http://hodnetekMSSS.blogspot.com http://hodnetekMobile.blogspot.com He writes articles on several topics to many sites.

@CloudExpo Stories
SYS-CON Events announced today that Enzu will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive advantage. By offering a suite of proven hosting and management services, Enzu wants companies to focus on the core of their online busine...
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
WebRTC adoption has generated a wave of creative uses of communications and collaboration through websites, sales apps, customer care and business applications. As WebRTC has become more mainstream it has evolved to use cases beyond the original peer-to-peer case, which has led to a repeating requirement for interoperability with existing infrastructures. In his session at @ThingsExpo, Graham Holt, Executive Vice President of Daitan Group, will cover implementation examples that have enabled ea...
SYS-CON Events announced today that Roundee / LinearHub will exhibit at the WebRTC Summit at @ThingsExpo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. LinearHub provides Roundee Service, a smart platform for enterprise video conferencing with enhanced features such as automatic recording and transcription service. Slack users can integrate Roundee to their team via Slack’s App Directory, and '/roundee' command lets your video conference ...
Enterprise IT has been in the era of Hybrid Cloud for some time now. But it seems most conversations about Hybrid are focused on integrating AWS, Microsoft Azure, or Google ECM into existing on-premises systems. Where is all the Private Cloud? What do technology providers need to do to make their offerings more compelling? How should enterprise IT executives and buyers define their focus, needs, and roadmap, and communicate that clearly to the providers?
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
SYS-CON Events announced today that Sheng Liang to Keynote at SYS-CON's 19th Cloud Expo, which will take place on November 1-3, 2016 at the Santa Clara Convention Center in Santa Clara, California.
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
More and more brands have jumped on the IoT bandwagon. We have an excess of wearables – activity trackers, smartwatches, smart glasses and sneakers, and more that track seemingly endless datapoints. However, most consumers have no idea what “IoT” means. Creating more wearables that track data shouldn't be the aim of brands; delivering meaningful, tangible relevance to their users should be. We're in a period in which the IoT pendulum is still swinging. Initially, it swung toward "smart for smar...
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
"Matrix is an ambitious open standard and implementation that's set up to break down the fragmentation problems that exist in IP messaging and VoIP communication," explained John Woolf, Technical Evangelist at Matrix, in this SYS-CON.tv interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Established in 1998, Calsoft is a leading software product engineering Services Company specializing in Storage, Networking, Virtualization and Cloud business verticals. Calsoft provides End-to-End Product Development, Quality Assurance Sustenance, Solution Engineering and Professional Services expertise to assist customers in achieving their product development and business goals. The company's deep domain knowledge of Storage, Virtualization, Networking and Cloud verticals helps in delivering ...
SYS-CON Events announced today that 910Telecom will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Housed in the classic Denver Gas & Electric Building, 910 15th St., 910Telecom is a carrier-neutral telecom hotel located in the heart of Denver. Adjacent to CenturyLink, AT&T, and Denver Main, 910Telecom offers connectivity to all major carriers, Internet service providers, Internet backbones and ...
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
In his general session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed cloud as a ‘better data center’ and how it adds new capacity (faster) and improves application availability (redundancy). The cloud is a ‘Dynamic Tool for Dynamic Apps’ and resource allocation is an integral part of your application architecture, so use only the resources you need and allocate /de-allocate resources on the fly.
In the next five to ten years, millions, if not billions of things will become smarter. This smartness goes beyond connected things in our homes like the fridge, thermostat and fancy lighting, and into heavily regulated industries including aerospace, pharmaceutical/medical devices and energy. “Smartness” will embed itself within individual products that are part of our daily lives. We will engage with smart products - learning from them, informing them, and communicating with them. Smart produc...
In his keynote at 19th Cloud Expo, Sheng Liang, co-founder and CEO of Rancher Labs, will discuss the technological advances and new business opportunities created by the rapid adoption of containers. With the success of Amazon Web Services (AWS) and various open source technologies used to build private clouds, cloud computing has become an essential component of IT strategy. However, users continue to face challenges in implementing clouds, as older technologies evolve and newer ones like Docke...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, will contrast how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He will show the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He will also have live demos of building immutable pipe...