Welcome!

@CloudExpo Authors: Harry Trott, Pat Romanski, Elizabeth White, Jason Bloomberg, Liz McMillan

Blog Feed Post

S3 Encryption With Porticor

Your Porticor appliance can encrypt data being stored to Amazon’s Simple Storage Service (S3). Porticor is the only system available that offers the convenience of cloud-based hosted key management without sacrificing trust by requiring someone else to manage the keys. Porticor’s split-key encryption technology protects keys and guarantees they remain under customer control and are never exposed in storage; and with homomorphic key encryption, the keys are protected – even while they are in use.

A variety of applications use S3 to store bulk data, and we support two different ways of enabling Porticor encryption with them.

  1. The first and recommended alternative is to configure the application so that data is written to and read from the Porticor appliance.
  2. Where this is not an option, you can instead configure host-to-IP mapping so that the application writing to s3.amazonaws.com actually writes to the appliance, which encrypts and forwards the data to S3.

These are two options to set up your environment, but the end result is one and the same: your data is encrypted when written to S3 and decrypted when reading from it, with no need to change the client application.

What follows are detailed instructions for both configuration options.

Alternative 1: Explicit Configuration

This alternative depends on the individual application. In fact some S3 clients are hardwired to the Amazon server addresses and cannot be configured. For these clients you must use Alternative 2.

In this alternative you configure the S3 client to connect to a special DNS address for each of your buckets and for the main S3 endpoint. These DNS addresses are installed automatically when you add your S3 “buckets” into the table on the S3 Encryption page. All buckets that are to be encrypted must be listed. If your bucket is called mylittlebucket, it will be mapped to the DNS name mylittlebucket.d.porticor.net.

The specific client configuration depends on the particular client. Following are two examples.

Configuring s3cmd

Edit the file $HOME/.s3cfg, and replace the host_base and host_bucket lines as below, where the long host_base value is your appliance’s address.

# old:
# host_base = s3.amazonaws.com
# host_bucket = %(bucket)s.s3.amazonaws.com

# new:
host_base = itbetb19zy3-pzjy3yty2zw.d.porticor.net
host_bucket = %(bucket)s.d.porticor.net

Configuring S3QL

Edit $HOME/.s3ql/authinfo2, and add a new section:

[porticor]
storage-url: s3c://itbetb19zy3-pzjy3yty2zw.d.porticor.net/bucket-name/
backend-login: aws-key-id
backend-password: aws-secret-key

To create the file system, run:

mkfs.s3ql --plain --ssl s3://bucket-name/

mount.s3ql --ssl s3://bucket-name /mnt/cloud-drive

Note the use of the “s3c” URL method. Both the --ssl and --plain flags are mandatory.

Alternative 2: Host Mapping

With this alternative you configure the host on which your S3 client is running, so that s3.amazonaws.com and bucketname.s3.amazonaws.com are resolved to the IP address of the Porticor appliance. You should not fill in the Bucket table in this alternative.
On Linux, edit the file /etc/hosts and add the lines shown below.
On Windows, add these lines to %windir%\system32\drivers\etc\lmhosts (or ...\hosts, depending on your Windows version).

Appliance-IP bucketname.s3.amazonaws.com # replicate this line for each S3 bucket
Appliance-IP s3.amazonaws.com

Use the so-called “private” IP address (10.x.x.x) of the Porticor appliance, for access from within the EC2 cloud. If you plan to access the appliance from outside the cloud or even from another AWS region you will need to use “public” address instead. Both addresses are listed in the S3 Configuration GUI page.
All communication with the virtual appliance is SSL-protected. In Alternative 2, you should make sure that your project’s CA certificate is installed on the client machine, otherwise your S3 client might refuse to connect.

Additional Notes

  • You need to wait a few minutes after the creation of a new bucket before starting to use it, so that it is recognized by all S3 servers.
  • Note that Porticor does not support “mixed buckets” containing both protected and unprotected objects.
  • Bucket names must conform with the requirements for DNS names, as recommended by Amazon Web Services. In particular they must not contain uppercase letters.

The post S3 Encryption With Porticor appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

@CloudExpo Stories
Cloud Expo, Inc. has announced today that Andi Mann returns to 'DevOps at Cloud Expo 2017' as Conference Chair The @DevOpsSummit at Cloud Expo will take place on June 6-8, 2017, at the Javits Center in New York City, NY. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great t...
Every successful software product evolves from an idea to an enterprise system. Notably, the same way is passed by the product owner's company. In his session at 20th Cloud Expo, Oleg Lola, CEO of MobiDev, will provide a generalized overview of the evolution of a software product, the product owner, the needs that arise at various stages of this process, and the value brought by a software development partner to the product owner as a response to these needs.
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, John Jelinek IV, a web developer at Linux Academy, will discuss why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers...
SYS-CON Events announced today that MobiDev, a client-oriented software development company, will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. MobiDev is a software company that develops and delivers turn-key mobile apps, websites, web services, and complex softw...
SYS-CON Events announced today that Enzu will exhibit at SYS-CON's 20th International Cloud Expo®, which will take place on June 6-8, 2017, at the Javits Center in New York City, NY, and the 21st International Cloud Expo®, which will take place October 31-November 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Enzu’s mission is to be the leading provider of enterprise cloud solutions worldwide. Enzu enables online businesses to use its IT infrastructure to their competitive ad...
Smart Cities are here to stay, but for their promise to be delivered, the data they produce must not be put in new siloes. In his session at @ThingsExpo, Mathias Herberts, Co-founder and CTO of Cityzen Data, discussed the best practices that will ensure a successful smart city journey.
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
Traditional on-premises data centers have long been the domain of modern data platforms like Apache Hadoop, meaning companies who build their business on public cloud were challenged to run Big Data processing and analytics at scale. But recent advancements in Hadoop performance, security, and most importantly cloud-native integrations, are giving organizations the ability to truly gain value from all their data. In his session at 19th Cloud Expo, David Tishgart, Director of Product Marketing ...
Choosing the right cloud for your workloads is a balancing act that can cost your organization time, money and aggravation - unless you get it right the first time. Economics, speed, performance, accessibility, administrative needs and security all play a vital role in dictating your approach to the cloud. Without knowing the right questions to ask, you could wind up paying for capacity you'll never need or underestimating the resources required to run your applications.
Technology vendors and analysts are eager to paint a rosy picture of how wonderful IoT is and why your deployment will be great with the use of their products and services. While it is easy to showcase successful IoT solutions, identifying IoT systems that missed the mark or failed can often provide more in the way of key lessons learned. In his session at @ThingsExpo, Peter Vanderminden, Principal Industry Analyst for IoT & Digital Supply Chain to Flatiron Strategies, will focus on how IoT depl...
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
The pace of innovation, vendor lock-in, production sustainability, cost-effectiveness, and managing risk… In his session at 18th Cloud Expo, Dan Choquette, Founder of RackN, discussed how CIOs are challenged finding the balance of finding the right tools, technology and operational model that serves the business the best. He also discussed how clouds, open source software and infrastructure solutions have benefits but also drawbacks and how workload and operational portability between vendors an...
With the proliferation of both SQL and NoSQL databases, organizations can now target specific fit-for-purpose database tools for their different application needs regarding scalability, ease of use, ACID support, etc. Platform as a Service offerings make this even easier now, enabling developers to roll out their own database infrastructure in minutes with minimal management overhead. However, this same amount of flexibility also comes with the challenges of picking the right tool, on the right ...
There will be new vendors providing applications, middleware, and connected devices to support the thriving IoT ecosystem. This essentially means that electronic device manufacturers will also be in the software business. Many will be new to building embedded software or robust software. This creates an increased importance on software quality, particularly within the Industrial Internet of Things where business-critical applications are becoming dependent on products controlled by software. Qua...
"Tintri was started in 2008 with the express purpose of building a storage appliance that is ideal for virtualized environments. We support a lot of different hypervisor platforms from VMware to OpenStack to Hyper-V," explained Dan Florea, Director of Product Management at Tintri, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, explained the best practices of continuous testing at high scale, which is rele...
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Containers have changed the mind of IT in DevOps. They enable developers to work with dev, test, stage and production environments identically. Containers provide the right abstraction for microservices and many cloud platforms have integrated them into deployment pipelines. DevOps and Containers together help companies to achieve their business goals faster and more effectively. In his session at DevOps Summit, Ruslan Synytsky, CEO and Co-founder of Jelastic, reviewed the current landscape of D...
In his General Session at 16th Cloud Expo, David Shacochis, host of The Hybrid IT Files podcast and Vice President at CenturyLink, investigated three key trends of the “gigabit economy" though the story of a Fortune 500 communications company in transformation. Narrating how multi-modal hybrid IT, service automation, and agile delivery all intersect, he will cover the role of storytelling and empathy in achieving strategic alignment between the enterprise and its information technology.
IoT is at the core or many Digital Transformation initiatives with the goal of re-inventing a company's business model. We all agree that collecting relevant IoT data will result in massive amounts of data needing to be stored. However, with the rapid development of IoT devices and ongoing business model transformation, we are not able to predict the volume and growth of IoT data. And with the lack of IoT history, traditional methods of IT and infrastructure planning based on the past do not app...