Welcome!

Cloud Expo Authors: Liz McMillan, Elizabeth White, Cloud Ventures, Pat Romanski, Michelle Drolet

Blog Feed Post

S3 Encryption With Porticor

Your Porticor appliance can encrypt data being stored to Amazon’s Simple Storage Service (S3). Porticor is the only system available that offers the convenience of cloud-based hosted key management without sacrificing trust by requiring someone else to manage the keys. Porticor’s split-key encryption technology protects keys and guarantees they remain under customer control and are never exposed in storage; and with homomorphic key encryption, the keys are protected – even while they are in use.

A variety of applications use S3 to store bulk data, and we support two different ways of enabling Porticor encryption with them.

  1. The first and recommended alternative is to configure the application so that data is written to and read from the Porticor appliance.
  2. Where this is not an option, you can instead configure host-to-IP mapping so that the application writing to s3.amazonaws.com actually writes to the appliance, which encrypts and forwards the data to S3.

These are two options to set up your environment, but the end result is one and the same: your data is encrypted when written to S3 and decrypted when reading from it, with no need to change the client application.

What follows are detailed instructions for both configuration options.

Alternative 1: Explicit Configuration

This alternative depends on the individual application. In fact some S3 clients are hardwired to the Amazon server addresses and cannot be configured. For these clients you must use Alternative 2.

In this alternative you configure the S3 client to connect to a special DNS address for each of your buckets and for the main S3 endpoint. These DNS addresses are installed automatically when you add your S3 “buckets” into the table on the S3 Encryption page. All buckets that are to be encrypted must be listed. If your bucket is called mylittlebucket, it will be mapped to the DNS name mylittlebucket.d.porticor.net.

The specific client configuration depends on the particular client. Following are two examples.

Configuring s3cmd

Edit the file $HOME/.s3cfg, and replace the host_base and host_bucket lines as below, where the long host_base value is your appliance’s address.

# old:
# host_base = s3.amazonaws.com
# host_bucket = %(bucket)s.s3.amazonaws.com

# new:
host_base = itbetb19zy3-pzjy3yty2zw.d.porticor.net
host_bucket = %(bucket)s.d.porticor.net

Configuring S3QL

Edit $HOME/.s3ql/authinfo2, and add a new section:

[porticor]
storage-url: s3c://itbetb19zy3-pzjy3yty2zw.d.porticor.net/bucket-name/
backend-login: aws-key-id
backend-password: aws-secret-key

To create the file system, run:

mkfs.s3ql --plain --ssl s3://bucket-name/

mount.s3ql --ssl s3://bucket-name /mnt/cloud-drive

Note the use of the “s3c” URL method. Both the --ssl and --plain flags are mandatory.

Alternative 2: Host Mapping

With this alternative you configure the host on which your S3 client is running, so that s3.amazonaws.com and bucketname.s3.amazonaws.com are resolved to the IP address of the Porticor appliance. You should not fill in the Bucket table in this alternative.
On Linux, edit the file /etc/hosts and add the lines shown below.
On Windows, add these lines to %windir%\system32\drivers\etc\lmhosts (or ...\hosts, depending on your Windows version).

Appliance-IP bucketname.s3.amazonaws.com # replicate this line for each S3 bucket
Appliance-IP s3.amazonaws.com

Use the so-called “private” IP address (10.x.x.x) of the Porticor appliance, for access from within the EC2 cloud. If you plan to access the appliance from outside the cloud or even from another AWS region you will need to use “public” address instead. Both addresses are listed in the S3 Configuration GUI page.
All communication with the virtual appliance is SSL-protected. In Alternative 2, you should make sure that your project’s CA certificate is installed on the client machine, otherwise your S3 client might refuse to connect.

Additional Notes

  • You need to wait a few minutes after the creation of a new bucket before starting to use it, so that it is recognized by all S3 servers.
  • Note that Porticor does not support “mixed buckets” containing both protected and unprotected objects.
  • Bucket names must conform with the requirements for DNS names, as recommended by Amazon Web Services. In particular they must not contain uppercase letters.

The post S3 Encryption With Porticor appeared first on Porticor Cloud Security.

Read the original blog entry...

More Stories By Gilad Parann-Nissany

Gilad Parann-Nissany, Founder and CEO at Porticor is a pioneer of Cloud Computing. He has built SaaS Clouds for medium and small enterprises at SAP (CTO Small Business); contributing to several SAP products and reaching more than 8 million users. Recently he has created a consumer Cloud at G.ho.st - a cloud operating system that delighted hundreds of thousands of users while providing browser-based and mobile access to data, people and a variety of cloud-based applications. He is now CEO of Porticor, a leader in Virtual Privacy and Cloud Security.

Cloud Expo Breaking News
More and more enterprises today are doing business by opening up their data and applications through APIs. Though forward-thinking and strategic, exposing APIs also increases the surface area for potential attack by hackers. To benefit from APIs while staying secure, enterprises and security architects need to continue to develop a deep understanding about API security and how it differs from traditional web application security or mobile application security. In his session at 14th Cloud Expo, Sachin Agarwal, VP of Product Marketing and Strategy at SOA Software, will walk you through the various aspects of how an API could be potentially exploited. He will discuss the necessary best practices to secure your data and enterprise applications while continue continuing to support your business’s digital initiatives.
The revolution that happened in the server universe over the past 15 years has resulted in an eco-system that is more open, more democratically innovative and produced better results in technically challenging dimensions like scale. The underpinnings of the revolution were common hardware, standards based APIs (ex. POSIX) and a strict adherence to layering and isolation between applications, daemons and kernel drivers/modules which allowed multiple types of development happen in parallel without hindering others. Put simply, today's server model is built on a consistent x86 platform with few surprises in its core components. A kernel abstracts away the platform, so that applications and daemons are decoupled from the hardware. In contrast, networking equipment is still stuck in the mainframe era. Today, networking equipment is a single appliance, including hardware, OS, applications and user interface come as a monolithic entity from a single vendor. Switching between different vendor'...
You use an agile process; your goal is to make your organization more agile. What about your data infrastructure? The truth is, today’s databases are anything but agile – they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver on new features and capabilities needed to make your organization competitive. As your application and business needs change, data repositories and structures get outmoded rapidly, resulting in increased work for application developers and slow performance for end users. Further, as data sizes grow into the Big Data realm, this problem is exacerbated and becomes even more difficult to address. A seemingly simple schema change can take hours (or more) to perform, and as requirements evolve the disconnect between existing data structures and actual needs diverge.
Cloud backup and recovery services are critical to safeguarding an organization’s data and ensuring business continuity when technical failures and outages occur. With so many choices, how do you find the right provider for your specific needs? In his session at 14th Cloud Expo, Daniel Jacobson, Technology Manager at BUMI, will outline the key factors including backup configurations, proactive monitoring, data restoration, disaster recovery drills, security, compliance and data center resources. Aside from the technical considerations, the secret sauce in identifying the best vendor is the level of focus, expertise and specialization of their engineering team and support group, and how they monitor your day-to-day backups, provide recommendations, and guide you through restores when necessary.
Cloud scalability and performance should be at the heart of every successful Internet venture. The infrastructure needs to be resilient, flexible, and fast – it’s best not to get caught thinking about architecture until the middle of an emergency, when it's too late. In his interactive, no-holds-barred session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, will dive into how to design and build-out the right cloud infrastructure.
SYS-CON Events announced today that SherWeb, a long-time leading provider of cloud services and Microsoft's 2013 World Hosting Partner of the Year, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. A worldwide hosted services leader ranking in the prestigious North American Deloitte Technology Fast 500TM, and Microsoft's 2013 World Hosting Partner of the Year, SherWeb provides competitive cloud solutions to businesses and partners around the world. Founded in 1998, SherWeb is a privately owned company headquartered in Quebec, Canada. Its service portfolio includes Microsoft Exchange, SharePoint, Lync, Dynamics CRM and more.
The world of cloud and application development is not just for the hardened developer these days. In their session at 14th Cloud Expo, Phil Jackson, Development Community Advocate for SoftLayer, and Harold Hannon, Sr. Software Architect at SoftLayer, will pull back the curtain of the architecture of a fun demo application purpose-built for the cloud. They will focus on demonstrating how they leveraged compute, storage, messaging, and other cloud elements hosted at SoftLayer to lower the effort and difficulty of putting together a useful application. This will be an active demonstration and review of simple command-line tools and resources, so don’t be afraid if you are not a seasoned developer.
SYS-CON Events announced today that BUMI, a premium managed service provider specializing in data backup and recovery, will exhibit at SYS-CON's 14th International Cloud Expo®, which will take place on June 10–12, 2014, at the Javits Center in New York City, New York. Manhattan-based BUMI (Backup My Info!) is a premium managed service provider specializing in data backup and recovery. Founded in 2002, the company’s Here, There and Everywhere data backup and recovery solutions are utilized by more than 500 businesses. BUMI clients include professional service organizations such as banking, financial, insurance, accounting, hedge funds and law firms. The company is known for its relentless passion for customer service and support, and has won numerous awards, including Customer Service Provider of the Year and 10 Best Companies to Work For.
Chief Security Officers (CSO), CIOs and IT Directors are all concerned with providing a secure environment from which their business can innovate and customers can safely consume without the fear of Distributed Denial of Service attacks. To be successful in today's hyper-connected world, the enterprise needs to leverage the capabilities of the web and be ready to innovate without fear of DDoS attacks, concerns about application security and other threats. Organizations face great risk from increasingly frequent and sophisticated attempts to render web properties unavailable, and steal intellectual property or personally identifiable information. Layered security best practices extend security beyond the data center, delivering DDoS protection and maintaining site performance in the face of fast-changing threats.
From data center to cloud to the network. In his session at 3rd SDDC Expo, Raul Martynek, CEO of Net Access, will identify the challenges facing both data center providers and enterprise IT as they relate to cross-platform automation. He will then provide insight into designing, building, securing and managing the technology as an integrated service offering. Topics covered include: High-density data center design Network (and SDN) integration and automation Cloud (and hosting) infrastructure considerations Monitoring and security Management approaches Self-service and automation
In his session at 14th Cloud Expo, David Holmes, Vice President at OutSystems, will demonstrate the immense power that lives at the intersection of mobile apps and cloud application platforms. Attendees will participate in a live demonstration – an enterprise mobile app will be built and changed before their eyes – on their own devices. David Holmes brings over 20 years of high-tech marketing leadership to OutSystems. Prior to joining OutSystems, he was VP of Global Marketing for Damballa, a leading provider of network security solutions. Previously, he was SVP of Global Marketing for Jacada where his branding and positioning expertise helped drive the company from start-up days to a $55 million initial public offering on Nasdaq.
Performance is the intersection of power, agility, control, and choice. If you value performance, and more specifically consistent performance, you need to look beyond simple virtualized compute. Many factors need to be considered to create a truly performant environment. In his General Session at 14th Cloud Expo, Marc Jones, Vice President of Product Innovation for SoftLayer, will explain how to take advantage of a multitude of compute options and platform features to make cloud the cornerstone of your online presence.
Are you interested in accelerating innovation, simplifying deployments, reducing complexity, and lowering development costs? The cloud is changing the face of application development and deployment, with enterprise-grade infrastructure and platform services making it possible for you to build and rapidly scale enterprise applications. In his session at 14th Cloud Expo, Gene Eun, Sr. Director, Oracle Cloud at Oracle, will discuss the latest solutions and strategies for application developers and enterprise IT organizations to leverage Infrastructure as a Service (IaaS) and Platform as a Service (PaaS) to build and deploy modern business applications in the cloud.
Hybrid cloud refers to the federation of a public and private cloud environment for the purpose of extending the elastic and flexibility of compute, storage and network capabilities, in an on-demand, pay-as-you go basis. The hybrid approach allows a business to take advantage of the scalability and cost-effectiveness that a public cloud computing environment offers without exposing mission-critical applications and data to third-party vulnerabilities. Hybrid cloud environments involve complex management challenges. First, organizations struggle to maintain control over the resources that lie outside of their managed IT scope. They also need greater infrastructure visibility to help reduce maintenance costs and ensure that their company data and resources are properly handled and secured.
As more applications and services move "to the cloud" (public or on-premise), cloud environments are increasingly adopting and building out traditional enterprise features. This in turn is enabling and encouraging cloud adoption from enterprise users. In many ways the definition is blurring as features like continuous operation, geo-distribution or on-demand capacity become the norm. At NuoDB we're involved in both building enterprise software and using enterprise cloud capabilities. In his session at 14th Cloud Expo, Seth Proctor, CTO of NuoDB, Inc., will cover experiences from building, deploying and using enterprise services and suggest some ways to approach moving enterprise applications into a cloud model.