Welcome!

@CloudExpo Authors: Elizabeth White, Scott Allen, Stefan Bernbo, Pat Romanski, Liz McMillan

Related Topics: @CloudExpo, Cloud Security, Government Cloud

@CloudExpo: Blog Feed Post

The Cloud and Cybersecurity

The cloud-first policy and the incremental approach to cloud adoption will make IT reform real

As part of federal CIO Vivek Kundra’s 25-point plan to reform federal IT management announced last December, federal agencies must adopt a “cloud-first” policy that requires them to move three applications to the “cloud” over the next 12 to 18 months. Agencies must identify the three “must move” services within three months, move one of those services to the cloud within 12 months and the remaining two within 10 months.

This cloud-first policy and the incremental approach to cloud adoption will make IT reform real and should result in huge (30-50%) savings in federal IT budgets. One specific and measurable goal laid out in the plan calls for a reduction in government data centers from the current 2,094 number to fewer than 800 by 2015. Already 50 percent of government agencies are moving to private clouds but to realize the full potential of the cloud, the government needs to move from many small clouds to fewer large, shared clouds. Of course, federal acquisition policies and authorities must be modified before agencies can fully embrace this strategy. The Federal Risk and Authorization Management Program (FedRAMP) begins to address this issue.

FedRAMP allows joint authorizations and continuous security monitoring services for Government and Commercial cloud computing systems intended for multi-agency use. Joint authorization of cloud providers results in a common security risk model that can be leveraged across the Federal Government. The use of this common security risk model provides a consistent baseline for Cloud based technologies. This common baseline ensures that the benefits of cloud-based technologies are effectively integrated across the various cloud computing solutions currently proposed within the government. The risk model will also enable the government to “approve once, and use often” by ensuring multiple agencies gain the benefit and insight of the FedRAMP’s Authorization and access to service provider’s authorization packages.

There are still a lot of challenges that federal agencies need to work out with the cloud–data sovereignty, privacy and security, funding models, etc–but it is clear that the cloud model will allow government to operate more efficiently and effectively. Nonetheless, there persists the nagging perception that the cloud is inherently unsafe. Government agencies are uncomfortable handing over control of their data to other agencies, vendors or third parties. They are right to be concerned; reported cyber attacks against federal systems increased by 39 percent during the last fiscal year when compared to the year before, says an annual report on agency implementation of the Federal Information Security Management Act (FISMA). The report–posted online last month by the Office of Management and Budget (FY2010 FISMA Report)–finds that Federal agencies reported 41,776 cyber incidents during fiscal 2010. In 2009, agencies reported close to 30,000 incidents.

Despite the grim outlook, we believe the security of the federal enterprise, as well as its functionality, can be significantly enhanced by smartly implementing cloud computing. The following are some key principles that can facilitate this:

  • The importance of mission-focused engineering. Private clouds inside the federal enterprise can enhance mission support, but mission-focused engineering should be a first step in this pursuit.
  • The continual need for security, including data confidentiality, integrity and availability. All federal computing approaches must be engineered to be in total consonance with IA guidelines to assure federal information, information systems and information infrastructure. Cloud Computing, when engineered right, makes dramatic, positive changes to the mission assurance posture of the federal enterprise. Cloud computing enables stronger end point security and better data protection. It also enables the use of thin clients and the many security benefits they provide. Identity management and encryption remain of critical importance.
  • The need for always instantaneously available backup of data in the cloud. Ensured availability under all circumstances is a key benefit of smart cloud computing approaches.
  • The continual need for open source and open standards. Most cloud infrastructure today is based on open source (Linux, Solaris, MySQL, Glassfish, Hadoop) and this positive trend will help in net centric approaches. According to the IDC Group, open source software (OSS) is “the most significant, all-encompassing and long-term trend that the software industry has seen since the early 1980′s” Gartner projects that by 2012, 90 percent of the world’s companies will be using open source software. This all indicates open source and open standards should be a key principle for federal cloud computing and other net centric approaches.
  • The continual need to evaluate both low barrier to entry and low barrier to exit. As approaches to cloud computing are evaluated, too frequently the cost of exiting an approach is not considered, resulting in lock-in into a capability that may soon be inefficient. Cloud computing capabilities should be adopted that do not result in lock-in.
  • The need for open standards. Cloud computing contributions to enhanced functionality for the federal workforce and increase interoperability as the code, API’s and interfaces for cloud computing are secure but are widely published for all participants to interface with. Federal involvement in open source and open standards communities should continue and be accelerated, since increasingly cloud computing open standards are being discussed and designed by open standards bodies like W3C, OASIS, IETF and the Liberty Alliance. Document and other formats used by federal cloud computing activities will be open and available for all authorized users on all devices.
  • The need to understand the cost of “private clouds”. For at least the near term, the federal government will remain a provider of “private cloud” capabilities where security dictates ownership levels of control over compute power. This fact means the federal enterprise must continually engineer for change and technology insertion, which underscores the need for low barriers to exist in design criteria.

Regarding security, cloud computing holds the potential to dramatically change the continuous loosing game of continual workstation patching and IT device remediation by reducing the amount of applications on desktops and changing the nature of the desktop device from fat client to thin client. Devices can now have their entire memory and operating system flashed out to the device from private clouds and can have the power of the cloud presented to users as if the user is on an old fashioned desktop. This can be done in a way that never requires IT departments to visit the workstation to patch and configure it. And since all data is stored on private clouds it can be encrypted and access only provided to authorized users. No data can ever be lost when laptops are stolen and no data can ever be lost when desktops are attacked by unauthorized users. Security by well engineered use or cloud computing and thin clients or cloud computing and smart fat clients is dramatically enhanced.

This all leads to a key conclusion for the federal enterprise: as we move forward in cloud computing for support to the mission, the federal enterprise should continue to strengthen formal processes to ensure that lessons learned from both industry and the government’s own successful cloud computing initiatives are continually examined and broadly adopted across the enterprise

Crucial Point associates Dillon Behr, Alex Olesker, Bob Gourley and Chris Barnes contributed to this post.

This post sponsored by the Enterprise CIO Forum and HP.

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@CloudExpo Stories
With over 720 million Internet users and 40–50% CAGR, the Chinese Cloud Computing market has been booming. When talking about cloud computing, what are the Chinese users of cloud thinking about? What is the most powerful force that can push them to make the buying decision? How to tap into them? In his session at 18th Cloud Expo, Yu Hao, CEO and co-founder of SpeedyCloud, answered these questions and discussed the results of SpeedyCloud’s survey.
The Internet of Things will challenge the status quo of how IT and development organizations operate. Or will it? Certainly the fog layer of IoT requires special insights about data ontology, security and transactional integrity. But the developmental challenges are the same: People, Process and Platform. In his session at @ThingsExpo, Craig Sproule, CEO of Metavine, demonstrated how to move beyond today's coding paradigm and shared the must-have mindsets for removing complexity from the develo...
SYS-CON Events announced today that MangoApps will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. MangoApps provides modern company intranets and team collaboration software, allowing workers to stay connected and productive from anywhere in the world and from any device.
Large scale deployments present unique planning challenges, system commissioning hurdles between IT and OT and demand careful system hand-off orchestration. In his session at @ThingsExpo, Jeff Smith, Senior Director and a founding member of Incenergy, will discuss some of the key tactics to ensure delivery success based on his experience of the last two years deploying Industrial IoT systems across four continents.
Redis is not only the fastest database, but it is the most popular among the new wave of databases running in containers. Redis speeds up just about every data interaction between your users or operational systems. In his session at 19th Cloud Expo, Dave Nielsen, Developer Advocate, Redis Labs, will share the functions and data structures used to solve everyday use cases that are driving Redis' popularity.
Basho Technologies has announced the latest release of Basho Riak TS, version 1.3. Riak TS is an enterprise-grade NoSQL database optimized for Internet of Things (IoT). The open source version enables developers to download the software for free and use it in production as well as make contributions to the code and develop applications around Riak TS. Enhancements to Riak TS make it quick, easy and cost-effective to spin up an instance to test new ideas and build IoT applications. In addition to...
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
Identity is in everything and customers are looking to their providers to ensure the security of their identities, transactions and data. With the increased reliance on cloud-based services, service providers must build security and trust into their offerings, adding value to customers and improving the user experience. Making identity, security and privacy easy for customers provides a unique advantage over the competition.
In his session at @DevOpsSummit at 19th Cloud Expo, Yoseph Reuveni, Director of Software Engineering at Jet.com, will discuss Jet.com's journey into containerizing Microsoft-based technologies like C# and F# into Docker. He will talk about lessons learned and challenges faced, the Mono framework tryout and how they deployed everything into Azure cloud. Yoseph Reuveni is a technology leader with unique experience developing and running high throughput (over 1M tps) distributed systems with extre...
"We've discovered that after shows 80% if leads that people get, 80% of the conversations end up on the show floor, meaning people forget about it, people forget who they talk to, people forget that there are actual business opportunities to be had here so we try to help out and keep the conversations going," explained Jeff Mesnik, Founder and President of ContentMX, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
SYS-CON Events announced today that Isomorphic Software will exhibit at DevOps Summit at 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Isomorphic Software provides the SmartClient HTML5/AJAX platform, the most advanced technology for building rich, cutting-edge enterprise web applications for desktop and mobile. SmartClient combines the productivity and performance of traditional desktop software with the simp...
"When you think about the data center today, there's constant evolution, The evolution of the data center and the needs of the consumer of technology change, and they change constantly," stated Matt Kalmenson, VP of Sales, Service and Cloud Providers at Veeam Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
"There's a growing demand from users for things to be faster. When you think about all the transactions or interactions users will have with your product and everything that is between those transactions and interactions - what drives us at Catchpoint Systems is the idea to measure that and to analyze it," explained Leo Vasiliou, Director of Web Performance Engineering at Catchpoint Systems, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York Ci...
As companies gain momentum, the need to maintain high quality products can outstrip their development team’s bandwidth for QA. Building out a large QA team (whether in-house or outsourced) can slow down development and significantly increases costs. This eBook takes QA profiles from 5 companies who successfully scaled up production without building a large QA team and includes: What to consider when choosing CI/CD tools How culture and communication can make or break implementation
Internet of @ThingsExpo, taking place November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA, is co-located with the 19th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world and ThingsExpo Silicon Valley Call for Papers is now open.
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
"We view the cloud not really as a specific technology but as a way of doing business and that way of doing business is transforming the way software, infrastructure and services are being delivered to business," explained Matthew Rosen, CEO and Director at Fusion, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.