|By Bob Gourley||
|April 5, 2011 12:30 PM EDT||
As part of federal CIO Vivek Kundra’s 25-point plan to reform federal IT management announced last December, federal agencies must adopt a “cloud-first” policy that requires them to move three applications to the “cloud” over the next 12 to 18 months. Agencies must identify the three “must move” services within three months, move one of those services to the cloud within 12 months and the remaining two within 10 months.
This cloud-first policy and the incremental approach to cloud adoption will make IT reform real and should result in huge (30-50%) savings in federal IT budgets. One specific and measurable goal laid out in the plan calls for a reduction in government data centers from the current 2,094 number to fewer than 800 by 2015. Already 50 percent of government agencies are moving to private clouds but to realize the full potential of the cloud, the government needs to move from many small clouds to fewer large, shared clouds. Of course, federal acquisition policies and authorities must be modified before agencies can fully embrace this strategy. The Federal Risk and Authorization Management Program (FedRAMP) begins to address this issue.
FedRAMP allows joint authorizations and continuous security monitoring services for Government and Commercial cloud computing systems intended for multi-agency use. Joint authorization of cloud providers results in a common security risk model that can be leveraged across the Federal Government. The use of this common security risk model provides a consistent baseline for Cloud based technologies. This common baseline ensures that the benefits of cloud-based technologies are effectively integrated across the various cloud computing solutions currently proposed within the government. The risk model will also enable the government to “approve once, and use often” by ensuring multiple agencies gain the benefit and insight of the FedRAMP’s Authorization and access to service provider’s authorization packages.
There are still a lot of challenges that federal agencies need to work out with the cloud–data sovereignty, privacy and security, funding models, etc–but it is clear that the cloud model will allow government to operate more efficiently and effectively. Nonetheless, there persists the nagging perception that the cloud is inherently unsafe. Government agencies are uncomfortable handing over control of their data to other agencies, vendors or third parties. They are right to be concerned; reported cyber attacks against federal systems increased by 39 percent during the last fiscal year when compared to the year before, says an annual report on agency implementation of the Federal Information Security Management Act (FISMA). The report–posted online last month by the Office of Management and Budget (FY2010 FISMA Report)–finds that Federal agencies reported 41,776 cyber incidents during fiscal 2010. In 2009, agencies reported close to 30,000 incidents.
Despite the grim outlook, we believe the security of the federal enterprise, as well as its functionality, can be significantly enhanced by smartly implementing cloud computing. The following are some key principles that can facilitate this:
- The importance of mission-focused engineering. Private clouds inside the federal enterprise can enhance mission support, but mission-focused engineering should be a first step in this pursuit.
- The continual need for security, including data confidentiality, integrity and availability. All federal computing approaches must be engineered to be in total consonance with IA guidelines to assure federal information, information systems and information infrastructure. Cloud Computing, when engineered right, makes dramatic, positive changes to the mission assurance posture of the federal enterprise. Cloud computing enables stronger end point security and better data protection. It also enables the use of thin clients and the many security benefits they provide. Identity management and encryption remain of critical importance.
- The need for always instantaneously available backup of data in the cloud. Ensured availability under all circumstances is a key benefit of smart cloud computing approaches.
- The continual need for open source and open standards. Most cloud infrastructure today is based on open source (Linux, Solaris, MySQL, Glassfish, Hadoop) and this positive trend will help in net centric approaches. According to the IDC Group, open source software (OSS) is “the most significant, all-encompassing and long-term trend that the software industry has seen since the early 1980′s” Gartner projects that by 2012, 90 percent of the world’s companies will be using open source software. This all indicates open source and open standards should be a key principle for federal cloud computing and other net centric approaches.
- The continual need to evaluate both low barrier to entry and low barrier to exit. As approaches to cloud computing are evaluated, too frequently the cost of exiting an approach is not considered, resulting in lock-in into a capability that may soon be inefficient. Cloud computing capabilities should be adopted that do not result in lock-in.
- The need for open standards. Cloud computing contributions to enhanced functionality for the federal workforce and increase interoperability as the code, API’s and interfaces for cloud computing are secure but are widely published for all participants to interface with. Federal involvement in open source and open standards communities should continue and be accelerated, since increasingly cloud computing open standards are being discussed and designed by open standards bodies like W3C, OASIS, IETF and the Liberty Alliance. Document and other formats used by federal cloud computing activities will be open and available for all authorized users on all devices.
- The need to understand the cost of “private clouds”. For at least the near term, the federal government will remain a provider of “private cloud” capabilities where security dictates ownership levels of control over compute power. This fact means the federal enterprise must continually engineer for change and technology insertion, which underscores the need for low barriers to exist in design criteria.
Regarding security, cloud computing holds the potential to dramatically change the continuous loosing game of continual workstation patching and IT device remediation by reducing the amount of applications on desktops and changing the nature of the desktop device from fat client to thin client. Devices can now have their entire memory and operating system flashed out to the device from private clouds and can have the power of the cloud presented to users as if the user is on an old fashioned desktop. This can be done in a way that never requires IT departments to visit the workstation to patch and configure it. And since all data is stored on private clouds it can be encrypted and access only provided to authorized users. No data can ever be lost when laptops are stolen and no data can ever be lost when desktops are attacked by unauthorized users. Security by well engineered use or cloud computing and thin clients or cloud computing and smart fat clients is dramatically enhanced.
This all leads to a key conclusion for the federal enterprise: as we move forward in cloud computing for support to the mission, the federal enterprise should continue to strengthen formal processes to ensure that lessons learned from both industry and the government’s own successful cloud computing initiatives are continually examined and broadly adopted across the enterprise
Crucial Point associates Dillon Behr, Alex Olesker, Bob Gourley and Chris Barnes contributed to this post.
DevOps is being widely accepted (if not fully adopted) as essential in enterprise IT. But as Enterprise DevOps gains maturity, expands scope, and increases velocity, the need for data-driven decisions across teams becomes more acute. DevOps teams in any modern business must wrangle the ‘digital exhaust’ from the delivery toolchain, "pervasive" and "cognitive" computing, APIs and services, mobile devices and applications, the Internet of Things, and now even blockchain. In this power panel at @...
Oct. 28, 2016 06:00 AM EDT Reads: 2,192
Qosmos, the market leader for IP traffic classification and network intelligence technology, has announced that it will launch the Launch L7 Viewer at CloudExpo | @ThingsExpo Silicon Valley, being held November 1 – 3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The L7 Viewer is a traffic analysis tool that provides complete visibility of all network traffic that crosses a virtualized infrastructure, up to Layer 7. It facilitates and accelerates common IT tasks such as VM migra...
Oct. 28, 2016 05:45 AM EDT Reads: 603
The Quantified Economy represents the total global addressable market (TAM) for IoT that, according to a recent IDC report, will grow to an unprecedented $1.3 trillion by 2019. With this the third wave of the Internet-global proliferation of connected devices, appliances and sensors is poised to take off in 2016. In his session at @ThingsExpo, David McLauchlan, CEO and co-founder of Buddy Platform, discussed how the ability to access and analyze the massive volume of streaming data from millio...
Oct. 28, 2016 05:00 AM EDT Reads: 3,198
In the 21st century, security on the Internet has become one of the most important issues. We hear more and more about cyber-attacks on the websites of large corporations, banks and even small businesses. When online we’re concerned not only for our own safety but also our privacy. We have to know that hackers usually start their preparation by investigating the private information of admins – the habits, interests, visited websites and so on. On the other hand, our own security is in danger bec...
Oct. 28, 2016 04:45 AM EDT Reads: 707
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Oct. 28, 2016 04:30 AM EDT Reads: 1,978
@DevOpsSummit has been named the ‘Top DevOps Influencer' by iTrend. iTrend processes millions of conversations, tweets, interactions, news articles, press releases, blog posts - and extract meaning form them and analyzes mobile and desktop software platforms used to communicate, various metadata (such as geo location), and automation tools. In overall placement, @DevOpsSummit ranked as the number one ‘DevOps Influencer' followed by @CloudExpo at third, and @MicroservicesE at 24th.
Oct. 28, 2016 04:15 AM EDT Reads: 4,144
Enterprises have been using both Big Data and virtualization for years. Until recently, however, most enterprises have not combined the two. Big Data's demands for higher levels of performance, the ability to control quality-of-service (QoS), and the ability to adhere to SLAs have kept it on bare metal, apart from the modern data center cloud. With recent technology innovations, we've seen the advantages of bare metal erode to such a degree that the enhanced flexibility and reduced costs that cl...
Oct. 28, 2016 04:15 AM EDT Reads: 673
“Media Sponsor” of SYS-CON's 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. CloudBerry Backup is a leading cross-platform cloud backup and disaster recovery solution integrated with major public cloud services, such as Amazon Web Services, Microsoft Azure and Google Cloud Platform.
Oct. 28, 2016 04:15 AM EDT Reads: 1,550
In the next forty months – just over three years – businesses will undergo extraordinary changes. The exponential growth of digitization and machine learning will see a step function change in how businesses create value, satisfy customers, and outperform their competition. In the next forty months companies will take the actions that will see them get to the next level of the game called Capitalism. Or they won’t – game over. The winners of today and tomorrow think differently, follow different...
Oct. 28, 2016 04:15 AM EDT Reads: 1,151
According to Forrester Research, every business will become either a digital predator or digital prey by 2020. To avoid demise, organizations must rapidly create new sources of value in their end-to-end customer experiences. True digital predators also must break down information and process silos and extend digital transformation initiatives to empower employees with the digital resources needed to win, serve, and retain customers.
Oct. 28, 2016 02:45 AM EDT Reads: 1,786
SYS-CON Events announced today that SoftNet Solutions will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. SoftNet Solutions specializes in Enterprise Solutions for Hadoop and Big Data. It offers customers the most open, robust, and value-conscious portfolio of solutions, services, and tools for the shortest route to success with Big Data. The unique differentiator is the ability to architect and ...
Oct. 28, 2016 02:30 AM EDT Reads: 1,185
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Oct. 28, 2016 02:00 AM EDT Reads: 4,394
A completely new computing platform is on the horizon. They’re called Microservers by some, ARM Servers by others, and sometimes even ARM-based Servers. No matter what you call them, Microservers will have a huge impact on the data center and on server computing in general. Although few people are familiar with Microservers today, their impact will be felt very soon. This is a new category of computing platform that is available today and is predicted to have triple-digit growth rates for some ...
Oct. 28, 2016 02:00 AM EDT Reads: 34,371
As the world moves toward more DevOps and Microservices, application deployment to the cloud ought to become a lot simpler. The Microservices architecture, which is the basis of many new age distributed systems such as OpenStack, NetFlix and so on, is at the heart of Cloud Foundry - a complete developer-oriented Platform as a Service (PaaS) that is IaaS agnostic and supports vCloud, OpenStack and AWS. Serverless computing is revolutionizing computing. In his session at 19th Cloud Expo, Raghav...
Oct. 28, 2016 01:15 AM EDT Reads: 2,223
In past @ThingsExpo presentations, Joseph di Paolantonio has explored how various Internet of Things (IoT) and data management and analytics (DMA) solution spaces will come together as sensor analytics ecosystems. This year, in his session at @ThingsExpo, Joseph di Paolantonio from DataArchon, will be adding the numerous Transportation areas, from autonomous vehicles to “Uber for containers.” While IoT data in any one area of Transportation will have a huge impact in that area, combining sensor...
Oct. 28, 2016 01:00 AM EDT Reads: 1,213
So you think you are a DevOps warrior, huh? Put your money (not really, it’s free) where your metrics are and prove it by taking The Ultimate DevOps Geek Quiz Challenge, sponsored by DevOps Summit. Battle through the set of tough questions created by industry thought leaders to earn your bragging rights and win some cool prizes.
Oct. 28, 2016 12:45 AM EDT Reads: 4,265
SYS-CON Events announced today that Transparent Cloud Computing (T-Cloud) Consortium will exhibit at the 19th International Cloud Expo®, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. The Transparent Cloud Computing Consortium (T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data proces...
Oct. 28, 2016 12:45 AM EDT Reads: 1,526
In his session at Cloud Expo, Robert Cohen, an economist and senior fellow at the Economic Strategy Institute, will provide economic scenarios that describe how the rapid adoption of software-defined everything including cloud services, SDDC and open networking will change GDP, industry growth, productivity and jobs. This session will also include a drill down for several industries such as finance, social media, cloud service providers and pharmaceuticals.
Oct. 28, 2016 12:00 AM EDT Reads: 2,204
SYS-CON Events announced today that Niagara Networks will exhibit at the 19th International Cloud Expo, which will take place on November 1–3, 2016, at the Santa Clara Convention Center in Santa Clara, CA. Niagara Networks offers the highest port-density systems, and the most complete Next-Generation Network Visibility systems including Network Packet Brokers, Bypass Switches, and Network TAPs.
Oct. 28, 2016 12:00 AM EDT Reads: 1,458
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, will discuss why and how ReadyTalk diverted from healthy revenue an...
Oct. 27, 2016 10:30 PM EDT Reads: 3,091