Welcome!

@CloudExpo Authors: Elizabeth White, Liz McMillan, Carmen Gonzalez, Harry Trott, Pat Romanski

News Feed Item

Veracode State of Software Security Report Finds That Despite Increasing Global Security Risks From Third Party Applications, Most Enterprises Still Lack Formal Programs To Secure Them

Veracode, Inc., the leader in cloud-based application security testing, today released the second feature supplement of its annual State of Software Security Report (SoSS), for the first time analyzing third-party software security testing metrics. The data indicates that despite increasing security risks from third-party and externally developed software, few enterprises currently have formal testing programs in place. However, there are signs that more organizations are beginning to recognize and address the security risks associated with externally developed applications.

“The widespread adoption of third-party apps and use of external developers in enterprises brings increased risk,” said Chris Eng, vice president of research, Veracode. “In fact, a typical enterprise has an average of 600 mission-critical applications, about 65 percent of which are developed externally, leaving companies increasingly vulnerable to the security risks found in these apps. We are beginning to see signs that enterprises are recognizing and addressing these risks. However, organizations still assume too much risk when trusting their third-party software suppliers to develop applications that meet industry and organizational standards. There is still much more work to be done to adequately secure the software supply chain.”

The supplement found that some of the most dangerous security flaws in existence, such as SQL injection and Cross-Site Scripting, are among the most prevalent vulnerabilities in third-party vendor applications. The report also showed that while a programmatic approach to software security testing can greatly help enterprises and their vendors mitigate these flaws, few organizations have formal programs in place to manage and secure the software supply chain.

Key findings of the report include:

  • Currently few enterprises have vendor application security testing programs in place, but the volume of assessments within organizations is growing
    • Less than one in five enterprises have requested a code-level security test from at least one vendor
    • However, the volume of vendor-supplied software or application assessments continues to grow with a 49 percent increase from the first quarter of 2011 to the second quarter of 2012
  • There is a Gap Between Enterprise Standard and Industry Standard Compliance
    • 38 percent of vendor-supplied applications complied with enterprise-defined policies vs. 10 percent with the OWASP Top Ten and 30 percent with CWE/SANS Top 25 industry-defined standards.
  • Some of the most dangerous vulnerabilities in vendor applications are also the most prevalent
    • Four of the top five flaw categories for web applications are also among the OWASP Top 10 most dangerous flaws and five of the top six flaw categories for non-web applications appear on the CWE/SANS Top 25 list of most dangerous flaws.
    • SQL injection and cross-site scripting affect 40 percent and 71 percent of vendor-supplied web application versions, respectively.
    • Only 10 percent of applications tested complied with the OWASP Top Ten list and 30 percent with the CWE/SANS Top 25 industry standards
  • With 62 percent of applications failing to reach compliance on first submission, procedures for managing non-compliant applications are an important aspect of an enterprise’s security policy
    • 11 percent of vendors resubmitted new versions of applications for testing but are still out of compliance with enterprise policies
  • Structured Testing Programs Promote Higher Participation
    • Enterprises that relied on an ad-hoc approach when requesting application security testing averaged four participating vendors, whereas enterprises with a structured approach had much higher levels of success, averaging participation from 38 vendors.
    • Enterprises with structured programs enabled more vendors to achieve compliance quickly, with 45 percent of vendor applications becoming compliant within one week.
    • By contrast, enterprises with an ad hoc program only saw 28 percent of third-party applications achieve compliance within one week

“Today, every organization is an extended enterprise, with third-party software a fundamental layer in the software supply chain,” said Wendy Nather, research director, 451 Research. “It’s critical that organizations develop security policies when purchasing software from outside vendors because of the risks inherent in using third-party applications, yet few are actually demanding security compliance of their suppliers.”

Report Methodology

This Study of Enterprise Testing of the Software Supply Chain captures data collected from 939 application versions (across 564 distinct applications) submitted to the Veracode Platform during an 18 month time period from January 201 to June 2012. The data comes from actual security analysis of web and non-web applications across industry verticals, languages and platforms, and represents multiple security testing methodologies on a wide range of application types and programming languages.

The focus of this report is to assess the security of software purchased from vendors, as well as participating in vendor application security testing programs, and how different programs impact vendor compliance with application security policies.

Download the Report

Veracode’s report, Enterprise Testing the Software Supply Chain, examines additional software security topics in the context of application security trends, including details on the industries that most commonly employ assessment programs, more information on the factors involved in time to compliance, and additional facts about the types of security programs that enterprises implement. For complete report findings, download a copy of the report by visiting: https://info.veracode.com/vast-soss.html.

About Veracode

Veracode is the only independent provider of cloud-based application intelligence and security verification services. The Veracode platform provides the fastest, most comprehensive solution to improve the security of internally developed, purchased or outsourced software applications and third-party components. By combining patented static, dynamic and manual testing, extensive eLearning capabilities, and advanced application analytics, Veracode enables scalable, policy-driven application risk management programs that help identify and eradicate numerous vulnerabilities by leveraging best-in-class technologies from vulnerability scanning to penetration testing and static code analysis. Veracode delivers unbiased proof of application security to stakeholders across the software supply chain while supporting independent audit and compliance requirements for all applications no matter how they are deployed, via the web, mobile or in the cloud. Veracode works with customers in more than 80 countries worldwide representing Global 2000 brands. For more information, visit www.veracode.com, follow on Twitter: @Veracode or read the Veracode Blog.

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@CloudExpo Stories
Between 2005 and 2020, data volumes will grow by a factor of 300 – enough data to stack CDs from the earth to the moon 162 times. This has come to be known as the ‘big data’ phenomenon. Unfortunately, traditional approaches to handling, storing and analyzing data aren’t adequate at this scale: they’re too costly, slow and physically cumbersome to keep up. Fortunately, in response a new breed of technology has emerged that is cheaper, faster and more scalable. Yet, in meeting these new needs they...
The cloud promises new levels of agility and cost-savings for Big Data, data warehousing and analytics. But it’s challenging to understand all the options – from IaaS and PaaS to newer services like HaaS (Hadoop as a Service) and BDaaS (Big Data as a Service). In her session at @BigDataExpo at @ThingsExpo, Hannah Smalltree, a director at Cazena, provided an educational overview of emerging “as-a-service” options for Big Data in the cloud. This is critical background for IT and data professionals...
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
The cloud competition for database hosts is fierce. How do you evaluate a cloud provider for your database platform? In his session at 18th Cloud Expo, Chris Presley, a Solutions Architect at Pythian, gave users a checklist of considerations when choosing a provider. Chris Presley is a Solutions Architect at Pythian. He loves order – making him a premier Microsoft SQL Server expert. Not only has he programmed and administered SQL Server, but he has also shared his expertise and passion with b...
In his session at 19th Cloud Expo, Claude Remillard, Principal Program Manager in Developer Division at Microsoft, contrasted how his team used config as code and immutable patterns for continuous delivery of microservices and apps to the cloud. He showed how the immutable patterns helps developers do away with most of the complexity of config as code-enabling scenarios such as rollback, zero downtime upgrades with far greater simplicity. He also demoed building immutable pipelines in the cloud ...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
"IoT is going to be a huge industry with a lot of value for end users, for industries, for consumers, for manufacturers. How can we use cloud to effectively manage IoT applications," stated Ian Khan, Innovation & Marketing Manager at Solgeniakhela, in this SYS-CON.tv interview at @ThingsExpo, held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA.
Today we can collect lots and lots of performance data. We build beautiful dashboards and even have fancy query languages to access and transform the data. Still performance data is a secret language only a couple of people understand. The more business becomes digital the more stakeholders are interested in this data including how it relates to business. Some of these people have never used a monitoring tool before. They have a question on their mind like “How is my application doing” but no id...
@GonzalezCarmen has been ranked the Number One Influencer and @ThingsExpo has been named the Number One Brand in the “M2M 2016: Top 100 Influencers and Brands” by Onalytica. Onalytica analyzed tweets over the last 6 months mentioning the keywords M2M OR “Machine to Machine.” They then identified the top 100 most influential brands and individuals leading the discussion on Twitter.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
Predictive analytics tools monitor, report, and troubleshoot in order to make proactive decisions about the health, performance, and utilization of storage. Most enterprises combine cloud and on-premise storage, resulting in blended environments of physical, virtual, cloud, and other platforms, which justifies more sophisticated storage analytics. In his session at 18th Cloud Expo, Peter McCallum, Vice President of Datacenter Solutions at FalconStor, discussed using predictive analytics to mon...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Information technology is an industry that has always experienced change, and the dramatic change sweeping across the industry today could not be truthfully described as the first time we've seen such widespread change impacting customer investments. However, the rate of the change, and the potential outcomes from today's digital transformation has the distinct potential to separate the industry into two camps: Organizations that see the change coming, embrace it, and successful leverage it; and...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Fact: storage performance problems have only gotten more complicated, as applications not only have become largely virtualized, but also have moved to cloud-based infrastructures. Storage performance in virtualized environments isn’t just about IOPS anymore. Instead, you need to guarantee performance for individual VMs, helping applications maintain performance as the number of VMs continues to go up in real time. In his session at Cloud Expo, Dhiraj Sehgal, Product and Marketing at Tintri, sha...
The Internet of Things (IoT) promises to simplify and streamline our lives by automating routine tasks that distract us from our goals. This promise is based on the ubiquitous deployment of smart, connected devices that link everything from industrial control systems to automobiles to refrigerators. Unfortunately, comparatively few of the devices currently deployed have been developed with an eye toward security, and as the DDoS attacks of late October 2016 have demonstrated, this oversight can ...
Machine Learning helps make complex systems more efficient. By applying advanced Machine Learning techniques such as Cognitive Fingerprinting, wind project operators can utilize these tools to learn from collected data, detect regular patterns, and optimize their own operations. In his session at 18th Cloud Expo, Stuart Gillen, Director of Business Development at SparkCognition, discussed how research has demonstrated the value of Machine Learning in delivering next generation analytics to impr...