Click here to close now.


@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Chris Fleck, Jason Bloomberg

Blog Feed Post

Cloud Front Group Capabilities Featured in Geospatial Intelligence Forum


The Geospatial Intelligence Forum highlighted the capabilities of the Cloud Front Group in an assessment titled “Analysis for Action

This piece highlights the importance of well engineered systems to extract and move the right information.

Cloud Front Group is known for their work with the best available American technologies including capabilities from  Saratoga Data,  Thetus,  piXlogicMetaCarta and others.  Cloud Front Group has also built industry leading capabilities to create knowledge over raw data and leverage existing enterprise capabilities to serve DoD and IC missions.

Highlights of this article included:

Today, UAVs alone are producing so much video feed that the intelligence community is finding it increasingly challenging to analyze and draw conclusions to determine “actionable intelligence.”

For one, there is the issue of knowing exactly where and when to look, especially since the space beneath an aircraft’s flight path can vary between 25 to 100 square kilometers. With traditional EO/IR video, aircraft are limited to watching approximately 1 percent of that area with adequate resolution.

With the growing influx of data that is being produced by UAVs, it has become increasingly necessary to bring computers into the loop, noted Jonathan “Michael” Ehrlich, product manager, GEOINT Enterprise Solutions, ITT Exelis Geospatial Systems. “The quantity of data being generated requires computer systems not only to provide a means to collect, store and distribute the data, but also to analyze and catalog the video data for analysts and decision makers,” he said.

“Functionally, information overload is a real concern for operators. All relevant data input must be matched with support for interpreting, correlating, summarizing and visualizing the data against the existing knowledge base, transforming raw bits of data into actionable intelligence,” commented John Mackay, president and chief executive officer of Cloud Front Group.

Video is quickly becoming the most indemand sensor intelligence on the battlefield, making the ability to transport and mine it a top priority. Video data has been increasing in a variety of ways, including the quantity of sensors and platforms, types of sensors, and the resolution and the frame-rate of data acquired.

From Platform to Analyst

The increased adoption of UAVs has reduced the cost of data acquisition operations, allowing more frequent and longer duration missions to be executed. But the challenge with UAV video is to quickly and efficiently get it from the platform to the analyst, while also giving analysts the ability to quickly identify priority intelligence requirements in near real-time or later without having to watch hours of unchanging video.

In addition, video requires significant bandwidth to deliver, which places demanding network requirements on real-time and tactical applications.

The Cloud Front Group has put together an integrated package of technologies to solve the problem of quickly disseminating relevant video data to tactical or real-time operators. “To achieve this goal, imagery object recognition and search software from piXlogic is used to scan captured video for notions of interest such as certain vehicles or people,” Mackay explained.

Once identified, the segment of video surrounding the identified notion is immediately routed to any subscribed operator using Flume, an advanced file transfer and synchronization software solution from Saratoga Data Systems.

“The entire sensor feed continues to be recorded and can be downloaded once its mission completes, but real-time operations can be positively affected by this dissemination of relevant segments, requiring much less bandwidth and providing more resilience to network challenges than actual video streaming,” he said.

The ability to collect, store and distribute metadata, as well as tag or mark events or sequences of interest, greatly assists in working with the large quantity of video feeds collected.

Wide area motion imagery (WAMI) and other large volume data sources require an even larger number of analysts to monitor activities if the current paradigm is extended. WAMI, unlike full motion video (FMV), is high-resolution imagery over large ground footprint areas for long periods of time, allowing persistent surveillance over city-scale regions, and enabling intelligence to be gathered from motion patterns and locations of many simultaneous targets over a large region of interest.

For this concept of operations to work, the video processing must occur as close to real-time as possible. This temporal proximity requirement suggests that the video processing needs be in physical proximity to the collection device in order to reduce transmission delay. The reliability of such recognition improves with the quality of the imagery being processed, which generally translates to being closer to the capture source so that compression, recoding and transmission do not degrade the data. Hence, the video processing system must be deployed on the same local area network as the sensor itself.

“Using our UAV example, the UAV should have on-board object recognition processing as well as video capture capabilities,” Mackay described.

The information provided by the alert must include the sensor data that caused the alert. This video segment then needs to be delivered with the highest-possible resolution to ensure operators can interpret the video accurately and make the best recommendation for the success of the mission.

Given the network challenges in tactical environments, an efficient compression and transmission protocol must be leveraged to provide the maximum possible video resolution over the available network conditions. Such a protocol must be resilient against network latency, intermittency, and the error rates common in tactical conditions so that missions can rely on the alerts being delivered.

“To address this technology challenge we use Saratoga Data System’s Flume, a 100 percent software solution to file transfers,” Mackay reported. “In recent testing by the Air Force and Army, Flume has proven significant improvement over standard network file transfer protocols.”


For more see: ”Analysis for Action



Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley, former CTO of the Defense Intelligence Agency (DIA), is Founder and CTO of Crucial Point LLC, a technology research and advisory firm providing fact based technology reviews in support of venture capital, private equity and emerging technology firms. He has extensive industry experience in intelligence and security and was awarded an intelligence community meritorious achievement award by AFCEA in 2008, and has also been recognized as an Infoworld Top 25 CTO and as one of the most fascinating communicators in Government IT by GovFresh.

@CloudExpo Stories
Between the compelling mockups and specs produced by analysts, and resulting applications built by developers, there exists a gulf where projects fail, costs spiral, and applications disappoint. Methodologies like Agile attempt to address this with intensified communication, with partial success but many limitations. In his session at DevOps Summit, Charles Kendrick, CTO and Chief Architect at Isomorphic Software, will present a revolutionary model enabled by new technologies. Learn how busine...
Interested in leveraging automation technologies and a cloud architecture to make developers more productive? Learn how PaaS can benefit your organization to help you streamline your application development, allow you to use existing infrastructure and improve operational efficiencies. Begin charting your path to PaaS with OpenShift Enterprise.
The buzz continues for cloud, data analytics and the Internet of Things (IoT) and their collective impact across all industries. But a new conversation is emerging - how do companies use industry disruption and technology enablers to lead in markets undergoing change, uncertainty and ambiguity? Organizations of all sizes need to evolve and transform, often under massive pressure, as industry lines blur and merge and traditional business models are assaulted and turned upside down. In this new da...
Internet of Things (IoT) will be a hybrid ecosystem of diverse devices and sensors collaborating with operational and enterprise systems to create the next big application. In their session at @ThingsExpo, Bramh Gupta, founder and CEO of, and Fred Yatzeck, principal architect leading product development at, discussed how choosing the right middleware and integration strategy from the get-go will enable IoT solution developers to adapt and grow with the industry, while at th...
Overgrown applications have given way to modular applications, driven by the need to break larger problems into smaller problems. Similarly large monolithic development processes have been forced to be broken into smaller agile development cycles. Looking at trends in software development, microservices architectures meet the same demands. Additional benefits of microservices architectures are compartmentalization and a limited impact of service failure versus a complete software malfunction....
Data loss happens, even in the cloud. In fact, if your company has adopted a cloud application in the past three years, data loss has probably happened, whether you know it or not. In his session at 17th Cloud Expo, Bryan Forrester, Senior Vice President of Sales at eFolder, will present how common and costly cloud application data loss is and what measures you can take to protect your organization from data loss.
The Internet of Things (IoT) is growing rapidly by extending current technologies, products and networks. By 2020, Cisco estimates there will be 50 billion connected devices. Gartner has forecast revenues of over $300 billion, just to IoT suppliers. Now is the time to figure out how you’ll make money – not just create innovative products. With hundreds of new products and companies jumping into the IoT fray every month, there’s no shortage of innovation. Despite this, McKinsey/VisionMobile data...
The web app is agile. The REST API is agile. The testing and planning are agile. But alas, data infrastructures certainly are not. Once an application matures, changing the shape or indexing scheme of data often forces at best a top down planning exercise and at worst includes schema changes that force downtime. The time has come for a new approach that fundamentally advances the agility of distributed data infrastructures. Come learn about a new solution to the problems faced by software organ...
“In the past year we've seen a lot of stabilization of WebRTC. You can now use it in production with a far greater degree of certainty. A lot of the real developments in the past year have been in things like the data channel, which will enable a whole new type of application," explained Peter Dunkley, Technical Director at Acision, in this interview at @ThingsExpo, held Nov 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA.
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. Migration to cloud shifts computing resources from your data center, which can yield significant advantages provided that the cloud vendor an offer enterprise-grade quality for your application.
JFrog has announced a powerful technology for managing software packages from development into production. JFrog Artifactory 4 represents disruptive innovation in its groundbreaking ability to help development and DevOps teams deliver increasingly complex solutions on ever-shorter deadlines across multiple platforms JFrog Artifactory 4 establishes a new category – the Universal Artifact Repository – that reflects JFrog's unique commitment to enable faster software releases through the first pla...
IT data is typically silo'd by the various tools in place. Unifying all the log, metric and event data in one analytics platform stops finger pointing and provides the end-to-end correlation. Logs, metrics and custom event data can be joined to tell the holistic story of your software and operations. For example, users can correlate code deploys to system performance to application error codes.
Through WebRTC, audio and video communications are being embedded more easily than ever into applications, helping carriers, enterprises and independent software vendors deliver greater functionality to their end users. With today’s business world increasingly focused on outcomes, users’ growing calls for ease of use, and businesses craving smarter, tighter integration, what’s the next step in delivering a richer, more immersive experience? That richer, more fully integrated experience comes ab...
As-a-service models offer huge opportunities, but also complicate security. It may seem that the easiest way to migrate to a new architectural model is to let others, experts in their field, do the work. This has given rise to many as-a-service models throughout the industry and across the entire technology stack, from software to infrastructure. While this has unlocked huge opportunities to accelerate the deployment of new capabilities or increase economic efficiencies within an organization, i...
SYS-CON Events announced today that Dyn, the worldwide leader in Internet Performance, will exhibit at SYS-CON's 17th International Cloud Expo®, which will take place on November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA. Dyn is a cloud-based Internet Performance company. Dyn helps companies monitor, control, and optimize online infrastructure for an exceptional end-user experience. Through a world-class network and unrivaled, objective intelligence into Internet condit...
The last decade was about virtual machines, but the next one is about containers. Containers enable a service to run on any host at any time. Traditional tools are starting to show cracks because they were not designed for this level of application portability. Now is the time to look at new ways to deploy and manage applications at scale. In his session at @DevOpsSummit, Brian “Redbeard” Harrington, a principal architect at CoreOS, will examine how CoreOS helps teams run in production. Attende...
Containers are revolutionizing the way we deploy and maintain our infrastructures, but monitoring and troubleshooting in a containerized environment can still be painful and impractical. Understanding even basic resource usage is difficult - let alone tracking network connections or malicious activity. In his session at DevOps Summit, Gianluca Borello, Sr. Software Engineer at Sysdig, will cover the current state of the art for container monitoring and visibility, including pros / cons and li...
There are so many tools and techniques for data analytics that even for a data scientist the choices, possible systems, and even the types of data can be daunting. In his session at @ThingsExpo, Chris Harrold, Global CTO for Big Data Solutions for EMC Corporation, will show how to perform a simple, but meaningful analysis of social sentiment data using freely available tools that take only minutes to download and install. Participants will get the download information, scripts, and complete en...
Containers are changing the security landscape for software development and deployment. As with any security solutions, security approaches that work for developers, operations personnel and security professionals is a requirement. In his session at @DevOpsSummit, Kevin Gilpin, CTO and Co-Founder of Conjur, will discuss various security considerations for container-based infrastructure and related DevOps workflows.
Manufacturing has widely adopted standardized and automated processes to create designs, build them, and maintain them through their life cycle. However, many modern manufacturing systems go beyond mechanized workflows to introduce empowered workers, flexible collaboration, and rapid iteration. Such behaviors also characterize open source software development and are at the heart of DevOps culture, processes, and tooling.