Click here to close now.

Welcome!

Cloud Expo Authors: Aria Blog, Irit Gillath, XebiaLabs Blog, Elizabeth White, AppDynamics Blog

Related Topics: Cloud Expo, Java, SOA & WOA, Virtualization, Search, Web 2.0

Cloud Expo: Blog Feed Post

The Next Big Thing: WeeData

Big Data are high-volume, high-velocity, and/or high-variety information assets that require new forms of processing

‘Big Data’ has a problem, and that problem is its name.

Dig deep into the big data ecosystem, or spend any time at all talking with its practitioners, and you should quickly start hitting the Vs. Initially Volume, Velocity and Variety, the Vs rapidly bred like rabbits. Now we have a plethora of new V-words, including Value, Veracity, and more. Every new presentation on big data, it seems, feels obligated to add a V to the pile.

Gartner stands by the original three, stating earlier this year that

Big Data are high-volume, high-velocity, and/or high-variety information assets that require new forms of processing to enable enhanced decision making, insight discovery and process optimization.

(my emphasis)

But by latching onto the ‘big’ part of the name, and reinforcing that with the ‘volume’ V, we become distracted and run the risk of rapidly missing the point. The implication from a whole industry is that size matters. Bigger is better. If you don’t collect everything, you’re woefully out of touch. And if you’re not counting in petas, exas, zettas or yottas, how on earth do you live with the shame?

From the outset, though, size was only part of the picture. Streams of data from social networks, traffic management systems or stock control processes raise a lot of challenges because of the speed with which data must be ingested, or the rapidity with which actionable decisions must be taken. Data volumes may only be a few gigabytes or – oh, the embarrassment – megabytes, but the challenge is still very real. Combining data of different types from disparate sources also creates opportunities. Video from traffic cameras, combined with the logs of pressure sensors beneath the roads, weather data from forecasters and historic records, and social networking comments about the smoothness (or otherwise) of the morning commute build a rich picture that no single source can provide. The initial data sets may be large, but the subset that is actually relevant to the problem being studied will often be far smaller. Variety is the challenge here, not volume.

Of all the Vs clamouring to join the initial three, the most compelling to me must surely be Value. What is the value of the insight offered by this data, regardless of how big it is, how fast it’s coming at me, or how many formats it comprises?

The correct interpretation of the right analysis, performed on the optimal set of data. Surely that’s what we should celebrate? Sometimes, certainly, that analysis will be performed on mind-bogglingly huge data stores. But sometimes it won’t, and the results can be just as valuable. Is it better to collect everything and then extract the gigabyte or two of data you actually need, or does it make more sense to know what you’re interested in and devise a collection strategy that gets you the right data in the first place?

Big Data is impressive stuff. This fledgling market segment is full of interesting companies doing remarkable things. The ability to hold genomes, city traffic systems, internet search logs, or complex financial models in memory and to manipulate them in order to derive insight in real time is stunning and transformative. The switch from highly structured relational databases to schema-less data stores creates new opportunities, and new challenges.

Let’s celebrate the value of data volume when it’s appropriate to do so, but let’s not create the patently false impression that big is best.

Ben Kepes and I jokingly used Twitter to announce a new company at last year’s Defrag. WeeData (‘wee‘ as in small, not the other kind) was to concern itself with everything that big data’s champions did not, and that’s a very large market indeed. We’re hoping that the people who rushed to sign up as customers, investors, and Board members got the joke…

Read the original blog entry...

More Stories By Paul Miller

Paul Miller works at the interface between the worlds of Cloud Computing and the Semantic Web, providing the insights that enable you to exploit the next wave as we approach the World Wide Database. He blogs at www.cloudofdata.com.

@CloudExpo Stories
Even as cloud and managed services grow increasingly central to business strategy and performance, challenges remain. The biggest sticking point for companies seeking to capitalize on the cloud is data security. Keeping data safe is an issue in any computing environment, and it has been a focus since the earliest days of the cloud revolution. Understandably so: a lot can go wrong when you allow valuable information to live outside the firewall. Recent revelations about government snooping, along...
The cloud is now a fact of life but generating recurring revenues that are driven by solutions and services on a consumption model have been hard to implement, until now. In their session at 16th Cloud Expo, Ermanno Bonifazi, CEO & Founder of Solgenia, and Ian Khan, Global Strategic Positioning & Brand Manager at Solgenia, will discuss how a top European telco has leveraged the innovative recurring revenue generating capability of the consumption cloud to enable a unique cloud monetization mod...
FedRAMP is mandatory for government cloud deployments and businesses need to comply in order to provide services for federal engagements. In his session at 16th Cloud Expo, Abel Sussman, Director for Coalfire Public Sector practice, will review the Federal Risk and Authorization Management Program (FedRAMP) process and provide advice on overcoming common compliance obstacles.
Are your applications getting in the way of your business strategy? It’s time to rethink your IT approach. In his session at 16th Cloud Expo, Madhukar Kumar, Vice President, Product Management at Liaison Technologies, will discuss a new data-centric approach to IT that allows your data, not applications, to inform business strategy. By moving away from an application-centric IT model where data integration and analysis are subservient to the constraints of applications, your organization will b...
Docker is an excellent platform for organizations interested in running microservices. It offers portability and consistency between development and production environments, quick provisioning times, and a simple way to isolate services. In his session at DevOps Summit at 16th Cloud Expo, Shannon Williams, co-founder of Rancher Labs, will walk through these and other benefits of using Docker to run microservices, and provide an overview of RancherOS, a minimalist distribution of Linux designed...
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Analytics is the foundation of smart data and now, with the ability to run Hadoop directly on smart storage systems like Cloudian HyperStore, enterprises will gain huge business advantages in terms of scalability, efficiency and cost savings as they move closer to realizing the potential of the Internet of Things. In his session at 16th Cloud Expo, Paul Turner, technology evangelist and CMO at Cloudian, Inc., will discuss the revolutionary notion that the storage world is transitioning from me...
Red Hat has launched the Red Hat Cloud Innovation Practice, a new global team of experts that will assist companies with more quickly on-ramping to the cloud. They will do this by providing solutions and services such as validated designs with reference architectures and agile methodology consulting, training, and support. The Red Hat Cloud Innovation Practice is born out of the integration of technology and engineering expertise gained through the company’s 2014 acquisitions of leading Ceph s...
IBM has announced that SoftLayer will offer OpenPOWER-based servers as part of its portfolio of cloud-based services. With the new offering, clients will be able to select OpenPOWER-based “bare metal” servers when configuring their cloud-based IT infrastructure from SoftLayer, an IBM company. Leveraging the OpenPOWER Foundation design concept, the servers were developed to help clients better manage data-intensive workloads on public and private clouds. Increasingly cloud technologies, bot...
The Software Defined Data Center (SDDC), which enables organizations to seamlessly run in a hybrid cloud model (public + private cloud), is here to stay. IDC estimates that the software-defined networking market will be valued at $3.7 billion by 2016. Security is a key component and benefit of the SDDC, and offers an opportunity to build security 'from the ground up' and weave it into the environment from day one. In his session at 16th Cloud Expo, Reuven Harrison, CTO and Co-Founder of Tufin,...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
The essence of cloud computing is that all consumable IT resources are delivered as services. In his session at 15th Cloud Expo, Yung Chou, Technology Evangelist at Microsoft, will demonstrate the concepts and implementations of two important cloud computing deliveries: Infrastructure as a Service (IaaS) and Platform as a Service (PaaS). He will discuss from business and technical viewpoints what exactly they are, why we care, how they are different and in what ways, and the strategies for IT ...
SYS-CON Media announced today that Symantec, a provider of leading security, backup and availability solutions for where vital information is stored, accessed and shared, has launched new ad campaigns on SYS-CON's i-Technology sites, which include Cloud Computing Journal, DevOps Journal, Virtualization Journal, and IoT Journal. Symantec’s campaigns focus on Disaster Recovery and High Availability, the availability of business-critical applications in today’s complex heterogeneous environments, ...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
JFrog on Thursday announced that it has added Docker support to Bintray, its distribution-as-a-service (DaaS) platform. When combined with JFrog’s Artifactory binary repository management system, organizations can now manage Docker images with an end-to-end solution that supports all technologies. The new version of Bintray allows organizations to create an unlimited number of private Docker repositories, and through the use of fast Akamai content delivery networks (CDNs), it decreases the dow...