|By Kumar Srivastava||
|August 16, 2013 08:30 AM EDT||
When talking about Big Data, most people talk about numbers: speed of processing and how many terabytes and petabytes the platform can handle. But deriving deep insights with the potential to change business growth trajectories relies not just on quantities, processing power and speed, but also three key ilities: portability, usability and quality of the data.
Portability, usability, and quality converge to define how well the processing power of the Big Data platform can be harnessed to deliver consistent, high quality, dependable and predictable enterprise-grade insights.
Portability: Ability to transport data and insights in and out of the system
Usability: Ability to use the system to hypothesize, collaborate, analyze, and ultimately to derive insights from data
Quality: Ability to produce highly reliable and trustworthy insights from the system
Portability is measured by how easily data sources (or providers) as well as data and analytics consumers (the primary "actors" in a Big Data system) can send data to, and consume data from, the system.
Data Sources can be internal systems or data sets, external data, data providers, or the apps and APIs that generate your data. A measure of high portability is how easily data providers and producers can send data to your Big Data system as well as how effortlessly they can connect to the enterprise data system to deliver context.
Analytics consumers are the business users and developers who examine the data to uncover patterns. Consumers expect to be able to inspect their raw, intermediate or output data to not only define and design analyses but also to visualize and interpret results. A measure of high portability for data consumers is easy access - both manually or programmatically - to raw, intermediate, and processed data. Highly portable systems enable consumers to readily trigger analytical jobs and receive notification when data or insights are available for consumption.
The usability of a Big Data system is the largest contributor to the perceived and actual value of that system. That's why enterprises need to consider if their Big Data analytics investment provides functionality that not only generates useful insights but also is easy to use.
Business users need an easy way to:
- Request analytics insights
- Explore data and generate hypothesis
- Self-serve and generate insights
- Collaborate with data scientists, developers, and business users
- Track and integrate insights into business critical systems, data apps, and strategic planning processes
Developers and data scientists need an easy way to:
- Define analytical jobs
- Collect, prepare, pre process, and cleanse data for analysis
- Add context to their data sets
- Understand how, when, and where the data was created, how to interpret data and know who created them
The quality of a Big Data system is dependent on the quality of input data streams, data processing jobs, and output delivery systems.
Input Quality: As the number, diversity, frequency, and format of data channel sources explode, it is critical that enterprise-grade Big Data platforms track the quality and consistency of data sources. This also informs downstream alerts to consumers about changes in quality, volume, velocity, or the configuration of their data stream systems.
Analytical Job Quality: A Big Data system should track and notify users about the quality of the jobs (such as map reduce or event processing jobs) that process incoming data sets to produce intermediate or output data sets.
Output Quality: Quality checks on the outputs from Big Data systems ensure that transactional systems, users, and apps offer dependable, high-quality insights to their end users. The output from Big Data systems needs to be analyzed for delivery predictability, statistical significance, and access according to the constraints of the transactional system.
Though we've explored how portability, usability, and quality separately influence the consistency, quality, dependability, and predictability of your data systems, remember it's the combination of the ilities that determines if your Big Data system will deliver actionable enterprise-grade insights.
This piece is the first in a three-part series on how businesses can squeeze maximum business value out of their Big Data analysis.
Cloudwick, the leading big data DevOps service and solution provider to the Fortune 1000, announced Big Loop, its multi-vendor operations platform. Cloudwick Big Loop creates greater collaboration between Fortune 1000 IT staff, developers and their database management systems as well as big data vendors. This allows customers to comprehensively manage and oversee their entire infrastructure, which leads to more successful production cluster operations, and scale-out. Cloudwick Big Loop supports ...
Oct. 20, 2014 02:15 PM EDT Reads: 1,517
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. ...
Oct. 20, 2014 02:00 PM EDT Reads: 1,417
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
Oct. 20, 2014 01:45 PM EDT Reads: 1,399
StackIQ offers a comprehensive software suite that automates the deployment, provisioning, and management of Big Infrastructure. With StackIQ’s software, you can spin up fully configured big data clusters, quickly and consistently — from bare-metal up to the applications layer — and manage them efficiently. Our software’s modular architecture allows customers to integrate nearly any application with the StackIQ software stack.
Oct. 20, 2014 01:00 PM EDT Reads: 1,318
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Oct. 20, 2014 01:00 PM EDT Reads: 1,295
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce t...
Oct. 20, 2014 12:00 PM EDT Reads: 1,752
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
Oct. 20, 2014 12:00 PM EDT Reads: 1,464
TechCrunch reported that "Berlin-based relayr, maker of the WunderBar, an Internet of Things (IoT) hardware dev kit which resembles a chunky chocolate bar, has closed a $2.3 million seed round, from unnamed U.S. and Switzerland-based investors. The startup had previously raised a €250,000 friend and family round, and had been on track to close a €500,000 seed earlier this year — but received a higher funding offer from a different set of investors, which is the $2.3M round it’s reporting."...
Oct. 20, 2014 09:00 AM EDT Reads: 1,391
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
Oct. 19, 2014 10:45 PM EDT Reads: 1,886
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, da...
Oct. 19, 2014 10:00 PM EDT Reads: 1,370
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
Oct. 19, 2014 09:00 PM EDT Reads: 1,582
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services.
Oct. 19, 2014 11:15 AM EDT Reads: 1,871
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.
Oct. 19, 2014 11:00 AM EDT Reads: 1,818
SYS-CON Events announced today that Objectivity, Inc., the leader in real-time, complex Big Data solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies – InfiniteGraph®, The Distributed Graph Database™ and Objectivity/DB®, a distributed and scalable object ma...
Oct. 18, 2014 12:00 PM EDT Reads: 1,663
In their session at DevOps Summit, Stan Klimoff, CTO of Qubell, and Mike Becker, Senior Data Engineer for RingCentral, will share the lessons learned from implementing CI/CD pipeline on AWS for a customer analytics project powered by Cloudera Hadoop, HP Vertica and Tableau. Stan Klimoff is CTO of Qubell, the enterprise DevOps platform. Stan has more than a decade of experience building distributed systems for companies such as eBay, Cisco and Seagate. Qubell is helping enterprises to become mor...
Oct. 17, 2014 08:00 PM EDT Reads: 1,565
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo, moderated by Ashar Baig, Research ...
Oct. 17, 2014 07:45 PM EDT Reads: 1,558
What are the benefits of using an enterprise-grade orchestration platform? In their session at 15th Cloud Expo, Jeff Tegethoff, CEO of Appcore, and Kedar Poduri, Senior Director of Product Management at Citrix Systems, will take a closer look at the architectural design factors needed to support diverse workloads and how to run these workloads efficiently as a service provider. They will also discuss how to deploy private cloud environments in 15 minutes or less.
Oct. 17, 2014 05:00 PM EDT Reads: 1,977
Seagate has a strong track record of collaborating with others to develop better cloud solutions. The Seagate Cloud Builder Alliance program, for example, leverages the company’s knowledge of storage and cloud-optimized solutions to give cloud service providers the customized, flexible and scalable server and storage solutions to meet the high levels of service their customers demand. Seagate also is a member of the OpenStack Foundation and Open Compute Project to help define and promote open-so...
Oct. 17, 2014 03:30 PM EDT Reads: 1,447
Big Data means many things to many people. From November 4-6 at the Santa Clara Convention Center, thousands of people will gather at Big Data Expo to discuss what it means to them, how they are implementing it, and how Big Data plays an integral role in the maturing cloud computing world and emerging Internet of Things. Attend Big Data Expo and make your contribution. Register for Big Data Expo "FREE" with Discount Code "BigDataOCTOBER" by October 31
Oct. 17, 2014 03:00 PM EDT Reads: 1,637
IBM and SAP announced a major strategic agreement that integrates Big Blue's datacenter capabilities with Big Hasso's ongoing HANA cloud-based database initiative. IBM is investing $1.2 billion to build 15 new datacesnters as part of its SoftLayer acquisition and expansion, and will also use its existing IBM Cloud servers to support the new agreement with SAP. The announcement was made jointly in Armonk, NY and Walldorf, Germany. IBM is described by SAP as a “premier strategic provider” of cl...
Oct. 17, 2014 01:00 PM EDT Reads: 1,932