Welcome!

@CloudExpo Authors: John Rauser, Mark Leake, Nishanth Kadiyala, Derek Weeks, Elizabeth White

Blog Feed Post

CTOvision Big Data Reporting for 2012: CTOs want discipline in the language of sensemaking

By

big-data-620x400This special report provides insights from a our reporting over the last 12 months, including summaries of our Government Big Data Newsletter (sign up for this weekly report here http://ctovision.com/newsletter-subscriptions)

Among the many Big Data themes we reported on in 2012, one seemed to resonate the most with our readers– all of us with a techie bent have realized that we need more discipline in our use of the term Big Data. We revisited this need for discipline in our post of:

Big Data Defined for 2013: A definition that can help in your interaction with the IT community

In it we suggest everyone follow the lead of the TechAmerica foundation in defining Big Data. At CTOvision we will use the term this way:

Big Data: A phenomenon defined by the rapid acceleration in the expanding volume of high velocity, complex and diverse types of data. Big Data is often defined along three dimensions– volume, velocity and variety.

Big Data Solutions: Advanced techniques and technologies to enable the capture, storage, distribution, management and analysis of information.

Early in the year we provided insights for program managers that want to get a started with Big Data solutions. We gave quickstart tips on how you can stand up your own cluster in the cloud. We followed up with ways you can quickly use Whirr to automate that.

Through the year we published several pieces on topics associated with the ethics issues around Big Data. This included a series by Kord Davis who reported on topics like:

We reported extensively on new concepts for Big Data involving very large quantities of data in memory. The greatest expert in this field, Terracotta CEO Robin Gilthorpe, provided his views on Big Data Trends to watch in 2013 by a YouTube video we highlighted to our readers. His view is that requirements will drive the industry to several new highs and will include dramatic social change because of this. His five predictions for 2013 are:

  • Big Data will be fast data – Enterprises will profit from Big Data intelligence in proportion to how quickly they can act on it.
  • Rise of the hybrid cloud – It’s no longer about building your own platform; it’s more efficient to play in ecosystems.
  • CIOs and CMOs get a lot closer – Marketing spend on technology is about to eclipse IT spend on technology.
  • The Internet of things crosses the chasm – In just a few years, over 25 billion data-producing devices will be connected.
  • Social becomes part of life’s fabric – Remember e-business departments? Social will permeate in the same way.

We also wrote about new concepts for capture, storage, distribution and management of data via new concepts like dispersed compute storage. Solutions like this from Cleversafe (see Cleversafe: how does it really work?) are true game changers inserting dramatic improvements to security and functionality and doing so with a quick return on investment.

We reported on many other firms associated with the fielding of high quality Big Data solutions into the federal enterprise, including MarkLogic, Oracle, Datameer, Cloudera, Terracotta, Cleversafe, Splunk, Kapow, Sitscape, CloudFrontGroup, ClearStory, and Thetus. These firms are fielding real, working solutions for Big Data and we will be reporting more on them in 2013 we are sure.

Another clear theme in our reporting of 2012 on Big Data was the importance of mission focus. That is why we are all so excited about the new technical capabilities of Hadoop and the related technologies. It is about impact to mission. Which leads to the Government Big Data Solutions Award:

Our reporting on Big Data for 2012 included announcing the results of the Government Big Data Solutions Award. The Government Big Data Solutions Award was established to highlight innovative solutions and facilitate the exchange of best practices, lessons learned and creative ideas for addressing Big Data challenges. The Top Five Nominees of 2012 were chosen for criteria that included:

  • Focus on current solutions: The ability to make a difference in government missions in the very near term was the most important evaluation factor.
  • Focus on government teams: Industry supporting government also considered, but this is about government missions.
  • Consideration of new approaches: New business processes, techniques, tools, models for enhancing analysis are key.

Winner of the 2012 Government Big Data Solutions Award was the National Cancer Institute’s Frederick National Laboratory.

The NCI Funded Frederick National Laboratory has been using Big Data solutions in pioneering ways to support researchers working on complex challenges around the relationship between genes and cancers. In a  recent example, they have built infrastructure capable of cross-referencing the relationships between 17000 genes and five major cancer subtypes across 20 million biomedical publication abstracts.  By cross referencing TCGA gene expression data from simulated 60 million patients and miRNA expression for a simulated 900 million patients. The result: understanding additional layers of the pathways these genes operate in and the drugs that target them. This will help researchers accelerate their work in areas of importance for all humanity.  This solution, based on the Oracle Big Data Appliance with the Cloudera Distribution of Apache Hadoop (CDH), leverages capabilities available from the Big Data community today in pioneering ways that can serve a broad range of researchers. The promising approach of this solution is repeatable across many other Big Data challenges for bioinfomatics, making this approach worthy of its selection as the 2012 Government Big Data Solution Award.

We also reported on a classification framework for Big Data solutions produced by  in a very insightful post on Classifying Today’s “Big Data Innovators”.  This is an innovative approach that is easy to think through and should be repeatable for many vendors in this space, and should help enterprise technologists think through which vendors may be right for their mission needs.  In it he categorizes the 13 innovative Big Data innovators reported on by Information Week. They are:

1.  MongoDB
2.  Amazon (Redshift, EMR, DynamoDB)
3.  Cloudera (CDH, Impala)
4.  Couchbase
5.  Datameer
6.  Datastax
7.  Hadapt
8.  Hortonworks
9.  Karmasphere
10.  MapR
11.  Neo Technology
12.  Platfora
13.  Splunk

He classifies them into:

1.  Operational data stores that allow flexible schemas
2.  Hadoop distributions
3.  Real-time Hadoop-based analytical platforms
4.  Hadoop-based BI solutions

We will likely return to this classification for reporting in 2013.

What does our reporting over the last 12 months signal for the next 12 months? We believe we will see a continued expansion of the user end of big data solutions. It is probably an oversimplification to say it this way, but one way to look at is is that we have an approach to the backend infrastructure, and that is primarily one built on the Apache Hadoop framework of software over commodity IT integrated into existing but modern enterprise solutions. Their is room for innovation here of course but in general the path of the backend is set and will continue. The dynamic change to expect now is in the user-facing applications. Brace yourself! Changes there will be dynamic.

For reports on Big Data throughout 2013 please sign up for our Government Big Data Newsletter. Find the weekly report at:  http://ctovision.com/newsletter-subscriptions/

Read the original blog entry...

More Stories By Bob Gourley

Bob Gourley writes on enterprise IT. He is a founder and partner at Cognitio Corp and publsher of CTOvision.com

@CloudExpo Stories
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Automation is enabling enterprises to design, deploy, and manage more complex, hybrid cloud environments. Yet the people who manage these environments must be trained in and understanding these environments better than ever before. A new era of analytics and cognitive computing is adding intelligence, but also more complexity, to these cloud environments. How smart is your cloud? How smart should it be? In this power panel at 20th Cloud Expo, moderated by Conference Chair Roger Strukhoff, paneli...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud promises the agility required by today’s digital businesses. As organizations adopt cloud based infrastructures and services, their IT resources become increasingly dynamic and hybrid in nature. Managing these require modern IT operations and tools. In his session at 20th Cloud Expo, Raj Sundaram, Senior Principal Product Manager at CA Technologies, will discuss how to modernize your IT operations in order to proactively manage your hybrid cloud and IT environments. He will be sharing bes...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
A look across the tech landscape at the disruptive technologies that are increasing in prominence and speculate as to which will be most impactful for communications – namely, AI and Cloud Computing. In his session at 20th Cloud Expo, Curtis Peterson, VP of Operations at RingCentral, highlighted the current challenges of these transformative technologies and shared strategies for preparing your organization for these changes. This “view from the top” outlined the latest trends and developments i...
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...
Artificial intelligence, machine learning, neural networks. We’re in the midst of a wave of excitement around AI such as hasn’t been seen for a few decades. But those previous periods of inflated expectations led to troughs of disappointment. Will this time be different? Most likely. Applications of AI such as predictive analytics are already decreasing costs and improving reliability of industrial machinery. Furthermore, the funding and research going into AI now comes from a wide range of com...