Welcome!

@CloudExpo Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Liz McMillan, Zakia Bouachraoui

RSS Feed Item

RE: XML not ideal for Big Data

Yes Michael - it is all about the "sweet spot".

XML is **THE** open standard for data and document exchange.  XML (and the
W3C) have pushed the idea of an open data standard and associated tools
beyond what I would have ever imagined.  I see no clear competitor in this
class.  And I think, if we move XML into the modern age it should be simpler
and not hung up on a serialization format - XML is the datamodel (InfoSet,
XQuery DM) - an idea which has been evolving for some time.  So in that way,
JSON could simply be an XML serialization (sort of what Kurt was hinting
at).

There are all kinds of uses cases where the ROI of using XML makes sense -
ranging from the ability to exchange data, move across vendors and the broad
range of close to free programming tools. For individual systems however,
that do not have these needs, other "internal" more optimized formats will
be used.  

In documents, XML is dominant.  In data, standards like XBRL(accounting) and
HL7(Health) and other data centric markup languages have promise - I havn't
been swimming in the data centric lake lately - what say others?  For
example, what are people doing in exchange of healthcare information.  When
I last checked the one big humming health record data center was popular in
some camps - but even then that system cannot be all things to all people -
so there has to be an exchange format.  The question is, should that
exchange format be "normalized" in the relational sense, or not?

If you have a system of heterogeneous systems (system of systems), like in
heathcare, open data standards often sense.  If you have one system,
internal use of XML may be limited.

Beware, making the "system of systems" idea work is a road full of detours
and blockades as can be seen by the failure of enterprise data model
projects, let alone, cross enterprise data models.

So, XML is not a religion but a clearly defined "mark in the sand" whose
application can be evaluated on a case by case basis.

Jim

-----Original Message-----
From: Michael Sokolov [mailto:sokolov@ifactory.com] 
Sent: Thursday, September 03, 2009 6:49 PM
To: 'Simon St.Laurent'; xml-dev@lists.xml.org
Subject: RE:  XML not ideal for Big Data

So many points were made arguing for XML being OK for "big data," many of
them sensible to me.  Just to be clear: I use XML databases day in and day
out, I work with large XML files and it's all just dandy.  I don't think
size is an issue, mostly.

However I think we need to recognize that there is data for which XML was
not designed and is ill-suited.  Binary, numeric data, such as video, images
and audio, to say nothing of scientific data (years of detector readings in
a neutrino decay experiment) is just not the sweet spot for XML.  I searched
for "MP3 to XML converter", but couldn't find anything.  I have to admit I
was surprised: the net is so big that I thought it had finally reached the
stage where enough monkeys typing will have produced absolutely everything.
Maybe my search skills just weren't up to the task.

Now it's hard to tell if his problem fell in the domain for which XML is not
well-suited.  I don't know what the details of the original writer's project
were, but I would tend to want to take him at his word that XML was not the
right choice for his data.  It's certainly possible: there is such a domain.
And genomics data sounds to me pretty unlike documents: it probably wouldn't
pass the smell test that was discussed earlier. XML is not for everything.

As an aside, XML is also not always the right choice for every*one*, either,
regardless of the problem domain.  Even if it might have been possible for
someone else to achieve success with a genomics dataset using XML rather
than CSV and perl or whatever he used, I think his point is still valid.  He
doesn't want to spend time learning XML technologies: he just want to get
the project done.  So if learning XML (document format, query language,
database technology, etc) was too hard and he managed to find success some
other way, I don't think that's any reason to disparage him.  He found a
tool that suited his purposes and the context in which he was working.

Last point: the only reason people write articles like his is that XML was
touted as the everything/everywhere solution for so long. For me it's still
about (human-readable) documents and data interchange, primarily.  I'm
curious whether there is agreement on that, or do folks see other broad
areas where XML is beneficial?

-Mike

> -----Original Message-----
> From: Simon St.Laurent [mailto:simonstl@simonstl.com] 
> Sent: Thursday, September 03, 2009 11:54 AM
> To: xml-dev@lists.xml.org
> Subject:  XML not ideal for Big Data
> 
> Perhaps there were better ways to have made XML work with his 
> problems... but I think on the whole he's right.
> 
> http://dataspora.com/blog/xml-and-big-data/
> 
> --
> Simon St.Laurent
> http://simonstl.com/
> 
> ____________________________


_______________________________________________________________________

XML-DEV is a publicly archived, unmoderated list hosted by OASIS
to support XML implementation and development. To minimize
spam in the archives, you must subscribe before posting.

[Un]Subscribe/change address: http://www.oasis-open.org/mlmanage/
Or unsubscribe: xml-dev-unsubscribe@lists.xml.org
subscribe: xml-dev-subscribe@lists.xml.org
List archive: http://lists.xml.org/archives/xml-dev/
List Guidelines: http://www.oasis-open.org/maillists/guidelines.php



Read the original blog entry...

CloudEXPO Stories
Most modern computer languages embed a lot of metadata in their application. We show how this goldmine of data from a runtime environment like production or staging can be used to increase profits. Adi conceptualized the Crosscode platform after spending over 25 years working for large enterprise companies like HP, Cisco, IBM, UHG and personally experiencing the challenges that prevent companies from quickly making changes to their technology, due to the complexity of their enterprise. An accomplished expert in Enterprise Architecture, Adi has also served as CxO advisor to numerous Fortune executives.
Cloud computing is a goal aspired to by all organizations, yet those in regulated industries and many public sector organizations are challenged in adopting cloud technologies. The ability to use modern application development capabilities such as containers, serverless computing, platform-based services, IoT and others are potentially of great benefit for these organizations but doing so in a public cloud-consistent way is the challenge.
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
In his session at 21st Cloud Expo, Michael Burley, a Senior Business Development Executive in IT Services at NetApp, described how NetApp designed a three-year program of work to migrate 25PB of a major telco's enterprise data to a new STaaS platform, and then secured a long-term contract to manage and operate the platform. This significant program blended the best of NetApp’s solutions and services capabilities to enable this telco’s successful adoption of private cloud storage and launching of virtual storage services to its enterprise market.
When building large, cloud-based applications that operate at a high scale, it’s important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. “Fly two mistakes high” is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, will discuss how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.