Welcome!

@CloudExpo Authors: Elizabeth White, Yeshim Deniz, Liz McMillan, Pat Romanski, Zakia Bouachraoui

Related Topics: @DXWorldExpo, Microservices Expo, Microsoft Cloud, Containers Expo Blog, @CloudExpo, Apache

@DXWorldExpo: Blog Feed Post

Big Data On-ramp

Here are five areas of consideration for Big Data on-ramp: Structure, Location, Objective, Participant, and Event (SLOPE)

Due to the unprecedented volume, variety, and velocity of Big Data, it is neither trivial nor straightforward to find a clear path to jumpstart the Big Data journey. This space is overwhelmingly crowded with so many immature options and evolving solutions. To some extent it is somewhat confusing and daunting. Where can you find an entry point? What is the most effective way to get on board? Which aspects should you be mindful of? How can you not miss the paramount things?
Why do you need to begin with the basics?

Here are five areas of consideration for Big Data on-ramp: Structure, Location, Objective, Participant, and Event (SLOPE).

  • Structure: The data format is the first and foremost factor. Confining to the traditional structured contents is no longer sufficient in this era. We have to pay close attention to how we will deal with the unstructured and semi-structured information that will be imported and analyzed in the short term and long run.
  • Location: Where data reside and how they move around inside or outside an enterprise have an influential impact on the overall Big Data ecosystem. A sophisticated messaging platform should be employed in a complex environment entailing heterogeneous data sources and consumption. The data locality is also important for distributed processing with a viable hosting model.<
  • Objective: The reasonable goal and right level of expectations should be set up at the very beginning to develop solid business cases. Big Data as a whole is not just for the sake of moving to the Big Data technology. Rather, it is an advanced discipline to transform a problematic environment to a realistic target state. For example, eliminating data silos is a must, but it also brings pains and conflicts during the execution.
  • Participant: It is crucial to conceive Big Data solutions from a user-centric perspective. All stakeholders involved need think coherently about the value chain of the data as assets. The priorities and preferences among the end-users, partners, data feeders, brokers, etc. must be balanced and harmonized. A RACI or SCARI matrix should be established to specify roles and responsibilities in the governance.
  • Event: The types of interactions and access dictate what Big Data platform are the most suitable candidates for both transactional and analytical processing. Large data streaming is a feasible option to enable near real-time analytics in the scenarios like fraud detection. Quantifiable thresholds ought to be defined to explicitly describe how real is real-time.


For more information, please contact Tony Shan ([email protected]).

Read the original blog entry...

More Stories By Tony Shan

Tony Shan works as a senior consultant, advisor at a global applications and infrastructure solutions firm helping clients realize the greatest value from their IT. Shan is a renowned thought leader and technology visionary with a number of years of field experience and guru-level expertise on cloud computing, Big Data, Hadoop, NoSQL, social, mobile, SOA, BI, technology strategy, IT roadmapping, systems design, architecture engineering, portfolio rationalization, product development, asset management, strategic planning, process standardization, and Web 2.0. He has directed the lifecycle R&D and buildout of large-scale award-winning distributed systems on diverse platforms in Fortune 100 companies and public sector like IBM, Bank of America, Wells Fargo, Cisco, Honeywell, Abbott, etc.

Shan is an inventive expert with a proven track record of influential innovations such as Cloud Engineering. He has authored dozens of top-notch technical papers on next-generation technologies and over ten books that won multiple awards. He is a frequent keynote speaker and Chair/Panel/Advisor/Judge/Organizing Committee in prominent conferences/workshops, an editor/editorial advisory board member of IT research journals/books, and a founder of several user groups, forums, and centers of excellence (CoE).

CloudEXPO Stories
"Space Monkey by Vivent Smart Home is a product that is a distributed cloud-based edge storage network. Vivent Smart Home, our parent company, is a smart home provider that places a lot of hard drives across homes in North America," explained JT Olds, Director of Engineering, and Brandon Crowfeather, Product Manager, at Vivint Smart Home, in this SYS-CON.tv interview at @ThingsExpo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Enterprises are striving to become digital businesses for differentiated innovation and customer-centricity. Traditionally, they focused on digitizing processes and paper workflow. To be a disruptor and compete against new players, they need to gain insight into business data and innovate at scale. Cloud and cognitive technologies can help them leverage hidden data in SAP/ERP systems to fuel their businesses to accelerate digital transformation success.
In this presentation, you will learn first hand what works and what doesn't while architecting and deploying OpenStack. Some of the topics will include:- best practices for creating repeatable deployments of OpenStack- multi-site considerations- how to customize OpenStack to integrate with your existing systems and security best practices.
DXWordEXPO New York 2018, colocated with CloudEXPO New York 2018 will be held November 11-13, 2018, in New York City and will bring together Cloud Computing, FinTech and Blockchain, Digital Transformation, Big Data, Internet of Things, DevOps, AI, Machine Learning and WebRTC to one location.
Most DevOps journeys involve several phases of maturity. Research shows that the inflection point where organizations begin to see maximum value is when they implement tight integration deploying their code to their infrastructure. Success at this level is the last barrier to at-will deployment. Storage, for instance, is more capable than where we read and write data. In his session at @DevOpsSummit at 20th Cloud Expo, Josh Atwell, a Developer Advocate for NetApp, will discuss the role and value extensible storage infrastructure has in accelerating software development activities, improve code quality, reveal multiple deployment options through automated testing, and support continuous integration efforts. All this will be described using tools common in DevOps organizations.