Welcome!

@CloudExpo Authors: Yeshim Deniz, Pat Romanski, Elizabeth White, Zakia Bouachraoui, Liz McMillan

Related Topics: @CloudExpo, Recurring Revenue

@CloudExpo: Article

Oracle RDBMS and Very Large Data Set Processing

An overview of very large data set processing technologies with an emphasis on Oracle RDBMS

Oracle database is a relational database management system that mostly complies with ACID transaction requirements ( atomicity, consistency, isolation, durability ). It means that each database transaction will be executed in a reliable, safe and integral manner. In order to comply with ACID Oracle database software implements fairly complex and expensive (in terms of computing resources, i.e., CPU, disk, memory) set of processes like redo and undo logging, memory latching, meta data maintenance etc. that make concurrent work possible, while maintaining data integrity. Any database transaction or even SELECT statement makes relational database systems perform tremendous amounts of work behind the scene, thus making it inherently slow and resource intensive.

Oracle is trying to address scalability and performance problem in a variety of ways:

- by introducing constant performance enhancements to the query optimizer

- Oracle RAC - Real Application Clusters based scale out ( much increased complexity with little practical value in terms of performance and functionality )

- appliances ( Exa* line of products - complex, unbalanced architecture, suboptimally utilized hardware,  patched up and repackaged  software )

All these attempt still feature performance and scalability bottleneck in shape of Oracle RDBMS and its shared-nothing, or assymmetric MPP ( in case of Exadata ) architecture.

Companies dealing with millions of users, huge volumes of data and needing great performance like Google could not use proprietary RDBMSs.  They developed their own solutions, relying on utilizing commodity hardware and open source software. They developed software that can make thousands of Intel boxes behave like a single system that can process your query searching through petabytes of data  in sub-second time. This could never be accomplished if they used standard RDBMS like Oracle. RDBMSs were not designed to deal with problems that Google is facing.

Data processing technologies that originated at Google or other places ( including direct Google research descendant Hadoop,  NoSQL group of products, etc. )  parallelize work and distribute data over thousands of servers, relax ACID requirement so it now maybe becomes BASE (Basically Available, Soft state, Eventual consistency) i.e., they provide weaker consistency guarantees ( CAP theorem ), they loose relational data structure i.e. basically go back to flat files, or distributed, scalable hash tables. By relaxing, modifying or loosing some properties of RDBMs and optimizing to run on commodity hardware they were able to get  results that are good enough in terms of data quality and consistency, while achieving great performance and sufficient accuracy for very basic transactions. Ironically many of these technologies are step backwards i.e. they will end up reinventing many RDBMS features as they mature ( Hadapt - Hadoop based relational database; Hive, Hbase ).

Oracle RDBMS has completely different purpose, i.e., it is targeted for corporate/business use where complex transactions must be accurately executed. Eventually consistent paradigm does not suffice here. While Oracle database can contain huge volumes of multi-media data, its main purpose is to store and concurrently process structured data sets relatively limited in size, for a limited number of concurrent corporate/business users.

More Stories By Ranko Mosic

Ranko Mosic, BScEng, is specializing in Big Data/Data Architecture consulting services ( database/data architecture, machine learning ). His clients are in finance, retail, telecommunications industries. Ranko is welcoming inquiries about his availability for consulting engagements and can be reached at 408-757-0053 or [email protected]

CloudEXPO Stories
Every organization is facing their own Digital Transformation as they attempt to stay ahead of the competition, or worse, just keep up. Each new opportunity, whether embracing machine learning, IoT, or a cloud migration, seems to bring new development, deployment, and management models. The results are more diverse and federated computing models than any time in our history.
On-premise or off, you have powerful tools available to maximize the value of your infrastructure and you demand more visibility and operational control. Fortunately, data center management tools keep a vigil on memory contestation, power, thermal consumption, server health, and utilization, allowing better control no matter your cloud's shape. In this session, learn how Intel software tools enable real-time monitoring and precise management to lower operational costs and optimize infrastructure for today even as you're forecasting for tomorrow.
"Calligo is a cloud service provider with data privacy at the heart of what we do. We are a typical Infrastructure as a Service cloud provider but it's been designed around data privacy," explained Julian Box, CEO and co-founder of Calligo, in this SYS-CON.tv interview at 21st Cloud Expo, held Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA.
Isomorphic Software is the global leader in high-end, web-based business applications. We develop, market, and support the SmartClient & Smart GWT HTML5/Ajax platform, combining the productivity and performance of traditional desktop software with the simplicity and reach of the open web. With staff in 10 timezones, Isomorphic provides a global network of services related to our technology, with offerings ranging from turnkey application development to SLA-backed enterprise support. Leading global enterprises use Isomorphic technology to reduce costs and improve productivity, developing & deploying sophisticated business applications with unprecedented ease and simplicity.
While a hybrid cloud can ease that transition, designing and deploy that hybrid cloud still offers challenges for organizations concerned about lack of available cloud skillsets within their organization. Managed service providers offer a unique opportunity to fill those gaps and get organizations of all sizes on a hybrid cloud that meets their comfort level, while delivering enhanced benefits for cost, efficiency, agility, mobility, and elasticity.