Click here to close now.

Welcome!

Cloud Expo Authors: Carmen Gonzalez, Pat Romanski, Liz McMillan, Hovhannes Avoyan, Peter Silva

Related Topics: Virtualization, Java, Microservices Journal, PowerBuilder, SAP, Cloud Expo

Virtualization: Article

SAP HANA’s Real Time Challenge to the Oracle Empire

Real-time In-Memory platform presents a groundbreaking approach

When the character Maverick from the movie Top Gun exclaimed, "I feel the need, the need for speed", you'd be forgiven for mistaking it for a sound bite from a CIO discussing their transactional databases. Whether it's a financial organization predicting share prices, a bank knowing whether it can approve a loan or a marketing organisation reaching consumers with a compelling promotional offer, the need to access, store, process and analyze data as quickly as possible is an imperative for any business looking to gain a competitive edge. Hence when in 2011, SAP announced their new in-memory platform HANA for enterprise applications everyone took note as they coined the advantage of real-time analytics. SAP HANA promised to not just make databases dramatically faster like traditional business warehouse accelerator systems but instead speed up the front end, enabling companies to run arbitrary, complex queries on billions of records in a matter of seconds as opposed to hours. The vendors of old legacy traditional databases were facing a major challenge, most notably the king of them all...Oracle.

The Birth and Emergence of Big Data
Back in the days of mainframe, you'd find the application and transactional data of reporting databases physically stored in the same system. This was due to applications, operating systems and databases being designed to maximize their hardware resources, which consequently meant you couldn't process transactions and process report simultaneously. The bottleneck here was cost, in that if you wanted to scale you needed another mainframe.

After the advent of client servers where applications could run on a centralized database server via multiple and cost effective servers, scalability was achieved by simply adding additional application servers. Regardless, of this a new bottleneck was quickly established with systems relying on a single database server and requests from ever increasing application servers that ended up causing I/O stagnation. This problem became exasperated with OLTP (online transaction processing), where report creation required the system to concurrently read multiple tables in the database. Added to this servers and processors kept getting faster while disks (despite the emergence of SSD) were quickly becoming the bottleneck to automated processes that were producing large amounts of data that concurrently resulted in more report requests.

The net effect was a downward spiral where the increase of users requiring an increase of reports from the databases meant an increase in huge amounts of data being requested from disks that simply weren't up to the job. When you then factored in the data proliferation of external users caused by the Internet and pressure inducing laws such as Sarbanes-Oxley, the demand to analyze even more data even quicker has reached fever point. With data and user volumes increasing by a factor of thousands compared to the I/O capability of databases, the transaction-based industry faced a challenge that required a dramatic shift and change.  Cue the 2011 emergence of SAP's HANA.

Real-Time In Memory Platform Presents a Groundbreaking Approach
One of the major advantages of SAP HANA's ability to run in real time is that it offers a non-requirement for data redundancy as it's built to run as a single database. With clusters of affordable and scalable servers, transactional and analytical data are run on the same database, hence eliminating different types of databases for different application needs. Oracle on the other hand has built an empire on exactly the opposite.

Oracle has thrived on a model where generally companies start with a simple database that's utilized for checking sales orders and ensuring product delivery to customers but as the business grows they need more databases with different and more demanding functions. Functions such as managing customer relationships, complex reporting and analysis drives a need for new databases that are separate from the actual business requiring data to be moved from one system to another. Eventually you have a sprawl of databases as existing ones are unable to handle the workloads making it almost impossible to track data movements yet alone attain real time updates. So while the Oracle marketing machine is also pitching the benefits of in-memory via its Exalytics appliance and in-memory database, TimesTen, Oracle are certainly in no rush to break this traditional model of database sprawl and the money-spinning licenses that come with it.

Looking closely at the Oracle Exalytics / TimesTen package, despite the hype, it merely is just an add-on product meaning that an end user will still need a license for the transactional database, another license for the data warehouse database and yet another license for TimesTen for Oracle Exalytics.

Moreover, the Oracle bolt-on approach serves to sell more of their hardware commodity and in some ways perversely justify their acquisition of SUN Microsystems, all at the expense of the customer. Due to the Exalytics approach continuing the traditional requirement for transactional data to be duplicated from the application to the warehouse and once again to Exalytics, the end user not only ends up with three copies of the data, they also have to have three levels of storage and servers. In contrast SAP HANA is designed to be a single database that runs both transactional applications and Business Warehouse deployments. Not only does SAP HANA's one copy of data replace the two or three required for Oracle it also eliminates the need for materialized views, redundant aggregates and indexes leaving a significantly reduced data footprint.

Comparing HANA to Oracle's TimesTen and Exalytics
As expected Oracle have already initiated their FUD team with bogus claims and untruths against HANA as well as even pushing their TimesTen as a like for like comparison. Where this is hugely flawed is that they fail to acknowledge or admit that SAP HANA is a completely groundbreaking design as opposed to a bolt-on approach.  With SAP HANA data is completely managed and accessed in RAM consequently doing away with the requirement of MOLAP, multiple indexes and other tuning features that Oracle pride themselves on.

Furthermore, despite the Oracle FUD, SAP HANA does indeed handle both unstructured and structured data, as well as utilise parallel queries for scaling out across server nodes. In this instance Oracle are trying hard to create the most confusion and subsequently detract the market from realizing that the TimesTen with Exalytics package still can't scale out beyond the 1TB RAM limit unlike SAP HANA where each container can store up to 500TB of data all executable at high speed.

With an aggressive TCO and ROI model compared to a traditional Oracle deployment, SAP HANA also proves a lot more cost effective. With pricing based on an incremental of 64GB RAM and the total amount of data held in memory, licenses are fully inclusive of production and test/development requirements as well as the necessary tools.

SAP HANA's Embracing of VMware
Furthermore with Oracle's belligerent stance towards VMware and the cost savings it brings to end users, SAP on the other hand has embraced it.  The recent announcement that SAP HANA is supporting VMware vSphere will provide them a vast competitive advantdge, as it will enable customers to provision instances of SAP HANA in minutes as VM templates, as well as gain benefits such as Dynamic Resource Scheduling and vSphere vMotion. By virtualizing SAP HANA with VMware, end users can quickly have several smaller HANA instances all sharing a single physical server leading to better utilization of existing resources. With the promise of certified preconfigured and optimised converged infrastructures such as the Vblock around the corner, SAP HANA appliances could be shipped with vSphere 5 and SAP HANA pre-installed within days, enabling rapid deployment for businesses.

The Business Benefits of Real-Time
With business and transactions being done in real time, SAP HANA ensures that the data and the analytics that come with them are also in real time. The process of manually polling data from multiple systems and sorting them through are inadequate in a time when businesses are facing unpredictable economic conditions and volatile demand and complex supply chains. The need is for real time metrics that are aligned to supply and demand where a retailers' shelves can accurately and immediately be stocked eliminating unnecessary inventory costs, lost sales opportunities and failed product launches. Being able to instantly analyze data at any level of granularity enables a business to quickly respond to these market insights and take decisive actions such as transferring inventory between distribution centers based on expected sales or altering the prices of promotions based on customer demand. Instead of waiting for processes that take hours, days or even weeks, SAP HANA's real time capabilities enable businesses to react in real time to incidents.

Ultimately SAP HANA is a revolutionary step forward that will empower organizations to focus more on the business and less on the infrastructure that supports them. With the promise of new applications being built by SAP to support real time decision making as well being able to run existing applications, SAP HANA presents the opportunity to not only transform a business but also the underlying technology that supports it.

More Stories By Archie Hendryx

SAN, NAS, Back Up / Recovery & Virtualisation Specialist.

@CloudExpo Stories
In 2015, 4.9 billion connected "things" will be in use. By 2020, Gartner forecasts this amount to be 25 billion, a 410 percent increase in just five years. How will businesses handle this rapid growth of data? Hadoop will continue to improve its technology to meet business demands, by enabling businesses to access/analyze data in real time, when and where they need it. Cloudera's Chief Technologist, Eli Collins, will discuss how Big Data is keeping up with today's data demands and how in t...
The best mobile applications are augmented by dedicated servers, the Internet and Cloud services. Mobile developers should focus on one thing: writing the next socially disruptive viral app. Thanks to the cloud, they can focus on the overall solution, not the underlying plumbing. From iOS to Android and Windows, developers can leverage cloud services to create a common cross-platform backend to persist user settings, app data, broadcast notifications, run jobs, etc. This session provide...
Modern Systems announced completion of a successful project with its new Rapid Program Modernization (eavRPMa"c) software. The eavRPMa"c technology architecturally transforms legacy applications, enabling faster feature development and reducing time-to-market for critical software updates. Working with Modern Systems, the University of California at Santa Barbara (UCSB) leveraged eavRPMa"c to transform its Student Information System from Software AG's Natural syntax to a modern application lev...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
While DevOps most critically and famously fosters collaboration, communication, and integration through cultural change, culture is more of an output than an input. In order to actively drive cultural evolution, organizations must make substantial organizational and process changes, and adopt new technologies, to encourage a DevOps culture. Moderated by Andi Mann, panelists will discuss how to balance these three pillars of DevOps, where to focus attention (and resources), where organizations m...
Health care systems across the globe are under enormous strain, as facilities reach capacity and costs continue to rise. M2M and the Internet of Things have the potential to transform the industry through connected health solutions that can make care more efficient while reducing costs. In fact, Vodafone's annual M2M Barometer Report forecasts M2M applications rising to 57 percent in health care and life sciences by 2016. Lively is one of Vodafone's health care partners, whose solutions enable o...
ProfitBricks has launched its new DevOps Central and REST API, along with support for three multi-cloud libraries and a Python SDK. This, combined with its already existing SOAP API and its new RESTful API, moves ProfitBricks into a position to better serve the DevOps community and provide the ability to automate cloud infrastructure in a multi-cloud world. Following this momentum, ProfitBricks has also introduced several libraries that enable developers to use their favorite language to code ...
Dave will share his insights on how Internet of Things for Enterprises are transforming and making more productive and efficient operations and maintenance (O&M) procedures in the cleantech industry and beyond. Speaker Bio: Dave Landa is chief operating officer of Cybozu Corp (kintone US). Based in the San Francisco Bay Area, Dave has been on the forefront of the Cloud revolution driving strategic business development on the executive teams of multiple leading Software as a Services (SaaS) ap...
SYS-CON Events announced today that Vicom Computer Services, Inc., a provider of technology and service solutions, will exhibit at SYS-CON's 16th International Cloud Expo®, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. They are located at booth #427. Vicom Computer Services, Inc. is a progressive leader in the technology industry for over 30 years. Headquartered in the NY Metropolitan area. Vicom provides products and services based on today’s requirements...
How do you securely enable access to your applications in AWS without exposing any attack surfaces? The answer is usually very complicated because application environments morph over time in response to growing requirements from your employee base, your partners and your customers. In his session at 16th Cloud Expo, Haseeb Budhani, CEO and Co-founder of Soha, will share five common approaches that DevOps teams follow to secure access to applications deployed in AWS, Azure, etc., and the frict...
The IoT Bootcamp is coming to Cloud Expo | @ThingsExpo on June 9-10 at the Javits Center in New York. Instructor. Registration is now available at http://iotbootcamp.sys-con.com/ Instructor Janakiram MSV previously taught the famously successful Multi-Cloud Bootcamp at Cloud Expo | @ThingsExpo in November in Santa Clara. Now he is expanding the focus to Janakiram is the founder and CTO of Get Cloud Ready Consulting, a niche Cloud Migration and Cloud Operations firm that recently got acquir...
The 17th International Cloud Expo has announced that its Call for Papers is open. 17th International Cloud Expo, to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA, brings together Cloud Computing, APM, APIs, Microservices, Security, Big Data, Internet of Things, DevOps and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding bu...
What exactly is a cognitive application? In her session at 16th Cloud Expo, Ashley Hathaway, Product Manager at IBM Watson, will look at the services being offered by the IBM Watson Developer Cloud and what that means for developers and Big Data. She'll explore how IBM Watson and its partnerships will continue to grow and help define what it means to be a cognitive service, as well as take a look at the offerings on Bluemix. She will also check out how Watson and the Alchemy API team up to off...
SYS-CON Media announced today that John Treadway’s blog has exceeded 475,000 page views. John Treadway, Vice President at Cloud Technology Partners, has surpassed 475,000 page views on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, Microservices Journal, and several others. His blog home page at SYS-CON can be found at JohnTreadway.SYS-CON.com.
With IoT exploding, massive data will transform businesses with opportunities to monetize almost anything that can be measured. In this C-Level Roundtable Discussion at @ThingsExpo, Brendan O’Brien, Aria Systems Co-founder and Chief Evangelist, will lead an expert panel of consultants, thought leaders and practitioners who will look at these new monetization trends, discuss the implications, and detail lessons learned from their collective experience. Finally, the panel will point the way forw...
SYS-CON Media announced today that @WebRTCSummit Blog, the largest WebRTC resource in the world, has been launched. @WebRTCSummit Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. @WebRTCSummit Blog can be bookmarked ▸ Here @WebRTCSummit conference site can be bookmarked ▸ Here
SYS-CON Events announced today that the DevOps Institute has been named “Association Sponsor” of SYS-CON's DevOps Summit, which will take place on June 9–11, 2015, at the Javits Center in New York City, NY. The DevOps Institute provides enterprise level training and certification. Working with thought leaders from the DevOps community, the IT Service Management field and the IT training market, the DevOps Institute is setting the standard in quality for DevOps education and training.
DevOps tasked with driving success in the cloud need a solution to efficiently leverage multiple clouds while avoiding cloud lock-in. Flexiant today announces the commercial availability of Flexiant Concerto. With Flexiant Concerto, DevOps have cloud freedom to automate the build, deployment and operations of applications consistently across multiple clouds. Concerto is available through four disruptive pricing models aimed to deliver multi-cloud at a price point everyone can afford.
The WebRTC Summit 2015 New York, to be held June 9-11, 2015, at the Javits Center in New York, NY, announces that its Call for Papers is open. Topics include all aspects of improving IT delivery by eliminating waste through automated business models leveraging cloud technologies. WebRTC Summit is co-located with 16th International Cloud Expo, @ThingsExpo, Big Data Expo, and DevOps Summit.
SYS-CON Events announced today that Column Technologies, a global technology solutions company, will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Established in 1998, Column Technologies is a leader in application performance and infrastructure management for commercial and federal markets. The company is headquartered in the United States, with a diverse and talented team of more than 350 employees around th...