Welcome!

@CloudExpo Authors: Bob Gourley, Elizabeth White, Liz McMillan, Pat Romanski, Sematext Blog

Related Topics: @BigDataExpo, Microservices Expo, Containers Expo Blog, Agile Computing, @CloudExpo, Apache, Cloud Security

@BigDataExpo: Article

Big Dollars from Big Data

How to reduce costs and increase performance in the data center

Cloud computing has given birth to a broad range of online services. To maintain a competitive edge, service providers are taking a closer look at their Big Data storage infrastructure in an earnest attempt to improve performance and reduce costs.

Large enterprises hosting their own cloud servers are seeking ways to scale and improve performance while maintaining or lowering expenditures. If the status quo of scaling users and storage infrastructure is upheld, it will become increasingly difficult to maintain low cost cloud services, such as online account management or data storage. Service providers will face higher energy consumption in their data centers overall, and many are loath to begin charging for online account access.

Costs vs. Benefits
In response to the trend of growing online account activity, many service providers are transitioning their data centers to a centralized environment whereby data is stored in a single location and made accessible from any location via the Internet. Centralizing the equipment enables service providers to keep costs down while delivering improved Internet connections to their online users and realizing gains in performance and reliability.

Yet with these additional performance improvements, scalability becomes more arduous and cost-prohibitive. Improving functionality within a centralized data center requires the purchase of additional high-performance, specialized equipment, boosting costs and energy consumption that are challenging to control at scale. In an economy where large organizations are seeking cost-cutting measures from every angle, these added expenses are unacceptable.

More Servers, More Problems?
Once a telco moves into providing cloud-based services for its users, such as online account access and management, the demands on its data centers spike dramatically. While the typical employee user of a telco's or service provider's internal network requires high performance, these systems normally have fewer users and can access files directly through the network. Additionally, employees are typically accessing, sending and saving relatively low-volume files like documents and spreadsheets, using less storage capacity and alleviating performance load.

Outside the internal network environment, however, the service provider's cloud servers are being accessed simultaneously over the Internet by more users, which itself ends up becoming a performance bottleneck. Providers, telcos and other large enterprises offering cloud services therefore not only have to scale their storage systems to each additional user, but must also sustain performance across the combined users. Due to the significantly higher number of users utilizing online account tools at any given time, cloud users place a greater strain on data center resources.

Combining Best Practices
To remain competitive, cloud service providers must find a way to scale rapidly to accommodate the proliferating demand for more data storage. Service providers seeking data storage options should look for an optimal combination of performance, scalability and cost-effectiveness. The following best practices can help maximize data center ROI in an era of IT cutbacks:

  1. Pick commodity components: Low-energy hardware can make good business sense. Commodity hardware not only costs less, but also uses far less energy. This significantly reduces both setup and operating costs in one move.
  2. Look for distributed storage: Distributed storage presents the best way to build at scale even though the data center trend has been moving toward centralization. This is because there are now ways to increase performance at the software level that counterbalances the performance advantage of a centralized data storage approach.
  3. Avoid bottlenecks at all costs: A single point of entry becomes a performance bottleneck very easily. Adding caches to alleviate the bottleneck, as most data center architectures presently do, add cost and complexity to a system very quickly. On the other hand, a horizontally scalable system that distributes data among all nodes delivers a high level of redundancy.

Conclusion
Big Data storage consists mainly of high performance, vertically scaled storage systems. Since these current architectures can only scale to a single petabyte and are expensive, they are not cost-effective or sustainable in the long run. Moving to a horizontally scaled data storage model that distributes data evenly onto low-energy hardware can reduce costs and increase performance in the Cloud. With these insights, providers of cloud services can take steps to improve the performance, scalability and efficiency of their data storage centers.

More Stories By Stefan Bernbo

Stefan Bernbo is the founder and CEO of Compuverde. For 20 years, he has designed and built numerous enterprise scale data storage solutions designed to be cost effective for storing huge data sets. From 2004 to 2010 Stefan worked within this field for Storegate, the wide-reaching Internet based storage solution for consumer and business markets, with the highest possible availability and scalability requirements. Previously, Stefan has worked with system and software architecture on several projects with Swedish giant Ericsson, the world-leading provider of telecommunications equipment and services to mobile and fixed network operators.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
"Once customers get a year into their IoT deployments, they start to realize that they may have been shortsighted in the ways they built out their deployment and the key thing I see a lot of people looking at is - how can I take equipment data, pull it back in an IoT solution and show it in a dashboard," stated Dave McCarthy, Director of Products at Bsquare Corporation, in this SYS-CON.tv interview at @ThingsExpo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
In IT, we sometimes coin terms for things before we know exactly what they are and how they’ll be used. The resulting terms may capture a common set of aspirations and goals – as “cloud” did broadly for on-demand, self-service, and flexible computing. But such a term can also lump together diverse and even competing practices, technologies, and priorities to the point where important distinctions are glossed over and lost.
What happens when the different parts of a vehicle become smarter than the vehicle itself? As we move toward the era of smart everything, hundreds of entities in a vehicle that communicate with each other, the vehicle and external systems create a need for identity orchestration so that all entities work as a conglomerate. Much like an orchestra without a conductor, without the ability to secure, control, and connect the link between a vehicle’s head unit, devices, and systems and to manage the ...
All clouds are not equal. To succeed in a DevOps context, organizations should plan to develop/deploy apps across a choice of on-premise and public clouds simultaneously depending on the business needs. This is where the concept of the Lean Cloud comes in - resting on the idea that you often need to relocate your app modules over their life cycles for both innovation and operational efficiency in the cloud. In his session at @DevOpsSummit at19th Cloud Expo, Valentin (Val) Bercovici, CTO of Soli...
Complete Internet of Things (IoT) embedded device security is not just about the device but involves the entire product’s identity, data and control integrity, and services traversing the cloud. A device can no longer be looked at as an island; it is a part of a system. In fact, given the cross-domain interactions enabled by IoT it could be a part of many systems. Also, depending on where the device is deployed, for example, in the office building versus a factory floor or oil field, security ha...
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
"Qosmos has launched L7Viewer, a network traffic analysis tool, so it analyzes all the traffic between the virtual machine and the data center and the virtual machine and the external world," stated Sebastien Synold, Product Line Manager at Qosmos, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Everyone knows that truly innovative companies learn as they go along, pushing boundaries in response to market changes and demands. What's more of a mystery is how to balance innovation on a fresh platform built from scratch with the legacy tech stack, product suite and customers that continue to serve as the business' foundation. In his General Session at 19th Cloud Expo, Michael Chambliss, Head of Engineering at ReadyTalk, discussed why and how ReadyTalk diverted from healthy revenue and mor...
Without a clear strategy for cost control and an architecture designed with cloud services in mind, costs and operational performance can quickly get out of control. To avoid multiple architectural redesigns requires extensive thought and planning. Boundary (now part of BMC) launched a new public-facing multi-tenant high resolution monitoring service on Amazon AWS two years ago, facing challenges and learning best practices in the early days of the new service. In his session at 19th Cloud Exp...
As data explodes in quantity, importance and from new sources, the need for managing and protecting data residing across physical, virtual, and cloud environments grow with it. Managing data includes protecting it, indexing and classifying it for true, long-term management, compliance and E-Discovery. Commvault can ensure this with a single pane of glass solution – whether in a private cloud, a Service Provider delivered public cloud or a hybrid cloud environment – across the heterogeneous enter...
You have great SaaS business app ideas. You want to turn your idea quickly into a functional and engaging proof of concept. You need to be able to modify it to meet customers' needs, and you need to deliver a complete and secure SaaS application. How could you achieve all the above and yet avoid unforeseen IT requirements that add unnecessary cost and complexity? You also want your app to be responsive in any device at any time. In his session at 19th Cloud Expo, Mark Allen, General Manager of...
Financial Technology has become a topic of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 20th Cloud Expo at the Javits Center in New York, June 6-8, 2017, will find fresh new content in a new track called FinTech.
@DevOpsSummit taking place June 6-8, 2017 at Javits Center, New York City, is co-located with the 20th International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. @DevOpsSummit at Cloud Expo New York Call for Papers is now open.
"We are a leader in the market space called network visibility solutions - it enables monitoring tools and Big Data analysis to access the data and be able to see the performance," explained Shay Morag, VP of Sales and Marketing at Niagara Networks, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
The 20th International Cloud Expo has announced that its Call for Papers is open. Cloud Expo, to be held June 6-8, 2017, at the Javits Center in New York City, brings together Cloud Computing, Big Data, Internet of Things, DevOps, Containers, Microservices and WebRTC to one location. With cloud computing driving a higher percentage of enterprise IT budgets every year, it becomes increasingly important to plant your flag in this fast-expanding business opportunity. Submit your speaking proposal ...
Keeping pace with advancements in software delivery processes and tooling is taxing even for the most proficient organizations. Point tools, platforms, open source and the increasing adoption of private and public cloud services requires strong engineering rigor - all in the face of developer demands to use the tools of choice. As Agile has settled in as a mainstream practice, now DevOps has emerged as the next wave to improve software delivery speed and output. To make DevOps work, organization...
"We are a modern development application platform and we have a suite of products that allow you to application release automation, we do version control, and we do application life cycle management," explained Flint Brenton, CEO of CollabNet, in this SYS-CON.tv interview at DevOps at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.
Regulatory requirements exist to promote the controlled sharing of information, while protecting the privacy and/or security of the information. Regulations for each type of information have their own set of rules, policies, and guidelines. Cloud Service Providers (CSP) are faced with increasing demand for services at decreasing prices. Demonstrating and maintaining compliance with regulations is a nontrivial task and doing so against numerous sets of regulatory requirements can be daunting task...
Bert Loomis was a visionary. This general session will highlight how Bert Loomis and people like him inspire us to build great things with small inventions. In their general session at 19th Cloud Expo, Harold Hannon, Architect at IBM Bluemix, and Michael O'Neill, Strategic Business Development at Nvidia, discussed the accelerating pace of AI development and how IBM Cloud and NVIDIA are partnering to bring AI capabilities to "every day," on-demand. They also reviewed two "free infrastructure" pr...
"This is specifically designed to accommodate some of the needs for high availability and failover in a network managed system for the major Korean corporations," stated Thomas Masters, Managing Director at InfranicsUSA, in this SYS-CON.tv interview at 19th Cloud Expo, held November 1-3, 2016, at the Santa Clara Convention Center in Santa Clara, CA.