|By Business Wire||
|November 22, 2012 09:30 PM EST||
HP today unveiled the industry’s first server built to help clients operationalize Big Data, drive new business opportunities and save up to $1 million over three years.(1)
HP ProLiant SL45402 Gen 8 (Photo: Business Wire)
With the advent of Big Data software and the promise that it brings, many organizations have tried to deploy these solutions on existing architectures not designed to handle the specific needs of these workloads. As a result, the outcomes from these early deployments have been suboptimal from a performance and cost perspective.
“Big Data application environments such as Hadoop, MPP data warehouses, Big Data analytics and object stores have very different workload requirements,” said Dan Vesset, vice president, Business Analytics Research, IDC. “Given the large and varied amounts of fast-moving data that needs to be stored and accessed quickly and the different requirements of end users, these workloads can be highly varied, complex and inefficient to manage if run on traditional hardware infrastructure. In order to fully embrace the promise of Big Data, it is critical that the underlying infrastructure be optimized for the workload.”
The new HP ProLiant SL4500 server series is the only solution purpose built for Big Data environments. It provides maximum performance, productivity and cost-effectiveness in an ultradense solution required by these workloads. Built on HP Converged Infrastructure, the new server offers a highly efficient design that consumes up to 50 percent less space, 61 percent less power and 31 percent lower cost while using 63 percent fewer cables.(1)
Modular architecture optimizes results for workload specific applications
The modular design of the HP ProLiant SL4500 server series offers varied compute and storage configurations that enable clients to optimize their infrastructure for a workload-specific application, removing the need to piece together incongruent hardware for the supporting infrastructure.
With a single, cost-effective architecture, the HP ProLiant SL4500 server series also supports multiple Apache Hadoop vendors including Cloudera and Hortonworks, as well as additional software including OpenStack Cloud Software and MongoDB.
“Enterprises that leverage Cloudera’s Platform for Big Data to unlock insights across all of their data benefit from deploying infrastructure components optimized for the extreme demands of Big Data workloads,” said Amr Awadallah, chief technology officer, Cloudera. “By designing a server purpose built for Big Data, HP is offering the market a seamless new approach to processing large data sets efficiently and cost-effectively.”
HP innovation delivers greater performance and density
The HP ProLiant SL4500 Gen8 server series, with HP Smart Array technology, delivers industry-leading performance with a nearly seven times faster input/output operations per second (IOPS) than existing architectures.(2) With the smart analytics of HP SmartCache, the system will optimize storage traffic to ensure the lowest latency response and up-front investment.
Current server offerings cannot address the rapidly growing amounts of storage and servers for Big Data, forcing IT leaders to acquire additional expensive data center space. However, the new HP ProLiantSL4500 server series solves this problem by delivering industry-leading storage density of up to 240 terabytes (TB) in a single 4.3-rack-unit (U) chassis, or 2.16 petabytes (PB) with nine servers in an industry-standard 42-U rack.(3)
As a result of this extreme density, clients realize significant cost savings, greater performance and increased efficiency.
Safeguard Big Data, simplify management and support with HP ProLiant Gen8
The latest member of the HP ProLiant Generation 8 (Gen8) family, the HP SL4500 server series is built with HP ProActive Insight Architecture, which embeds intelligence and automation capabilities allowing clients to:
- Eliminate down time and safeguard valuable data with automated data protection and HP Predictive Spare Activation, which moves data to an alternate device before failures occur.
- Ensure maximum server productivity with HP Active Health, and automate firmware updates with HP Smart Update.
- Leverage the industry’s most comprehensive services, support and warranty offering with HP Insight Online.
- Lower data center power costs and improve compute per watt by up to 70 percent than previous generations with HP Intelligent Infrastructure.(4)
HP enhances scale-out server portfolio
HP also announced updates to its high-performance computing (HPC) portfolio, enabling clients to maximize the performance benefits of the latest processing technology from Intel and NVIDIA.
The HP ProLiant SL270s Gen8 server offers maximized processor density, with the ability to support up to eight Intel® Xeon Phi™ coprocessors or eight NVIDIA® Kepler™ graphic processing units (GPUs) per server. The HP ProLiant SL270s and SL250s Gen8 servers now support the latest NVIDIA Kepler GPUs and newly announced Intel Xeon Phi coprocessors, enabling clients to select the best accelerator or coprocessor for their specific workloads.
Pricing and availability(5)
The HP ProLiant SL4500 server series in a single node configuration is available immediately worldwide for a starting price of $7,643.
The new HP ProLiant SL270s Gen8 servers will be available next month, with for a starting price of $6,166. The HP ProLiant SL250s Gen8 servers will be available with NVIDIA Kepler and Intel Xeon Phi processors early next year for a starting price of $5,659.
HP’s premier Europe, Middle East and Africa client event, HP Discover, takes place Dec. 4-6 in Frankfurt, Germany.
HP creates new possibilities for technology to have a meaningful impact on people, businesses, governments and society. The world’s largest technology company, HP brings together a portfolio that spans printing, personal computing, software, services and IT infrastructure to solve customer problems. More information about HP (NYSE: HPQ) is available at http://www.hp.com.
(1) Based on internal calculations comparing Dell PowerEdge R510 servers plus four units of Dell Power Vault MD1200 direct attach storage, which consumes seven racks, 28 servers, 112 JBOD (just a bunch of disks) enclosures, 14 networking switches, 448 cables and 79 kilowatts of power to HP ProLiant SL4500 servers consuming three racks, six networking switches, 168 cables and 31 kilowatts of power. Savings of $0.5M in acquisition costs (or CAPEX); $0.25M savings in power, cooling and distribution costs; and $0.2M savings due to rent of data center floor space over a three-year period.
(2) Based on internal testing that compares an HP ProLiant G7 server with a standard 15K SAS drive to that of HP ProLiant Gen8 server with a solid state drive.
(3) Based on internal calculations. Single-node configuration of HP SL4500 server is capable of providing up to 60 LFF (3.5-inch) SAS/SATA/SSD hard drives in a 4.3U chassis. Using the 4TB SAS/SATA hard drives that will be available early FY2013, every SL4500 server can provide 240 TB of storage capacity. A total of up to nine SL4500 servers can be installed in a single 42U rack to provide a maximum storage capacity of 2.16PB by using 540 LFF hard drives of 4TB capacity.
(4) HP internal comparison: HP ProLiant G7 to HP ProLiant G8.
(5) Estimated U.S. street prices. Actual prices may vary.
Intel and Xeon are trademarks of Intel Corporation in the U.S. and other countries.
This news release contains forward-looking statements that involve risks, uncertainties and assumptions. If such risks or uncertainties materialize or such assumptions prove incorrect, the results of HP and its consolidated subsidiaries could differ materially from those expressed or implied by such forward-looking statements and assumptions. All statements other than statements of historical fact are statements that could be deemed forward-looking statements, including but not limited to statements of the plans, strategies and objectives of management for future operations; any statements concerning expected development, performance, market share or competitive performance relating to products and services; any statements regarding anticipated operational and financial results; any statements of expectation or belief; and any statements of assumptions underlying any of the foregoing. Risks, uncertainties and assumptions include macroeconomic and geopolitical trends and events; the competitive pressures faced by HP’s businesses; the development and transition of new products and services (and the enhancement of existing products and services) to meet customer needs and respond to emerging technological trends; the execution and performance of contracts by HP and its customers, suppliers and partners; the protection of HP's intellectual property assets, including intellectual property licensed from third parties; integration and other risks associated with business combination and investment transactions; the hiring and retention of key employees; assumptions related to pension and other post-retirement costs and retirement programs; the execution, timing and results of restructuring plans, including estimates and assumptions related to the cost and the anticipated benefits of implementing those plans; expectations and assumptions relating to the execution and timing of cost reduction programs and restructuring and integration plans; the resolution of pending investigations, claims and disputes; and other risks that are described in HP’s Quarterly Report on Form 10-Q for the fiscal quarter ended July 31, 2012 and HP’s other filings with the Securities and Exchange Commission, including HP’s Annual Report on Form 10-K for the fiscal year ended October 31, 2011. HP assumes no obligation and does not intend to update these forward-looking statements.
© 2012 Hewlett-Packard Asia Pacific Pte Ltd. Registration No.: 198703164G. The information contained herein is subject to change without notice. The only warranties for HP products and services are set forth in the express warranty statements accompanying such products and services. Nothing herein should be construed as constituting an additional warranty. HP shall not be liable for technical or editorial errors or omissions contained herein.
Aspose.Total for .NET is the most complete package of all file format APIs for .NET as offered by Aspose. It empowers developers to create, edit, render, print and convert between a wide range of popular document formats within any .NET, C#, ASP.NET and VB.NET applications. Aspose compiles all .NET APIs on a daily basis to ensure that it contains the most up to date versions of each of Aspose .NET APIs. If a new .NET API or a new version of existing APIs is released during the subscription peri...
Jul. 28, 2016 06:00 PM EDT Reads: 938
"My role is working with customers, helping them go through this digital transformation. I spend a lot of time talking to banks, big industries, manufacturers working through how they are integrating and transforming their IT platforms and moving them forward," explained William Morrish, General Manager Product Sales at Interoute, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 05:30 PM EDT Reads: 2,208
Organizations planning enterprise data center consolidation and modernization projects are faced with a challenging, costly reality. Requirements to deploy modern, cloud-native applications simultaneously with traditional client/server applications are almost impossible to achieve with hardware-centric enterprise infrastructure. Compute and network infrastructure are fast moving down a software-defined path, but storage has been a laggard. Until now.
Jul. 28, 2016 05:30 PM EDT Reads: 1,844
[webcast] Continuous Delivery in the Enterprise | @DevOpsSummit @IBMDevOps #IBM #DevOps #ContinuousDelivery
To leverage Continuous Delivery, enterprises must consider impacts that span functional silos, as well as applications that touch older, slower moving components. Managing the many dependencies can cause slowdowns. See how to achieve continuous delivery in the enterprise.
Jul. 28, 2016 05:00 PM EDT Reads: 406
You think you know what’s in your data. But do you? Most organizations are now aware of the business intelligence represented by their data. Data science stands to take this to a level you never thought of – literally. The techniques of data science, when used with the capabilities of Big Data technologies, can make connections you had not yet imagined, helping you discover new insights and ask new questions of your data. In his session at @ThingsExpo, Sarbjit Sarkaria, data science team lead ...
Jul. 28, 2016 04:30 PM EDT Reads: 1,179
SYS-CON Events announced today the Kubernetes and Google Container Engine Workshop, being held November 3, 2016, in conjunction with @DevOpsSummit at 19th Cloud Expo at the Santa Clara Convention Center in Santa Clara, CA. This workshop led by Sebastian Scheele introduces participants to Kubernetes and Google Container Engine (GKE). Through a combination of instructor-led presentations, demonstrations, and hands-on labs, students learn the key concepts and practices for deploying and maintainin...
Jul. 28, 2016 04:30 PM EDT Reads: 891
Security, data privacy, reliability, and regulatory compliance are critical factors when evaluating whether to move business applications from in-house, client-hosted environments to a cloud platform. Quality assurance plays a vital role in ensuring that the appropriate level of risk assessment, verification, and validation takes place to ensure business continuity during the migration to a new cloud platform.
Jul. 28, 2016 04:15 PM EDT Reads: 356
Extracting business value from Internet of Things (IoT) data doesn’t happen overnight. There are several requirements that must be satisfied, including IoT device enablement, data analysis, real-time detection of complex events and automated orchestration of actions. Unfortunately, too many companies fall short in achieving their business goals by implementing incomplete solutions or not focusing on tangible use cases. In his general session at @ThingsExpo, Dave McCarthy, Director of Products...
Jul. 28, 2016 04:15 PM EDT Reads: 1,772
Security, data privacy, reliability and regulatory compliance are critical factors when evaluating whether to move business applications from in-house client hosted environments to a cloud platform. In her session at 18th Cloud Expo, Vandana Viswanathan, Associate Director at Cognizant, In this session, will provide an orientation to the five stages required to implement a cloud hosted solution validation strategy.
Jul. 28, 2016 03:45 PM EDT Reads: 1,115
UpGuard has become a member of the Center for Internet Security (CIS), and will continue to help businesses expand visibility into their cyber risk by providing hardening benchmarks to all customers. By incorporating these benchmarks, UpGuard's CSTAR solution builds on its lead in providing the most complete assessment of both internal and external cyber risk. CIS benchmarks are a widely accepted set of hardening guidelines that have been publicly available for years. Numerous solutions exist t...
Jul. 28, 2016 03:30 PM EDT Reads: 705
Up until last year, enterprises that were looking into cloud services usually undertook a long-term pilot with one of the large cloud providers, running test and dev workloads in the cloud. With cloud’s transition to mainstream adoption in 2015, and with enterprises migrating more and more workloads into the cloud and in between public and private environments, the single-provider approach must be revisited. In his session at 18th Cloud Expo, Yoav Mor, multi-cloud solution evangelist at Cloudy...
Jul. 28, 2016 03:30 PM EDT Reads: 365
Verizon Communications Inc. (NYSE, Nasdaq: VZ) and Yahoo! Inc. (Nasdaq: YHOO) have entered into a definitive agreement under which Verizon will acquire Yahoo's operating business for approximately $4.83 billion in cash, subject to customary closing adjustments. Yahoo informs, connects and entertains a global audience of more than 1 billion monthly active users** -- including 600 million monthly active mobile users*** through its search, communications and digital content products. Yahoo also co...
Jul. 28, 2016 03:15 PM EDT Reads: 693
A critical component of any IoT project is what to do with all the data being generated. This data needs to be captured, processed, structured, and stored in a way to facilitate different kinds of queries. Traditional data warehouse and analytical systems are mature technologies that can be used to handle certain kinds of queries, but they are not always well suited to many problems, particularly when there is a need for real-time insights.
Jul. 28, 2016 03:15 PM EDT Reads: 1,904
"Software-defined storage is a big problem in this industry because so many people have different definitions as they see fit to use it," stated Peter McCallum, VP of Datacenter Solutions at FalconStor Software, in this SYS-CON.tv interview at 18th Cloud Expo, held June 7-9, 2016, at the Javits Center in New York City, NY.
Jul. 28, 2016 03:00 PM EDT Reads: 1,563
Extreme Computing is the ability to leverage highly performant infrastructure and software to accelerate Big Data, machine learning, HPC, and Enterprise applications. High IOPS Storage, low-latency networks, in-memory databases, GPUs and other parallel accelerators are being used to achieve faster results and help businesses make better decisions. In his session at 18th Cloud Expo, Michael O'Neill, Strategic Business Development at NVIDIA, focused on some of the unique ways extreme computing is...
Jul. 28, 2016 12:49 PM EDT Reads: 315
Amazon has gradually rolled out parts of its IoT offerings in the last year, but these are just the tip of the iceberg. In addition to optimizing their back-end AWS offerings, Amazon is laying the ground work to be a major force in IoT – especially in the connected home and office. Amazon is extending its reach by building on its dominant Cloud IoT platform, its Dash Button strategy, recently announced Replenishment Services, the Echo/Alexa voice recognition control platform, the 6-7 strategic...
Jul. 28, 2016 12:30 PM EDT Reads: 605
The best-practices for building IoT applications with Go Code that attendees can use to build their own IoT applications. In his session at @ThingsExpo, Indraneel Mitra, Senior Solutions Architect & Technology Evangelist at Cognizant, provided valuable information and resources for both novice and experienced developers on how to get started with IoT and Golang in a day. He also provided information on how to use Intel Arduino Kit, Go Robotics API and AWS IoT stack to build an application tha...
Jul. 28, 2016 12:00 PM EDT Reads: 1,239
IoT generates lots of temporal data. But how do you unlock its value? You need to discover patterns that are repeatable in vast quantities of data, understand their meaning, and implement scalable monitoring across multiple data streams in order to monetize the discoveries and insights. Motif discovery and deep learning platforms are emerging to visualize sensor data, to search for patterns and to build application that can monitor real time streams efficiently. In his session at @ThingsExpo, ...
Jul. 28, 2016 11:15 AM EDT Reads: 1,149
Adding public cloud resources to an existing application can be a daunting process. The tools that you currently use to manage the software and hardware outside the cloud aren’t always the best tools to efficiently grow into the cloud. All of the major configuration management tools have cloud orchestration plugins that can be leveraged, but there are also cloud-native tools that can dramatically improve the efficiency of managing your application lifecycle. In his session at 18th Cloud Expo, ...
Jul. 28, 2016 11:00 AM EDT Reads: 1,151
Ovum, a leading technology analyst firm, has published an in-depth report, Ovum Decision Matrix: Selecting a DevOps Release Management Solution, 2016–17. The report focuses on the automation aspects of DevOps, Release Management and compares solutions from the leading vendors.
Jul. 28, 2016 11:00 AM EDT Reads: 1,796