Click here to close now.

Welcome!

Cloud Expo Authors: Pat Romanski, Liz McMillan, Mike Kavis, Marty Puranik, Carmen Gonzalez

News Feed Item

PHD Virtual Provides Easier to Manage and More Efficient Backup Solutions for Enterprise Environments

PHD Virtual Technologies, a pioneer in virtual machine backup and recovery, and innovator of virtualization monitoring solutions, announced today their solutions are well-suited for large enterprise environments as well as small businesses with growing data storage, backup and recovery needs.

Get PHDVB v6.0 for 15 Days Click Here

Tweet This: @PHDVirtual Provides Enterprises with Virtual Machine Backup and Recovery Solutions Fit to Scale http://bit.ly/HyPclv

PHD Virtual Backup 6.0 gives customers of all sizes the scalability and flexibility they need, but for large enterprises, these products make backup processes easier to manage and more efficient, while also providing fully recoverable data at a moment’s notice. They also consume the least amount of overhead and capital when compared to other products on the market and provide encrypted data protection that is required at larger enterprises.

"Customers continue to embrace server virtualization and are increasingly deploying multiple hypervisors," said Robert Amatruda, Research Director for Data Protection and Recovery at IDC. "PHD Virtual's Backup 6.0 solution provides customers a cost-effective and easy-to-deploy solution that supports multiple hypervisors and will scale with their virtual environment.”

“In addition to the growth we have experienced within the SMB market, we’ve seen a major uptick in our penetration in the larger enterprise environments as well,” commented Jim Legg, CEO, PHD Virtual. “The simplicity and cost-savings are definitely not exclusive to the smaller organizations – the ease of use and data movement offsite provides a powerful combination for any size corporation.”

PHD Virtual benefits for large enterprise environments include the following:

  • Complete or partial restorations - provides administrators with the ability to restore a complete server from scratch, by simply selecting a restore point and target. No agents, no operating system install, just restore a complete duplicate of the VM.
  • TrueDedupe technology - a true source-side deduplication of data, meaning the deduplication and compression of the source data is performed before sending the information across the WAN/LAN and before the data is written to disk. This type of efficient deduplication is critical for enterprises that require disk space to house the backups for massive amounts of data while also being more scalable and pairing down the overall backup window time. PHD performs deduplication by comparing REAL data existing on the backup target while ensuring to eliminate duplicate copies of data across ALL VMs stored on the target making it more robust and preventing unnecessary job management to achieve storage efficiencies.
  • Parallel processing model – this provides the ability to use multiple data streams for backing up, restoring, and replicating with the result that multiple jobs can run concurrently, In addition, parallel processing allows the user to throttle, increasing or decreasing the resources used for processing to balance the work load and timing of the backup window in the data center.
  • Fault tolerant scaling – provides a 100% virtualized footprint for your backups running on a Linux-based application that can scale up and out by simply deploying more Virtual Backup Appliances (VBAs) giving the ability to create fault tolerance and load balancing, while providing performance that is required without the extra cost of more physical infrastructure or extra licensing.
  • Replication – by backing up VMs once and storing them to disk, PHD Virtual Backup eliminates the need for unnecessary snapshots on your production VMs while maintaining the extra layer of protection of having the replicated VMs located offsite.
  • Disaster recovery planning – PHD Virtual provides:
    • TrueRestore: a verification and self-healing process in which the blocks being backed up are inspected both during the backup and restore functions. By doing this, PHD ensures that the data being backed up is indeed the same data that is going to be restored.
    • Test-Mode: provides the ability to run replicated VMs in a test mode located in a standby environment. This gives peace of mind that the standby VM has been verified, is completely operational and can be properly failed over.
  • Data recovery – PHD Virtual’s Instant Restore provides more savvy recovery methods by allowing administrators to immediately power on the backup VMs and begin a restore process simultaneously. By doing so, PHD provides immediate access to servers and applications and also leverages concurrent data streams, allowing them to implement a technology called “mass restore,” which creates and configures a single restore job that will process multiple VMs at the same time, again reducing complexity and reducing the company’s RTO. Granular restore is more common than a complete data center restore, so PHD has provided support within their products to restore a file, virtual disk within a VM, or single application object, such as an email, mailbox, datastore, database, table, etc. This feature provides the functionality to restore only what you need, when you need it, without setting up any virtual labs or sandboxes to speed up recovery time and prevent unnecessary data loss.
  • Backing up a constantly evolving data center – PHD Virtual allows administrators to plug a backup appliance or VBA into virtually any environment, including the cloud or software defined data center (SDDC) or a remote or branch office, ensuring data is safe and recoverable.

“PHD Virtual does VMware backup better than anyone else, especially for enterprises like us,” said Barry Quiel, SunGard Public Sector, California. “For our large environment, we needed someone that specialized in moving lots of virtual machine data, while storing as little as possible of it. We also need plenty of options to handle how we move data off-site and to tape. PHD Virtual gives us all of the options we needed to make sure our VMware environment meets our enterprise data protection requirements.”

Supporting Resources

PHD Virtual Technologies: http://phdvirtual.com/

More PHD Virtual News: http://phdvirtual.com/newsandevents

Twitter: https://twitter.com/PHDVirtual

Facebook: http://www.facebook.com/PHDVirtualTechnologies

LinkedIn: http://www.linkedin.com/groups?gid=1992663&mostPopular=&trk=tyah

RSS Feeds: PHD Virtual news releases: http://www.phdvirtual.com/rss/news-and-events.xml

About PHD Virtual Technologies

PHD Virtual provides the absolute best value in virtual backup and monitoring for VMware and Citrix platforms. More than 4,500 customers worldwide rely on our products because they are effective, easier to use and far more affordable than competitive alternatives. Delivering the highest performance and most scalable cross platform backup and monitoring solutions on the market and pioneer of Virtual Backup Appliances (VBAs), PHD Virtual Technologies has been transforming data protection for virtual IT environments since 2006. Its PHD Virtual Monitor provides a complete, end-to-end solution for monitoring virtual, physical and application infrastructures in VMware and Citrix environments. For more information, please visit: http://www.phdvirtual.com/

More Stories By Business Wire

Copyright © 2009 Business Wire. All rights reserved. Republication or redistribution of Business Wire content is expressly prohibited without the prior written consent of Business Wire. Business Wire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@CloudExpo Stories
As organizations shift toward IT-as-a-service models, the need for managing and protecting data residing across physical, virtual, and now cloud environments grows with it. CommVault can ensure protection &E-Discovery of your data – whether in a private cloud, a Service Provider delivered public cloud, or a hybrid cloud environment – across the heterogeneous enterprise. In his session at 16th Cloud Expo, Randy De Meno, Chief Technologist - Windows Products and Microsoft Partnerships, will disc...
The Workspace-as-a-Service (WaaS) market will grow to $6.4B by 2018. In his session at 16th Cloud Expo, Seth Bostock, CEO of IndependenceIT, will begin by walking the audience through the evolution of Workspace as-a-Service, where it is now vs. where it going. To look beyond the desktop we must understand exactly what WaaS is, who the users are, and where it is going in the future. IT departments, ISVs and service providers must look to workflow and automation capabilities to adapt to growing ...
Hadoop as a Service (as offered by handful of niche vendors now) is a cloud computing solution that makes medium and large-scale data processing accessible, easy, fast and inexpensive. In his session at Big Data Expo, Kumar Ramamurthy, Vice President and Chief Technologist, EIM & Big Data, at Virtusa, will discuss how this is achieved by eliminating the operational challenges of running Hadoop, so one can focus on business growth. The fragmented Hadoop distribution world and various PaaS soluti...
Since 2008 and for the first time in history, more than half of humans live in urban areas, urging cities to become “smart.” Today, cities can leverage the wide availability of smartphones combined with new technologies such as Beacons or NFC to connect their urban furniture and environment to create citizen-first services that improve transportation, way-finding and information delivery. In her session at @ThingsExpo, Laetitia Gazel-Anthoine, CEO of Connecthings, will focus on successful use c...
Platform-as-a-Service (PaaS) is a technology designed to make DevOps easier and allow developers to focus on application development. The PaaS takes care of provisioning, scaling, HA, and other cloud management aspects. Apache Stratos is a PaaS codebase developed in Apache and designed to create a highly productive developer environment while also supporting powerful deployment options. Integration with the Docker platform, CoreOS Linux distribution, and Kubernetes container management system ...
There are many considerations when moving applications from on-premise to cloud. It is critical to understand the benefits and also challenges of this migration. A successful migration will result in lower Total Cost of Ownership, yet offer the same or higher level of robustness. In his session at 15th Cloud Expo, Michael Meiner, an Engineering Director at Oracle, Corporation, will analyze a range of cloud offerings (IaaS, PaaS, SaaS) and discuss the benefits/challenges of migrating to each of...
Cloud data governance was previously an avoided function when cloud deployments were relatively small. With the rapid adoption in public cloud – both rogue and sanctioned, it’s not uncommon to find regulated data dumped into public cloud and unprotected. This is why enterprises and cloud providers alike need to embrace a cloud data governance function and map policies, processes and technology controls accordingly. In her session at 15th Cloud Expo, Evelyn de Souza, Data Privacy and Compliance...
Containers and microservices have become topics of intense interest throughout the cloud developer and enterprise IT communities. Accordingly, attendees at the upcoming 16th Cloud Expo at the Javits Center in New York June 9-11 will find fresh new content in a new track called PaaS | Containers & Microservices Containers are not being considered for the first time by the cloud community, but a current era of re-consideration has pushed them to the top of the cloud agenda. With the launch ...
VictorOps is making on-call suck less with the only collaborative alert management platform on the market. With easy on-call scheduling management, a real-time incident timeline that gives you contextual relevance around your alerts and powerful reporting features that make post-mortems more effective, VictorOps helps your IT/DevOps team solve problems faster.
Skeuomorphism usually means retaining existing design cues in something new that doesn’t actually need them. However, the concept of skeuomorphism can be thought of as relating more broadly to applying existing patterns to new technologies that, in fact, cry out for new approaches. In his session at DevOps Summit, Gordon Haff, Senior Cloud Strategy Marketing and Evangelism Manager at Red Hat, will discuss why containers should be paired with new architectural practices such as microservices ra...
Roberto Medrano, Executive Vice President at SOA Software, had reached 30,000 page views on his home page - http://RobertoMedrano.SYS-CON.com/ - on the SYS-CON family of online magazines, which includes Cloud Computing Journal, Internet of Things Journal, Big Data Journal, and SOA World Magazine. He is a recognized executive in the information technology fields of SOA, internet security, governance, and compliance. He has extensive experience with both start-ups and large companies, having been ...
HP and Aruba Networks on Monday announced a definitive agreement for HP to acquire Aruba, a provider of next-generation network access solutions for the mobile enterprise, for $24.67 per share in cash. The equity value of the transaction is approximately $3.0 billion, and net of cash and debt approximately $2.7 billion. Both companies' boards of directors have approved the deal. "Enterprises are facing a mobile-first world and are looking for solutions that help them transition legacy investme...
The industrial software market has treated data with the mentality of “collect everything now, worry about how to use it later.” We now find ourselves buried in data, with the pervasive connectivity of the (Industrial) Internet of Things only piling on more numbers. There’s too much data and not enough information. In his session at @ThingsExpo, Bob Gates, Global Marketing Director, GE’s Intelligent Platforms business, to discuss how realizing the power of IoT, software developers are now focu...
Operational Hadoop and the Lambda Architecture for Streaming Data Apache Hadoop is emerging as a distributed platform for handling large and fast incoming streams of data. Predictive maintenance, supply chain optimization, and Internet-of-Things analysis are examples where Hadoop provides the scalable storage, processing, and analytics platform to gain meaningful insights from granular data that is typically only valuable from a large-scale, aggregate view. One architecture useful for capturing...
SYS-CON Events announced today that Vitria Technology, Inc. will exhibit at SYS-CON’s @ThingsExpo, which will take place on June 9-11, 2015, at the Javits Center in New York City, NY. Vitria will showcase the company’s new IoT Analytics Platform through live demonstrations at booth #330. Vitria’s IoT Analytics Platform, fully integrated and powered by an operational intelligence engine, enables customers to rapidly build and operationalize advanced analytics to deliver timely business outcomes ...
DevOps is about increasing efficiency, but nothing is more inefficient than building the same application twice. However, this is a routine occurrence with enterprise applications that need both a rich desktop web interface and strong mobile support. With recent technological advances from Isomorphic Software and others, it is now feasible to create a rich desktop and tuned mobile experience with a single codebase, without compromising performance or usability.
SYS-CON Events announced today Arista Networks will exhibit at SYS-CON's DevOps Summit 2015 New York, which will take place June 9-11, 2015, at the Javits Center in New York City, NY. Arista Networks was founded to deliver software-driven cloud networking solutions for large data center and computing environments. Arista’s award-winning 10/40/100GbE switches redefine scalability, robustness, and price-performance, with over 3,000 customers and more than three million cloud networking ports depl...
The speed of software changes in growing and large scale rapid-paced DevOps environments presents a challenge for continuous testing. Many organizations struggle to get this right. Practices that work for small scale continuous testing may not be sufficient as the requirements grow. In his session at DevOps Summit, Marc Hornbeek, Sr. Solutions Architect of DevOps continuous test solutions at Spirent Communications, will explain the best practices of continuous testing at high scale, which is r...
Thanks to Docker, it becomes very easy to leverage containers to build, ship, and run any Linux application on any kind of infrastructure. Docker is particularly helpful for microservice architectures because their successful implementation relies on a fast, efficient deployment mechanism – which is precisely one of the features of Docker. Microservice architectures are therefore becoming more popular, and are increasingly seen as an interesting option even for smaller projects, instead of bein...
The explosion of connected devices / sensors is creating an ever-expanding set of new and valuable data. In parallel the emerging capability of Big Data technologies to store, access, analyze, and react to this data is producing changes in business models under the umbrella of the Internet of Things (IoT). In particular within the Insurance industry, IoT appears positioned to enable deep changes by altering relationships between insurers, distributors, and the insured. In his session at @Things...