Welcome!

@CloudExpo Authors: Liz McMillan, Zakia Bouachraoui, Yeshim Deniz, Pat Romanski, Elizabeth White

Related Topics: @CloudExpo

Blog Feed Post

Backup to INTERNAP Accelerated IP - XIP

Part 3 of the 3-part tutorial introducing the Gladinet Cloud Access Solutions to the Internap XIPCloud storage.

This is part 3 of the 3-piece tutorial introducing the Gladinet Cloud Access Solutions to the Internap XIPCloud storage.

Accelerated IP, or XIP, is Internap’s solution to the weaknesses of Transport Control Protocol (TCP). Through XIP, enterprise web services and applications that rely on the TCP layer can improve their performance by up to 400%.  - Quote Internap.com

XIPCloud means Internap’s cloud storage combined with accelerated IP. The first thing you see from the Internap cloud storage marketing materials is high performance, in addition to availability, security and open source.

This tutorial covers how to backup files, folders and SQL Servers to Internap XIPCloud.

Gladinet Cloud Backup

First you will need to have a copy of Gladinet Cloud Backup. The Cloud Backup is also integrated into Gladinet Cloud Desktop and Gladinet CloudAFS. As long as you have a Cloud Backup license, you can use it from one of the three products.

image

Step 1- Mount Internap XIPCloud

Step 1 is the same across all three products, mounting Internap XIPCloud into the application. You will need to use the same user name and password that you use to login to Internap XIPCloud page and mount the XIPCloud storage with Gladinet.

image

Backup By Folders

If you need to backup by folders, you can click the link above in the user interface. A dialog will come up and allowing you to select a source folder. Cloud Backup is capable of backing up folders that has active files and files-in-lock and take snapshots of them.

image

Next step is to pick the destination. If you have multiple Internap XIPCloud accounts, you can mount them all. Otherwise, you will only see one destination as the backup target.

image

the next step will select the backup schedule.

image

The last step is to set the snapshot schedule and other parameters. That is it, fairly simple and straight forward.

image

Backup SQL Server

If you need to backup SQL Server, you will select the Backup by Application link. The reason it is called backup by application is because it supports applications that integrate with Windows Volume Shadow Copy Service. SQL Server is one of the application that natively integrates with Volume Shadow Copy Service.

image

After selecting the SQL Server database instance to backup, the rest of the user interface is the same to set the backup schedule and other related parameters.

More Stories By Jerry Huang

Jerry Huang, an engineer and entrepreneur, founded Gladinet with his close friends and is pursuing interests in the cloud computing. He has published articles on the company blog as well as following up on the company twitter activities. He graduated from the University of Michigan in 1998 and has lived in West Palm Beach, Florida since.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


CloudEXPO Stories
With more than 30 Kubernetes solutions in the marketplace, it's tempting to think Kubernetes and the vendor ecosystem has solved the problem of operationalizing containers at scale or of automatically managing the elasticity of the underlying infrastructure that these solutions need to be truly scalable. Far from it. There are at least six major pain points that companies experience when they try to deploy and run Kubernetes in their complex environments. In this presentation, the speaker will detail these pain points and explain how cloud can address them.
The deluge of IoT sensor data collected from connected devices and the powerful AI required to make that data actionable are giving rise to a hybrid ecosystem in which cloud, on-prem and edge processes become interweaved. Attendees will learn how emerging composable infrastructure solutions deliver the adaptive architecture needed to manage this new data reality. Machine learning algorithms can better anticipate data storms and automate resources to support surges, including fully scalable GPU-centric compute for the most data-intensive applications. Hyperconverged systems already in place can be revitalized with vendor-agnostic, PCIe-deployed, disaggregated approach to composable, maximizing the value of previous investments.
When building large, cloud-based applications that operate at a high scale, it's important to maintain a high availability and resilience to failures. In order to do that, you must be tolerant of failures, even in light of failures in other areas of your application. "Fly two mistakes high" is an old adage in the radio control airplane hobby. It means, fly high enough so that if you make a mistake, you can continue flying with room to still make mistakes. In his session at 18th Cloud Expo, Lee Atchison, Principal Cloud Architect and Advocate at New Relic, discussed how this same philosophy can be applied to highly scaled applications, and can dramatically increase your resilience to failure.
Machine learning has taken residence at our cities' cores and now we can finally have "smart cities." Cities are a collection of buildings made to provide the structure and safety necessary for people to function, create and survive. Buildings are a pool of ever-changing performance data from large automated systems such as heating and cooling to the people that live and work within them. Through machine learning, buildings can optimize performance, reduce costs, and improve occupant comfort by sharing information within the building and with outside city infrastructure via real time shared cloud capabilities.
As Cybric's Chief Technology Officer, Mike D. Kail is responsible for the strategic vision and technical direction of the platform. Prior to founding Cybric, Mike was Yahoo's CIO and SVP of Infrastructure, where he led the IT and Data Center functions for the company. He has more than 24 years of IT Operations experience with a focus on highly-scalable architectures.