Welcome!

@CloudExpo Authors: Elizabeth White, Pat Romanski, Liz McMillan, Derek Weeks, John Rauser

Related Topics: @CloudExpo, Containers Expo Blog, @BigDataExpo

@CloudExpo: Blog Post

Case Study: Accelerate - Academic Research | @CloudExpo @DDN_limitless #Cloud #Storage

UCL transforms research collaboration and data preservation with scalable cloud object storage appliance from DDN

University College London (UCL), ranked consistently as one of the top five universities in the world, is London's leading multidisciplinary university with more than 10,000 staff , over 26,000 students as well as more than 100 departments, institutes and research centers. With 25 Nobel Prize winners and three Fields medalists among UCL's alumni and staff, the university has attained a world-class reputation for the quality of its teaching and research across the academic spectrum.

As London's premier research institution, UCL has 5,000 researchers committed to applying their collective strengths, insights and creativity to overcome problems of global significance. The university's innovative, cross-disciplinary research agenda is designed to deliver immediate, medium and long-term benefits to humanity. UCL Grand Challenges, which encompass Global Health, Sustainable Cities, Intercultural Interaction and Human Wellbeing, are a central feature of the university's research strategy.

According to Dr. J. Max Wilkinson, Head of Research Data Services for the UCL Information Services Division, sharing and preserving project-based research results is essential to the scientific method. "I was brought in to provide researchers with a safe and resilient solution for storing, sharing, reusing and preserving project-based data," he explains. "Our goal is to remove the burden of managing project data from individual researchers while making it more available over longer periods of time."

The Challenge
The opportunity to improve the sharing and access of project-based research presented several unique technical and cultural challenges. On the technical side, the team had to accommodate a variety of different types of data, growing in volume and velocity. In some cases, a small amount off data is so valuable to a research team that six discrete copies were retained on separate USB drives or removable hard drives kept in different locations. In other instances, UCL researchers produce copious amounts of very well-defined data that pass between compute algorithms under which research sits.

In addition to solving technical problems, the research data services team was faced with the opportunity to support researchers in a new ‘data-intensive' world by making it safe and easy to follow best practices in data management and use best-of-class storage solutions. "We discovered the valuable data underpinning most research projects were stuck on a hard drives or disk, never to be seen again," adds Wilkinson. "If we could provide a framework over which people could share and preserve data confidently, we could minimize this behavior and improve research by making the scholarly rerecord more complete."

To accomplish this, UCCLL needed to provide an enterprise-class foundation for data manipulation that met the needs of its diverse user community. While some researchers thought 100GB was a large amount of data, others clamored for more than 100TB to support a particular project. There was also an expectation that up to 3,000 individuals from UCL's total base of 5,000 active researchers and collaborators would require services within the next 18-to-24 months.

"We had a simple services proposition that would eliminate the need for research teams to manage racks of servers and data storage devices," says Wilkinson. "Of course, this meant we'd need a highly scalable storage infrastructure that could grow to 100PB without creating a large storage footprint or excessive administrative overhead."

Additionally, they had to address long-term data retention needs that extended well beyond the realm of research projects. UCL, along with many other UK research intensive institutes, is faced with increasingly stringent requirements for the management of project data outputs by UK Research Councils and other funding bodies in the United Kingdom. As grant funding in the UK supports best practice, it was critical to have a proven data management plan that documented how UCL would preserve data for sometimes decades while ensuring maximum appropriate access and reuse by third parties.

The Solution
In seeking a scalable, resilient storage foundation, UCL issued an RFP to solicit insight on different approaches for consolidating the university's research data storage infrastructure. Each of the 21 RFP respondents was asked to provide examples of large-scale deployments, which produced far-ranging answers, including how providers addressed sheer data volume, reduced increasingly complex environments or delivered overarching data management frameworks.

UCL's RFP covered a diverse set of requirements to determine each potential solution provider's respective strengths and limitations. "We asked for more than we thought possible from a single vendor-from a synchronous file sharing to a high performance parallel file system, highly scalable, resilient storage that would be simple to manage," notes Daniel Hanlon, Storage Architect for Research Data Services at University College London. "We wanted to cover our bases while determining what was practical and doable for researchers."

Recommendations encompassed a broad storage spectrum, including NAS, SAN, HSM, object storage, asset management solutions and small amounts of spinning disks with lots of back-end tape. "Because we had such broad requirements, we omitted any vendor that was bound to a particular hardware platform," explains Wilkinson. "It was important to be both data and storage agnostic so we would have the flexibility to support all data and media types without being locked into any particular hardware platform."

With its ability to support virtually unlimited scalability, object storage appealed to UCL, especially since it also would be much easier to manage than alternatives. Still, object storage was seen as a relatively new technology and UCL lacked hands-on experience with large-scale deployments within the university's ecosystem. In addition to evaluating the different technologies, UCL also assessed each provider's understanding of their environment, as it was critically important to accommodate UCL's researcher requirements in order to drive acceptance. "Some of the RFP respondents didn't understand the difference between the corporate and academic worlds, and the fact that universities by nature generally have to avoid being tied into particular closed technologies," adds Hanlon. "Many of the RFP respondents were eliminated, not because of their technical response, but because they didn't really get what we were trying to do."

As a result, the universe of prospective solutions was reduced to a half-dozen recommendations. As the team took a closer look at the finalists, they considered each vendor's academic track record, ability to scale without overburdening administrators and experience with open-source technology. "We wanted to work with a storage solutions provider that took advantage of open-source solutions," Hanlon notes. "This would enable us to partner with them and also with other academic institutions trying to do similar things."

In the final analysis, UCL wanted a partner with equal enthusiasm for freeing researchers from the burden of data storage so they could maximize the impact of their projects. "We were very interested in building a relationship with a strong storage partner to fill our technology gap," says Wilkinson. "After a thorough assessment, DataDirectTM Networks (DDN) met our technical requirements and shared our data storage vision. In evaluating DDN, we agreed that their solution had a simple proposition, high performance and low administration overhead."

The proposed solution, which included the GRIDScaler massively scalable parallel file system and Web Object Scaler (WOS), also provided the desired scalability and management simplicity. Another plus for WOS storage was its tight integration with the Integrated Rule-Oriented Data Management Solution (iRODS). This open-source solution is ideally suited for research collaboration by making it easier to organize, share and find collections of data stored in local and remote repositories.

"It was important that DDN's solution gave us multiple ways to access the same storage, so we could be compatible with existing application codes," says Hanlon. "The tendency with other solutions was to give us bits of technology that had been developed in different spaces and that didn't really fit our problem."

The Benefits
During a successful pilot implementation involving a half-petabyte of storage, UCL gained first-hand insight into the advantage of DDN's turnkey distributed storage and collaboration solution. "The main attraction of DDN WOS is the combination of an efficient object store with edge appliances to ease integration with other storage infrastructure," says Hanlon. Another big plus for UCL is DDN's high-density storage capacity, which will enable fitting a lot more disks into existing storage racks, which is crucial to growing while maintaining a small footprint in UCL's highly-congested, expensive downtown London location.

As researchers are often reluctant to give up control of their data storage solutions, the team also has been pleased to discover early adopters who see the value of using the new service to protect and preserve current data assets. In fact, the new research data service already is getting high marks for performance reliability, data durability, data backup and disaster recovery capabilities.

UCL predicts that as traction for the new service increases, there will be greater interest in leveraging it to further extend how current research is reused and exploited to drive more impactful outcomes. By taking this innovative approach, the UCL Research Data Services team is embracing the open data movement while enlisting leading-edge technologies to deliver reliable, flexible data access that maximizes appropriate sharing and re-use of research data.

Additionally, UCL is taking the researcher worry of meeting increasingly strong expectations from funding organizations out of the storage equation with its plans to add a scalable archive to its dynamic storage service offering. "We'll be able to tell researchers that if they use our services, they'll be compliant with UCL, UK Research Council and other UK and international funding bodies' policies and requirements," Wilkinson says. "They won't have to worry about it because we will."

By providing a framework over which UCL researchers can store and share data confidently, UCL expects to achieve significant bottom-line cost savings. Early projections around the initial phase of the infrastructure build out are upwards of hundreds of thousands of UK pounds, simply by eliminating the need for thousands of researchers to attain and maintain their own storage hardware. "DDN is empowering us to deliver performance and cost savings through a dramatically simplified approach; in doing so we support UCL researchers, their collaborators and partners to maintain first class research at London's global university," concludes Wilkinson. "Add in the fact that DDN's resilient, extensible storage solution provided evidence of seamless expansion from a half-petabyte to 100PBs, and we found exactly the foundation we were looking for."

More Stories By Pat Romanski

News Desk compiles and publishes breaking news stories, press releases and latest news articles as they happen.

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
With major technology companies and startups seriously embracing Cloud strategies, now is the perfect time to attend 21st Cloud Expo October 31 - November 2, 2017, at the Santa Clara Convention Center, CA, and June 12-14, 2018, at the Javits Center in New York City, NY, and learn what is going on, contribute to the discussions, and ensure that your enterprise is on the right path to Digital Transformation.
In his session at @ThingsExpo, Eric Lachapelle, CEO of the Professional Evaluation and Certification Board (PECB), provided an overview of various initiatives to certify the security of connected devices and future trends in ensuring public trust of IoT. Eric Lachapelle is the Chief Executive Officer of the Professional Evaluation and Certification Board (PECB), an international certification body. His role is to help companies and individuals to achieve professional, accredited and worldwide re...
Wooed by the promise of faster innovation, lower TCO, and greater agility, businesses of every shape and size have embraced the cloud at every layer of the IT stack – from apps to file sharing to infrastructure. The typical organization currently uses more than a dozen sanctioned cloud apps and will shift more than half of all workloads to the cloud by 2018. Such cloud investments have delivered measurable benefits. But they’ve also resulted in some unintended side-effects: complexity and risk. ...
It is ironic, but perhaps not unexpected, that many organizations who want the benefits of using an Agile approach to deliver software use a waterfall approach to adopting Agile practices: they form plans, they set milestones, and they measure progress by how many teams they have engaged. Old habits die hard, but like most waterfall software projects, most waterfall-style Agile adoption efforts fail to produce the results desired. The problem is that to get the results they want, they have to ch...
In 2014, Amazon announced a new form of compute called Lambda. We didn't know it at the time, but this represented a fundamental shift in what we expect from cloud computing. Now, all of the major cloud computing vendors want to take part in this disruptive technology. In his session at 20th Cloud Expo, Doug Vanderweide, an instructor at Linux Academy, discussed why major players like AWS, Microsoft Azure, IBM Bluemix, and Google Cloud Platform are all trying to sidestep VMs and containers wit...
New competitors, disruptive technologies, and growing expectations are pushing every business to both adopt and deliver new digital services. This ‘Digital Transformation’ demands rapid delivery and continuous iteration of new competitive services via multiple channels, which in turn demands new service delivery techniques – including DevOps. In this power panel at @DevOpsSummit 20th Cloud Expo, moderated by DevOps Conference Co-Chair Andi Mann, panelists examined how DevOps helps to meet the de...
When growing capacity and power in the data center, the architectural trade-offs between server scale-up vs. scale-out continue to be debated. Both approaches are valid: scale-out adds multiple, smaller servers running in a distributed computing model, while scale-up adds fewer, more powerful servers that are capable of running larger workloads. It’s worth noting that there are additional, unique advantages that scale-up architectures offer. One big advantage is large memory and compute capacity...
You know you need the cloud, but you’re hesitant to simply dump everything at Amazon since you know that not all workloads are suitable for cloud. You know that you want the kind of ease of use and scalability that you get with public cloud, but your applications are architected in a way that makes the public cloud a non-starter. You’re looking at private cloud solutions based on hyperconverged infrastructure, but you’re concerned with the limits inherent in those technologies.
The taxi industry never saw Uber coming. Startups are a threat to incumbents like never before, and a major enabler for startups is that they are instantly “cloud ready.” If innovation moves at the pace of IT, then your company is in trouble. Why? Because your data center will not keep up with frenetic pace AWS, Microsoft and Google are rolling out new capabilities. In his session at 20th Cloud Expo, Don Browning, VP of Cloud Architecture at Turner, posited that disruption is inevitable for comp...
No hype cycles or predictions of zillions of things here. IoT is big. You get it. You know your business and have great ideas for a business transformation strategy. What comes next? Time to make it happen. In his session at @ThingsExpo, Jay Mason, Associate Partner at M&S Consulting, presented a step-by-step plan to develop your technology implementation strategy. He discussed the evaluation of communication standards and IoT messaging protocols, data analytics considerations, edge-to-cloud tec...
"When we talk about cloud without compromise what we're talking about is that when people think about 'I need the flexibility of the cloud' - it's the ability to create applications and run them in a cloud environment that's far more flexible,” explained Matthew Finnie, CTO of Interoute, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
Cloud applications are seeing a deluge of requests to support the exploding advanced analytics market. “Open analytics” is the emerging strategy to deliver that data through an open data access layer, in the cloud, to be directly consumed by external analytics tools and popular programming languages. An increasing number of data engineers and data scientists use a variety of platforms and advanced analytics languages such as SAS, R, Python and Java, as well as frameworks such as Hadoop and Spark...
"We are a monitoring company. We work with Salesforce, BBC, and quite a few other big logos. We basically provide monitoring for them, structure for their cloud services and we fit into the DevOps world" explained David Gildeh, Co-founder and CEO of Outlyer, in this SYS-CON.tv interview at DevOps Summit at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
The Internet giants are fully embracing AI. All the services they offer to their customers are aimed at drawing a map of the world with the data they get. The AIs from these companies are used to build disruptive approaches that cannot be used by established enterprises, which are threatened by these disruptions. However, most leaders underestimate the effect this will have on their businesses. In his session at 21st Cloud Expo, Rene Buest, Director Market Research & Technology Evangelism at Ara...
SYS-CON Events announced today that Silicon India has been named “Media Sponsor” of SYS-CON's 21st International Cloud Expo, which will take place on Oct 31 – Nov 2, 2017, at the Santa Clara Convention Center in Santa Clara, CA. Published in Silicon Valley, Silicon India magazine is the premiere platform for CIOs to discuss their innovative enterprise solutions and allows IT vendors to learn about new solutions that can help grow their business.
Join us at Cloud Expo June 6-8 to find out how to securely connect your cloud app to any cloud or on-premises data source – without complex firewall changes. More users are demanding access to on-premises data from their cloud applications. It’s no longer a “nice-to-have” but an important differentiator that drives competitive advantages. It’s the new “must have” in the hybrid era. Users want capabilities that give them a unified view of the data to get closer to customers and grow business. The...
Amazon started as an online bookseller 20 years ago. Since then, it has evolved into a technology juggernaut that has disrupted multiple markets and industries and touches many aspects of our lives. It is a relentless technology and business model innovator driving disruption throughout numerous ecosystems. Amazon’s AWS revenues alone are approaching $16B a year making it one of the largest IT companies in the world. With dominant offerings in Cloud, IoT, eCommerce, Big Data, AI, Digital Assista...
The current age of digital transformation means that IT organizations must adapt their toolset to cover all digital experiences, beyond just the end users’. Today’s businesses can no longer focus solely on the digital interactions they manage with employees or customers; they must now contend with non-traditional factors. Whether it's the power of brand to make or break a company, the need to monitor across all locations 24/7, or the ability to proactively resolve issues, companies must adapt to...
"Loom is applying artificial intelligence and machine learning into the entire log analysis process, from start to finish and at the end you will get a human touch,” explained Sabo Taylor Diab, Vice President, Marketing at Loom Systems, in this SYS-CON.tv interview at 20th Cloud Expo, held June 6-8, 2017, at the Javits Center in New York City, NY.
@DevOpsSummit at Cloud Expo taking place Oct 31 - Nov 2, 2017, at the Santa Clara Convention Center, Santa Clara, CA, is co-located with the 21st International Cloud Expo and will feature technical sessions from a rock star conference faculty and the leading industry players in the world. The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is ...