Welcome!

Cloud Expo Authors: Pat Romanski, Carmen Gonzalez, Esmeralda Swartz, Roger Strukhoff, Elizabeth White

News Feed Item

Fujitsu Develops New Data Transfer Protocol Enabling Improved Transmissions Speeds

- Software-only approach enables over 30 times improvement in file transfer speeds between Japan and the US
- Reduces virtual desktop operating latency to less than 1/6 of previous levels

Kawasaki, Japan, Jan 29, 2013 - (JCN Newswire) - Fujitsu Laboratories Limited today announced the development of a new data transfer protocol that, by taking a software-only approach, can significantly improve the performance of file transfers, virtual desktops and other various communications applications.

Conventionally, when using transmission control protocol (TCP)(1) - the standard protocol employed in communications applications - in a low-quality communications environment, such as when connected to a wireless network or during times of line congestion, data loss (packet loss) can occur, leading to significant drops in transmission performance due to increased latency from having to retransmit data.

To address this problem, Fujitsu Laboratories has succeeded at a software-only approach, developing: 1) A new protocol that incorporates an efficient proprietarily developed retransmission method based on user datagram protocol (UDP)(2), an optimized way to deliver streaming media able to reduce latency resulting from data retransmission when packet loss occurs; 2) Control technology that addresses the problem of UDP transmissions consuming excess bandwidth by performing a real-time measurement of available network bandwidth and securing an optimal amount of communications bandwidth without overwhelming TCP's share of the bandwidth; and 3) Technology that, by employing the new protocol, makes it possible to easily speed up existing TCP applications without having to modify them.

Through a simple software installation, the new technology will make it possible to speed up TCP applications that previously required costly specialized hardware, and it can also be easily incorporated into mobile devices and other kinds of equipment. Moreover, compared with TCP, the technology enables a greater than 30 times improvement in file transfer speeds between Japan and the US, in addition to reducing virtual desktop operating latency to less than 1/6 of previous levels. This, in turn, is expected to make it easier to take advantage of various applications employing international communication lines and wireless networks which are anticipated to become increasingly widespread.

Background

With the increased popularity of mobile devices and cloud services in recent years, a wide range of applications have begun to utilize communications capabilities. In many applications, such as file transfer, virtual desktop, and other communications applications, TCP is employed as a standard communications protocol. One issue with TCP is that data loss (packet loss) can occur in low-quality communications environments, resulting in significant drops in transmission performance (reduced throughput and higher latency) due to increased latency from having to retransmit data. In the future, it is expected that there will be greater opportunities to take advantage of international communications lines and wireless networks, making it necessary to ensure that transmission performance does not drop even when connected to a low-quality communications environment.

Technological Challenges

Currently, one well-known method of speeding up application transmission speeds in low-quality communications environments is to employ specialized acceleration hardware. This kind of specialized equipment, however, is expensive and bulky, making it difficult to incorporate into mobile devices. High-speed transmission methods for transferring files using software-based acceleration also exist, but to support a variety of existing TCP applications using these methods, it has been necessary to make modifications to the traffic processing components of each application.

Newly Developed Technology

By developing a proprietary software-based transfer protocol, Fujitsu Laboratories has succeeded in significantly improving the throughput and operating latency of existing TCP applications.

Key features of the new technology are as follows:

1) New protocol improves throughput and latency in low-quality communications environments

Fujitsu has developed a new protocol that incorporates a proprietarily developed and efficient retransmission method based on UDP, a protocol optimized for delivering streaming media. As a result, the new protocol is able to reduce latency resulting from data retransmission when packet loss occurs. The protocol can quickly distinguish between lost packets and packets that have not yet arrived at their destination, thereby preventing unnecessary retransmissions and latency from occurring. By incorporating the new protocol as a software add-on to UDP, it is possible to maintain the high speeds typical of UDP while avoiding packet loss and packets being sent in reverse order, UDP's main weaknesses. This, in turn, has enabled improvements in packet delivery and latency. In a comparison with standard TCP, the new protocol achieved a throughput increase of over 30 times during a simulated file transfer between Japan and the US, and operating packet delivery latency was reduced to less than 1/6 of previous levels.

2) Communications bandwidth control technology using real-time measurement of available network bandwidth

Fujitsu Laboratories developed a control technology that, by performing real-time measurement of available network bandwidth, can secure an optimal amount of communications bandwidth without overwhelming the share of bandwidth used by other TCP communications in a mixed TCP environment. For example, when other TCP communications are using relatively little bandwidth, the bandwidth share for the new protocol will increase, and when other TCP communications are taking up a higher percentage of bandwidth, the new protocol will use a smaller share.

3) Technology for accelerating existing TCP applications without any modifications

Fujitsu Laboratories has developed a technology that automatically converts TCP traffic standard for a wide variety of applications into the new protocol described in (1) above. This makes it possible to significantly improve the speed of a host of existing applications, including file transfer applications, virtual desktop applications, and web browsing applications, all without the need for any modifications.

Results

The use of the new technology is expected to speed up the performance of a wide range of communications applications employing international communication lines and wireless networks which are anticipated to become widely used more and more. For instance, the technology can help speed up web browsing and file download speeds in mobile communications environments where there is deterioration due to building obstructions or movement. In addition, the technology can improve data transfer speeds between datacenters in Japan and the US. It is also expected to help improve the usability of virtual desktops when accessing a virtual desktop located on a remote server using a low-quality communications environment.

Future Development

During fiscal 2013, Fujitsu Laboratories aims to commercialize the new technology as a communications middleware solution for improving communications speeds without having to modify existing TCP applications.

(1) Transmission Control Protocol (TCP): An Internet protocol that guarantees data delivery through a retransmission mechanism.
(2) User Datagram Protocol (UDP): An Internet protocol that does not guarantee data delivery.

About Fujitsu Limited

Fujitsu is the leading Japanese information and communication technology (ICT) company offering a full range of technology products, solutions and services. Over 170,000 Fujitsu people support customers in more than 100 countries. We use our experience and the power of ICT to shape the future of society with our customers. Fujitsu Limited (TSE:6702) reported consolidated revenues of 4.5 trillion yen (US$54 billion) for the fiscal year ended March 31, 2012. For more information, please see www.fujitsu.com.



Source: Fujitsu Limited

Contact:
Fujitsu Limited
Public and Investor Relations
www.fujitsu.com/global/news/contacts/
+81-3-3215-5259


Copyright 2013 JCN Newswire. All rights reserved. www.japancorp.net

More Stories By JCN Newswire

Copyright 2008 JCN Newswire. All rights reserved. Republication or redistribution of JCN Newswire content is expressly prohibited without the prior written consent of JCN Newswire. JCN Newswire shall not be liable for any errors or delays in the content, or for any actions taken in reliance thereon.

@CloudExpo Stories
Cloudwick, the leading big data DevOps service and solution provider to the Fortune 1000, announced Big Loop, its multi-vendor operations platform. Cloudwick Big Loop creates greater collaboration between Fortune 1000 IT staff, developers and their database management systems as well as big data vendors. This allows customers to comprehensively manage and oversee their entire infrastructure, which leads to more successful production cluster operations, and scale-out. Cloudwick Big Loop supports ...
Software AG helps organizations transform into Digital Enterprises, so they can differentiate from competitors and better engage customers, partners and employees. Using the Software AG Suite, companies can close the gap between business and IT to create digital systems of differentiation that drive front-line agility. We offer four on-ramps to the Digital Enterprise: alignment through collaborative process analysis; transformation through portfolio management; agility through process automation...
The Transparent Cloud-computing Consortium (abbreviation: T-Cloud Consortium) will conduct research activities into changes in the computing model as a result of collaboration between "device" and "cloud" and the creation of new value and markets through organic data processing High speed and high quality networks, and dramatic improvements in computer processing capabilities, have greatly changed the nature of applications and made the storing and processing of data on the network commonplace. ...
StackIQ offers a comprehensive software suite that automates the deployment, provisioning, and management of Big Infrastructure. With StackIQ’s software, you can spin up fully configured big data clusters, quickly and consistently — from bare-metal up to the applications layer — and manage them efficiently. Our software’s modular architecture allows customers to integrate nearly any application with the StackIQ software stack.
SimpleECM is the only platform to offer a powerful combination of enterprise content management (ECM) services, capture solutions, and third-party business services providing simplified integrations and workflow development for solution providers. SimpleECM is opening the market to businesses of all sizes by reinventing the delivery of ECM services. Our APIs make the development of ECM services simple with the use of familiar technologies for a frictionless integration directly into web applicat...
Things are being built upon cloud foundations to transform organizations. This CEO Power Panel at 15th Cloud Expo, moderated by Roger Strukhoff, Cloud Expo and @ThingsExpo conference chair, will address the big issues involving these technologies and, more important, the results they will achieve. How important are public, private, and hybrid cloud to the enterprise? How does one define Big Data? And how is the IoT tying all this together?
The Internet of Things (IoT) is going to require a new way of thinking and of developing software for speed, security and innovation. This requires IT leaders to balance business as usual while anticipating for the next market and technology trends. Cloud provides the right IT asset portfolio to help today’s IT leaders manage the old and prepare for the new. Today the cloud conversation is evolving from private and public to hybrid. This session will provide use cases and insights to reinforce t...
TechCrunch reported that "Berlin-based relayr, maker of the WunderBar, an Internet of Things (IoT) hardware dev kit which resembles a chunky chocolate bar, has closed a $2.3 million seed round, from unnamed U.S. and Switzerland-based investors. The startup had previously raised a €250,000 friend and family round, and had been on track to close a €500,000 seed earlier this year — but received a higher funding offer from a different set of investors, which is the $2.3M round it’s reporting."...
The 3rd International Internet of @ThingsExpo, co-located with the 16th International Cloud Expo - to be held June 9-11, 2015, at the Javits Center in New York City, NY - announces that its Call for Papers is now open. The Internet of Things (IoT) is the biggest idea since the creation of the Worldwide Web more than 20 years ago.
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, da...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. Over the summer Gartner released its much anticipated annual Hype Cycle report and the big news is that Internet of Things has now replaced Big Data as the most hyped technology. Indeed, we're hearing more and more about this fascinating new technological paradigm. ...
In his session at 15th Cloud Expo, Mark Hinkle, Senior Director, Open Source Solutions at Citrix Systems Inc., will provide overview of the open source software that can be used to deploy and manage a cloud computing environment. He will include information on storage, networking(e.g., OpenDaylight) and compute virtualization (Xen, KVM, LXC) and the orchestration(Apache CloudStack, OpenStack) of the three to build their own cloud services.
The Internet of Things needs an entirely new security model, or does it? Can we save some old and tested controls for the latest emerging and different technology environments? In his session at Internet of @ThingsExpo, Davi Ottenheimer, EMC Senior Director of Trust, will review hands-on lessons with IoT devices and reveal privacy options and a new risk balance you might not expect.
SYS-CON Events announced today that Objectivity, Inc., the leader in real-time, complex Big Data solutions, will exhibit at SYS-CON's 15th International Cloud Expo®, which will take place on November 4–6, 2014, at the Santa Clara Convention Center in Santa Clara, CA. Objectivity, Inc. is the Enterprise Database leader of real-time, complex Big Data solutions. Our leading edge technologies – InfiniteGraph®, The Distributed Graph Database™ and Objectivity/DB®, a distributed and scalable object ma...
In their session at DevOps Summit, Stan Klimoff, CTO of Qubell, and Mike Becker, Senior Data Engineer for RingCentral, will share the lessons learned from implementing CI/CD pipeline on AWS for a customer analytics project powered by Cloudera Hadoop, HP Vertica and Tableau. Stan Klimoff is CTO of Qubell, the enterprise DevOps platform. Stan has more than a decade of experience building distributed systems for companies such as eBay, Cisco and Seagate. Qubell is helping enterprises to become mor...
The major cloud platforms defy a simple, side-by-side analysis. Each of the major IaaS public-cloud platforms offers their own unique strengths and functionality. Options for on-site private cloud are diverse as well, and must be designed and deployed while taking existing legacy architecture and infrastructure into account. Then the reality is that most enterprises are embarking on a hybrid cloud strategy and programs. In this Power Panel at 15th Cloud Expo, moderated by Ashar Baig, Research ...
What are the benefits of using an enterprise-grade orchestration platform? In their session at 15th Cloud Expo, Jeff Tegethoff, CEO of Appcore, and Kedar Poduri, Senior Director of Product Management at Citrix Systems, will take a closer look at the architectural design factors needed to support diverse workloads and how to run these workloads efficiently as a service provider. They will also discuss how to deploy private cloud environments in 15 minutes or less.
Seagate has a strong track record of collaborating with others to develop better cloud solutions. The Seagate Cloud Builder Alliance program, for example, leverages the company’s knowledge of storage and cloud-optimized solutions to give cloud service providers the customized, flexible and scalable server and storage solutions to meet the high levels of service their customers demand. Seagate also is a member of the OpenStack Foundation and Open Compute Project to help define and promote open-so...
Big Data means many things to many people. From November 4-6 at the Santa Clara Convention Center, thousands of people will gather at Big Data Expo to discuss what it means to them, how they are implementing it, and how Big Data plays an integral role in the maturing cloud computing world and emerging Internet of Things. Attend Big Data Expo and make your contribution. Register for Big Data Expo "FREE" with Discount Code "BigDataOCTOBER" by October 31
IBM and SAP announced a major strategic agreement that integrates Big Blue's datacenter capabilities with Big Hasso's ongoing HANA cloud-based database initiative. IBM is investing $1.2 billion to build 15 new datacesnters as part of its SoftLayer acquisition and expansion, and will also use its existing IBM Cloud servers to support the new agreement with SAP. The announcement was made jointly in Armonk, NY and Walldorf, Germany. IBM is described by SAP as a “premier strategic provider” of cl...