Click here to close now.

Welcome!

CloudExpo® Blog Authors: Liz McMillan, Hovhannes Avoyan, Lori MacVittie, AppDynamics Blog, Kelly Murphy

Related Topics: CloudExpo® Blog, Microservices Journal

CloudExpo® Blog: Article

Cloud Transaction Synchronicity

Without single-source timing, financial transactions running in clouds are as random as raindrops

High-Frequency Trading (HFT) and secret algorithms have become the new competitive strategy in today's global financial industry. The faster traders can turn around trades, the faster they can get in and out of quick markets and short time pockets of opportunity.

Having accurate records of when a transaction occurs is critical as to processing the trade and valuating the transaction.

There are many articles and white papers discussing cloud computing and shared services. New services that are being touted are things like SaaS (Software as a Service), PaaS (Platform as a Service), and IaaS (Infrastructure as a Service).

What is missing is the ability to sync up transactions coming from various outbound originations. What is necessary is the ability to provide Timing as a Service (TaaS).

A Master Clock Is Needed
This TaaS type of service is the ability to have every transaction synched up off of one Master Clock. Just like the timing of the public switched telephone network (PSTN) itself is based on one Master clock (the Atomic clock), the applications needing timing should also be provided with timing. This timing could be a whole new network service for the carrier to provide.

"Timing as a Service"© (TaaS©) needs to be implemented and offered as a network services offering for financial institutions and related firms using any Cloud technologies. HFT needs TaaS© for accuracy.

Without synchronized timing coming from one source, high-speed financial transactions running through various financial networks cannot be as accurate as they should be.

Synchronized Financial Services Framework
Synchronized timing that is transaction-centric should be a mandatory service for all financial transactions that are being transmitted to all the exchanges. A well-defined timing definition for financial services (trading, transaction processing) should be developed and accepted by the various financial trading organizations because it would provide a more accurate layer of processing and oversight for all transactions being processed. A Synchronized Financial Services Framework would go down into a level of granularity of nanoseconds in order to ensure trades being done in microseconds could be easily sequenced into proper order.

Why is this so important? Financial transactions have to be processed in a very orderly fashion. Each transaction has a very precise value assigned to it based on the timing of the trade.

In the old days when everything was done manually, each order had to be time-stamped on the floor so that its value could be recorded based on the time of trade. Now, that time of trade has gone from a tenth of a second to a microsecond. (See Chart)

Financial Transactions Timing Chart

SPEED of TRANSACTIONS

(in seconds)

SECONDS

TIMING FROM

OBSERVATION

Hundredth

1/100

MULTIPLE

SERVERS

Where regulators are today

Millisecond

1/ 1,000

MULTIPLE SERVERS

Latency (30-40 milliseconds)

Microsecond

1/10,000

MULTIPLE

SERVERS

Where traders are today

Nanosecond

1/1,000,000

ONE ATOMIC CLOCK

Where regulators should be

Source: James Carlini, 2012

The safeguards and regulatory oversight have not gotten down to that level of granularity. If it had, the explanation of the 2010 Dow Flash Crash (the 1,000 Point Drop) would have been more accurate.

20th Century Solutions Cannot Be Applied to 21st Century Challenges
New services mean new revenues to network carriers. Each network circuit being used by financial organizations would have to provide timing for all transactions being sent through the network carriers.

It's too bad the carriers don't have the visionaries on staff anymore to dream up these new network services. They should have been offering this type of service already. If they did, mission-critical applications using cloud computing services would be an easier decision than it is today.

Network carriers seem to be stuck in the 1960s in their monopolistic telephony approach because this is a service clearly needed to facilitate cloud applications and opportunities in the 21st century.

•   •   •

Copyright 2012 - James Carlini

More Stories By James Carlini

James Carlini, MBA, a certified Infrastructure Consultant, keynote speaker and former award-winning Adjunct Professor at Northwestern University, has advised on mission-critical networks. Clients include the Chicago Mercantile Exchange, GLOBEX, and City of Chicago’s 911 Center. An expert witness in civil and federal courts on network infrastructure, he has worked with AT&T, Sprint and others.

Follow daily Carlini-isms at www.twitter.com/JAMESCARLINI

Comments (0)

Share your thoughts on this story.

Add your comment
You must be signed in to add a comment. Sign-in | Register

In accordance with our Comment Policy, we encourage comments that are on topic, relevant and to-the-point. We will remove comments that include profanity, personal attacks, racial slurs, threats of violence, or other inappropriate material that violates our Terms and Conditions, and will block users who make repeated violations. We ask all readers to expect diversity of opinion and to treat one another with dignity and respect.


@CloudExpo Stories
Containers Expo Blog covers the world of containers, as this lightweight alternative to virtual machines enables developers to work with identical dev environments and stacks. Containers Expo Blog offers top articles, news stories, and blog posts from the world's well-known experts and guarantees better exposure for its authors than any other publication. Bookmark Containers Expo Blog ▸ Here Follow new article posts on Twitter at @ContainersExpo
There is little doubt that Big Data solutions will have an increasing role in the Enterprise IT mainstream over time. 8th International Big Data Expo, co-located with 17th International Cloud Expo - to be held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA - has announced its Call for Papers is open. As advanced data storage, access and analytics technologies aimed at handling high-volume and/or fast moving data all move center stage, aided by the cloud computing bo...
The 5th International DevOps Summit, co-located with 17th International Cloud Expo – being held November 3-5, 2015, at the Santa Clara Convention Center in Santa Clara, CA – announces that its Call for Papers is open. Born out of proven success in agile development, cloud computing, and process automation, DevOps is a macro trend you cannot afford to miss. From showcase success stories from early adopters and web-scale businesses, DevOps is expanding to organizations of all sizes, including the...
Move from reactive to proactive cloud management in a heterogeneous cloud infrastructure. In his session at 16th Cloud Expo, Manoj Khabe, Innovative Solution-Focused Transformation Leader at Vicom Computer Services, Inc., will show how to replace a help desk-centric approach with an ITIL-based service model and service-centric CMDB that’s tightly integrated with an event and incident management platform. Learn how to expand the scope of operations management to service management. He will al...
You use an agile process; your goal is to make your organization more agile. But what about your data infrastructure? The truth is, today's databases are anything but agile - they are effectively static repositories that are cumbersome to work with, difficult to change, and cannot keep pace with application demands. Performance suffers as a result, and it takes far longer than it should to deliver new features and capabilities needed to make your organization competitive. As your application an...
Over the years, a variety of methodologies have emerged in order to overcome the challenges related to project constraints. The successful use of each methodology seems highly context-dependent. However, communication seems to be the common denominator of the many challenges that project management methodologies intend to resolve. In this respect, Information and Communication Technologies (ICTs) can be viewed as powerful tools for managing projects. Few research papers have focused on the way...
As the world moves from DevOps to NoOps, application deployment to the cloud ought to become a lot simpler. However, applications have been architected with a much tighter coupling than it needs to be which makes deployment in different environments and migration between them harder. The microservices architecture, which is the basis of many new age distributed systems such as OpenStack, Netflix and so on is at the heart of CloudFoundry – a complete developer-oriented Platform as a Service (PaaS...
The Industrial Internet revolution is now underway, enabled by connected machines and billions of devices that communicate and collaborate. The massive amounts of Big Data requiring real-time analysis is flooding legacy IT systems and giving way to cloud environments that can handle the unpredictable workloads. Yet many barriers remain until we can fully realize the opportunities and benefits from the convergence of machines and devices with Big Data and the cloud, including interoperability, ...
High-performing enterprise Software Quality Assurance (SQA) teams validate systems that are ready for use - getting most actively involved as components integrate and form complete systems. These teams catch and report on defects, making sure the customer gets the best software possible. SQA teams have leveraged automation and virtualization to execute more thorough testing in less time - bringing Dev and Ops together, ensuring production readiness. Does the emergence of DevOps mean the end of E...
The term culture has had a polarizing effect among DevOps supporters. Some propose that culture change is critical for success with DevOps, but are remiss to define culture. Some talk about a DevOps culture but then reference activities that could lead to culture change and there are those that talk about culture change as a set of behaviors that need to be adopted by those in IT. There is no question that businesses successful in adopting a DevOps mindset have seen departmental culture change, ...
Amazon and Google have built software-defined data centers (SDDCs) that deliver massively scalable services with great efficiency. Yet, building SDDCs has proven to be a near impossibility for companies without hyper-scale resources. In his session at 15th Cloud Expo, David Cauthron, CTO and Founder of NIMBOXX, highlighted how a mid-sized manufacturer of global industrial equipment bridged the gap from virtualization to software-defined services, streamlining operations and costs while connect...
The widespread success of cloud computing is driving the DevOps revolution in enterprise IT. Now as never before, development teams must communicate and collaborate in a dynamic, 24/7/365 environment. There is no time to wait for long development cycles that produce software that is obsolete at launch. DevOps may be disruptive, but it is essential. The DevOps Summit at Cloud Expo – to be held June 3-5, 2015, at the Javits Center in New York City – will expand the DevOps community, enable a wide...
The Internet of Things is tied together with a thin strand that is known as time. Coincidentally, at the core of nearly all data analytics is a timestamp. When working with time series data there are a few core principles that everyone should consider, especially across datasets where time is the common boundary. In his session at Internet of @ThingsExpo, Jim Scott, Director of Enterprise Strategy & Architecture at MapR Technologies, discussed single-value, geo-spatial, and log time series dat...
Cloud Expo, Inc. has announced today that Andi Mann returns to DevOps Summit 2015 as Conference Chair. The 4th International DevOps Summit will take place on June 9-11, 2015, at the Javits Center in New York City. "DevOps is set to be one of the most profound disruptions to hit IT in decades," said Andi Mann. "It is a natural extension of cloud computing, and I have seen both firsthand and in independent research the fantastic results DevOps delivers. So I am excited to help the great team at ...
The Internet of Things is not only adding billions of sensors and billions of terabytes to the Internet. It is also forcing a fundamental change in the way we envision Information Technology. For the first time, more data is being created by devices at the edge of the Internet rather than from centralized systems. What does this mean for today's IT professional? In this Power Panel at @ThingsExpo, moderated by Conference Chair Roger Strukhoff, panelists will addresses this very serious issue o...
Scott Jenson leads a project called The Physical Web within the Chrome team at Google. Project members are working to take the scalability and openness of the web and use it to talk to the exponentially exploding range of smart devices. Nearly every company today working on the IoT comes up with the same basic solution: use my server and you'll be fine. But if we really believe there will be trillions of these devices, that just can't scale. We need a system that is open a scalable and by using ...
Enterprises are fast realizing the importance of integrating SaaS/Cloud applications, API and on-premises data and processes, to unleash hidden value. This webinar explores how managers can use a Microservice-centric approach to aggressively tackle the unexpected new integration challenges posed by proliferation of cloud, mobile, social and big data projects. Industry analyst and SOA expert Jason Bloomberg will strip away the hype from microservices, and clearly identify their advantages and d...
All major researchers estimate there will be tens of billions devices - computers, smartphones, tablets, and sensors - connected to the Internet by 2020. This number will continue to grow at a rapid pace for the next several decades. With major technology companies and startups seriously embracing IoT strategies, now is the perfect time to attend @ThingsExpo, June 9-11, 2015, at the Javits Center in New York City. Learn what is going on, contribute to the discussions, and ensure that your enter...
EMC Corporation on Tuesday announced it has entered into a definitive agreement to acquire privately held Virtustream. When the transaction closes, Virtustream will form EMC’s new managed cloud services business. The acquisition represents a transformational element of EMC’s strategy to help customers move all applications to cloud-based IT environments. With the addition of Virtustream, EMC completes the industry’s most comprehensive hybrid cloud portfolio to support all applications, all workl...
Container frameworks, such as Docker, provide a variety of benefits, including density of deployment across infrastructure, convenience for application developers to push updates with low operational hand-holding, and a fairly well-defined deployment workflow that can be orchestrated. Container frameworks also enable a DevOps approach to application development by cleanly separating concerns between operations and development teams. But running multi-container, multi-server apps with containers ...