Introduction to the project
This Software Group incubation project, codenamed "Botticelli," focused on the front office within capital markets. The front office involves the buying and selling of securities by traders. This project implemented a simple algorithmic trading use case. Algorithmic trading refers to the programmatic placing of buy and sell orders. In algorithmic trading, a quantitative model automatically generates the timing and size of the orders based on certain parameters and constraints. These trading environments require a high-speed infrastructure that can handle large volumes of data to optimize the firm's trades. In the project scenario, the algorithmic program analyzes and processes live market data in conjunction with other reference data and generates orders to buy or sell securities.
The front office infrastructure needed to support algorithmic trading is different from the mid- and back-office infrastructures. The latter are more business process-driven, with guaranteed delivery of messages being a critical requirement to ensure that none of the orders are lost.
Within an algorithmic trading scenario, there are a number of steps that must be executed, and we applied IBM technology to demonstrate each of these steps.
Algorithmic trading process requirements
Algorithmic trading requires an infrastructure that provides high-speed access to large volumes of data. The trading platform typically requires various capabilities, including the ability to:
- Accept streams of incoming market data
- Distribute market data to applications/processes in microseconds
- Analyze market data in real time
- Pre-load reference and historical data
- Support business rules
- Provide access for human traders to the market data and other financial information
- Monitor the health of the algorithmic trading environment
The Botticelli scenario depicted in Figure 1 represents a simplistic algorithmic trading use case.
Figure 1. Botticelli algorithmic trading scenario
The flow is as follows:
- The NYSE and NASDAQ market data is processed through feed handlers, and is made available to subscribers.
- Two subscribers receive the market data— the bargain index algorithmic trading program and the data hub.
- Reference data and threshold parameters are loaded into memory to be used throughout the trading session by the algorithmic trading programs.
- The bargain index program receives the market data over a high-speed
connection and does the following:
- Retrieves reference data, such as earnings per share, analyst ratings, and configuration parameters from memory.
- Calculates volume-weighted average price for the trades.
- Performs calculations and generates the orders that it has identified as bargains.
- Determines if the order is an exception (exceeds a threshold on size).
- Sends non-exception orders through the data hub to be routed to the execution venue.
- Invokes business rules to determine to whom to route exception orders.
- Routes the exception orders through the data hub to the appropriate trader.
- The trader desktop subscribes to the data hub and displays:
- A watch list of stocks (Dow 30 for this scenario), various financial widgets, and exception orders for the trader.
- The monitoring dashboard subscribes to the data hub and displays:
- Multi-dimensional views of the order data, order status, output from the bargain index program, system latency information, and exception order breakdown.
Financial markets front office blueprint
Figure 2 illustrates the suite of IBM middleware that was integrated to provide the capabilities defined in the previous section.
Figure 2. Front office blueprint
Let's take a look at each of the IBM technologies and products that are being used for these capabilities. Each product described below is associated with one of the lettered components (for example, "A") in Figure 2 .
- IBM WebSphere® Front Office (A)
- WebSphere Front Office provides over 95 feed handlers covering the major North American, European, and Asian equities feeds, the largest options feed, and some futures and commodities feeds within North America. The feed handlers understand how to interpret the incoming data, normalize it, and make it available to subscribers. The data distribution feature provides various ways to distribute the data, each involving different qualities of service.
- WebSphere MQ Low Latency Messaging (B)
- WebSphere MQ Low Latency Messaging is a low-latency distribution bus
that addresses the needs of the financial markets trading
infrastructure. It supports features such as high-performance
messaging, multicast and unicast transports, component state
replication, fault detection and failover, and consistent ordered
Low Latency Messaging recently participated in a Securities Technology Analysis Center (STAC) performance benchmark. In terms of speed, at a rate of 50K messages per second, STAC benchmarked WebSphere MQ Low Latency Messaging at eight microsecond single hop latency on Infiniband. And in terms of throughput, in internal performance tests, WebSphere MQ Low Latency Messaging was benchmarked at being able to handle over 13 million, 45-byte messages per second on InfiniBand.
- IBM InfoSphere™ Streams (C)
- InfoSphere Streams is a high-performance stream processing technology
that can be used to rapidly analyze data, news, and video in real time
as the data streams from thousands of real-time sources. The Streams
platform can analyze structured and unstructured data and can scale to
over 125 node (server) instances. Within Botticelli, we implemented
the bargain index as a program running on the Streams platform.
Figure 3 depicts the runtime view of the bargain index program. This program is implemented as a one-way flow of data tuples. It starts with the input of market data from WebSphere Front Office, and then splits the data into trades and quotes. The trades are enriched with additional data, and the volume- weighted average price calculation is performed. Orders are eventually generated, and the output is sent to the data hub over an WebSphere MQ Low Latency Messaging connection. The program was created using a stream-processing language that supports declarative composition of operators, which are its basic building blocks. Streams provides code generation for high performance and platform independence. Because of these features, Streams can be a very good technology for an algorithmic trading platform.
Figure 3. Bargain index program
- IBM solidDB® (D)
- solidDB provides in-memory retrieval of data using specialized access methods. These access methods result in data access and storage up to ten times faster than those used to access cached data from traditional disk-based database systems. Within Botticelli, we preloaded the reference data that the algorithmic programs would need into solidDB so that accessing that data in real time would be as fast as possible.
- WebSphere ILOG® Business Rule Management Systems (BRMS) (E)
- ILOG technology specializes in business rule management, optimization,
and visualization. The ILOG BRMS is called JRules. JRules provides the
ability to centralize rule management and administration within an
enterprise, providing auditability and traceability, which are
critical requirements in the financial markets industry. For this
reason, we use JRules to handle the routing of exception orders to
traders for review.
Within Botticelli, we invoked routing rules created in JRules to determine where to route exception orders. These are algorithmically generated orders that exceed thresholds set on order size. The next article in this series addresses how we integrated Streams and ILOG. (To listen to a recorded ILOG demo that shows how JRules can be used in a Financial Markets firm, see Resources.)
Botticelli also used ILOG visualization widgets within the trader desktop. IBM ILOG JViews Enterprise provides a wide range of graphical interfaces that can be integrated into desktop, Ajax, and Eclipse displays.
- WebSphere eXtreme Scale (F)
- WebSphere eXtreme Scale is a caching and grid technology that we
utilized as a central data hub for the data that we needed to make
available to client applications. We cached the market data from
WebSphere Front Office, the order data from Streams, and the
historical data required by the trader desktops.
We created different receivers that could process the incoming data over multiple types of messaging protocols and made that data available through a subscription manager.
- Desktop clients: IBM Lotus® Expeditor and WebSphere sMash (G)
- There are different requirements for client desktops based on the
volume and frequency of data changes. We created a desktop application
that exhibits very low latency using Lotus Expeditor and a Web-based
application using WebSphere sMash (that can be accessed by any
Lotus Expeditor is a server-managed client platform that provides a single container to integrate desktop applications, Web applications, and legacy applications. You can contextually link the content of the applications through the use of a property broker. In Botticelli, we also integrated the ILOG JViews visualization widgets for our charts and graphs.
WebSphere sMash is a development and execution platform for quickly building and running dynamic Web 2.0-based applications using SOA principles. It supports scripting languages such as PHP and Groovy, and provides an agile Web application development environment.
- IBM Cognos® Now! (H)
- In addition to the trader desktop, we identified a need to provide
application monitoring capability. Trade desk managers, who are
responsible for a trading desk, are interested in:
- How many orders are being generated
- The order breakdown by sector/symbol/time
- How the data feeds are running
- What the order status is
Cognos Now! has a streaming engine that allows it to process feeds of data in near real time. In our scenario the order data is sent from the algorithmic program into the Cognos Now! dashboard. Multi-dimensional views of the order data are presented on the dashboard in a variety of formats. The Alert Manager capability allows the business users to identify data conditions that they would like to be alerted to. Watch points can be used to highlight data conditions within the dashboard tables.
A future article in this series describes the integration between Cognos Now! and WebSphere MQ Low Latency Messaging as part of this project.
There were multiple integration points that were addressed as part of this effort. As WebSphere MQ Low Latency Messaging is the primary messaging bus, the products in the Blueprint had to be enabled to use it. This required the development of a number of new APIs. The code for those APIs has been contributed to the product teams. There was also work done to integrate the JRules engine with Streams, enabling dynamic refresh of business rules in a running Streams program. Future articles in this series describe how we addressed these and other integration issues. These integration points include:
- Sending WebSphere Front Office data to InfoSphere Streams
- Receiving solidDB data into InfoSphere Streams
- Sending InfoSphere Streams output to other applications using the WebSphere MQ Low Latency Messaging protocol
- Receiving data into Cognos Now! over WebSphere MQ Low Latency Messaging
- Embedding the ILOG JRules engine into an InfoSphere Streams program
- Extracting latency metrics from WebSphere Front Office, WebSphere MQ Low Latency Messaging, InfoSphere Streams, and solidDB
This article provided you with an overview of some of the IBM capabilities in the financial markets industry, specifically the front office. We used an algorithmic trading scenario to illustrate how IBM middleware technology can be integrated to provide a sample solution. The same technologies can provide a framework for many other solutions in this domain that require high-speed, high-throughput capabilities.
We would like to thank the people who contributed to the Botticelli project:
- Nick Schofield, Wei Tchao, and Wayne Lee developed the prototype that integrates these products and provides a demonstration.
- Folu Okunseinde provided financial markets technical expertise and architectural direction.
- Rajiv Chodhari and Philip Enness provided business direction priorities, financial markets expertise, and customer requirements.
- Jim Caldwell made us aware of this problem domain and provided great management support.
- IBM financial markets industry expertise: Learn more about IBM financial market solutions.
- WebSphere MQ Low Latency Messaging: Learn more about WebSphere MQ Low Latency Messaging.
- WebSphere Front Office: Learn more about WebSphere Front Office.
- InfoSphere Streams: Learn more about InfoSphere Streams.
- solidDB: Learn more about solidDB product family.
- WebSphere ILOG BRMS: Learn more about WebSphere ILOG Business Rule Management Systems.
- WebSphere sMash: Learn more about WebSphere sMash.
- Cognos Now!: Learn more about Cognos Now!
- WebSphere eXtreme Scale: Learn more about WebSphere eXtreme Scale.
- "Getting Started with WebSphere eXtreme Scale, Part 1: Understanding WebSphere eXtreme Scale and how it works" (developerWorks, November 2009): Gain a technical understanding of what WebSphere eXtreme Scale is, the features it provides, and the vast benefits it offers.
- Demo: "Cognos Now! for the financial industry" (developerWorks, February 2010): See how Cognos Now! can be used by financial markets firms in an algorithmic trading scenario.
- Demo: "IBM WebSphere ILOG Business Rule Management System applied in the financial industry" (developerWorks, February 2010): Understand IBM WebSphere ILOG Business Rule Management Systems (BRMS) capabilities and how they can be applied in the financial market front office.
- developerWorks Information Management zone: Learn more about Information Management. Find technical documentation, how-to articles, education, downloads, product information, and more.
- Stay current with developerWorks technical events and webcasts.
Get products and technologies
- Build your next development project with IBM trial software, available for download directly from developerWorks.
- Lotus Expeditor wiki: Find and contribute to information about installing, administering, and using Lotus Expeditor components.
- Participate in developerWorks blogs and get involved in the My developerWorks community; with your personal profile and custom home page, you can tailor developerWorks to your interests and interact with other developerWorks users.
Dig deeper into Information management on developerWorks
Get samples, articles, product docs, and community resources to help build, deploy, and manage your cloud apps.
Keep up with the best and latest technical info to help you tackle your development challenges.
Software development in the cloud. Register today to create a project.
Evaluate IBM software and solutions, and transform challenges into opportunities.