by Nick Saunders
8 November, 2016 - 5 minute read

How does Winton use technology? And how are advances in technology pushing our business forward?

In the investment industry, Winton is known as a ‘systematic’ manager, as we use computer algorithms to determine what financial products to buy and sell on behalf of our clients.

The absence of human intervention in the day-to-day investment process means that we are heavily reliant on our systems and applications, many of which we have developed in-house. But just like any technology company, we know that we have to keep evolving and enhancing those systems so that we can allow our business to grow and develop, often in ways that haven’t yet been identified.

To allow us to provide that flexibility, we need to have a deep understanding of our software and our infrastructure. One way to appreciate the nature of this challenge is to think of our technology environment as a pipeline, with 5 key segments performing the primary functions:

Technology Pipeline

Data Ingestion

We are data hungry. Whether acquiring from vendors or collecting through proprietary means, whether financial or alternative in nature, processing data at massive scale underpins everything we do at Winton. But acquisition is just the beginning.

Data is often inaccurate, incomplete or simply wrong. We can’t afford to assume our data is correct, regardless of whether we are applying it to our live trading or to our long-term research. As a consequence, we continue to advance our tool set to transform, test, clean, combine and validate all aspects of our data universe, every day.

Providing high-quality, accurate, immutable data is the foundation of our business.

Signal Discovery

A signal is an indication whether to buy or a sell a particular security. Discovering signals in noisy data sets and translating those discoveries into a repeatable process performed by a computer is our equivalent of ‘Portfolio Management’. But discovering high-quality signals is difficult – if it was easy, everyone would be doing it!

The harder it becomes to experiment, back-test and verify new signals, the more focus is placed on the infrastructure (both on-premises and cloud-based) required to perform the calculation ‘heavy-lifting’. And the more signals that are generated, the more complex the process becomes to combine these and manage their delivery.

This inflection point is where research, data and technology come together to produce Winton’s core output.

Order Generation

Once we have generated the signals, we need to translate these into the ‘real’ world equivalent of financial products and orders (i.e. the amounts of a particular security to buy or sell). Winton advises many ‘funds’ on behalf of many clients who will receive a variety of signal combinations, depending on their product choice.

But it is not that simple. We need to consider a range of constraints, whether mandated by our regulators, our clients or by our internal risk management group. Performing these calculations safely, fairly and quickly within an automated framework allows us to operate at scale.


Once we know what we need to buy or sell, we move into the order execution phase. While Winton is not a ‘high-frequency’ trader, we do use algorithms to determine how to place our buy or sell orders in the market. We also perform continuous transaction cost analysis that provides a feedback-loop to inform us of our market impact and the effectiveness of our algorithms.

Scale and through-put are key to our execution engine. Our controls and technical resilience combined with in-house tooling to support human oversight, also provide confidence that we are transacting as intended: in a smart and responsible way.


Our job doesn’t end once an order has been executed. While many investment managers view the ‘post-trade’ process (confirmation, settlement, valuation, reporting etc.) as something to outsource so that it is ‘out of sight’, Winton takes a different view.

We have built our operational platform to act as the engine room of our business, providing full control and transparency. But just like any engine, it needs to operate at peak efficiency without excessive manual intervention.

If we can successfully engineer the post-trade process then we can successfully scale our business. This is where many of our most interesting software challenges lie.

So what tools do we use across our business? Our tech stack below shows the main components, many of which will feature in our blog posts in the coming weeks and months:

Winton Technology Stack

This picture is anything but static as we recognise the need to understand how the technical landscape is evolving. While we go to great lengths to appraise and evaluate new additions prior to inclusion in our stack, we actively incorporate new technologies and solutions when they suit our needs.

So is that the extent of technology at Winton? Far from it.

Technologists work closely with our Venture Capital group to identify and develop exciting early-stage data and tech-focussed businesses. We collaborate with colleagues at our Data Science hub in San Francisco who are surfacing original data sets that we can generate value from. We support our long-term research initiatives, which have been the beating heart of Winton for the past 20 years. And we actively contribute to a range of open-source projects, many of which will appear in upcoming blog posts. Who knows? Perhaps we will release our own open source project in the future…!

We hope that you find our blog interesting and enlightening as we share some of our successes as well as our challenges. If you want to get in touch, feel free to connect …