# Difference between revisions of "Liquidity tracker"

m (→Sigmoid weights) |
|||

Line 1: | Line 1: | ||

− | Liquidity tracker is a new indicator developed by Bookmap. It was developed using Bookmap API, which means it could have been developed by anyone else. The indicator display either sum or average of liquidity on bid order book, ask order book, and their difference, which represents the imbalance between buyers and sellers. Traders can configure how much / deep of the market depth should be accounted up to the entire full market depth. It's also possible apply higher weights for price | + | Liquidity tracker is a new indicator developed by Bookmap. It was developed using Bookmap API, which means it could have been developed by anyone else. The indicator display either sum or average of liquidity on bid order book, ask order book, and their difference, which represents the imbalance between buyers and sellers. Traders can configure how much / deep of the market depth should be accounted up to the entire full market depth. It's also possible apply higher weights for near price levels and lower weights for price levels far from the market. |

== Simplest case of 10 price levels taken uniformly == | == Simplest case of 10 price levels taken uniformly == | ||

Line 7: | Line 7: | ||

* Liquidity Diff is Bid Liquidity minus Ask Liquidity, i.e. the imbalance | * Liquidity Diff is Bid Liquidity minus Ask Liquidity, i.e. the imbalance | ||

− | == Average | + | == Why to use 'Average' == |

Suppose that we frequently change the number of price levels to be accounted, for instance from 10 to 50, then to 30, and so on. Using the sum as the output doesn't give us an easy measurement of whether the order book is more dense near the market of farther from it. This can be solved by choosing the output to be displayed as the average per accounted price levels. Now, if, for instance, Bid Liquidity of 10 levels is higher than Bid Liquidity of 30 levels, we know that the bid book is denser near the market. In case of uniform computation mode of N price levels, the average is division of the sum by N. In more complicated computation modes the division factor is computed differently, but the principle is the same. Note that the display of sum can generate illusionary patterns. | Suppose that we frequently change the number of price levels to be accounted, for instance from 10 to 50, then to 30, and so on. Using the sum as the output doesn't give us an easy measurement of whether the order book is more dense near the market of farther from it. This can be solved by choosing the output to be displayed as the average per accounted price levels. Now, if, for instance, Bid Liquidity of 10 levels is higher than Bid Liquidity of 30 levels, we know that the bid book is denser near the market. In case of uniform computation mode of N price levels, the average is division of the sum by N. In more complicated computation modes the division factor is computed differently, but the principle is the same. Note that the display of sum can generate illusionary patterns. | ||

− | == Exponential | + | == What is Exponential weight and why to use it == |

Suppose that our computation mode is 10 levels uniformly, and suppose that at certain moment the size of 11-th price level is significantly large. Now, if price fluctuates just 1 tick up and down, changing that price level between 11-th and 10-th level, the impact of it on the indicator's values will be significant. Indeed, that price level with large size is important. But it will look as if the minor change in the current price is that important. To solve this problem, we can assign higher weights for near the market price levels, and lower weights to farther levels. This makes the impact of that price level much smoother when price moves towards or away from it. The first level is always given weight 1 (100%), and farther from it, the weights will gradually drop by half (half life of exponential decay) every K-th price level, where K is configurable. The exponential computation mode is always computed on the full market depth (i.e. as many as provided by the data vendor). But at the distance of, for instance, 10*K the weights are 1/2^10 = 1/1024 = 0.1%, so the impact of those price levels is negligible. | Suppose that our computation mode is 10 levels uniformly, and suppose that at certain moment the size of 11-th price level is significantly large. Now, if price fluctuates just 1 tick up and down, changing that price level between 11-th and 10-th level, the impact of it on the indicator's values will be significant. Indeed, that price level with large size is important. But it will look as if the minor change in the current price is that important. To solve this problem, we can assign higher weights for near the market price levels, and lower weights to farther levels. This makes the impact of that price level much smoother when price moves towards or away from it. The first level is always given weight 1 (100%), and farther from it, the weights will gradually drop by half (half life of exponential decay) every K-th price level, where K is configurable. The exponential computation mode is always computed on the full market depth (i.e. as many as provided by the data vendor). But at the distance of, for instance, 10*K the weights are 1/2^10 = 1/1024 = 0.1%, so the impact of those price levels is negligible. | ||

− | == Sigmoid weights == | + | == What is Sigmoid weights == |

Another way to make the impact of far from the market price levels more gradual is to use the so called sigmoid computation mode (here sigmoid is a nickname, not the mathematical sigmoid). It allows to set 2 nodes that define the shape of the weights: | Another way to make the impact of far from the market price levels more gradual is to use the so called sigmoid computation mode (here sigmoid is a nickname, not the mathematical sigmoid). It allows to set 2 nodes that define the shape of the weights: | ||

* Uniform weights 100% until node #1 | * Uniform weights 100% until node #1 | ||

Line 29: | Line 29: | ||

=== Filter by order size === | === Filter by order size === | ||

− | The order size filter allows to define which orders participate in computation. Because this feature requires precise market-by-order (MBO) data, is enabled only with CME data by Rithmic. The computation is applied on actual sizes of orders, but the filter is applied on maximum size of order during its lifetime. See more details here: [[Advanced_CVD#Order_size_of_passive_orders|Order size of passive orders]]. | + | The order size filter allows to define which orders participate in computation. Because this feature requires precise market-by-order (MBO) data, it is enabled only with CME data by Rithmic. The computation is applied on actual sizes of orders, but the filter is applied on maximum size of order during its lifetime. See more details here: [[Advanced_CVD#Order_size_of_passive_orders|Order size of passive orders]]. |

=== Avoiding illusionary imbalance === | === Avoiding illusionary imbalance === | ||

Line 35: | Line 35: | ||

=== Interpretation and trading techniques === | === Interpretation and trading techniques === | ||

− | The indicator provides true measurements of liquidity, but it isn't a buy / sell signal. Its interpretation is the art of trading. Let's assume a significant imbalance where the bid liquidity | + | The indicator provides true measurements of liquidity, but it isn't a buy / sell signal. Its interpretation is the art of trading. Let's assume a significant imbalance where the bid liquidity is much higher that ask liquidity. If new aggressive orders are equally distributed between buyers and sellers, the price is likely to move up, i.e. following the least resistance path. But if there are market participants who wish to sell and look for high bid liquidity, then the price may go down. |

## Revision as of 19:53, 5 February 2019

Liquidity tracker is a new indicator developed by Bookmap. It was developed using Bookmap API, which means it could have been developed by anyone else. The indicator display either sum or average of liquidity on bid order book, ask order book, and their difference, which represents the imbalance between buyers and sellers. Traders can configure how much / deep of the market depth should be accounted up to the entire full market depth. It's also possible apply higher weights for near price levels and lower weights for price levels far from the market.

## Contents

## Simplest case of 10 price levels taken uniformly

If we set the following configuration, the displayed values are:

- Bid Liquidity is the sum of sizes of all Buy orders at first 10 levels of the Bid book
- Ask Liquidity is the sum of sizes of all Sell orders at first 10 levels of the Ask book
- Liquidity Diff is Bid Liquidity minus Ask Liquidity, i.e. the imbalance

## Why to use 'Average'

Suppose that we frequently change the number of price levels to be accounted, for instance from 10 to 50, then to 30, and so on. Using the sum as the output doesn't give us an easy measurement of whether the order book is more dense near the market of farther from it. This can be solved by choosing the output to be displayed as the average per accounted price levels. Now, if, for instance, Bid Liquidity of 10 levels is higher than Bid Liquidity of 30 levels, we know that the bid book is denser near the market. In case of uniform computation mode of N price levels, the average is division of the sum by N. In more complicated computation modes the division factor is computed differently, but the principle is the same. Note that the display of sum can generate illusionary patterns.

## What is Exponential weight and why to use it

Suppose that our computation mode is 10 levels uniformly, and suppose that at certain moment the size of 11-th price level is significantly large. Now, if price fluctuates just 1 tick up and down, changing that price level between 11-th and 10-th level, the impact of it on the indicator's values will be significant. Indeed, that price level with large size is important. But it will look as if the minor change in the current price is that important. To solve this problem, we can assign higher weights for near the market price levels, and lower weights to farther levels. This makes the impact of that price level much smoother when price moves towards or away from it. The first level is always given weight 1 (100%), and farther from it, the weights will gradually drop by half (half life of exponential decay) every K-th price level, where K is configurable. The exponential computation mode is always computed on the full market depth (i.e. as many as provided by the data vendor). But at the distance of, for instance, 10*K the weights are 1/2^10 = 1/1024 = 0.1%, so the impact of those price levels is negligible.

## What is Sigmoid weights

Another way to make the impact of far from the market price levels more gradual is to use the so called sigmoid computation mode (here sigmoid is a nickname, not the mathematical sigmoid). It allows to set 2 nodes that define the shape of the weights:

- Uniform weights 100% until node #1
- Linearly decreasing weights from 100% to zero between node #1 and node #2.

This mode also allows to set weights precisely zero starting from the distance defined by node 2.

## Advanced tips and tricks

### Computation complexity

The indicator computation takes little CPU resources. The most efficient computation modes are exponential and uniform full depth. In other computation modes (uniform N levels and 'sigmoid'), the CPU time is proportional to number of accounted price levels.

### Indicator's lag and temporary smoothing

By default, there is no lag. At each moment the indicator displays the liquidity and its imbalance based on the state of order book (market depth) at the same moment. To be precise, there are just two causes of temporary lag:

- Computation frequency settings. For instance, if set to 200 milliseconds, then the lag is between 0 to 200 milliseconds, plus the latency from the exchange.
- Smoothing the lines obviously adds a lag. In effect, like any other moving average indicator, it displays the average value for the moment T/2 ago where T is the selected smoothing parameter, i.e. sliding window.

### Filter by order size

The order size filter allows to define which orders participate in computation. Because this feature requires precise market-by-order (MBO) data, it is enabled only with CME data by Rithmic. The computation is applied on actual sizes of orders, but the filter is applied on maximum size of order during its lifetime. See more details here: Order size of passive orders.

### Avoiding illusionary imbalance

Traders should be careful with interpretation of imbalance when computation mode is full depth and selected output is sum, here is why. Imagine that orders are allowed to be placed only in a fixed price range, e.g. between 2600 and 2700. Imagine also that it just so happen that the size at each price level is precisely 100, so by definition there is no any bid/ask imbalance. Still, when price moves up, the imbalance will show order book imbalance in the same direction. This happens simply because there would be less ask levels. In fact, the imbalance would show 100% correlation with the price, but it's obviously a useless signal. The situation of a fixed range of permitted prices isn't purely hypothetical. It's a common practice for most of the exchanges (including CME) to set price limits, therefore this illusion will manifest itself when the sum is computed over full market depth. If market depth data contains only N price levels, then the fixed price limits will be created by the extended order book feature which keeps last known values outside the reported price range.

### Interpretation and trading techniques

The indicator provides true measurements of liquidity, but it isn't a buy / sell signal. Its interpretation is the art of trading. Let's assume a significant imbalance where the bid liquidity is much higher that ask liquidity. If new aggressive orders are equally distributed between buyers and sellers, the price is likely to move up, i.e. following the least resistance path. But if there are market participants who wish to sell and look for high bid liquidity, then the price may go down.