# Difference between revisions of "Liquidity tracker"

Liquidity tracker is a new indicator developed by Bookmap. It was developed using Bookmap API, which means it could have been developed by anyone else. The indicator display either sum or average of liquidity on bid order book, ask order book, and their difference, which represents the imbalance between buyers and sellers. Traders can configure how much / deep of the market depth should be accounted up to the entire full market depth. It's also possible apply higher weights for price the levels near the market and lower weights for the price levels far from the market.

## Simplest case of 10 price levels computed uniformly

If we set the following configuration, the displayed values are:

• Bid Liquidity is the sum of sizes of all Buy orders at first 10 levels of the Bid book
• Ask Liquidity is the sum of sizes of all Sell orders at first 10 levels of the Ask book
• Liquidity Diff is Bid Liquidity minus Ask Liquidity, i.e. the imbalance

## Average versus Sum

Suppose that we frequently change the number of price levels to be accounted, for instance from 10 to 50, then to 30, and so on. Using the sum as the output doesn't give us an easy measurement of whether the order book is more dense near the market of farther from it. This can be solved by choosing the output to be displayed as the average per accounted price levels. Now, if, for instance, Bid Liquidity of 10 levels is higher than Bid Liquidity of 30 levels, we know that the bid book is denser near the market. In case of uniform computation mode of N price levels, the average is division of the sum by N. In more complicated computation modes the division factor is computed differently, but the principle is the same. Note that the display of sum can generate illusionary patterns.

## Exponential weights

Suppose that our computation mode is 10 levels uniformly, and suppose that at certain moment the size of 11-th price level is significantly large. Now, if price fluctuates just 1 tick up and down, changing that price level between 11-th and 10-th level, the impact of it on the indicator's values will be significant. Indeed, that price level with large size is important. But it will look as if the minor change in the current price is that important. To solve this problem, we can assign higher weights for near the market price levels, and lower weights to farther levels. This makes the impact of that price level much smoother when price moves towards or away from it. The first level is always given weight 1 (100%), and farther from it, the weights will gradually drop by half (half life of exponential decay) every K-th price level, where K is configurable. The exponential computation mode is always computed on the full market depth (i.e. as many as provided by the data vendor). But at the distance of, for instance, 10*K the weights are 1/2^10 = 1/1024 = 0.1%, so the impact of those price levels is negligible.

## Sigmoid weights

Another way to make the impact of far from the market price levels more gradual is to use the so called sigmoid computation mode (here sigmoid is a nickname, not the mathematical sigmoid). It allows to set 2 nodes that define the shape of the weights:

• Uniform weights 100% until node 1
• Linearly decreasing weights from 100% to zero between node 1 and node 2.

This mode also allows to set weights precisely zero starting from the distance defined by node 2.