iBankCoin
Home / Tag Archives: algorithm trading

Tag Archives: algorithm trading

Intermediate Term Consolidation

The short term consolidation score looks at:
Daily Change
Daily Change Relative To ATR
Daily Change Relative To Weekly.

The short term represents any movement less than 1 week, but really only involves the daily moves because of lack of data between 1-6 days. It also scales the measurements so that as a stock moves less, it gets more points added to the score.

The intermediate term consolidation gives us more data.
Here are a few possible considerations to score:
1)Weekly change vs monthly performance
2)Weekly volatility vs monthly volatility
3)ATR (14 average daily move) ratios…*
4)Weekly volatility vs beta
5)Weekly volatility overall
6)Weekly performance vs beta
7)Absolute weekly performance not too extreme in either direction.
8)Distance from 20 day moving average?
9)Distance from 20 day moving average relative to some volatility metrics**

We can tinker a bit more with how one thing relates to another in different layers to provide a lot more clarity on whether or not the stock is undergoing volatility compression or volatility expansion in the last 7-20 days.
———————-
*ATR divided by price gives us a percentage daily move as an average over the last 14 days. While this only refers to the daily movement, it’s a function of 14 day volatility on daily basis. If the weekly volatility is smaller relative to the ATR (14 day volatility), it tends to represent a stock that is contracting in volatility. If the ATR is smaller than the monthly, it also tends to represent a stock that has contracted in volatility over the last 14 days more so than the last 30. You can also apply a beta adjusted bonus so high beta stocks (stocks with typically more volatility over a longer time frame) that also have a low ATR (less volatility over last 14 days), score well.

**A stock that is closer to it’s 20 day moving average will tend to either have moved less over the last 20 days OR be near it’s “balance” in it’s range suggesting it’s close to the Apex if the chart is fully consolidated. However,a stock with a higher beta, or ATR, or monthly or weekly volatility may be able to tolerate a little bit more movement from the 20 day volatility and still have no confirmed breakout or breakdown. Also, a stock that has moved less in the last 14 days or week than it has over it’s 20 day average may be consolidating. The closer a stock is to the 20 day moving average, typically the less it’s moved over those 20 days, or at least the more likely that it has stayed in an equally proportionally rangebound area or regressed to the mean recently.

Also:A stock in an uptrend with a 20 day moving average under the 50 day may represnet short term weakness and consolidation within an uptrend. A stock in a downtrend with the 20 day above the 50 day may represent a stock that may be consolidating and possibly forming a bottm, particularly if the stock is also above the 20 day moving average. This sort of deduction from data probably won’t be used, but may be a decent idea to pair with consolidation data to find chart patterns. It would best be used in a binary (1 for yes, 0 for no) and as an additional filter (E.G. you can set up a table to show you stocks that score over 80 that also pass this filter of “patterning” from uptrend.) Defining the stock as being in a longer term uptrend can be based upon a stock being above a 200 day moving average or 50 day being above the 200, distance from 52 week low, distance from 52 week high or other data. A little off topic here so I will scratch it.

A volatility measurement tends to measure a stock’s standard deviation of movement over a particular time frame. A stock will move within 1 std deviation ~68% of the time if it is normally distributed. 95% will be contained within 2 std deviations and 99.7% of the data set will be contained within 3 standard deviations. Whether or not stocks movements are actually normally distributed has famously come into question by Nassib Taleb in his books “Black Swan” and “fooled by randomness”, but for the purpose of measuring volatility, this doesn’t matter unless we are going to sell option premiums expecting moves within a particular range.

This is a bit different than looking at a stock’s average movement, but still relate to movement over a time period.

So as long as we evaluate each stock with the same measurement, comparing ATR average daily movement over 14 days) to weekly or monthly volatility (standard deviation of movement over the time period) should still get us an idea of how volatility has changed over time.

By looking for contracting volatility over time and comparing volatility over various time frames as well as the magnitude of the change, we are able to look on a 1 day, 1 week, 2 week, and 1 month period of time and compare volatility relative to a stock’s long term comparative movements relative to the S&P (beta), and not only get a good idea of whether or not a stock is located within normal bounds of a range, but whether or not those ranges are contracting, to what extent they are contracting, and how the recent “quietness” compares to the historical movement of a stock.

volatility time volatilty compression2

The pictures above don’t fairly represent how our scores also consider relative movement, relative volatility, and average daily moves and standard deviations over time periods. Overall, using all the data presented will provide a much greater chance of having consolidation patterns rise to the top than may be represented by the images. I covered this in a different post.

Again– once we gather this data, we can have the spreadsheet sort out the data by certain categories– We then can make adjustments to the final individual score based upon the average of the group. A stock showing signs of consolidation in a group with an amazing consolidation score may be worth more than an “ideal” setup in an average group or worse. When you see multiple stocks in an industry setting up, it’s less likely to just be randomness or a fake. If every single stock in a group is setting up this represents a large amount of capital preparing to rotate into a theme. One or two signals can be wrong, but say 20/25 names in the group setting up? The probability of you catching an idea before it makes a big move is greater.

I feel similarly about seasonal data. Seasonal data on an individual stock may be due to a few quarters of earnings at a certain time that happened a few years and skewed the data. But seasonal data that suggests both the individual stock outperforming the industry and an entire industry will outperform the sector, and the sector will outperform at a particular time of year is more likely due to a causal relationship such as capital moving in reaction to or in anticipation of holiday shopping leading to an increase in earnings and a fundamental reason as to why a particular stock has been more effective at capitalizing on this seasonality than it’s peers in the industry. This is a little off topic here as well.

I don’t necessarily need to understand the cause as long as there is evidence the move will continue to correlate with the timeframe as opposed to it just happened to correlate without cause that will be operationally random moving forward.

If you don’t understand a cycle and are just selecting a point your results will be normally distributed as if it were random. But if instead it were a cycle with waves that expand and contract in duration, and you were able to identify this cycle and buy closer to the low and sell closer to the high, then you would be able to show superior results. Your results would come at the expense of someone on the other side, so again, analyzing the results would show a normal distribution. If you are an outlier to the upside, someone or a group of people will be an outlier to the downside, and overall the data set would be within a range with little evidence that you were actually able to exploit a tendency. That is the nature of a interconnected system where the wins of one person correlate with the loss of another around a collective average. A little off topic again.

So we basically have a good outline for the different things we can consider when making our intermediate term volatility rank which I have begun working on. I have finished the short term consolidation rank unless I decide that something in the intermediate term rank belongs in the short term or I come up with a different idea of what I can add.

Comments »

2014 Goals Streamlining The Process Part 2

The analysis and grading system discussed in part one will look something like this but have more in depth data and calculations and filtering systems along with the ability to categorize based upon the data and pull the information to a coversheet where it will have a summery of the findings that is more clear.
industry

more detailed breakdown and how sub categories will work.

stocks overview

At this point, it is mostly just a concept in my head that I have recently started to get on paper along with a brief draft of one aspect of what it will look like and how it is possible. I don’t even know how far I am going to be able to take this spreadsheet and how much can really be automated, vs how much I will have to manually setup. I have a number of real rough, general pictures in my head of all these spreadsheets and how they will work together so that I just press a few buttons (ideally as few as possible, but as many as necessary for quality results) and get a result, some of which I manually will go into finviz and look over and then look at charts and assess risk/rewards from however many I want, sort those by best available (ideally streaming updates) by expectations per equal unit of risk, and combine them together into the risk simulator to see how the broad strategy will help me meet my goals, so I know how those pieces fit within the broad strategy. With that in mind, the spreadsheet will pull a combination of the possible trades into different categories, make suggestions which I will be able to confirm by adding it to my trading journal for tracking, categorizing and reviewing my results in a way that looks at what I did, what condition the market was in and other variables that I want to be able to track and review over the course of many years to continue to look at areas I need to improve, trades I need to avoid making, trades I should make more of and strategies that could use some tweaking. My trading journal then will be able to adjust to reflect the “best fit” match relative to the target “allocations” and what not, and hopefully account for fees and evaluate whether or not the benefit is worth the costs of “rebalancing” and/or adding new positions and provide a suggestion on position sizing or a look at some simulations of how it would look assuming all opportunities are available and reflect reality.

But to go from conceptual rough draft to an actual concrete set of spreadsheets and what not is a huge leap. One step at a time. The first step will be to really get into the specifics of what I want just one of these spreadsheets to accomplish, and work from there.

Since I have done work on the position sizing/trading system simulator, I have a few adjustments I want to make, likely before year end.

1)Allow the spreadsheet to add in deposits or withdrawals on a per trade basis.

2)Allow the spreadsheet to adjust the “drawdown killswitch” AFTER subtracting the amount added after each trade and adjusting for the drawdown not including deposits.

3)Allow the grand total gain to subtract all capital added and starting amount to get a net gain.

4)Binary Yes/No function if drawdown killswitch is hit so you can track percentage chance that you hit the drawdown killswitch over X trades or less to potentially simulate the percentage of traders over a time frame that meet those results.

5)Consider adding in a “target goal” that functions as a “reverse kill switch” where trading is halted after goal is made

6)Binary Yes/No for “target reached” so you can estimate percentage chance of reaching target in X amount of trades or less given the assumptions you plugged in about expectations of the system(s).

7)Secondary portfolio targets and dynamically adjusted risk – Set it up so IF a particular portfolio target is reached, the risk percentage per trade is then adjusted and/or the amount deposit/withdrawn is adjusted to simulate reaching a goal in which you will attempt to retire from job while managing the sudden need to withdraw from account while being more conservative in your strategy. OR so you can increase the chances of getting to your target so if you get really close you don’t take unnecessary risk to get there at the cost of greater volatility that is not needed if you have traded well

8)Experiment with correlated trades held simultaneously with the same trading system. (the results of one influences the probability of another)

9)If that works, experiment with correlated trades held simultaneously with DIFFERENT expectations (such as a stock trading system combined with an option trading system) with different risk amounts

10)… ideally some sort of adjustment is going to have to be made to allow different average holding periods so the simulation can match up to more accurately reflect the timing of the trades.

11)If you can do 8 and 9, you should be able to set it up for up to 5 simultaneous trades for up to 5 unique “trading systems” simultaneously within portfolio, but may require a lot of busy work.

12)Come up with ideas to test a lot of different assumptions/strategies.

13)Use the spreadsheet to do a lot of testing of those assumptions.

update:You can check out the progress of the OA Bot.

Comments »