Range Strat - MACD/RSIThis strategy uses a trend based indicator (MACD) for entry/exit signals with a momentum oscillator (RSI) to act as confirmation. Although relying on a trend based indicator this has been created for range bound crypto markets, which have been in a period of chop since June 2022.
Long/Short signals are generated from MACD with the RSI oscillator thresholds suppressing entries at price extremes. This is not a mean reversion RSI strategy! As the indicators are contrary to each other you will need to be generous with the RSI settings in order for signals to trigger.
Strategy is designed for use on the 4h timeframe, it may work well on higher timeframes, but lower time frames will lead to false signals. Use fixed percentage of equity for order size to capture the compounding effect. As a reversal strategy bear in mind that should market trend strongly in either direction stops will be required.
The RSI thresholds can be tailored to provide higher frequency or safer signals. Similarly tweaking MACD settings will provide earlier/more frequent or safer signals. As this is intended to enter near range high / low you should check the visual cues to ensure a ping-pong effect is observed, so that peaks and troughs are captured. Once an observable range is established the strategy works well across a range of crypto markets,
The script is open source, so feel free to amend as you wish. Using a different momentum oscillator may provide better results. I have prior coding experience, but first time using PineScript was last night, so it's not very tidy. I will update this with some additional customisation and TP/SL in the near future.
Usage: Range bound markets
Markets: Cryptocurrency Alts/BTC/ETH
Timeframe: 4h
Komut dosyalarını "2022年+美股英伟达+交易税费+计算方法" için ara
TASC 2022.09 LRAdj EMA█ OVERVIEW
TASC's September 2022 edition of Traders' Tips includes an article by Vitali Apirine titled "The Linear Regression-Adjusted Exponential Moving Average". This script implements the titular indicator presented in this article.
█ CONCEPT
The Linear Regression-Adjusted Exponential Moving Average (LRAdj EMA) is a new tool that combines a linear regression indicator with exponential moving averages . First, the indicator accounts for the linear regression deviation, that is, the distance between the price and the linear regression indicator. Subsequently, an exponential moving average (EMA) smooths the price data and and provides an indication of the current direction.
As part of a trading system, LRAdj EMA can be used in conjunction with an exponential moving average of the same length to identify the overall trend. Alternatively, using LRAdj EMAs of different lengths together can help identify turning points.
█ CALCULATION
The script uses the following input parameters:
EMA Length
LR Lookback Period
Multiplier
The calculation of LRAdj EMA is carried out as follows:
Current LRAdj EMA = Prior LRAdj EMA + MLTP × (1+ LRAdj × Multiplier ) × ( Price − Prior LRAdj EMA ),
where MLTP is a weighting multiplier defined as MLTP = 2 ⁄ ( EMA Length + 1), and LRAdj is the linear regression adjustment (LRAdj) multiplier:
LRAdj = (Abs( Current LR Dist )−Abs( Minimum LR Dist )) ⁄ (Abs( Maximum LR Dist )−Abs( Minimum LR Dist ))
When calculating the LRAdj multiplier, the absolute values of the following quantities are used:
Current LR Dist is the distance between the current close and the linear regression indicator with a length determined by the LR Lookback Period parameter,
Minimum LR Dist is the minimum distance between the close and the linear regression indicator for the LR lookback period ,
Maximum LR Dist is the maximum distance between the close and the linear regression indicator for the LR lookback period .
Auto Trendline Indicator (based on fractals)A tool that automatically draws out trend lines by connecting the most recent fractals.
Description:
The process of manual drawing out trend lines is highly subjective. Many times, we don’t trade what we see, but what we “want to see”. As a result, we draw lines pointing to the direction that we wishfully want price to move towards. While there are no right/wrong ways to draw trend lines, there are, however, systematic/unsystematic ways to draw trend lines. This tool will systematically draw out trend lines based on fractals.
Additional feature:
This tool will also plot out symbols (default symbol “X”) to signify points of crossings. This can be useful for traders considering to use trend lines as part of their trading strategies.
Here is an interesting observation on the price actions of NASDAQ futures on a 5 second chart during regular trading hours on July 14, 2022.
It’s a phenomenon. People like to see straight lines connecting HL/LH, etc., so it's possible for the market as a whole to psychologically react to these lines. However, it is important to note that is is impossible to predict the direction of price. In the case above, price could have tanked below auto-drawn trend line. Fractal based trend lines should only be taken as references and regarded as price levels. No studies have ever proven that the slope of trend lines can indicate price's future direction.
More about fractals:
To understand more about fractals:
www.investopedia.com
www.tradingview.com
Contrary to what it sounds like, fractal in "technical analysis" does not refer to the recursive self-repeating patterns that appear in nature, such as the mesmerizing patterns found in snowflakes. The Fractal Markets Hypothesis claims that market prices exhibit fractal properties over time. Assuming this assertion to be true, then fractals can be used a tool to represent the chaotic movements of price is a simplified manner.
The purpose of this exercise is to take a tool that is readily available (ie. in this case, TradingView’s built-in fractals tool), and to create a newer tool based on it.
Parameters:
Fractal period (denoted as ‘n’ in code): It is the number of bars bounding a high/low point that must be lower/higher than it, respectively, in order for fractal to be considered valid. Period ‘n’ can be adjusted in this tool. Traditionally, chartists pick the value of 5. The longer it is, the less noise seen on the chart, and the pivot point may also be exhibited in higher timeframes. The drawback is that it will increase the period of lag, and it will take more bars to confirm the printed fractal.
Others: Intuitive parameters such as whether to draw historical trend lines, what color to use, which way to extend the lines, and whether or not to show points of crossings.
TASC 2022.08 Trading The Fear Index█ OVERVIEW
TASC's August 2022 edition of Traders' Tips includes an article by Markos Katsanos titled "Trading The Fear Index". This script implements a trading strategy called the “daily long/short trading system for volatility ETFs” presented in this article.
█ CONCEPTS
This long-term strategy aims to capitalize on stock market volatility by using exchange-traded funds (ETFs or ETNs) linked to the VIX index.
The strategy rules (see below) are based on a combination of the movement of the Cboe VIX index, the readings of the stochastic oscillator applied to the SPY ETF relative to the VIX, and a custom indicator presented in the article and called the correlation trend . Thus, they are not based on the price movement of the traded ETF itself, but rather on the movement of the VIX and of the S&P 500 index. This allows the strategy to capture most of the spikes in volatility while profiting from the long-term time decay of the traded ETFs.
█ STRATEGY RULES
Long rules
Rising volatility: The VIX should rise by more than 50% in the last 6 days.
Trend: The correlation trend of the VIX should be 0.8 or higher and also higher than yesterday's value.
VIX-SPY relative position: The 25-day and 10-day VIX stochastics should be above the 25-day and 10-day SPY stochastics respectively. In addition, the 10-day stochastic of the VIX should be above its yesterday's value.
Long positions are closed if the 10-day stochastic of the SPY rises above the 10-day stochastic of the VIX or falls below the yesterday's value.
Short rules
Declining volatility: The VIX should drop over 20% in the last 6 days and should be down during the last 3 days.
VIX threshold: The VIX should spend less than 35% of time below 15.
VIX-SPY relative position: The 10-day VIX stochastic should be below the 10-day SPY stochastic. In addition, the 10-day SPY stochastic should be higher than the yesterday's value.
Long positions are closed if the first two Long rules are triggered (Rising volatility and Trend).
The script allows you to display the readings of the indicators used in the strategy rules in the form of oscillator time series (as in the preview chart) and/or in the form of a table.
[blackcat] L2 Ehlers Pairs RotationLevel 2
Background
John Ehlers’ articles in the July issues on 2022,“Pairs Rotation With Ehlers Loops, Part 2”
Function
In part 1 of his article in the June 2022 issue (“Ehlers Loops”), John Ehlers uses a relationship of price and volume to determine if any predictive value can be obtained. His technique, called Ehlers Loops, aid in visualizing the performance of one datastream against another. In part 2 appearing in this issue (“Pairs Rotation With Ehlers Loops”), the author demonstrates Ehlers Loops in action with a pairs trading example. This pairs rotation strategy is aimed at minimizing drawdown while simultaneously maximizing return on capital.
Remarks
Feedbacks are appreciated.
TASC 2022.07 Pairs Rotation With Ehlers Loops█ OVERVIEW
TASC's July 2022 edition of Traders' Tips includes an article by John Ehlers titled "Pairs Rotation With Ehlers Loops". This is the code that implements the Ehlers Loops applied to pairs rotation trading.
█ CONCEPTS
John Ehlers developed Ehlers loops as a tool to visualize the performance of one data stream versus another. Initially, he used this tool to chart price versus volume. However, Ehlers loops proved to be suitable for determining the timing of the pairs rotation strategy . This strategy works by having a long position in only one of two securities, depending on which one is considered stronger at a given time.
When the prices of two securities (filtered and scaled with a standard deviation for consistent presentation) are plotted against each other, the curvature and direction of rotation on the chart can help guide decisions on long positions. For example, when plotting a stock versus a referenced symbol, a vertical upward movement while rotating clockwise is a sign of going long the stock. Similarly, a horizontal movement to the right while rotating counterclockwise is the sign to go long the reference. A higher probability of a reversal is expected when the price moves more than one or two standard deviations.
█ CALCULATIONS
The script uses the following steps to calculate the Ehlers Loops:
The price data of both securities in the pair are individually filtered using identical high-pass and SuperSmoother filters. This results in two band-limited data streams, having a nominally zero mean. The input parameters Low-Pass Period and High-Pass Period control the filter bandwidth and thus can modify the shape of the Ehlers Loops.
Subsequently, the filtered data streams are scaled in terms of standard deviation by dividing each of them by their root-mean-square (RMS) values. These data streams are plotted as zero-mean oscillators.
Finally, the scaled data streams are displayed one against another for the selected time interval (defined by the input parameter Loop Segments ). In the resulting scatterplot, the thicker line corresponds to the later data points. The fluctuations of the filtered price data of the chart symbol are plotted along the y -axis, and the price changes of the referenced symbol are shown along the x -axis.
Adaptive, Relative Strength EMA (RSEMA) [Loxx]TASC's May 2022 edition Traders' Tipsl includes the "Relative Strength Moving Averages" article authored by Vitali Apirine. This is the code implementing the Relative Strength Exponential Moving Average (RS EMA) indicator introduced in this publication.
This indicator adds onto Vitali Apirine's work by including three different types of momentum used to calculate RSEMA as well as fixed and adaptive cycle calculations to be used as dynamic inputs to calculate momentum. The purpose of these additional calculation methods is to attempt to filter out noice and track trends by using different methods and inputs to calculation momentum.
Momentum methods
-Wilder relative strength
-Chande momentum
-Momentum component of Jurik's RSX RSI
Cycle calculation methods
-Fixed
-Vertical horizontal filter
-Ehlers' Autocorrelation Dominant Cycle
What is Wilder relative strength?
The Relative Strength Index (RSI), developed by J. Welles Wilder, is a momentum oscillator that measures the speed and change of price movements. The RSI oscillates between zero and 100. Traditionally the RSI is considered overbought when above 70 and oversold when below 30.
What is Chande momentum?
Chande Momentum was designed specifically to track the movement and momentum of a security. It calculates the difference between the sum of both recent gains and recent losses, then dividing the result by the sum of all price movement over the same period.
What is the momentum component of Jurik's RSX RSI?
RSI is a very popular technical indicator, because it takes into consideration market speed, direction and trend uniformity. However, the its widely criticized drawback is its noisy (jittery) appearance. The Jurk RSX retains all the useful features of RSI , but with one important exception: the noise is gone with no added lag. For our purposes here, we derive momentum minus the lag.
Vertical horizontal filter?
Vertical Horizontal Filter (VHF) was created by Adam White to identify trending and ranging markets. VHF measures the level of trend activity, similar to ADX in the Directional Movement System. Trend indicators can then be employed in trending markets and momentum indicators in ranging markets.
What is autocorrelation?
Ehlers Autocorrelation is used in the calculation of dominant cycle length to be injected into standard technical analysis tools to improve TA accuracy. Its main purpose is to eliminate noise from the price data, reduce effects of the “spectral dilation” phenomenon, and reveal dominant cycle periods.
As the first step, Autocorrelation uses Mr. Ehlers’s previous installment, Ehlers Roofing Filter, in order to enhance the signal-to-noise ratio and neutralize the spectral dilation. This filter is based on aerospace analog filters and when applied to market data, it attempts to only pass spectral components whose periods are between 10 and 48 bars.
Autocorrelation is then applied to the filtered data: as its name implies, this function correlates the data with itself a certain period back. As with other correlation techniques, the value of +1 would signify the perfect correlation and -1, the perfect anti-correlation.
Happy trading!
TASC 2022.06 Ehlers Loops█ OVERVIEW
TASC's June 2022 edition Traders' Tips includes an article by John Ehlers titled "Ehlers Loops. Part 1". This is the code implementing the price-volume Ehlers Loops he introduced in the publication.
█ CONCEPTS
John Ehlers developed Ehlers loops as a tool to visualize the performance of one data stream versus another, both filtered and scaled. In this article, the author applies his concept to exploit and/or dispel the dogmatic principles of reliable price-volume relationships.
The script offers two different ways to visualize Ehlers Loops:
Oscillators (default option)
In this implementation, filtered and scaled volume is plotted along with filtered and scaled price as zero-mean oscillators. Observation of the relative direction of volume and price oscillators can be discretionarily used to interpret and predict market conditions. For example, it is generally assumed that an increase in volume and an increase in price define a bullish condition. Similarly, decreasing volume and increasing price are generally considered bearish. A decrease in volume and a decrease in price is considered a bullish condition. The increase in volume and decrease in price is often thought to be bearish.
Scatterplot
This Crocker-style visualization displays filtered and scaled price against filtered and scaled volume for the selected timespan. Fluctuations in volume are plotted along the x -axis, while price changes along the y -axis. This way of visualizing the Ehlers Loop allows you to analyze the curvature and directional path of the price in relation to volume, offering a different comparative perspective. The boundaries of the price and volume scale on the Ehlers Loop Crocker-chart are presented in standard deviations. Deviations can be used to predict possible future price or volume fluctuations. The expected probability of potential reversals is 68%, 95% and 99.7% at one, two and three standard deviations, respectively.
█ CALCULATIONS
The following steps are used to build an Ehlers Loop:
• Both price and volume are filtered to be band-limited signals. This is done by applying the high-pass Butterworth filter in combination with the low-pass SuperSmooth filter.
The cutoff wavelengths of the high-pass and low-pass filters are defined by the input parameters HPPeriod and LPPeriod , respectively.
These values change the appearance of the Ehlers Loops and can be customized to your trading style.
• The filtered price and volume time series are then scaled in terms of standard deviation by dividing each by their root-mean-square values.
• The resultant price and volume data are plotted as zero-mean oscillators or as a scatterplot.
Time█ OVERVIEW
This library is a Pine Script™ programmer’s tool containing a variety of time related functions to calculate or measure time, or format time into string variables.
█ CONCEPTS
`formattedTime()`, `formattedDate()` and `formattedDay()`
Pine Script™, like many other programming languages, uses timestamps in UNIX format, expressed as the number of milliseconds elapsed since 00:00:00 UTC, 1 January 1970. These three functions convert a UNIX timestamp to a formatted string for human consumption.
These are examples of ways you can call the functions, and the ensuing results:
CODE RESULT
formattedTime(timenow) >>> "00:40:35"
formattedTime(timenow, "short") >>> "12:40 AM"
formattedTime(timenow, "full") >>> "12:40:35 AM UTC"
formattedTime(1000 * 60 * 60 * 3.5, "HH:mm") >>> "03:30"
formattedDate(timenow, "short") >>> "4/30/22"
formattedDate(timenow, "medium") >>> "Apr 30, 2022"
formattedDate(timenow, "full") >>> "Saturday, April 30, 2022"
formattedDay(timenow, "E") >>> "Sat"
formattedDay(timenow, "dd.MM.yy") >>> "30.04.22"
formattedDay(timenow, "yyyy.MM.dd G 'at' hh:mm:ss z") >>> "2022.04.30 AD at 12:40:35 UTC"
These functions use str.format() and some of the special formatting codes it allows for. Pine Script™ documentation does not yet contain complete specifications on these codes, but in the meantime you can find some information in the The Java™ Tutorials and in Java documentation of its MessageFormat class . Note that str.format() implements only a subset of the MessageFormat features in Java.
`secondsSince()`
The introduction of varip variables in Pine Script™ has made it possible to track the time for which a condition is true when a script is executing on a realtime bar. One obvious use case that comes to mind is to enable trades to exit only when the exit condition has been true for a period of time, whether that period is shorter that the chart's timeframe, or spans across multiple realtime bars.
For more information on this function and varip please see our Using `varip` variables publication.
`timeFrom( )`
When plotting lines , boxes , and labels one often needs to calculate an offset for past or future end points relative to the time a condition or point occurs in history. Using xloc.bar_index is often the easiest solution, but some situations require the use of xloc.bar_time . We introduce `timeFrom()` to assist in calculating time-based offsets. The function calculates a timestamp using a negative (into the past) or positive (into the future) offset from the current bar's starting or closing time, or from the current time of day. The offset can be expressed in units of chart timeframe, or in seconds, minutes, hours, days, months or years. This function was ported from our Time Offset Calculation Framework .
`formattedNoOfPeriods()` and `secondsToTfString()`
Our final two offerings aim to confront two remaining issues:
How much time is represented in a given timestamp?
How can I produce a "simple string" timeframe usable with request.security() from a timeframe expressed in seconds?
`formattedNoOfPeriods()` converts a time value in ms to a quantity of time units. This is useful for calculating a difference in time between 2 points and converting to a desired number of units of time. If no unit is supplied, the function automatically chooses a unit based on a predetermined time step.
`secondsToTfString()` converts an input time in seconds to a target timeframe string in timeframe.period string format. This is useful for implementing stepped timeframes relative to the chart time, or calculating multiples of a given chart timeframe. Results from this function are in simple form, which means they are useable as `timeframe` arguments in functions like request.security() .
█ NOTES
Although the example code is commented in detail, the size of the library justifies some further explanation as many concepts are demonstrated. Key points are as follows:
• Pivot points are used to draw lines from. `timeFrom( )` calculates the length of the lines in the specified unit of time.
By default the script uses 20 units of the charts timeframe. Example: a 1hr chart has arrows 20 hours in length.
• At the point of the arrows `formattedNoOfPeriods()` calculates the line length in the specified unit of time from the input menu.
If “Use Input Time” is disabled, a unit of time is automatically assigned.
• At each pivot point a label with a formatted date or time is placed with one of the three formatting helper functions to display the time or date the pivot occurred.
• A label on the last bar showcases `secondsSince()` . The label goes through three stages of detection for a timed alert.
If the difference between the high and the open in ticks exceeds the input value, a timer starts and will turn the label red once the input time is exceeded to simulate a time-delayed alert.
• In the bottom right of the screen `secondsToTfString()` posts the chart timeframe in a table. This can be multiplied from the input menu.
Look first. Then leap.
█ FUNCTIONS
formattedTime(timeInMs, format)
Converts a UNIX timestamp (in milliseconds) to a formatted time string.
Parameters:
timeInMs : (series float) Timestamp to be formatted.
format : (series string) Format for the time. Optional. The default value is "HH:mm:ss".
Returns: (string) A string containing the formatted time.
formattedDate(timeInMs, format)
Converts a UNIX timestamp (in milliseconds) to a formatted date string.
Parameters:
timeInMs : (series float) Timestamp to be formatted.
format : (series string) Format for the date. Optional. The default value is "yyyy-MM-dd".
Returns: (string) A string containing the formatted date.
formattedDay(timeInMs, format)
Converts a UNIX timestamp (in milliseconds) to the name of the day of the week.
Parameters:
timeInMs : (series float) Timestamp to be formatted.
format : (series string) Format for the day of the week. Optional. The default value is "EEEE" (complete day name).
Returns: (string) A string containing the day of the week.
secondsSince(cond, resetCond)
The duration in milliseconds that a condition has been true.
Parameters:
cond : (series bool) Condition to time.
resetCond : (series bool) When `true`, the duration resets.
Returns: The duration in seconds for which `cond` is continuously true.
timeFrom(from, qty, units)
Calculates a +/- time offset in variable units from the current bar's time or from the current time.
Parameters:
from : (series string) Starting time from where the offset is calculated: "bar" to start from the bar's starting time, "close" to start from the bar's closing time, "now" to start from the current time.
qty : (series int) The +/- qty of units of offset required. A "series float" can be used but it will be cast to a "series int".
units : (series string) String containing one of the seven allowed time units: "chart" (chart's timeframe), "seconds", "minutes", "hours", "days", "months", "years".
Returns: (int) The resultant time offset `from` the `qty` of time in the specified `units`.
formattedNoOfPeriods(ms, unit)
Converts a time value in ms to a quantity of time units.
Parameters:
ms : (series int) Value of time to be formatted.
unit : (series string) The target unit of time measurement. Options are "seconds", "minutes", "hours", "days", "weeks", "months". If not used one will be automatically assigned.
Returns: (string) A formatted string from the number of `ms` in the specified `unit` of time measurement
secondsToTfString(tfInSeconds, mult)
Convert an input time in seconds to target string TF in `timeframe.period` string format.
Parameters:
tfInSeconds : (simple int) a timeframe in seconds to convert to a string.
mult : (simple float) Multiple of `tfInSeconds` to be calculated. Optional. 1 (no multiplier) is default.
Returns: (string) The `tfInSeconds` in `timeframe.period` format usable with `request.security()`.
TASC 2022.05 Relative Strength Exponential Moving Average█ OVERVIEW
TASC's May 2022 edition Traders' Tips includes the "Relative Strength Moving Averages" article authored by Vitali Apirine. This is the code implementing the Relative Strength Exponential Moving Average (RS EMA) indicator introduced in this publication.
█ CONCEPTS
RS EMA is an adaptive trend-following indicator with reduced lag characteristics. By design, this was made possible by harnessing the relative strength of price. It operates in a similar fashion to a traditional EMA, but it has an improved response to price fluctuations. In a trading strategy, RS EMA can be used in conjunction with an EMA of the same length to identify the overall trend (see the preview chart). Alternatively, RS EMAs with different lengths can define turning points and filter price movements.
RS EMA is an adaptive trend-following indicator with reduced lag characteristics. By design, this was made possible by harnessing the relative strength of price. It operates in a similar fashion to a traditional EMA, but it has an improved response to price fluctuations.
█ CALCULATIONS
The following steps are used in the calculation process:
• Calculate the relative strength (RS) of a given length.
• Multiply RS by a chosen coefficient (multiplier) to adapt the EMA filtering the original time series. Calculate the EMA of the resulting time series.
The author recommends RS EMA(10,10,10) as typical settings, where the first parameter is the EMA length, the second parameter is the RS length, and the third parameter is the RS multiplier. Other values may be substituted depending on your trading style and goals.
Esqvair's Neural Reversal Probability IndicatorIntroduction
Esqvair's Neural Reversal Probability Indicator is the indicator that shows probability of reversal.
Warning: This script should only be used on 1 minute chart.
How to use
When a signal appears (by default it is a green bar), a reversal should be expected.
The signal appears when the indicator value >= Threshold.
If you want more signals, you must lower the threshold, if less, you must increase the threshold.
For some assets, like Forex pairs, you have to optimize the threshold yourself, but for most stocks, the default threshold works well.
How well a threshold fits an asset depends on the volatility of the asset.
For most assets, the indicator ranges from 35 to 75.
Settings
Smoothing - The default is 1, which means no smoothing. Indicator smoothing by SMA.
Threshold - default 71.0 is responsible for the occurrence of signals, read "How to use" part to learn more
The Indicator
This indicator is a pre-trained neural network that was trained outside of TradingView and then its structure and weights values were converted to PineScript.
Warning: A neural network is a black box in the sense that although it can approximate any function, studying its structure will not give you any idea about the structure of the function being approximated.
Possible questions
Why does the indicator value most time range from 35 to 75 when the probability should ranges from 0 to 100?
-Due to some randomness in the markets, a neural network can never be 100% sure.
What data was used to train the neural network?
-This was BTCUSD 1 minute chart data from 02/05/2020 to 02/05/2022.
Where did you train the neural network and convert it to PineScript?
-I used a programming language that I know.
TASC 2022.04 S&P500 Hybrid Seasonal System█ OVERVIEW
TASC's April 2022 edition of Traders' Tips includes the "Sell In May? Stock Market Seasonality" article authored by Markos Katsanos. This is the code implementing the "Hybrid Seasonal System" from the article.
█ CONCEPTS
In his article, Markos Katsanos takes an updated look at the "Sell in May" adage by reviewing recent historical data for seasonal equity market tendencies. The author explores the development of a trading strategy (a set of buy and sell rules) based on this research.
He starts from the enhanced buy & hold system featured in his July 2021 TASC article, and adds additional technical conditions. These include volatility conditions ( VIX and ATR ) plus the "Volume Flow Indicator" (VFI), which is a custom money flow indicator that Katsanos introduced in his June 2004 TASC article. He provides an example of a trading system that others can test for themselves and modify as they see fit. The author notes that the system could likely be improved further by adding money management conditions (such as a stop-loss), or by adding more technical conditions not considered in the scope of this article.
█ CALCULATIONS
The entry and exit rules that constitute the trading system are defined below. The critical values of VIX, ATR and VFI (specified below) used in the calculations were determined by optimization for a daily chart of the SPY ETF . By default, the strategy only allows long entries. However, the script offers the possibility to initiate short entries upon exiting long trades through the "Long Only" toggle in the script's inputs.
Long Entry Rules
• Seasonal: The seasonal trade is initiated on the first business day October at the open.
• Volatility: In case of high volatility, that is if the VIX is above 60% or the 15-day ATR was above 90% over the past 25 days, the seasonal trade is deferred until later in the month or year, when the volatility subsides.
Exit/Short Entry Rules
• Seasonal: The exit/short signal is triggered on the first business day of August at the open.
• Volatility: The exit/short signal is triggered if VIX is above 120 % (i.e. 2 times the corresponding threshold parameter).
• Money flow (VFI): The exit/short signal is triggered if the VFI crosses under a critical value (-20) while its 10-day moving average is pointing down.
Join TradingView!
TASC 2022.03 Relative Strength Volatility-Adjusted EMA█ OVERVIEW
TASC's March 2022 edition of Traders' Tips includes the "Relative Strength Moving Averages - Part 3: The Relative Strength Volatility-Adjusted Exponential Moving Average" article authored by Vitali Apirine. This is the code that implements the "RS VolatAdj EMA" from the article.
█ CONCEPTS
In a three-part article series, Vitaly Apirine examines ways to filter price movements and define turning points by applying the Relative Strength concept to exponential moving averages . The resulting indicator is more responsive and is intended to account for the relative strength of volatility .
█ CALCULATIONS
The calculation process uses the following steps:
Select an appropriate volatility index (in our case it is VIX ).
Calculate up day volatility (UV) smoothed by a 10-day EMA.
Calculate down day volatility (DV) smoothed by a 10-day EMA.
Take the absolute value of the difference between UV and DV and divide by the sum of UV and DV. This is the Volatility Strength we need.
Calculate a MLTP constant - the weighting multiplier for an exponential moving average.
Combine Volatility Strength and MLTP to create an exponential moving average on current price data.
Join TradingView!
Logarithmic Bollinger BandsLogarithmic Bollinger Bands
Published by Eric Thies on January 14, 2022
Summary
In this script I have taken the standard Bollinger band pinescript and made efforts to eliminate the behavior experienced in periods of high volatility in which we see the bands disappear completely off the chart by adding exponential plotting and logarithmic sourcing to the tool.
This tool will also show periods of Bearish and Bullish Expansion for users to see when volatility is running high in the market.
More On Bollinger Bands
Bollinger Bands consist of a center line representing the moving average of a security’s price over a certain period, and two additional parallel lines (called the upper and lower trading bands) one of which is just the moving average plus k-times the standard deviation over the selected time frame, and the other being the moving average minus k-times the standard deviation over that same timeframe. This technique has been developed in the 1980’s by John Bollinger, who lately registered the terms “Bollinger Bands” as a U.S. trademark in 2011. Technical analysts typically use 20 periods and k = 2 as default settings to build Bollinger Bands, while they can choose a simple or exponential moving average. Bollinger Bands provide a relative definition of high and low prices of a security. When the security is trading within the upper band, the price is considered high, while it is considered low when the security is trading within the lower band.
There is no general consensus on the use of Bollinger Bands among traders. Some traders see a buy signal when the price hits the lower Bollinger Band and close their position when the price hits the moving average. Some others buy when the price crosses over the upper band and sell when the price crosses below the lower band. We can see here two opposing interpretations based on different rationales, depending whether we are in a reversal or continuation pattern. Another interesting feature of the Bollinger Bands is that they give an indication of the volatility levels; a widening gap between the upper and lower bands indicates an increasing volatility, while a narrowing band indicates a decreasing volatility. Moreover, when the bands have an almost flat slope (parallel to the x-axis) the price will generally oscillate between the bands as if trading through a channel.
// © 2022 KINGTHIES THIS SOURCE CODE IS SUBJECT TO TERMS OF MOZILLA PUBLIC LICENSE 2.0 (MOZILLA.ORG/MPL/2.0)
//@version=5
//## !<---------------- © KINGTHIES --------------------->
indicator('Logarithmic Bollinger Bands (kingthies)',shorttitle='LogBands_KT',overlay=true)
// { BBANDS
src = math.log(input(close,title="Source"))
lenX = input(20,title='lenX')
highlights = input(false,title="Highlight Bear and Bull Expansions?")
mult = 2
bbandBasis = ta.sma(src,lenX)
dev = 2 * ta.stdev(src, 20)
upperBB = bbandBasis + dev
lowerBB = bbandBasis - dev
bbw = (upperBB-lowerBB)/bbandBasis
bbr = (src - lowerBB)/(upperBB - lowerBB)
// }
// { BBAND EXPANSIONS
bullExp= ta.rising(upperBB,1) and ta.falling(lowerBB,1) and ta.rising(bbandBasis,1) and ta.rising(bbw,1) and ta.rising(bbr,1)
bearExp= ta.rising(upperBB,1) and ta.falling(lowerBB,1) and ta.falling(bbandBasis,1) and ta.rising(bbw,1) and ta.falling(bbr,1)
// }
// { COLORS
greenBG = color.rgb(9,121,105,75), redBG = color.rgb(136,8,8,75)
bullCol = highlights and bullExp ? greenBG : na, bearCol = highlights and bearExp ? redBG : na
// }
// { INDICATOR PLOTTING
lowBB=plot(math.exp(lowerBB),title='Low Band',color=color.aqua),plot(math.exp(bbandBasis),title='BBand Basis',color=color.red),
highBB=plot(math.exp(upperBB),title='High Band',color=color.aqua),fill(lowBB,highBB,title='Band Fill Color',color=color.rgb(0,128,128,75))
bgcolor(bullCol,title='Bullish Expansion Highlights'),bgcolor(bearCol,title='Bearish Expansion Highlights')
// }
TASC 2022.02 Ehlers' Elegant Oscillator█ OVERVIEW
TASC's February 2022 edition of Traders' Tips includes the "Inverse Fisher Transform Redux — An Elegant Oscillator" article authored by John Ehlers. This is the code implementing the "Elegant Oscillator" from the article.
█ CONCEPTS
By applying the inverse Fisher transform to a waveform with a nominal Gaussian probability distribution using root mean square ( RMS ) scaling and smoothing the result, John Ehlers creates an oscillator that swings between -1 and 1.
█ CALCULATIONS
The calculation process uses the following steps:
• Compute the 2-bar difference of closing prices.
• Calculate the root mean square (RMS) of the differences.
• Scale the differences using the computed RMS.
• Apply the inverse Fisher transform to the scaled values.
• Smooth the transformed data with the SuperSmoother filter.
Join TradingView!
TASC 2022.01 Improved RSI w/Hann█ OVERVIEW
TASC's January 2022 edition Traders' Tips includes the "(Yet Another Improved) RSI Enhanced With Hann Windowing" article authored by John Ehlers. Once again John Ehlers revolutionizes the RSI indicator. This is TradingView's Pine Script code for the indicator.
█ CONCEPTS
By employing a Hann windowed finite impulse response filter ( FIR ), John Ehlers has enhanced the "Relative Strength Indicator" ( RSI ) to provide an improved oscillator with exceptional smoothness.
█ NOTES
Calculations
The method of calculations using "closes up" and "closes down" from Welles Wilder's RSI described in his 1978 book is still inherent to Ehlers enhanced formula. However, a finite impulse response (FIR) Hann windowing technique is employed following the closes up/down calculations instead of the original Wilder infinite impulse response averaging filter. The resulting oscillator waveform is confined between +/-1.0 with a 0.0 centerline regardless of chart interval, as opposed to Wilder's original formulation, which was confined between 0 and 100 with a centerline of 50. On any given trading timeframe, the value of Ehlers' enhanced RSI found above the centerline typically represents an overvalued region, while undervalued regions are typically found below the centerline.
Background
The original RSI indicator was designed by J. Welles Wilder and presented in his "New Concepts in Technical Trading Systems" book published in 1978.
Join TradingView!
taLibrary "ta"
█ OVERVIEW
This library holds technical analysis functions calculating values for which no Pine built-in exists.
Look first. Then leap.
█ FUNCTIONS
cagr(entryTime, entryPrice, exitTime, exitPrice)
It calculates the "Compound Annual Growth Rate" between two points in time. The CAGR is a notional, annualized growth rate that assumes all profits are reinvested. It only takes into account the prices of the two end points — not drawdowns, so it does not calculate risk. It can be used as a yardstick to compare the performance of two instruments. Because it annualizes values, the function requires a minimum of one day between the two end points (annualizing returns over smaller periods of times doesn't produce very meaningful figures).
Parameters:
entryTime : The starting timestamp.
entryPrice : The starting point's price.
exitTime : The ending timestamp.
exitPrice : The ending point's price.
Returns: CAGR in % (50 is 50%). Returns `na` if there is not >=1D between `entryTime` and `exitTime`, or until the two time points have not been reached by the script.
█ v2, Mar. 8, 2022
Added functions `allTimeHigh()` and `allTimeLow()` to find the highest or lowest value of a source from the first historical bar to the current bar. These functions will not look ahead; they will only return new highs/lows on the bar where they occur.
allTimeHigh(src)
Tracks the highest value of `src` from the first historical bar to the current bar.
Parameters:
src : (series int/float) Series to track. Optional. The default is `high`.
Returns: (float) The highest value tracked.
allTimeLow(src)
Tracks the lowest value of `src` from the first historical bar to the current bar.
Parameters:
src : (series int/float) Series to track. Optional. The default is `low`.
Returns: (float) The lowest value tracked.
█ v3, Sept. 27, 2022
This version includes the following new functions:
aroon(length)
Calculates the values of the Aroon indicator.
Parameters:
length (simple int) : (simple int) Number of bars (length).
Returns: ( [float, float ]) A tuple of the Aroon-Up and Aroon-Down values.
coppock(source, longLength, shortLength, smoothLength)
Calculates the value of the Coppock Curve indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
longLength (simple int) : (simple int) Number of bars for the fast ROC value (length).
shortLength (simple int) : (simple int) Number of bars for the slow ROC value (length).
smoothLength (simple int) : (simple int) Number of bars for the weigted moving average value (length).
Returns: (float) The oscillator value.
dema(source, length)
Calculates the value of the Double Exponential Moving Average (DEMA).
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The double exponentially weighted moving average of the `source`.
dema2(src, length)
An alternate Double Exponential Moving Average (Dema) function to `dema()`, which allows a "series float" length argument.
Parameters:
src : (series int/float) Series of values to process.
length : (series int/float) Length for the smoothing parameter calculation.
Returns: (float) The double exponentially weighted moving average of the `src`.
dm(length)
Calculates the value of the "Demarker" indicator.
Parameters:
length (simple int) : (simple int) Number of bars (length).
Returns: (float) The oscillator value.
donchian(length)
Calculates the values of a Donchian Channel using `high` and `low` over a given `length`.
Parameters:
length (int) : (series int) Number of bars (length).
Returns: ( [float, float, float ]) A tuple containing the channel high, low, and median, respectively.
ema2(src, length)
An alternate ema function to the `ta.ema()` built-in, which allows a "series float" length argument.
Parameters:
src : (series int/float) Series of values to process.
length : (series int/float) Number of bars (length).
Returns: (float) The exponentially weighted moving average of the `src`.
eom(length, div)
Calculates the value of the Ease of Movement indicator.
Parameters:
length (simple int) : (simple int) Number of bars (length).
div (simple int) : (simple int) Divisor used for normalzing values. Optional. The default is 10000.
Returns: (float) The oscillator value.
frama(source, length)
The Fractal Adaptive Moving Average (FRAMA), developed by John Ehlers, is an adaptive moving average that dynamically adjusts its lookback period based on fractal geometry.
Parameters:
source (float) : (series int/float) Series of values to process.
length (int) : (series int) Number of bars (length).
Returns: (float) The fractal adaptive moving average of the `source`.
ft(source, length)
Calculates the value of the Fisher Transform indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Number of bars (length).
Returns: (float) The oscillator value.
ht(source)
Calculates the value of the Hilbert Transform indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
Returns: (float) The oscillator value.
ichimoku(conLength, baseLength, senkouLength)
Calculates values of the Ichimoku Cloud indicator, including tenkan, kijun, senkouSpan1, senkouSpan2, and chikou. NOTE: offsets forward or backward can be done using the `offset` argument in `plot()`.
Parameters:
conLength (int) : (series int) Length for the Conversion Line (Tenkan). The default is 9 periods, which returns the mid-point of the 9 period Donchian Channel.
baseLength (int) : (series int) Length for the Base Line (Kijun-sen). The default is 26 periods, which returns the mid-point of the 26 period Donchian Channel.
senkouLength (int) : (series int) Length for the Senkou Span 2 (Leading Span B). The default is 52 periods, which returns the mid-point of the 52 period Donchian Channel.
Returns: ( [float, float, float, float, float ]) A tuple of the Tenkan, Kijun, Senkou Span 1, Senkou Span 2, and Chikou Span values. NOTE: by default, the senkouSpan1 and senkouSpan2 should be plotted 26 periods in the future, and the Chikou Span plotted 26 days in the past.
ift(source)
Calculates the value of the Inverse Fisher Transform indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
Returns: (float) The oscillator value.
kvo(fastLen, slowLen, trigLen)
Calculates the values of the Klinger Volume Oscillator.
Parameters:
fastLen (simple int) : (simple int) Length for the fast moving average smoothing parameter calculation.
slowLen (simple int) : (simple int) Length for the slow moving average smoothing parameter calculation.
trigLen (simple int) : (simple int) Length for the trigger moving average smoothing parameter calculation.
Returns: ( [float, float ]) A tuple of the KVO value, and the trigger value.
pzo(length)
Calculates the value of the Price Zone Oscillator.
Parameters:
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The oscillator value.
rms(source, length)
Calculates the Root Mean Square of the `source` over the `length`.
Parameters:
source (float) : (series int/float) Series of values to process.
length (int) : (series int) Number of bars (length).
Returns: (float) The RMS value.
rwi(length)
Calculates the values of the Random Walk Index.
Parameters:
length (simple int) : (simple int) Lookback and ATR smoothing parameter length.
Returns: ( [float, float ]) A tuple of the `rwiHigh` and `rwiLow` values.
stc(source, fast, slow, cycle, d1, d2)
Calculates the value of the Schaff Trend Cycle indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
fast (simple int) : (simple int) Length for the MACD fast smoothing parameter calculation.
slow (simple int) : (simple int) Length for the MACD slow smoothing parameter calculation.
cycle (simple int) : (simple int) Number of bars for the Stochastic values (length).
d1 (simple int) : (simple int) Length for the initial %D smoothing parameter calculation.
d2 (simple int) : (simple int) Length for the final %D smoothing parameter calculation.
Returns: (float) The oscillator value.
stochFull(periodK, smoothK, periodD)
Calculates the %K and %D values of the Full Stochastic indicator.
Parameters:
periodK (simple int) : (simple int) Number of bars for Stochastic calculation. (length).
smoothK (simple int) : (simple int) Number of bars for smoothing of the %K value (length).
periodD (simple int) : (simple int) Number of bars for smoothing of the %D value (length).
Returns: ( [float, float ]) A tuple of the slow %K and the %D moving average values.
stochRsi(lengthRsi, periodK, smoothK, periodD, source)
Calculates the %K and %D values of the Stochastic RSI indicator.
Parameters:
lengthRsi (simple int) : (simple int) Length for the RSI smoothing parameter calculation.
periodK (simple int) : (simple int) Number of bars for Stochastic calculation. (length).
smoothK (simple int) : (simple int) Number of bars for smoothing of the %K value (length).
periodD (simple int) : (simple int) Number of bars for smoothing of the %D value (length).
source (float) : (series int/float) Series of values to process. Optional. The default is `close`.
Returns: ( [float, float ]) A tuple of the slow %K and the %D moving average values.
supertrend(factor, atrLength, wicks)
Calculates the values of the SuperTrend indicator with the ability to take candle wicks into account, rather than only the closing price.
Parameters:
factor (float) : (series int/float) Multiplier for the ATR value.
atrLength (simple int) : (simple int) Length for the ATR smoothing parameter calculation.
wicks (simple bool) : (simple bool) Condition to determine whether to take candle wicks into account when reversing trend, or to use the close price. Optional. Default is false.
Returns: ( [float, int ]) A tuple of the superTrend value and trend direction.
szo(source, length)
Calculates the value of the Sentiment Zone Oscillator.
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The oscillator value.
t3(source, length, vf)
Calculates the value of the Tilson Moving Average (T3).
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
vf (simple float) : (simple float) Volume factor. Affects the responsiveness.
Returns: (float) The Tilson moving average of the `source`.
t3Alt(source, length, vf)
An alternate Tilson Moving Average (T3) function to `t3()`, which allows a "series float" `length` argument.
Parameters:
source (float) : (series int/float) Series of values to process.
length (float) : (series int/float) Length for the smoothing parameter calculation.
vf (simple float) : (simple float) Volume factor. Affects the responsiveness.
Returns: (float) The Tilson moving average of the `source`.
tema(source, length)
Calculates the value of the Triple Exponential Moving Average (TEMA).
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The triple exponentially weighted moving average of the `source`.
tema2(source, length)
An alternate Triple Exponential Moving Average (TEMA) function to `tema()`, which allows a "series float" `length` argument.
Parameters:
source (float) : (series int/float) Series of values to process.
length (float) : (series int/float) Length for the smoothing parameter calculation.
Returns: (float) The triple exponentially weighted moving average of the `source`.
trima(source, length)
Calculates the value of the Triangular Moving Average (TRIMA).
Parameters:
source (float) : (series int/float) Series of values to process.
length (int) : (series int) Number of bars (length).
Returns: (float) The triangular moving average of the `source`.
trima2(src, length)
An alternate Triangular Moving Average (TRIMA) function to `trima()`, which allows a "series int" length argument.
Parameters:
src : (series int/float) Series of values to process.
length : (series int) Number of bars (length).
Returns: (float) The triangular moving average of the `src`.
trix(source, length, signalLength, exponential)
Calculates the values of the TRIX indicator.
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Length for the smoothing parameter calculation.
signalLength (simple int) : (simple int) Length for smoothing the signal line.
exponential (simple bool) : (simple bool) Condition to determine whether exponential or simple smoothing is used. Optional. The default is `true` (exponential smoothing).
Returns: ( [float, float, float ]) A tuple of the TRIX value, the signal value, and the histogram.
uo(fastLen, midLen, slowLen)
Calculates the value of the Ultimate Oscillator.
Parameters:
fastLen (simple int) : (series int) Number of bars for the fast smoothing average (length).
midLen (simple int) : (series int) Number of bars for the middle smoothing average (length).
slowLen (simple int) : (series int) Number of bars for the slow smoothing average (length).
Returns: (float) The oscillator value.
vhf(source, length)
Calculates the value of the Vertical Horizontal Filter.
Parameters:
source (float) : (series int/float) Series of values to process.
length (simple int) : (simple int) Number of bars (length).
Returns: (float) The oscillator value.
vi(length)
Calculates the values of the Vortex Indicator.
Parameters:
length (simple int) : (simple int) Number of bars (length).
Returns: ( [float, float ]) A tuple of the viPlus and viMinus values.
vzo(length)
Calculates the value of the Volume Zone Oscillator.
Parameters:
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The oscillator value.
williamsFractal(period)
Detects Williams Fractals.
Parameters:
period (int) : (series int) Number of bars (length).
Returns: ( [bool, bool ]) A tuple of an up fractal and down fractal. Variables are true when detected.
wpo(length)
Calculates the value of the Wave Period Oscillator.
Parameters:
length (simple int) : (simple int) Length for the smoothing parameter calculation.
Returns: (float) The oscillator value.
█ v7, Nov. 2, 2023
This version includes the following new and updated functions:
atr2(length)
An alternate ATR function to the `ta.atr()` built-in, which allows a "series float" `length` argument.
Parameters:
length (float) : (series int/float) Length for the smoothing parameter calculation.
Returns: (float) The ATR value.
changePercent(newValue, oldValue)
Calculates the percentage difference between two distinct values.
Parameters:
newValue (float) : (series int/float) The current value.
oldValue (float) : (series int/float) The previous value.
Returns: (float) The percentage change from the `oldValue` to the `newValue`.
donchian(length)
Calculates the values of a Donchian Channel using `high` and `low` over a given `length`.
Parameters:
length (int) : (series int) Number of bars (length).
Returns: ( [float, float, float ]) A tuple containing the channel high, low, and median, respectively.
highestSince(cond, source)
Tracks the highest value of a series since the last occurrence of a condition.
Parameters:
cond (bool) : (series bool) A condition which, when `true`, resets the tracking of the highest `source`.
source (float) : (series int/float) Series of values to process. Optional. The default is `high`.
Returns: (float) The highest `source` value since the last time the `cond` was `true`.
lowestSince(cond, source)
Tracks the lowest value of a series since the last occurrence of a condition.
Parameters:
cond (bool) : (series bool) A condition which, when `true`, resets the tracking of the lowest `source`.
source (float) : (series int/float) Series of values to process. Optional. The default is `low`.
Returns: (float) The lowest `source` value since the last time the `cond` was `true`.
relativeVolume(length, anchorTimeframe, isCumulative, adjustRealtime)
Calculates the volume since the last change in the time value from the `anchorTimeframe`, the historical average volume using bars from past periods that have the same relative time offset as the current bar from the start of its period, and the ratio of these volumes. The volume values are cumulative by default, but can be adjusted to non-accumulated with the `isCumulative` parameter.
Parameters:
length (simple int) : (simple int) The number of periods to use for the historical average calculation.
anchorTimeframe (simple string) : (simple string) The anchor timeframe used in the calculation. Optional. Default is "D".
isCumulative (simple bool) : (simple bool) If `true`, the volume values will be accumulated since the start of the last `anchorTimeframe`. If `false`, values will be used without accumulation. Optional. The default is `true`.
adjustRealtime (simple bool) : (simple bool) If `true`, estimates the cumulative value on unclosed bars based on the data since the last `anchor` condition. Optional. The default is `false`.
Returns: ( [float, float, float ]) A tuple of three float values. The first element is the current volume. The second is the average of volumes at equivalent time offsets from past anchors over the specified number of periods. The third is the ratio of the current volume to the historical average volume.
rma2(source, length)
An alternate RMA function to the `ta.rma()` built-in, which allows a "series float" `length` argument.
Parameters:
source (float) : (series int/float) Series of values to process.
length (float) : (series int/float) Length for the smoothing parameter calculation.
Returns: (float) The rolling moving average of the `source`.
supertrend2(factor, atrLength, wicks)
An alternate SuperTrend function to `supertrend()`, which allows a "series float" `atrLength` argument.
Parameters:
factor (float) : (series int/float) Multiplier for the ATR value.
atrLength (float) : (series int/float) Length for the ATR smoothing parameter calculation.
wicks (simple bool) : (simple bool) Condition to determine whether to take candle wicks into account when reversing trend, or to use the close price. Optional. Default is `false`.
Returns: ( [float, int ]) A tuple of the superTrend value and trend direction.
vStop(source, atrLength, atrFactor)
Calculates an ATR-based stop value that trails behind the `source`. Can serve as a possible stop-loss guide and trend identifier.
Parameters:
source (float) : (series int/float) Series of values that the stop trails behind.
atrLength (simple int) : (simple int) Length for the ATR smoothing parameter calculation.
atrFactor (float) : (series int/float) The multiplier of the ATR value. Affects the maximum distance between the stop and the `source` value. A value of 1 means the maximum distance is 100% of the ATR value. Optional. The default is 1.
Returns: ( [float, bool ]) A tuple of the volatility stop value and the trend direction as a "bool".
vStop2(source, atrLength, atrFactor)
An alternate Volatility Stop function to `vStop()`, which allows a "series float" `atrLength` argument.
Parameters:
source (float) : (series int/float) Series of values that the stop trails behind.
atrLength (float) : (series int/float) Length for the ATR smoothing parameter calculation.
atrFactor (float) : (series int/float) The multiplier of the ATR value. Affects the maximum distance between the stop and the `source` value. A value of 1 means the maximum distance is 100% of the ATR value. Optional. The default is 1.
Returns: ( [float, bool ]) A tuple of the volatility stop value and the trend direction as a "bool".
Removed Functions:
allTimeHigh(src)
Tracks the highest value of `src` from the first historical bar to the current bar.
allTimeLow(src)
Tracks the lowest value of `src` from the first historical bar to the current bar.
trima2(src, length)
An alternate Triangular Moving Average (TRIMA) function to `trima()`, which allows a
"series int" length argument.
Financial Astrology Indexes ML Daily TrendDaily trend indicator based on financial astrology cycles detected with advanced machine learning techniques for some of the most important market indexes: DJI, UK100, SPX, IBC, IXIC, NI225, BANKNIFTY, NIFTY and GLD fund (not index) for Gold predictions. The daily price trend is forecasted through planets cycles (angular aspects, speed phases, declination zone), fast cycles are based on Moon, Mercury, Venus and Sun and Mid term cycles are based on Mars, Vesta and Ceres . The combination of all this cycles produce a daily price trend prediction that is encoded into a PineScript array using binary format "0 or 1" that represent sell and buy signals respectively. The indicator provides signals since 2021-01-01 to 2022-12-31, the past months signals purpose is to support backtesting of the indicator combined with other technical indicator entries like MAs, RSI or Stochastic . For future predictions besides 2022 a machine learning models re-train phase will be required.
When the signal moving average is increasing from 0 to 1 indicates an increase of buy force, when is decreasing from 1 to 0 indicates an increase in sell force, finally, when is sideways around the 0.4-0.6 area predicts a period of buy/sell forces equilibrium, traders indecision which result in a price congestion within a narrow price range.
We also have published same indicator for Crypto-Currencies research portfolio:
DISCLAIMER: This indicator is experimental and don’t provide financial or investment advice, the main purpose is to demonstrate the predictive power of financial astrology. Any allocation of funds following the documented machine learning model prediction is a high-risk endeavour and it’s the users responsibility to practice healthy risk management according to your situation.
Power Metcalfe's + Fibonacci Channel## Metcalfe's Law + Fibonacci Channel - Optimized Bitcoin Valuation Model
This indicator presents an enhanced variation of the classic Bitcoin Metcalfe's Law model, combining logarithmic regression analysis with Fibonacci retracement levels to create a comprehensive valuation framework.
**Key Features:**
- **Optimized Metcalfe's Law calculation** using historical cycle data (2013-2022) for improved accuracy
- **Fibonacci channel overlay** with key levels: 0.382, 0.618, 1.272, 1.618, 2.000, 2.618, 3.000
- **Dynamic trading zones** with visual buy/sell signals based on price position relative to the channel
- **Real-time targets** displaying current Fibonacci projections and fair value estimates
**What makes it different:**
Unlike standard Metcalfe's Law implementations, this version integrates logarithmic growth principles and uses a refined dataset that accounts for Bitcoin's maturation cycles. The Fibonacci overlay provides clearer entry/exit points while maintaining the long-term growth trajectory based on network adoption.
**Best suited for:** Long-term Bitcoin holders and macro traders looking for mathematical support/resistance levels based on network adoption dynamics and scarcity.
The model automatically updates calculations and provides a comprehensive information table showing current formula parameters and key price targets.
Advanced Fed Decision Forecast Model (AFDFM)The Advanced Fed Decision Forecast Model (AFDFM) represents a novel quantitative framework for predicting Federal Reserve monetary policy decisions through multi-factor fundamental analysis. This model synthesizes established monetary policy rules with real-time economic indicators to generate probabilistic forecasts of Federal Open Market Committee (FOMC) decisions. Building upon seminal work by Taylor (1993) and incorporating recent advances in data-dependent monetary policy analysis, the AFDFM provides institutional-grade decision support for monetary policy analysis.
## 1. Introduction
Central bank communication and policy predictability have become increasingly important in modern monetary economics (Blinder et al., 2008). The Federal Reserve's dual mandate of price stability and maximum employment, coupled with evolving economic conditions, creates complex decision-making environments that traditional models struggle to capture comprehensively (Yellen, 2017).
The AFDFM addresses this challenge by implementing a multi-dimensional approach that combines:
- Classical monetary policy rules (Taylor Rule framework)
- Real-time macroeconomic indicators from FRED database
- Financial market conditions and term structure analysis
- Labor market dynamics and inflation expectations
- Regime-dependent parameter adjustments
This methodology builds upon extensive academic literature while incorporating practical insights from Federal Reserve communications and FOMC meeting minutes.
## 2. Literature Review and Theoretical Foundation
### 2.1 Taylor Rule Framework
The foundational work of Taylor (1993) established the empirical relationship between federal funds rate decisions and economic fundamentals:
rt = r + πt + α(πt - π) + β(yt - y)
Where:
- rt = nominal federal funds rate
- r = equilibrium real interest rate
- πt = inflation rate
- π = inflation target
- yt - y = output gap
- α, β = policy response coefficients
Extensive empirical validation has demonstrated the Taylor Rule's explanatory power across different monetary policy regimes (Clarida et al., 1999; Orphanides, 2003). Recent research by Bernanke (2015) emphasizes the rule's continued relevance while acknowledging the need for dynamic adjustments based on financial conditions.
### 2.2 Data-Dependent Monetary Policy
The evolution toward data-dependent monetary policy, as articulated by Fed Chair Powell (2024), requires sophisticated frameworks that can process multiple economic indicators simultaneously. Clarida (2019) demonstrates that modern monetary policy transcends simple rules, incorporating forward-looking assessments of economic conditions.
### 2.3 Financial Conditions and Monetary Transmission
The Chicago Fed's National Financial Conditions Index (NFCI) research demonstrates the critical role of financial conditions in monetary policy transmission (Brave & Butters, 2011). Goldman Sachs Financial Conditions Index studies similarly show how credit markets, term structure, and volatility measures influence Fed decision-making (Hatzius et al., 2010).
### 2.4 Labor Market Indicators
The dual mandate framework requires sophisticated analysis of labor market conditions beyond simple unemployment rates. Daly et al. (2012) demonstrate the importance of job openings data (JOLTS) and wage growth indicators in Fed communications. Recent research by Aaronson et al. (2019) shows how the Beveridge curve relationship influences FOMC assessments.
## 3. Methodology
### 3.1 Model Architecture
The AFDFM employs a six-component scoring system that aggregates fundamental indicators into a composite Fed decision index:
#### Component 1: Taylor Rule Analysis (Weight: 25%)
Implements real-time Taylor Rule calculation using FRED data:
- Core PCE inflation (Fed's preferred measure)
- Unemployment gap proxy for output gap
- Dynamic neutral rate estimation
- Regime-dependent parameter adjustments
#### Component 2: Employment Conditions (Weight: 20%)
Multi-dimensional labor market assessment:
- Unemployment gap relative to NAIRU estimates
- JOLTS job openings momentum
- Average hourly earnings growth
- Beveridge curve position analysis
#### Component 3: Financial Conditions (Weight: 18%)
Comprehensive financial market evaluation:
- Chicago Fed NFCI real-time data
- Yield curve shape and term structure
- Credit growth and lending conditions
- Market volatility and risk premia
#### Component 4: Inflation Expectations (Weight: 15%)
Forward-looking inflation analysis:
- TIPS breakeven inflation rates (5Y, 10Y)
- Market-based inflation expectations
- Inflation momentum and persistence measures
- Phillips curve relationship dynamics
#### Component 5: Growth Momentum (Weight: 12%)
Real economic activity assessment:
- Real GDP growth trends
- Economic momentum indicators
- Business cycle position analysis
- Sectoral growth distribution
#### Component 6: Liquidity Conditions (Weight: 10%)
Monetary aggregates and credit analysis:
- M2 money supply growth
- Commercial and industrial lending
- Bank lending standards surveys
- Quantitative easing effects assessment
### 3.2 Normalization and Scaling
Each component undergoes robust statistical normalization using rolling z-score methodology:
Zi,t = (Xi,t - μi,t-n) / σi,t-n
Where:
- Xi,t = raw indicator value
- μi,t-n = rolling mean over n periods
- σi,t-n = rolling standard deviation over n periods
- Z-scores bounded at ±3 to prevent outlier distortion
### 3.3 Regime Detection and Adaptation
The model incorporates dynamic regime detection based on:
- Policy volatility measures
- Market stress indicators (VIX-based)
- Fed communication tone analysis
- Crisis sensitivity parameters
Regime classifications:
1. Crisis: Emergency policy measures likely
2. Tightening: Restrictive monetary policy cycle
3. Easing: Accommodative monetary policy cycle
4. Neutral: Stable policy maintenance
### 3.4 Composite Index Construction
The final AFDFM index combines weighted components:
AFDFMt = Σ wi × Zi,t × Rt
Where:
- wi = component weights (research-calibrated)
- Zi,t = normalized component scores
- Rt = regime multiplier (1.0-1.5)
Index scaled to range for intuitive interpretation.
### 3.5 Decision Probability Calculation
Fed decision probabilities derived through empirical mapping:
P(Cut) = max(0, (Tdovish - AFDFMt) / |Tdovish| × 100)
P(Hike) = max(0, (AFDFMt - Thawkish) / Thawkish × 100)
P(Hold) = 100 - |AFDFMt| × 15
Where Thawkish = +2.0 and Tdovish = -2.0 (empirically calibrated thresholds).
## 4. Data Sources and Real-Time Implementation
### 4.1 FRED Database Integration
- Core PCE Price Index (CPILFESL): Monthly, seasonally adjusted
- Unemployment Rate (UNRATE): Monthly, seasonally adjusted
- Real GDP (GDPC1): Quarterly, seasonally adjusted annual rate
- Federal Funds Rate (FEDFUNDS): Monthly average
- Treasury Yields (GS2, GS10): Daily constant maturity
- TIPS Breakeven Rates (T5YIE, T10YIE): Daily market data
### 4.2 High-Frequency Financial Data
- Chicago Fed NFCI: Weekly financial conditions
- JOLTS Job Openings (JTSJOL): Monthly labor market data
- Average Hourly Earnings (AHETPI): Monthly wage data
- M2 Money Supply (M2SL): Monthly monetary aggregates
- Commercial Loans (BUSLOANS): Weekly credit data
### 4.3 Market-Based Indicators
- VIX Index: Real-time volatility measure
- S&P; 500: Market sentiment proxy
- DXY Index: Dollar strength indicator
## 5. Model Validation and Performance
### 5.1 Historical Backtesting (2017-2024)
Comprehensive backtesting across multiple Fed policy cycles demonstrates:
- Signal Accuracy: 78% correct directional predictions
- Timing Precision: 2.3 meetings average lead time
- Crisis Detection: 100% accuracy in identifying emergency measures
- False Signal Rate: 12% (within acceptable research parameters)
### 5.2 Regime-Specific Performance
Tightening Cycles (2017-2018, 2022-2023):
- Hawkish signal accuracy: 82%
- Average prediction lead: 1.8 meetings
- False positive rate: 8%
Easing Cycles (2019, 2020, 2024):
- Dovish signal accuracy: 85%
- Average prediction lead: 2.1 meetings
- Crisis mode detection: 100%
Neutral Periods:
- Hold prediction accuracy: 73%
- Regime stability detection: 89%
### 5.3 Comparative Analysis
AFDFM performance compared to alternative methods:
- Fed Funds Futures: Similar accuracy, lower lead time
- Economic Surveys: Higher accuracy, comparable timing
- Simple Taylor Rule: Lower accuracy, insufficient complexity
- Market-Based Models: Similar performance, higher volatility
## 6. Practical Applications and Use Cases
### 6.1 Institutional Investment Management
- Fixed Income Portfolio Positioning: Duration and curve strategies
- Currency Trading: Dollar-based carry trade optimization
- Risk Management: Interest rate exposure hedging
- Asset Allocation: Regime-based tactical allocation
### 6.2 Corporate Treasury Management
- Debt Issuance Timing: Optimal financing windows
- Interest Rate Hedging: Derivative strategy implementation
- Cash Management: Short-term investment decisions
- Capital Structure Planning: Long-term financing optimization
### 6.3 Academic Research Applications
- Monetary Policy Analysis: Fed behavior studies
- Market Efficiency Research: Information incorporation speed
- Economic Forecasting: Multi-factor model validation
- Policy Impact Assessment: Transmission mechanism analysis
## 7. Model Limitations and Risk Factors
### 7.1 Data Dependency
- Revision Risk: Economic data subject to subsequent revisions
- Availability Lag: Some indicators released with delays
- Quality Variations: Market disruptions affect data reliability
- Structural Breaks: Economic relationship changes over time
### 7.2 Model Assumptions
- Linear Relationships: Complex non-linear dynamics simplified
- Parameter Stability: Component weights may require recalibration
- Regime Classification: Subjective threshold determinations
- Market Efficiency: Assumes rational information processing
### 7.3 Implementation Risks
- Technology Dependence: Real-time data feed requirements
- Complexity Management: Multi-component coordination challenges
- User Interpretation: Requires sophisticated economic understanding
- Regulatory Changes: Fed framework evolution may require updates
## 8. Future Research Directions
### 8.1 Machine Learning Integration
- Neural Network Enhancement: Deep learning pattern recognition
- Natural Language Processing: Fed communication sentiment analysis
- Ensemble Methods: Multiple model combination strategies
- Adaptive Learning: Dynamic parameter optimization
### 8.2 International Expansion
- Multi-Central Bank Models: ECB, BOJ, BOE integration
- Cross-Border Spillovers: International policy coordination
- Currency Impact Analysis: Global monetary policy effects
- Emerging Market Extensions: Developing economy applications
### 8.3 Alternative Data Sources
- Satellite Economic Data: Real-time activity measurement
- Social Media Sentiment: Public opinion incorporation
- Corporate Earnings Calls: Forward-looking indicator extraction
- High-Frequency Transaction Data: Market microstructure analysis
## References
Aaronson, S., Daly, M. C., Wascher, W. L., & Wilcox, D. W. (2019). Okun revisited: Who benefits most from a strong economy? Brookings Papers on Economic Activity, 2019(1), 333-404.
Bernanke, B. S. (2015). The Taylor rule: A benchmark for monetary policy? Brookings Institution Blog. Retrieved from www.brookings.edu
Blinder, A. S., Ehrmann, M., Fratzscher, M., De Haan, J., & Jansen, D. J. (2008). Central bank communication and monetary policy: A survey of theory and evidence. Journal of Economic Literature, 46(4), 910-945.
Brave, S., & Butters, R. A. (2011). Monitoring financial stability: A financial conditions index approach. Economic Perspectives, 35(1), 22-43.
Clarida, R., Galí, J., & Gertler, M. (1999). The science of monetary policy: A new Keynesian perspective. Journal of Economic Literature, 37(4), 1661-1707.
Clarida, R. H. (2019). The Federal Reserve's monetary policy response to COVID-19. Brookings Papers on Economic Activity, 2020(2), 1-52.
Clarida, R. H. (2025). Modern monetary policy rules and Fed decision-making. American Economic Review, 115(2), 445-478.
Daly, M. C., Hobijn, B., Şahin, A., & Valletta, R. G. (2012). A search and matching approach to labor markets: Did the natural rate of unemployment rise? Journal of Economic Perspectives, 26(3), 3-26.
Federal Reserve. (2024). Monetary Policy Report. Washington, DC: Board of Governors of the Federal Reserve System.
Hatzius, J., Hooper, P., Mishkin, F. S., Schoenholtz, K. L., & Watson, M. W. (2010). Financial conditions indexes: A fresh look after the financial crisis. National Bureau of Economic Research Working Paper, No. 16150.
Orphanides, A. (2003). Historical monetary policy analysis and the Taylor rule. Journal of Monetary Economics, 50(5), 983-1022.
Powell, J. H. (2024). Data-dependent monetary policy in practice. Federal Reserve Board Speech. Jackson Hole Economic Symposium, Federal Reserve Bank of Kansas City.
Taylor, J. B. (1993). Discretion versus policy rules in practice. Carnegie-Rochester Conference Series on Public Policy, 39, 195-214.
Yellen, J. L. (2017). The goals of monetary policy and how we pursue them. Federal Reserve Board Speech. University of California, Berkeley.
---
Disclaimer: This model is designed for educational and research purposes only. Past performance does not guarantee future results. The academic research cited provides theoretical foundation but does not constitute investment advice. Federal Reserve policy decisions involve complex considerations beyond the scope of any quantitative model.
Citation: EdgeTools Research Team. (2025). Advanced Fed Decision Forecast Model (AFDFM) - Scientific Documentation. EdgeTools Quantitative Research Series
MC Geopolitical Tension Events📌 Script Title: Geopolitical Tension Events
📖 Description:
This script highlights key geopolitical and military tension events from 1914 to 2024 that have historically impacted global markets.
It automatically plots vertical dashed lines and labels on the chart at the time of each major event. This allows traders and analysts to visually assess how markets have responded to global crises, wars, and significant political instability over time.
🧠 Use Cases:
Historical backtesting: Understand how market responded to past geopolitical shocks.
Contextual analysis: Add macro context to technical setups.
🗓️ List of Geopolitical Tension Events in the Script
Date Event Title Description
1914-07-28 WWI Begins Outbreak of World War I following the assassination of Archduke Franz Ferdinand.
1929-10-24 Wall Street Crash Black Thursday, the start of the 1929 stock market crash.
1939-09-01 WWII Begins Germany invades Poland, starting World War II.
1941-12-07 Pearl Harbor Japanese attack on Pearl Harbor; U.S. enters WWII.
1945-08-06 Hiroshima Bombing First atomic bomb dropped on Hiroshima by the U.S.
1950-06-25 Korean War Begins North Korea invades South Korea.
1962-10-16 Cuban Missile Crisis 13-day standoff between the U.S. and USSR over missiles in Cuba.
1973-10-06 Yom Kippur War Egypt and Syria launch surprise attack on Israel.
1979-11-04 Iran Hostage Crisis U.S. Embassy in Tehran seized; 52 hostages taken.
1990-08-02 Gulf War Begins Iraq invades Kuwait, triggering U.S. intervention.
2001-09-11 9/11 Attacks Coordinated terrorist attacks on the U.S.
2003-03-20 Iraq War Begins U.S.-led invasion of Iraq to remove Saddam Hussein.
2008-09-15 Lehman Collapse Bankruptcy of Lehman Brothers; peak of global financial crisis.
2014-03-01 Crimea Crisis Russia annexes Crimea from Ukraine.
2020-01-03 Soleimani Strike U.S. drone strike kills Iranian General Qasem Soleimani.
2022-02-24 Ukraine Invasion Russia launches full-scale invasion of Ukraine.
2023-10-07 Hamas-Israel War Hamas launches attack on Israel, sparking war in Gaza.
2024-01-12 Red Sea Crisis Houthis attack ships in Red Sea, prompting Western naval response.
DCA Investment Tracker Pro [tradeviZion]DCA Investment Tracker Pro: Educational DCA Analysis Tool
An educational indicator that helps analyze Dollar-Cost Averaging strategies by comparing actual performance with historical data calculations.
---
💡 Why I Created This Indicator
As someone who practices Dollar-Cost Averaging, I was frustrated with constantly switching between spreadsheets, calculators, and charts just to understand how my investments were really performing. I wanted to see everything in one place - my actual performance, what I should expect based on historical data, and most importantly, visualize where my strategy could take me over the long term .
What really motivated me was watching friends and family underestimate the incredible power of consistent investing. When Napoleon Bonaparte first learned about compound interest, he reportedly exclaimed "I wonder it has not swallowed the world" - and he was right! Yet most people can't visualize how their $500 monthly contributions today could become substantial wealth decades later.
Traditional DCA tracking tools exist, but they share similar limitations:
Require manual data entry and complex spreadsheets
Use fixed assumptions that don't reflect real market behavior
Can't show future projections overlaid on actual price charts
Lose the visual context of what's happening in the market
Make compound growth feel abstract rather than tangible
I wanted to create something different - a tool that automatically analyzes real market history, detects volatility periods, and shows you both current performance AND educational projections based on historical patterns right on your TradingView charts. As Warren Buffett said: "Someone's sitting in the shade today because someone planted a tree a long time ago." This tool helps you visualize your financial tree growing over time.
This isn't just another calculator - it's a visualization tool that makes the magic of compound growth impossible to ignore.
---
🎯 What This Indicator Does
This educational indicator provides DCA analysis tools. Users can input investment scenarios to study:
Theoretical Performance: Educational calculations based on historical return data
Comparative Analysis: Study differences between actual and theoretical scenarios
Historical Projections: Theoretical projections for educational analysis (not predictions)
Performance Metrics: CAGR, ROI, and other analytical metrics for study
Historical Analysis: Calculates historical return data for reference purposes
---
🚀 Key Features
Volatility-Adjusted Historical Return Calculation
Analyzes 3-20 years of actual price data for any symbol
Automatically detects high-volatility stocks (meme stocks, growth stocks)
Uses median returns for volatile stocks, standard CAGR for stable stocks
Provides conservative estimates when extreme outlier years are detected
Smart fallback to manual percentages when data insufficient
Customizable Performance Dashboard
Educational DCA performance analysis with compound growth calculations
Customizable table sizing (Tiny to Huge text options)
9 positioning options (Top/Middle/Bottom + Left/Center/Right)
Theme-adaptive colors (automatically adjusts to dark/light mode)
Multiple display layout options
Future Projection System
Visual future growth projections
Timeframe-aware calculations (Daily/Weekly/Monthly charts)
1-30 year projection options
Shows projected portfolio value and total investment amounts
Investment Insights
Performance vs benchmark comparison
ROI from initial investment tracking
Monthly average return analysis
Investment milestone alerts (25%, 50%, 100% gains)
Contribution tracking and next milestone indicators
---
📊 Step-by-Step Setup Guide
1. Investment Settings 💰
Initial Investment: Enter your starting lump sum (e.g., $60,000)
Monthly Contribution: Set your regular DCA amount (e.g., $500/month)
Return Calculation: Choose "Auto (Stock History)" for real data or "Manual" for fixed %
Historical Period: Select 3-20 years for auto calculations (default: 10 years)
Start Year: When you began investing (e.g., 2020)
Current Portfolio Value: Your actual portfolio worth today (e.g., $150,000)
2. Display Settings 📊
Table Sizes: Choose from Tiny, Small, Normal, Large, or Huge
Table Positions: 9 options - Top/Middle/Bottom + Left/Center/Right
Visibility Toggles: Show/hide Main Table and Stats Table independently
3. Future Projection 🔮
Enable Projections: Toggle on to see future growth visualization
Projection Years: Set 1-30 years ahead for analysis
Live Example - NASDAQ:META Analysis:
Settings shown: $60K initial + $500/month + Auto calculation + 10-year history + 2020 start + $150K current value
---
🔬 Pine Script Code Examples
Core DCA Calculations:
// Calculate total invested over time
months_elapsed = (year - start_year) * 12 + month - 1
total_invested = initial_investment + (monthly_contribution * months_elapsed)
// Compound growth formula for initial investment
theoretical_initial_growth = initial_investment * math.pow(1 + annual_return, years_elapsed)
// Future Value of Annuity for monthly contributions
monthly_rate = annual_return / 12
fv_contributions = monthly_contribution * ((math.pow(1 + monthly_rate, months_elapsed) - 1) / monthly_rate)
// Total expected value
theoretical_total = theoretical_initial_growth + fv_contributions
Volatility Detection Logic:
// Detect extreme years for volatility adjustment
extreme_years = 0
for i = 1 to historical_years
yearly_return = ((price_current / price_i_years_ago) - 1) * 100
if yearly_return > 100 or yearly_return < -50
extreme_years += 1
// Use median approach for high volatility stocks
high_volatility = (extreme_years / historical_years) > 0.2
calculated_return = high_volatility ? median_of_returns : standard_cagr
Performance Metrics:
// Calculate key performance indicators
absolute_gain = actual_value - total_invested
total_return_pct = (absolute_gain / total_invested) * 100
roi_initial = ((actual_value - initial_investment) / initial_investment) * 100
cagr = (math.pow(actual_value / initial_investment, 1 / years_elapsed) - 1) * 100
---
📊 Real-World Examples
See the indicator in action across different investment types:
Stable Index Investments:
AMEX:SPY (SPDR S&P 500) - Shows steady compound growth with standard CAGR calculations
Classic DCA success story: $60K initial + $500/month starting 2020. The indicator shows SPY's historical 10%+ returns, demonstrating how consistent broad market investing builds wealth over time. Notice the smooth theoretical growth line vs actual performance tracking.
MIL:VUAA (Vanguard S&P 500 UCITS) - Shows both data limitation and solution approaches
Data limitation example: VUAA shows "Manual (Auto Failed)" and "No Data" when default 10-year historical setting exceeds available data. The indicator gracefully falls back to manual percentage input while maintaining all DCA calculations and projections.
MIL:VUAA (Vanguard S&P 500 UCITS) - European ETF with successful 5-year auto calculation
Solution demonstration: By adjusting historical period to 5 years (matching available data), VUAA auto calculation works perfectly. Shows how users can optimize settings for newer assets. European market exposure with EUR denomination, demonstrating DCA effectiveness across different markets and currencies.
NYSE:BRK.B (Berkshire Hathaway) - Quality value investment with Warren Buffett's proven track record
Value investing approach: Berkshire Hathaway's legendary performance through DCA lens. The indicator demonstrates how quality companies compound wealth over decades. Lower volatility than tech stocks = standard CAGR calculations used.
High-Volatility Growth Stocks:
NASDAQ:NVDA (NVIDIA Corporation) - Demonstrates volatility-adjusted calculations for extreme price swings
High-volatility example: NVIDIA's explosive AI boom creates extreme years that trigger volatility detection. The indicator automatically switches to "Median (High Vol): 50%" calculations for conservative projections, protecting against unrealistic future estimates based on outlier performance periods.
NASDAQ:TSLA (Tesla) - Shows how 10-year analysis can stabilize volatile tech stocks
Stable long-term growth: Despite Tesla's reputation for volatility, the 10-year historical analysis (34.8% CAGR) shows consistent enough performance that volatility detection doesn't trigger. Demonstrates how longer timeframes can smooth out extreme periods for more reliable projections.
NASDAQ:META (Meta Platforms) - Shows stable tech stock analysis using standard CAGR calculations
Tech stock with stable growth: Despite being a tech stock and experiencing the 2022 crash, META's 10-year history shows consistent enough performance (23.98% CAGR) that volatility detection doesn't trigger. The indicator uses standard CAGR calculations, demonstrating how not all tech stocks require conservative median adjustments.
Notice how the indicator automatically detects high-volatility periods and switches to median-based calculations for more conservative projections, while stable investments use standard CAGR methods.
---
📈 Performance Metrics Explained
Current Portfolio Value: Your actual investment worth today
Expected Value: What you should have based on historical returns (Auto) or your target return (Manual)
Total Invested: Your actual money invested (initial + all monthly contributions)
Total Gains/Loss: Absolute dollar difference between current value and total invested
Total Return %: Percentage gain/loss on your total invested amount
ROI from Initial Investment: How your starting lump sum has performed
CAGR: Compound Annual Growth Rate of your initial investment (Note: This shows initial investment performance, not full DCA strategy)
vs Benchmark: How you're performing compared to the expected returns
---
⚠️ Important Notes & Limitations
Data Requirements: Auto mode requires sufficient historical data (minimum 3 years recommended)
CAGR Limitation: CAGR calculation is based on initial investment growth only, not the complete DCA strategy
Projection Accuracy: Future projections are theoretical and based on historical returns - actual results may vary
Timeframe Support: Works ONLY on Daily (1D), Weekly (1W), and Monthly (1M) charts - no other timeframes supported
Update Frequency: Update "Current Portfolio Value" regularly for accurate tracking
---
📚 Educational Use & Disclaimer
This analysis tool can be applied to various stock and ETF charts for educational study of DCA mathematical concepts and historical performance patterns.
Study Examples: Can be used with symbols like AMEX:SPY , NASDAQ:QQQ , AMEX:VTI , NASDAQ:AAPL , NASDAQ:MSFT , NASDAQ:GOOGL , NASDAQ:AMZN , NASDAQ:TSLA , NASDAQ:NVDA for learning purposes.
EDUCATIONAL DISCLAIMER: This indicator is a study tool for analyzing Dollar-Cost Averaging strategies. It does not provide investment advice, trading signals, or guarantees. All calculations are theoretical examples for educational purposes only. Past performance does not predict future results. Users should conduct their own research and consult qualified financial professionals before making any investment decisions.
---
© 2025 TradeVizion. All rights reserved.
Kaufman Trend Strategy# ✅ Kaufman Trend Strategy – Full Description (Script Publishing Version)
**Kaufman Trend Strategy** is a dynamic trend-following strategy based on Kaufman Filter theory.
It detects real-time trend momentum, reduces noise, and aims to enhance entry accuracy while optimizing risk.
⚠️ _For educational and research purposes only. Past performance does not guarantee future results._
---
## 🎯 Strategy Objective
- Smooth price noise using Kaufman Filter smoothing
- Detect the strength and direction of trends with a normalized oscillator
- Manage profits using multi-stage take-profits and adaptive ATR stop-loss logic
---
## ✨ Key Features
- **Kaufman Filter Trend Detection**
Extracts directional signal using a state space model.
- **Multi-Stage Profit-Taking**
Automatically takes partial profits based on color changes and zero-cross events.
- **ATR-Based Volatility Stops**
Stops adjust based on swing highs/lows and current market volatility.
---
## 📊 Entry & Exit Logic
**Long Entry**
- `trend_strength ≥ 60`
- Green trend signal
- Price above the Kaufman average
**Short Entry**
- `trend_strength ≤ -60`
- Red trend signal
- Price below the Kaufman average
**Exit (Long/Short)**
- Blue trend color → TP1 (50%)
- Oscillator crosses 0 → TP2 (25%)
- Trend weakens → Final exit (25%)
- ATR + swing-based stop loss
---
## 💰 Risk Management
- Initial capital: `$3,000`
- Order size: `$100` per trade (realistic, low-risk sizing)
- Commission: `0.002%`
- Slippage: `2 ticks`
- Pyramiding: `1` max position
- Estimated risk/trade: `~0.1–0.5%` of equity
> ⚠️ _No trade risks more than 5% of equity. This strategy follows TradingView script publishing rules._
---
## ⚙️ Default Parameters
- **1st Take Profit**: 50%
- **2nd Take Profit**: 25%
- **Final Exit**: 25%
- **ATR Period**: 14
- **Swing Lookback**: 10
- **Entry Threshold**: ±60
- **Exit Threshold**: ±40
---
## 📅 Backtest Summary
- **Symbol**: USD/JPY
- **Timeframe**: 1H
- **Date Range**: Jan 3, 2022 – Jun 4, 2025
- **Trades**: 924
- **Win Rate**: 41.67%
- **Profit Factor**: 1.108
- **Net Profit**: +$1,659.29 (+54.56%)
- **Max Drawdown**: -$1,419.73 (-31.87%)
---
## ✅ Summary
This strategy uses Kaufman filtering to detect market direction with reduced lag and increased smoothness.
It’s built with visual clarity and strong trade management, making it practical for both beginners and advanced users.
---
## 📌 Disclaimer
This script is for educational and informational purposes only and should not be considered financial advice.
Use with proper risk controls and always test in a demo environment before live trading.