Confirmation of Trend using Neural Networks (by kind permission of Leo)
Forums › ProRealTime English forum › ProOrder support › Confirmation of Trend using Neural Networks (by kind permission of Leo)
- This topic has 37 replies, 8 voices, and was last updated 6 years ago by GraHal.
Tagged: data mining, machine learning, neural network
-
-
09/02/2018 at 12:30 PM #7956409/02/2018 at 7:03 PM #79578
Right full Neural Network now, huge thanks to Leo!
v4.0 below, spread = 4.
Tweaks and variations welcome (better exit strategy?).
Please post full System Code and Performance Stats.
123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319// Hyperparameters to be optimizedDEFPARAM CUMULATEORDERS = FalseETA=1 //known as the learning ratecandlesback=6 // for the classifierProfitRiskRatio=2// for the classifierspread=0.9 // for the classifier///////////////// CLASSIFIER /////////////myATR=average[20](range)+std[20](range)ExtraStopLoss=MyATR//ExtraStopLoss=3*spread*pipsize//for long tradesclassifierlong=0FOR scanL=1 to candlesback DOIF classifierlong[scanL]=1 thenBREAKENDIFLongTradeLength=ProfitRiskRatio*(close[scanL]-(low[scanL]-ExtraStopLoss[scanL]))IF close[scanL]+LongTradeLength < high-spread*pipsize thenIF lowest[scanL+1](low) > low[scanL]-ExtraStopLoss[scanL]+spread*pipsize thenclassifierlong=1candleentrylong=barindex-scanLBREAKENDIFENDIFNEXT//for short tradesclassifiershort=0FOR scanS=1 to candlesback DOIF classifiershort[scanS]=1 thenBREAKENDIFShortTradeLength=ProfitRiskRatio*((high[scanS]-close[scanS])+ExtraStopLoss[scanS])IF close[scanS]-ShortTradeLength > low+spread*pipsize thenIF highest[scanS+1](high) < high[scanS]+ExtraStopLoss[scanS]-spread*pipsize thenclassifiershort=1candleentryshort=barindex-scanSBREAKENDIFENDIFNEXT///////////////////////// NEURONAL NETWORK ///////////////////// ...INITIAL VALUES...once a11=1once a12=-1once a13=1once a14=-1once a21=1once a22=1once a23=-1once a24=1once a31=-1once a32=1once a33=-1once a34=1once a41=1once a42=-1once a43=1once a44=-1once a51=-1once a52=1once a53=-1once a54=1once a61=1once a62=-1once a63=1once a64=-1once Fbias1=0once Fbias2=0once Fbias3=0once Fbias4=0once Fbias5=0once Fbias6=0once b11=1once b12=-1once b13=1once b14=-1once b15=1once b16=-1once b21=-1once b22=1once b23=-1once b24=1once b25=-1once b26=1once Obias1=0once Obias2=0// ...DEFINITION OF INPUTS...SMA20=average[min(20,barindex)](close)SMA200=average[min(200,barindex)](close)SMA2400=average[min(2400,barindex)](close) //in 5 min time frame this is the value of SMA 200 periods in hourlyvariable1= RSI[14](close) // or to be definedvariable2= (close-SMA20)/SMA20 *100 //or to be definedvariable3= (SMA20-SMA200)/SMA200 *100 //or to be definedvariable4= (SMA200-SMA2400)/SMA2400 *100 // to be defined// >>> LEARNING PROCESS <<<// If the classifier has detected a wining trade in the past//IF hour > 7 and hour < 21 thenIF BARINDEX > 2500 THENIF classifierlong=1 or classifiershort=1 THENIF hour > 7 and hour < 21 thencandleentry=max(candleentrylong,candleentryshort)Y1=classifierlongY2=classifiershort// >>> INPUT FOR NEURONS <<<input1=variable1[barindex-candleentry]input2=variable2[barindex-candleentry]input3=variable3[barindex-candleentry]input4=variable4[barindex-candleentry]FOR i=1 to 10 DO // THIS HAVE TO BE IMPROVEDETAi=ETA - ETA/10*(i-1) //Learning Rate// >>> FIRST LAYER OF NEURONS <<<F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6F1=1/(1+EXP(-1*F1))F2=1/(1+EXP(-1*F2))F3=1/(1+EXP(-1*F3))F4=1/(1+EXP(-1*F4))F5=1/(1+EXP(-1*F5))F6=1/(1+EXP(-1*F6))// >>> OUTPUT NEURONS <<<output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2output1=1/(1+EXP(-1*output1))output2=1/(1+EXP(-1*output2))// >>> PARTIAL DERIVATES OF COST FUNCTION <<<// ... CROSS-ENTROPY AS COST FUCTION ...// COST = - ( (Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - (Y2*LOG(output2)+(1-Y2)*LOG(1-output2) )DerObias1 = (output1-Y1) * 1DerObias2 = (output2-Y2) * 1Derb11 = (output1-Y1) * F1Derb12 = (output1-Y1) * F2Derb13 = (output1-Y1) * F3Derb14 = (output1-Y1) * F4Derb15 = (output1-Y1) * F5Derb16 = (output1-Y1) * F6Derb21 = (output2-Y2) * F1Derb22 = (output2-Y2) * F2Derb23 = (output2-Y2) * F3Derb24 = (output2-Y2) * F4Derb25 = (output2-Y2) * F5Derb26 = (output2-Y2) * F6//Implementing BackPropagationObias1=Obias1-ETAi*DerObias1Obias2=Obias2-ETAi*DerObias2b11=b11-ETAi*Derb11b12=b12-ETAi*Derb12b13=b11-ETAi*Derb13b14=b11-ETAi*Derb14b15=b11-ETAi*Derb15b16=b11-ETAi*Derb16b21=b11-ETAi*Derb21b22=b12-ETAi*Derb22b23=b11-ETAi*Derb23b24=b11-ETAi*Derb24b25=b11-ETAi*Derb25b26=b11-ETAi*Derb26// >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<DerFbias1 = (output1-Y1) * b11 * F1*(1-F1) * 1 + (output2-Y2) * b21 * F1*(1-F1) * 1DerFbias2 = (output1-Y1) * b12 * F2*(1-F2) * 1 + (output2-Y2) * b22 * F2*(1-F2) * 1DerFbias3 = (output1-Y1) * b13 * F3*(1-F3) * 1 + (output2-Y2) * b23 * F3*(1-F3) * 1DerFbias4 = (output1-Y1) * b14 * F4*(1-F4) * 1 + (output2-Y2) * b24 * F4*(1-F4) * 1DerFbias5 = (output1-Y1) * b15 * F5*(1-F5) * 1 + (output2-Y2) * b25 * F5*(1-F5) * 1DerFbias6 = (output1-Y1) * b16 * F6*(1-F6) * 1 + (output2-Y2) * b26 * F6*(1-F6) * 1Dera11 = (output1-Y1) * b11 * F1*(1-F1) * input1 + (output2-Y2) * b21 * F1*(1-F1) * input1Dera12 = (output1-Y1) * b11 * F1*(1-F1) * input2 + (output2-Y2) * b21 * F1*(1-F1) * input2Dera13 = (output1-Y1) * b11 * F1*(1-F1) * input3 + (output2-Y2) * b21 * F1*(1-F1) * input3Dera14 = (output1-Y1) * b11 * F1*(1-F1) * input4 + (output2-Y2) * b21 * F1*(1-F1) * input4Dera21 = (output1-Y1) * b12 * F2*(1-F2) * input1 + (output2-Y2) * b22 * F2*(1-F2) * input1Dera22 = (output1-Y1) * b12 * F2*(1-F2) * input2 + (output2-Y2) * b22 * F2*(1-F2) * input2Dera23 = (output1-Y1) * b12 * F2*(1-F2) * input3 + (output2-Y2) * b22 * F2*(1-F2) * input3Dera24 = (output1-Y1) * b12 * F2*(1-F2) * input4 + (output2-Y2) * b22 * F2*(1-F2) * input4Dera31 = (output1-Y1) * b13 * F3*(1-F3) * input1 + (output2-Y2) * b23 * F3*(1-F3) * input1Dera32 = (output1-Y1) * b13 * F3*(1-F3) * input2 + (output2-Y2) * b23 * F3*(1-F3) * input2Dera33 = (output1-Y1) * b13 * F3*(1-F3) * input3 + (output2-Y2) * b23 * F3*(1-F3) * input3Dera34 = (output1-Y1) * b13 * F3*(1-F3) * input4 + (output2-Y2) * b23 * F3*(1-F3) * input4Dera41 = (output1-Y1) * b14 * F4*(1-F4) * input1 + (output2-Y2) * b24 * F4*(1-F4) * input1Dera42 = (output1-Y1) * b14 * F4*(1-F4) * input2 + (output2-Y2) * b24 * F4*(1-F4) * input2Dera43 = (output1-Y1) * b14 * F4*(1-F4) * input3 + (output2-Y2) * b24 * F4*(1-F4) * input3Dera44 = (output1-Y1) * b14 * F4*(1-F4) * input4 + (output2-Y2) * b24 * F4*(1-F4) * input4Dera51 = (output1-Y1) * b15 * F5*(1-F5) * input1 + (output2-Y2) * b25 * F5*(1-F5) * input1Dera52 = (output1-Y1) * b15 * F5*(1-F5) * input2 + (output2-Y2) * b25 * F5*(1-F5) * input2Dera53 = (output1-Y1) * b15 * F5*(1-F5) * input3 + (output2-Y2) * b25 * F5*(1-F5) * input3Dera54 = (output1-Y1) * b15 * F5*(1-F5) * input4 + (output2-Y2) * b25 * F5*(1-F5) * input4Dera61 = (output1-Y1) * b16 * F6*(1-F6) * input1 + (output2-Y2) * b26 * F6*(1-F6) * input1Dera62 = (output1-Y1) * b16 * F6*(1-F6) * input2 + (output2-Y2) * b26 * F6*(1-F6) * input2Dera63 = (output1-Y1) * b16 * F6*(1-F6) * input3 + (output2-Y2) * b26 * F6*(1-F6) * input3Dera64 = (output1-Y1) * b16 * F6*(1-F6) * input4 + (output2-Y2) * b26 * F6*(1-F6) * input4//Implementing BackPropagationFbias1=Fbias1-ETAi*DerFbias1Fbias2=Fbias2-ETAi*DerFbias2Fbias3=Fbias3-ETAi*DerFbias3Fbias4=Fbias4-ETAi*DerFbias4Fbias5=Fbias5-ETAi*DerFbias5Fbias6=Fbias6-ETAi*DerFbias6a11=a11-ETAi*Dera11a12=a12-ETAi*Dera12a13=a13-ETAi*Dera13a14=a14-ETAi*Dera14a21=a21-ETAi*Dera21a22=a22-ETAi*Dera22a23=a23-ETAi*Dera23a24=a24-ETAi*Dera24a31=a31-ETAi*Dera31a32=a32-ETAi*Dera32a33=a33-ETAi*Dera33a34=a34-ETAi*Dera34a41=a41-ETAi*Dera41a42=a42-ETAi*Dera42a43=a43-ETAi*Dera43a44=a44-ETAi*Dera44a51=a51-ETAi*Dera51a52=a52-ETAi*Dera52a53=a53-ETAi*Dera53a54=a54-ETAi*Dera54a61=a61-ETAi*Dera61a62=a62-ETAi*Dera62a63=a63-ETAi*Dera63a64=a64-ETAi*Dera64//GradientNorm = SQRT(DerObias1*DerObias1 + DerObias2*DerObias2+Derb11*Derb11+Derb12*Derb12+Derb13*Derb13+Derb14*Derb14+Derb15*Derb15+Derb16*Derb16 + Derb21*Derb21+Derb22*Derb22+Derb23*Derb23+Derb24*Derb24+Derb25*Derb25+Derb26*Derb26 + DerFbias1*DerFbias1+DerFbias2*DerFbias2+DerFbias3+DerFbias3+DerFbias4*DerFbias4+DerFbias4*DerFbias5+DerFbias6*DerFbias6 + Dera11*Dera11+Dera12*Dera12+Dera13*Dera13+Dera14*Dera14 + Dera21*Dera21+Dera22*Dera22+Dera23*Dera23+Dera24*Dera24 + Dera31*Dera31+Dera32*Dera32+Dera33*Dera33+Dera34*Dera34 + Dera41*Dera41+Dera42*Dera42+Dera43*Dera43+Dera44*Dera44 + Dera51*Dera51+Dera52*Dera52+Dera53*Dera53+Dera54*Dera54 + Dera61*Dera61+Dera62*Dera62+Dera63*Dera63+Dera64*Dera64)NEXTENDIFENDIF//ENDIF/////////////////// NEW PREDICTION ///////////////////// >>> INPUT NEURONS <<<input1=variable1input2=variable2input3=variable3input4=variable4// >>> FIRST LAYER OF NEURONS <<<F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6F1=1/(1+EXP(-1*F1))F2=1/(1+EXP(-1*F2))F3=1/(1+EXP(-1*F3))F4=1/(1+EXP(-1*F4))F5=1/(1+EXP(-1*F5))F6=1/(1+EXP(-1*F6))// >>> OUTPUT NEURONS <<<output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2output1=1/(1+EXP(-1*output1))output2=1/(1+EXP(-1*output2))ENDIF//return output1 as "prediction long", output2 as "prediction short", threshold as "threshold"If Output1 > 0.9 and output2 < 1 ThenBuy at MarketEndifIf Output2 > 0.7 and output1 < 0.7 ThenSellshort at MarketendifIf LongonMarket thenSET TARGET PPROFIT 390SET STOP PLOSS 50EndifIf ShortonMarket thenSET TARGET PPROFIT 100SET STOP PLOSS 60Endif09/02/2018 at 7:09 PM #7958109/02/2018 at 7:35 PM #79583Ha I think it’s a lost cause!?
I have tried in the past on several Threads to encourage use of version numbers etc, but – whilst we still have folks making elementary faux pas like posting unformatted code (not using the Insert PRT Code button) then I think we have to accept we may never have Configuration Control?
But I think now this Topic can justifiably be renamed 🙂 …
Confirmation of Trend using Neural Networks (by kind permission of Leo)
09/03/2018 at 8:36 AM #79605Leo posted a later version Neural Network (v1.2) so here is the System based on Leo’s latest self-learning neural algorithm.
Tested with Spread = 4
1 user thanked author for this post.
09/03/2018 at 8:52 AM #79613Wow GraHal!
It looks awesome!
thanks for testing in an strategy.
A lot of work ahead! But such a rustic neural network proves than it works, imagine all the possibilities!
2 users thanked author for this post.
09/09/2018 at 10:39 PM #80056Hi GraHal,
I posted a little improved version of the neural network… but the inputs totally different.
In case you feel curious.
NOTE:
please add in your codes
1DEFPARAM PreLoadBars = 10000then you give time to the algorithm for learn
Thanks in advance
As you know, while I am coding with one hand in the other one my baby is sleeping otherwise she will cry, haha 🙂
1 user thanked author for this post.
09/10/2018 at 7:35 AM #80060in the other one my baby is sleeping
I recall similar scenarios, can be frustrating at the time, but it passes all too quick then the precious moments will be only in your mind also, so enjoy if you can!? 🙂
Make sure she not swallow a memory stick full of code or she may end up like Nicolas!! 🙂
-
AuthorPosts
Find exclusive trading pro-tools on