Neural networks programming with prorealtime
Forums › ProRealTime English forum › ProBuilder support › Neural networks programming with prorealtime
- This topic has 126 replies, 8 voices, and was last updated 1 year ago by MobiusGrey.
Tagged: data mining, machine learning
-
-
11/02/2018 at 5:41 AM #8398011/09/2018 at 5:46 AM #84453
Hi all,
I am glad to share with you my new version of a neural network.
I am satisfied (so far) with the performance of the neural network. THE CODE I UPLOAD NOW USES AN OLD VERSION OF THE CLASSIFIER
Here the changes:
–> Implementing gradient descent in mini batches for 10values of data
–> Only 5 epochs per training with constant update of the Learning Rate for faster learning (only 5 because if I incremented to 6 there is an error of infinite loop… I think ProrealCode does not have enough power of press ion to compute a proper gradient descent)
–> I implement learning process with Momentum in order to increase speed of convergence.
–> The Values for the data is an indicator I invent for measure the behaviour of different Moving Averages.
–> If you scroll down the graphic of the indicator (values from -1 to -10), you will see how it shows the different values of weights, bias, Cost and Error which I use to evaluate and check the correct functionality of the Neural Network)
Hope you like it, testing it and sharing your results. I am also glad to hear your opinions.
Cheers
Neural Network123456789101112131415161718192021222324252627282930313233343536373839404142434445464748495051525354555657585960616263646566676869707172737475767778798081828384858687888990919293949596979899100101102103104105106107108109110111112113114115116117118119120121122123124125126127128129130131132133134135136137138139140141142143144145146147148149150151152153154155156157158159160161162163164165166167168169170171172173174175176177178179180181182183184185186187188189190191192193194195196197198199200201202203204205206207208209210211212213214215216217218219220221222223224225226227228229230231232233234235236237238239240241242243244245246247248249250251252253254255256257258259260261262263264265266267268269270271272273274275276277278279280281282283284285286287288289290291292293294295296297298299300301302303304305306307308309310311312313314315316317318319320321322323324325326327328329330331332333334335336337338339340341342343344345346347348349350351352353354355356357358359360361362363364365366367368369370371372373374375376377378379380381382383384385386387388389390391392393394395396397398399400401402403404405406407408409410411412413414415416417418419420421422423424425426427428429430431432433434435436437438439440441442443444445446447448449450451452453454455456457458459460461462463464465466467468469470471472473474475476477478479480481482483484485486487488489490491492493494495496497498499500501502503504505506507508509510511512513514515516517518519520521522523524525526527528529530531532533534535536537538539540541542543544545546547548549550551552553554555556557558559560561562563564565566567568569570571572573574575576577578579580581582583584585586587588589590591592593594595596597598599600601602603604605606607608609610611612613614615616617618619620621622623624625626627628629630631// Hyperparameters to be optimized// ETA=0.5 //known as the learning rate// PipsToTrade=25 //pips movement for classifier// betha=0.1///////////////// CLASSIFIER /////////////candlerange=average[200](average[100](average[50](abs(close-open))))candlesback=round(average[200](round(PipsToTrade*pipsize/candlerange)))IF barindex >600 then//Supports and resistancesXcandlesback=round(0.6*candlesback)highest1=highest[candlesback](high)IF highest1 = highest1[Xcandlesback] thenRe1=highest1ENDIFIF high > Re1 thenRe1=highENDIFlowest1=lowest[candlesback](low)IF lowest1 = lowest1[Xcandlesback] thenS1=lowest1ENDIFIf low < S1 thenS1=lowENDIF//for long tradesclassifierlong=0IF close - S1 > PipsToTrade*pipsize+candlerange THENFOR scanL=1 to candlesback*2 DOIF classifierlong[scanL]=1 thenBREAKENDIFIF low[scanL]=S1 THENclassifierlong=1candleentrylong=barindex-scanLBREAKENDIFNEXTENDIF//for short tradesclassifiershort=0IF Re1-close > PipsToTrade*pipsize+candlerange THENFOR scanS=1 to candlesback*2 DOIF classifiershort[scanS]=1 thenBREAKENDIFIF High[scanS]=Re1 THENclassifiershort=1candleentryshort=barindex-scanSBREAKENDIFNEXTENDIFENDIF///////////////////////// NEURONAL NETWORK ///////////////////// ...INITIAL VALUES...once a11=1once a12=-1once a13=1once a14=-1once a21=-1once a22=1once a23=-1once a24=1once a31=1once a32=-1once a33=1once a34=-1once a41=-1once a42=1once a43=-1once a44=1once a51=1once a52=-1once a53=1once a54=-1once a61=-1once a62=1once a63=-1once a64=1once Fbias1=1once Fbias2=1once Fbias3=1once Fbias4=1once Fbias5=1once Fbias6=1once b11=1once b12=1once b13=1once b14=1once b15=1once b16=1once b21=1once b22=1once b23=1once b24=1once b25=1once b26=1once Obias1=0once Obias2=0// ...DEFINITION OF INPUTS...//TREND001SuperPeriod001=7SuperPeriod002=21SuperPeriod003=34SuperPeriod004=77SuperRange=pipsize*round((average[200](average[100](average[50](average[20](average[5](round(RANGE/pipsize)))))))/2)*2Curve001=average[SuperPeriod001](close)Main001=average[SuperPeriod001](Curve001)Trend001=round(10*(Curve001-Main001)/SuperRange)/10Curve002=average[SuperPeriod002](close)Main002=average[SuperPeriod002](Curve002)Trend002=round(10*(Curve002-Main002)/SuperRange)/10Curve003=average[SuperPeriod003](close)Main003=average[SuperPeriod003](Curve003)Trend003=round(10*(Curve003-Main003)/SuperRange)/10Curve004=average[SuperPeriod004](close)Main004=average[SuperPeriod004](Curve004)Trend004=round(10*(Curve004-Main004)/SuperRange)/10variable1= Trend001 //or to be definedvariable2= Trend002//or to be definedvariable3= Trend003 // to be definedvariable4= Trend004 // to be defined// >>> LEARNING PROCESS <<<// If the classifier has detected a wining trade in the past//IF hour > 7 and hour < 21 then//STORING THE LEARNING DATAIF classifierlong=1 or classifiershort=1 THENcandleentry0010=candleentry0009Y10010=Y10009Y20010=Y20009candleentry0009=candleentry0008Y10009=Y10008Y20009=Y20008candleentry0008=candleentry0007Y10008=Y10007Y20008=Y20007candleentry0007=candleentry0006Y10007=Y10006Y20007=Y20006candleentry0006=candleentry0005Y10006=Y10005Y20006=Y20005candleentry0005=candleentry0004Y10005=Y10004Y20005=Y20004candleentry0004=candleentry0003Y10004=Y10003Y20004=Y20003candleentry0003=candleentry0002Y10003=Y10002Y20003=Y20002candleentry0002=candleentry0001Y10002=Y10001Y20002=Y20001candleentry0001=max(candleentrylong,candleentryshort)Y10001=classifierlongY20001=classifiershortENDIFonce Keta=1ETAi=ETA*KetaIF BARINDEX > 3000 THENIF classifierlong=1 or classifiershort=1 THENIF hour > 8 and hour < 21 thenFOR ii=1 to 5 DO //EPOCHS//Backing up the old values of the gradient for implement DESCENT GRADIENT WITH MOMENTUMBDerObias1 = DerObias1BDerObias2 = DerObias2BDerb11 = Derb11BDerb12 = Derb12BDerb13 = Derb13BDerb14 = Derb14BDerb15 = Derb15BDerb16 = Derb16BDerb21 = Derb21BDerb22 = Derb22BDerb23 = Derb23BDerb24 = Derb24BDerb25 = Derb25BDerb26 = Derb26BDerFbias1 = DerFbias1BDerFbias2 = DerFbias2BDerFbias3 = DerFbias3BDerFbias4 = DerFbias4BDerFbias5 = DerFbias5BDerFbias6 = DerFbias6BDera11 = Dera11BDera12 = Dera12BDera13 = Dera13BDera14 = Dera14BDera21 = Dera21BDera22 = Dera22BDera23 = Dera23BDera24 = Dera24BDera31 = Dera31BDera32 = Dera32BDera33 = Dera33BDera34 = Dera34BDera41 = Dera41BDera42 = Dera42BDera43 = Dera43BDera44 = Dera44BDera51 = Dera51BDera52 = Dera52BDera53 = Dera53BDera54 = Dera54BDera61 = Dera61BDera62 = Dera62BDera63 = Dera63BDera64 = Dera64//Reseting Error and cost functionsERROR=0COST=0//Reseting the gradient descet for calculting it in the minibatchDerObias1 = 0DerObias2 = 0Derb11 = 0Derb12 = 0Derb13 = 0Derb14 = 0Derb15 = 0Derb16 = 0Derb21 = 0Derb22 = 0Derb23 = 0Derb24 = 0Derb25 = 0Derb26 = 0// >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<DerFbias1 = 0DerFbias2 = 0DerFbias3 = 0DerFbias4 = 0DerFbias5 = 0DerFbias6 = 0Dera11 = 0Dera12 = 0Dera13 = 0Dera14 = 0Dera21 = 0Dera22 = 0Dera23 = 0Dera24 = 0Dera31 = 0Dera32 = 0Dera33 = 0Dera34 = 0Dera41 = 0Dera42 = 0Dera43 = 0Dera44 = 0Dera51 = 0Dera52 = 0Dera53 = 0Dera54 = 0Dera61 = 0Dera62 = 0Dera63 = 0Dera64 = 0FOR i=1 to 10 DO // Gradient descent and backpropagationIF i = 1 THENcandleentry=candleentry0010Y1=Y10010Y2=Y20010ELSIF i = 2 THENcandleentry=candleentry0009Y1=Y10009Y2=Y20009ELSIF i = 3 THENcandleentry=candleentry0008Y1=Y10008Y2=Y20008ELSIF i = 4 THENcandleentry=candleentry0007Y1=Y10007Y2=Y20007ELSIF i = 5 THENcandleentry=candleentry0006Y1=Y10006Y2=Y20006ELSIF i = 6 THENcandleentry=candleentry0005Y1=Y10005Y2=Y20005ELSIF i = 7 THENcandleentry=candleentry0004Y1=Y10004Y2=Y20004ELSIF i = 8 THENcandleentry=candleentry0003Y1=Y10003Y2=Y20003ELSIF i = 9 THENcandleentry=candleentry0002Y1=Y10002Y2=Y20002ELSIF i = 10 THENcandleentry=candleentry0001Y1=Y10001Y2=Y20001ENDIF// >>> INPUT FOR NEURONS <<<input1=variable1[barindex-candleentry]input2=variable2[barindex-candleentry]input3=variable3[barindex-candleentry]input4=variable4[barindex-candleentry]// >>> FIRST LAYER OF NEURONS <<<F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6F1=1/(1+EXP(-1*F1))F2=1/(1+EXP(-1*F2))F3=1/(1+EXP(-1*F3))F4=1/(1+EXP(-1*F4))F5=1/(1+EXP(-1*F5))F6=1/(1+EXP(-1*F6))// >>> OUTPUT NEURONS <<<output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2output1=1/(1+EXP(-1*output1))output2=1/(1+EXP(-1*output2))// >>> PARTIAL DERIVATES OF COST FUNCTION <<<// ... CROSS-ENTROPY AS COST FUCTION ...// COST = - ( (Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - (Y2*LOG(output2)+(1-Y2)*LOG(1-output2) )DerObias1 = (output1-Y1) * 1+ DerObias1DerObias2 = (output2-Y2) * 1+ DerObias2Derb11 = (output1-Y1) * F1+ Derb11Derb12 = (output1-Y1) * F2+ Derb12Derb13 = (output1-Y1) * F3+ Derb13Derb14 = (output1-Y1) * F4+ Derb14Derb15 = (output1-Y1) * F5+ Derb15Derb16 = (output1-Y1) * F6+ Derb16Derb21 = (output2-Y2) * F1+ Derb21Derb22 = (output2-Y2) * F2+ Derb22Derb23 = (output2-Y2) * F3+ Derb23Derb24 = (output2-Y2) * F4+ Derb24Derb25 = (output2-Y2) * F5+ Derb25Derb26 = (output2-Y2) * F6+ Derb26// >>> PARTIAL DERIVATES OF COST FUNCTION (LAYER) <<<DerFbias1 = (output1-Y1) * b11 * F1*(1-F1) * 1 + (output2-Y2) * b21 * F1*(1-F1) * 1+ DerFbias1DerFbias2 = (output1-Y1) * b12 * F2*(1-F2) * 1 + (output2-Y2) * b22 * F2*(1-F2) * 1+ DerFbias2DerFbias3 = (output1-Y1) * b13 * F3*(1-F3) * 1 + (output2-Y2) * b23 * F3*(1-F3) * 1+ DerFbias3DerFbias4 = (output1-Y1) * b14 * F4*(1-F4) * 1 + (output2-Y2) * b24 * F4*(1-F4) * 1+ DerFbias4DerFbias5 = (output1-Y1) * b15 * F5*(1-F5) * 1 + (output2-Y2) * b25 * F5*(1-F5) * 1+ DerFbias5DerFbias6 = (output1-Y1) * b16 * F6*(1-F6) * 1 + (output2-Y2) * b26 * F6*(1-F6) * 1+ DerFbias6Dera11 = (output1-Y1) * b11 * F1*(1-F1) * input1 + (output2-Y2) * b21 * F1*(1-F1) * input1+ Dera11Dera12 = (output1-Y1) * b11 * F1*(1-F1) * input2 + (output2-Y2) * b21 * F1*(1-F1) * input2+ Dera12Dera13 = (output1-Y1) * b11 * F1*(1-F1) * input3 + (output2-Y2) * b21 * F1*(1-F1) * input3+ Dera13Dera14 = (output1-Y1) * b11 * F1*(1-F1) * input4 + (output2-Y2) * b21 * F1*(1-F1) * input4+ Dera14Dera21 = (output1-Y1) * b12 * F2*(1-F2) * input1 + (output2-Y2) * b22 * F2*(1-F2) * input1+ Dera21Dera22 = (output1-Y1) * b12 * F2*(1-F2) * input2 + (output2-Y2) * b22 * F2*(1-F2) * input2+ Dera22Dera23 = (output1-Y1) * b12 * F2*(1-F2) * input3 + (output2-Y2) * b22 * F2*(1-F2) * input3+ Dera23Dera24 = (output1-Y1) * b12 * F2*(1-F2) * input4 + (output2-Y2) * b22 * F2*(1-F2) * input4+ Dera24Dera31 = (output1-Y1) * b13 * F3*(1-F3) * input1 + (output2-Y2) * b23 * F3*(1-F3) * input1+ Dera31Dera32 = (output1-Y1) * b13 * F3*(1-F3) * input2 + (output2-Y2) * b23 * F3*(1-F3) * input2+ Dera32Dera33 = (output1-Y1) * b13 * F3*(1-F3) * input3 + (output2-Y2) * b23 * F3*(1-F3) * input3+ Dera33Dera34 = (output1-Y1) * b13 * F3*(1-F3) * input4 + (output2-Y2) * b23 * F3*(1-F3) * input4+ Dera34Dera41 = (output1-Y1) * b14 * F4*(1-F4) * input1 + (output2-Y2) * b24 * F4*(1-F4) * input1+ Dera41Dera42 = (output1-Y1) * b14 * F4*(1-F4) * input2 + (output2-Y2) * b24 * F4*(1-F4) * input2+ Dera42Dera43 = (output1-Y1) * b14 * F4*(1-F4) * input3 + (output2-Y2) * b24 * F4*(1-F4) * input3+ Dera43Dera44 = (output1-Y1) * b14 * F4*(1-F4) * input4 + (output2-Y2) * b24 * F4*(1-F4) * input4+ Dera44Dera51 = (output1-Y1) * b15 * F5*(1-F5) * input1 + (output2-Y2) * b25 * F5*(1-F5) * input1+ Dera51Dera52 = (output1-Y1) * b15 * F5*(1-F5) * input2 + (output2-Y2) * b25 * F5*(1-F5) * input2+ Dera52Dera53 = (output1-Y1) * b15 * F5*(1-F5) * input3 + (output2-Y2) * b25 * F5*(1-F5) * input3+ Dera53Dera54 = (output1-Y1) * b15 * F5*(1-F5) * input4 + (output2-Y2) * b25 * F5*(1-F5) * input4+ Dera54Dera61 = (output1-Y1) * b16 * F6*(1-F6) * input1 + (output2-Y2) * b26 * F6*(1-F6) * input1+ Dera61Dera62 = (output1-Y1) * b16 * F6*(1-F6) * input2 + (output2-Y2) * b26 * F6*(1-F6) * input2+ Dera62Dera63 = (output1-Y1) * b16 * F6*(1-F6) * input3 + (output2-Y2) * b26 * F6*(1-F6) * input3+ Dera63Dera64 = (output1-Y1) * b16 * F6*(1-F6) * input4 + (output2-Y2) * b26 * F6*(1-F6) * input4+ Dera64ERROR= 0.5*( (output1-Y1)*(output1-Y1) + (output2-Y2)*(output2-Y2) ) + ERRORCOST= - ( Y1*LOG(output1)+(1-Y1)*LOG(1-output1) ) - ( Y2*LOG(output2)+(1-Y2)*LOG(1-output2) ) + COSTNEXTDerObias1 = DerObias1/10*(1-betha) + betha*BDerObias1DerObias2 = DerObias2/10*(1-betha) + betha*BDerObias2Derb11 = Derb11/10*(1-betha) + betha*BDerb11Derb12 = Derb12/10*(1-betha) + betha*BDerb12Derb13 = Derb13/10*(1-betha) + betha*BDerb13Derb14 = Derb14/10*(1-betha) + betha*BDerb14Derb15 = Derb15/10*(1-betha) + betha*BDerb15Derb16 = Derb16/10*(1-betha) + betha*BDerb16Derb21 = Derb21/10*(1-betha) + betha*BDerb21Derb22 = Derb22/10*(1-betha) + betha*BDerb22Derb23 = Derb23/10*(1-betha) + betha*BDerb23Derb24 = Derb24/10*(1-betha) + betha*BDerb24Derb25 = Derb25/10*(1-betha) + betha*BDerb25Derb26 = Derb26/10*(1-betha) + betha*BDerb26DerFbias1 = DerFbias1/10*(1-betha) + betha*BDerFbias1DerFbias2 = DerFbias2/10*(1-betha) + betha*BDerFbias2DerFbias3 = DerFbias3/10*(1-betha) + betha*BDerFbias3DerFbias4 = DerFbias4/10*(1-betha) + betha*BDerFbias4DerFbias5 = DerFbias5/10*(1-betha) + betha*BDerFbias5DerFbias6 = DerFbias6/10*(1-betha) + betha*BDerFbias6Dera11 = Dera11/10*(1-betha) + betha*BDera11Dera12 = Dera12/10*(1-betha) + betha*BDera12Dera13 = Dera13/10*(1-betha) + betha*BDera13Dera14 = Dera14/10*(1-betha) + betha*BDera14Dera21 = Dera21/10*(1-betha) + betha*BDera21Dera22 = Dera22/10*(1-betha) + betha*BDera22Dera23 = Dera23/10*(1-betha) + betha*BDera23Dera24 = Dera24/10*(1-betha) + betha*BDera24Dera31 = Dera31/10*(1-betha) + betha*BDera31Dera32 = Dera32/10*(1-betha) + betha*BDera32Dera33 = Dera33/10*(1-betha) + betha*BDera33Dera34 = Dera34/10*(1-betha) + betha*BDera34Dera41 = Dera41/10*(1-betha) + betha*BDera41Dera42 = Dera42/10*(1-betha) + betha*BDera42Dera43 = Dera43/10*(1-betha) + betha*BDera43Dera44 = Dera44/10*(1-betha) + betha*BDera44Dera51 = Dera51/10*(1-betha) + betha*BDera51Dera52 = Dera52/10*(1-betha) + betha*BDera52Dera53 = Dera53/10*(1-betha) + betha*BDera53Dera54 = Dera54/10*(1-betha) + betha*BDera54Dera61 = Dera61/10*(1-betha) + betha*BDera61Dera62 = Dera62/10*(1-betha) + betha*BDera62Dera63 = Dera63/10*(1-betha) + betha*BDera63Dera64 = Dera64/10*(1-betha) + betha*BDera64//Implementing BackPropagationObias1=Obias1-ETAi*DerObias1Obias2=Obias2-ETAi*DerObias2b11=b11-ETAi*Derb11b12=b12-ETAi*Derb12b13=b11-ETAi*Derb13b14=b11-ETAi*Derb14b15=b11-ETAi*Derb15b16=b11-ETAi*Derb16b21=b11-ETAi*Derb21b22=b12-ETAi*Derb22b23=b11-ETAi*Derb23b24=b11-ETAi*Derb24b25=b11-ETAi*Derb25b26=b11-ETAi*Derb26Fbias1=Fbias1-ETAi*DerFbias1Fbias2=Fbias2-ETAi*DerFbias2Fbias3=Fbias3-ETAi*DerFbias3Fbias4=Fbias4-ETAi*DerFbias4Fbias5=Fbias5-ETAi*DerFbias5Fbias6=Fbias6-ETAi*DerFbias6a11=a11-ETAi*Dera11a12=a12-ETAi*Dera12a13=a13-ETAi*Dera13a14=a14-ETAi*Dera14a21=a21-ETAi*Dera21a22=a22-ETAi*Dera22a23=a23-ETAi*Dera23a24=a24-ETAi*Dera24a31=a31-ETAi*Dera31a32=a32-ETAi*Dera32a33=a33-ETAi*Dera33a34=a34-ETAi*Dera34a41=a41-ETAi*Dera41a42=a42-ETAi*Dera42a43=a43-ETAi*Dera43a44=a44-ETAi*Dera44a51=a51-ETAi*Dera51a52=a52-ETAi*Dera52a53=a53-ETAi*Dera53a54=a54-ETAi*Dera54a61=a61-ETAi*Dera61a62=a62-ETAi*Dera62a63=a63-ETAi*Dera63a64=a64-ETAi*Dera64ERROR=ERROR/10ERROR=round(ERROR*1000)/1000COST = COST/10COST= round( COST*1000 ) /1000DRAWTEXT("C= #COST# , ETAi=#ETAi#", candleentry, -0.15*(ii-1)-0.05, Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("E= #ERROR#", candleentry, -0.15*(ii-1)-0.11, Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a1i: #a11# ; #a12# ; #a13# ; #a14# ; Fb1:#Fbias1#", candleentry, -1-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a2i: #a21# ; #a22# ; #a23# ; #a24# ; Fb2:#Fbias2#", candleentry, -2-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a3i: #a31# ; #a32# ; #a33# ; #a34# ; Fb3:#Fbias3#", candleentry, -3-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a4i: #a41# ; #a42# ; #a43# ; #a44# ; Fb4:#Fbias4#", candleentry, -4-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a5i: #a51# ; #a52# ; #a53# ; #a54# ; Fb5:#Fbias5#", candleentry, -5-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("a6i: #a61# ; #a62# ; #a63# ; #a64# ; Fb6:#Fbias6#", candleentry, -6-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("b1i: #b11# ; #b12# ; #b13# ; #b14# ; #b15# ; #b16# ; Ob1:#Obias1#", candleentry, -7-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)DRAWTEXT("b2i: #b21# ; #b22# ; #b23# ; #b24# ; #b25# ; #b26# ; Ob2:#Obias2#", candleentry, -8-0.1*(ii-1), Dialog, Bold, 10) COLOURED(50,150,50)//GradientNorm = SQRT(DerObias1*DerObias1 + DerObias2*DerObias2+Derb11*Derb11+Derb12*Derb12+Derb13*Derb13+Derb14*Derb14+Derb15*Derb15+Derb16*Derb16 + Derb21*Derb21+Derb22*Derb22+Derb23*Derb23+Derb24*Derb24+Derb25*Derb25+Derb26*Derb26 + DerFbias1*DerFbias1+DerFbias2*DerFbias2+DerFbias3+DerFbias3+DerFbias4*DerFbias4+DerFbias4*DerFbias5+DerFbias6*DerFbias6 + Dera11*Dera11+Dera12*Dera12+Dera13*Dera13+Dera14*Dera14 + Dera21*Dera21+Dera22*Dera22+Dera23*Dera23+Dera24*Dera24 + Dera31*Dera31+Dera32*Dera32+Dera33*Dera33+Dera34*Dera34 + Dera41*Dera41+Dera42*Dera42+Dera43*Dera43+Dera44*Dera44 + Dera51*Dera51+Dera52*Dera52+Dera53*Dera53+Dera54*Dera54 + Dera61*Dera61+Dera62*Dera62+Dera63*Dera63+Dera64*Dera64)//DRAWTEXT("GradientNorm: #GradientNorm#", candleentry, 2-0.1*(ii), Dialog, Bold, 10) COLOURED(50,150,50)COST2=COST1COST1=COSTIF COST1 > COST2 and ii>=2 THEN //ETAi must be reducedETAi=ETAi/1.5ELSE //ETAi can be increasedETAi=ETAi*(1.5+0.2*ii)ENDIFCOST2=COST1NEXT//DRAWTEXT("#candleentry0001#,#candleentry0002#,#candleentry0003#,#candleentry0004#,#candleentry0005#,#candleentry0006#,#candleentry0007#,#candleentry0008#,#candleentry0009#,#candleentry0010#", candleentry, -0.7, Dialog, Bold, 10) COLOURED(50,150,50)once error1=ERRORerror2=error1error1=ERROR//IF error1 < error2 THEN//Keta=Keta+0.05//ELSE//Keta=Keta-0.05//ENDIFKeta=Keta*(1+min(0.1,abs((error2-error1)/error1))*SGN(error2-error1))ENDIFENDIF//ENDIF/////////////////// NEW PREDICTION ///////////////////// >>> INPUT NEURONS <<<input1=variable1input2=variable2input3=variable3input4=variable4// >>> FIRST LAYER OF NEURONS <<<F1=a11*input1+a12*input2+a13*input3+a14*input4+Fbias1F2=a21*input1+a22*input2+a23*input3+a24*input4+Fbias2F3=a31*input1+a32*input2+a33*input3+a34*input4+Fbias3F4=a41*input1+a42*input2+a43*input3+a44*input4+Fbias4F5=a51*input1+a52*input2+a53*input3+a54*input4+Fbias5F6=a61*input1+a62*input2+a63*input3+a64*input4+Fbias6F1=1/(1+EXP(-1*F1))F2=1/(1+EXP(-1*F2))F3=1/(1+EXP(-1*F3))F4=1/(1+EXP(-1*F4))F5=1/(1+EXP(-1*F5))F6=1/(1+EXP(-1*F6))// >>> OUTPUT NEURONS <<<output1=b11*F1+b12*F2+b13*F3+b14*F4+b15*F5+b16*F6+Obias1output2=b21*F1+b22*F2+b23*F3+b24*F4+b25*F5+b26*F6+Obias2output1=1/(1+EXP(-1*output1))output2=1/(1+EXP(-1*output2))ENDIFreturn output1 coloured(0,150,0) style(line,1) as "prediction long" , output2 coloured(200,0,0) style(line,1) as "prediction short",classifierlong coloured(0,150,0) STYLE(HISTOGRAM,2) AS "classifier_long" , classifiershort coloured(200,0,0) STYLE(HISTOGRAM,2) AS "classifier_short", 0.5 coloured(0,0,200) as "0.5", 0.6 coloured(0,0,200) as "0.6", 0.7 coloured(0,0,200) as "0.7", 0.8 coloured(0,0,200) as "0.8"6 users thanked author for this post.
11/09/2018 at 11:45 AM #84477Hi Leo,
I feel like in Sci-Fi with this indicator;-) and it is great;
thanks a lot for sharing;
I tried it on stocks but was unable to make it release its prediction! I surely missed smthing (see picture).
Should we change smthing?
Can you give us on which asset you were able to have the graph you mentioned in your post?
1 user thanked author for this post.
11/09/2018 at 12:46 PM #84480Hi.
Well. First you should know that this is not a universal indicator. Not like, we both see the same stuff if we open it.
You need to load many bars like 20000bars then you give enough time to the indicator to learn. If you open it with 20k and me with 30k. My indicator has learn more than yours.
Another parameters is the pips to a metaphirc trade. Now we look for possible trades and learn what has happend in the moment of entry the trade.
For the prediction you take a value from 0.5 to 0.999 for make where is ypur predicition.
In any case, There is a lot of work ahead. Thats why I share. Me alone I can not do it as faster I want.
The neural network itself is working properlly ( with the limitations of prorealtime). That I recheck and recheck. And corrected and redoing stufff and checked it againg. Pufff
11/09/2018 at 1:40 PM #84483Thank you Leo for your ongoing work and for being so kind to share with us!
I made a System out of your latest NN Code … equity curve attached over 100k bars with spread = 4.
I have it on Demo Fwd Test from today … I’ll report back on here after 10 trades.
11/09/2018 at 2:09 PM #84489100 units is far from sufficient for the machine to has learned something!
1 user thanked author for this post.
11/09/2018 at 2:58 PM #84495thanks Leo for your precautions and Nicolas for your both answers;
still blocked here; I get this message; with all TF;
Best
11/09/2018 at 2:59 PM #84497great Grahal, can you share it too?
11/09/2018 at 4:10 PM #84499Kris 75. Take pips per trade higher number because you loaded it in daily time frame.
1 user thanked author for this post.
11/09/2018 at 5:12 PM #84502thanks Leo; was able to make it work in daily with “trade perpips” = 100 ;
Hope Grahal will share the straegy so I can backtest it;
Best
Chris
11/09/2018 at 6:12 PM #84504Hope Grahal will share the strategy so I can backtest it;
I will share for sure, but I’d prefer the System to do a few trades first.
All I did was use output1 as a Long entry and output2 as a Short entry and added TP and SL (rough feel for profit / loss) and optimisation, oh and a simple Filter.
Am I the only one who has makes Leo’s NN Indicator code into Systems? I have about 5 or 6 versions running in Demo Forward Test?
This Thread needs to be kept clean for discussion re Leo’s NN Code.
I have shared a few Systems based on Leo / NN code on my Thread …
Right full Neural Network now, huge thanks to Leo!
What we need is cross-fertilisation of ideas as I’m sure there are better ways (than mine) of using Leo’s code to make profitable Systems? Different TF and Markets even? The wider view? Feedback will likely help Leo with his superlative NN coding??
So there’s a challenge @Kris75 … you show me yours and I’ll show you mine? 🙂
I will post the System (based on Leo 1.3.2 code) on the other Thread (link above), after I’ve seen at least one trade open and closed.
Cheers
GraHal11/11/2018 at 11:34 AM #8459211/11/2018 at 11:37 AM #84593Hi all,
I posted something interesting here:
https://www.prorealcode.com/topic/weekly-seasonality-analysis/#post-84588
What if do something similar for weekly? and then also for daily?
when 3 predictions are showing the same then we entry in daily or hourly time frame…
mmm…. sexy…
2 users thanked author for this post.
11/12/2018 at 7:13 PM #84703Hi Leo,
Why not use a special type of recurrent neural network called LSTM network?
A solution to the vanishing gradient problem is to use cells to remember longterm dependencies
CordiallySources:
http://colah.github.io/posts/2015-08-Understanding-LSTMs/
https://github.com/llSourcell/How-to-Predict-Stock-Prices-Easily-Demo11/12/2018 at 9:41 PM #84720How man! sorry for disappoint you. What I posted here is the first attempt ever (as far as I know) of creating artificial intelligent using such a basic language like ProBuilder with no array support. It only has memory for execute 5 epochs per mini batch, I do not even think to make something like a recurrent neural network.
I only start with a Feed Forward Neural Network! with only 1 hidden layer!!! with only 6 neurons! do you think that my decent gradient is vanishing? really? I have other problems, but definitely not a vanishing gradient.
I will be happy if I were able one of those days to increase the number of neurons… maybe you Raspoutine can help us with this endeavour.
-
AuthorPosts
Find exclusive trading pro-tools on