Jump to content

Welcome to the new Traders Laboratory! Please bear with us as we finish the migration over the next few days. If you find any issues, want to leave feedback, get in touch with us, or offer suggestions please post to the Support forum here.

darthtrader3.0beta

Members
  • Content Count

    119
  • Joined

  • Last visited

Everything posted by darthtrader3.0beta

  1. Ahh yea, I see that would work. How would you do that though? I have a hard time not thinking in loops, the only way I can see to do that would be in on market data like: for(i = x, x < 499,x++) if(BarsInProgress == x) do something return; Have you guys ever ran across documentation for all the methods in ninja? It seems like there should be some kind of Get.BarsInProgress() method but I've never found that level of documentation or maybe I'm just missing a bigger overall concept. The big things I've read about for NT7 are the tick backfill for everything, multi instrument and time frame indicators, plotting the same instrument on one chart(still not sure if this means plotting the same instrument in price chart 1, that would be sick). I did read the other day though the beta is pushed back to the end of Q1 09, which kind of sucks but oh well. Bump: Here is another question maybe you guys would know the general programming concept for.. I would love to eventually override plot to produce a kind of 5 or 15 minute distribution profile for TICK, like the volume profile or market profile of TICK values but it plots a new one every 5 or 15 minutes. I think that would be a far better way to see how the TICK is going off as opposed to candles or line on close. I've dug into the volume profile indicator enough to get the general idea of how they pulled that monster off.. What I don't understand though is how you would store things to chop it up into different time intervals. Volume profile uses a sorted dictortionary/hash table, so would you just use a custom data series of sorted dictionarys and set the chart to whatever time period you want and override the plot? I guess in a more general sense I don't understand how to start these kind of calculations new for each bar..Another easier example would be say you wanted to find the average price for a 5 minute bar on a tick by tick basis..So keep a running total of the value of each tick divided by a running count of ticks.. If you do this in OnMarketData...how would you tell it to reset the tick value and counter to zero when the next bar starts?
  2. Sounds like a plan... The only part that I'm not sure about is the event triggering the index. Add() is going to create 500 seperate dataseries no matter what and as long as there is some get function to get the index of the data series that had an event, then the remaining calculations should be straight forward. good stuff..What I posted about structs doesn't make any sense, I guess you could just even use a 4d array and store price, bid volume, ask volume, weight..then you have everything. One thing to keep in mind with all this is NT7 might have a different way of handling OnMarketData because currently you can't backfill anything you do with that while NT7 is going to have backfill for bid/ask volume, ect...Not sure how they are going to handle this. Losing everything if you reload the chart midday is the biggest reason I haven't bothered to get too into OnMarketData. Thats always annoyed me greatly.
  3. I think it would be better to do with OnMarketData but I don't completely understand it..I guess I wasn't looking at it abstractly enough as I was thinking that it was more linked to market data on the primary instrument as opposed to any market data on any data series in the strategy... The calculation is as you said, for every new tick(of each stock that has been loaded as an additional instrument), if tick up then total++, if tick down then total--...The primary instrument is really just a place holder for the strategy and has nothing to do with the calculations. I'll mess around with it tomarrow and see what happens. One thing I'm not sure about though is things didn't look right until I reset the total to zero on each tick, then ran the loop..so basically taking an uptick/downtick snap shot on each tick of SPY for what the summed uptick/downticks looked like on the underlieing instruments. Volume could be really interesting for doing like an S&P TICK delta indicator but I don't see how you can get away with not weighting things. 1000 shares of XOM at the ask needs to have greater meaning than a 1000 shares at the ask of Disney, since the 1000 shares of XOM has greater meaning as far as calculating the index itself. How would you go about calculating the cash index from the the 500 S&P stocks? I was thinking of eventually trying to have a Struct for each stock with Price, Bid Volume, Ask Volume and the index weighting..then 500 custom dataseries for holding the structs...if that was possible you would literally be able to do anything you want as far as breadth goes but I'm sure that is not the optimal way to do things by a long shot..
  4. Well personally, I wouldn't worry about IB data if your not on it already. Forget about the data and make your choice broker wise that way. I think what gets lost in this is most people don't trade off 1 tick charts, most of what we do involves snapshots of ticks and throwing out data to make a usefull summary at much slower intervals than .2 seconds... The real difference between IB and a real tick provider is probly nothing unless your doing some high frequency autotrader deal.. If you like their broker services then just go with them and if you really need tick precise data then add DTN into the mix. DTN is impossible to beat bang for the buck wise.
  5. Hmm I'm not totally sure what you mean..Ninja is a bit confusing on this stuff because if calculateonbarclose is true, then close() is like the close of a candle stick...if its false, then close() is just an incoming tick..but then if you want to know if the tick transacted at the bid or the ask you have to override onmarketdata()..I dont think what I did is truely tick precise for each of the stock dataseries because that loop is only firing on each tick of the primary instrument, but like I said on how heavy SPY is, this should be more than enough speed. All the loop is doing in what I posted is marching through the added dataseries and looking to see if the last tick of each dataseries is higher or lower than 1 tick back of that dataseries to add up up vs downticks. I'm not sure it would really be more effecient to not loop as there is really very little calculations going on. Its really just 50 comparisons and 50 additions or subtractions on each tick. I would love to try to get the PREM like this someday but with the multiplication to compute the weighting on each stock for the index, that might be unrealistic computing power wise on a tick basis.
  6. Well nevermind, the ninja guys said that its probly an issue with my dataprovider in that I can only use 50 streams at once. This might be worth looking into as on my average computer, 50 instruments is only using 25% of the cpu. That 3 strategies idea is interesting but I don't think ninja can communicate between strategies currently? Either way though I guess this is a function of the dataprovider and computer, there is no limit on the software.. If anyone wants to mess with this here is the code for the S&P 100.. protected override void Initialize() { Add("AA", PeriodType.Minute, 1); Add("AAPL", PeriodType.Minute, 1); Add("ABT", PeriodType.Minute, 1); Add("AEP", PeriodType.Minute, 1); Add("AES", PeriodType.Minute, 1); Add("AIG", PeriodType.Minute, 1); Add("ALL", PeriodType.Minute, 1); Add("AMGN", PeriodType.Minute, 1); Add("AVP", PeriodType.Minute, 1); Add("AXP", PeriodType.Minute, 1); Add("BA", PeriodType.Minute, 1); Add("BAC", PeriodType.Minute, 1); Add("BAX", PeriodType.Minute, 1); Add("BHI", PeriodType.Minute, 1); Add("BK", PeriodType.Minute, 1); Add("BMY", PeriodType.Minute, 1); Add("BNI", PeriodType.Minute, 1); Add("C", PeriodType.Minute, 1); Add("CAT", PeriodType.Minute, 1); Add("CBS", PeriodType.Minute, 1); Add("CI", PeriodType.Minute, 1); Add("CL", PeriodType.Minute, 1); Add("CMCSA", PeriodType.Minute, 1); Add("COF", PeriodType.Minute, 1); Add("COP", PeriodType.Minute, 1); Add("COV", PeriodType.Minute, 1); Add("CPB", PeriodType.Minute, 1); Add("CSCO", PeriodType.Minute, 1); Add("CVS", PeriodType.Minute, 1); Add("CVX", PeriodType.Minute, 1); Add("DD", PeriodType.Minute, 1); Add("DELL", PeriodType.Minute, 1); Add("DIS", PeriodType.Minute, 1); Add("DOW", PeriodType.Minute, 1); Add("EMC", PeriodType.Minute, 1); Add("EP", PeriodType.Minute, 1); Add("ETR", PeriodType.Minute, 1); Add("EXC", PeriodType.Minute, 1); Add("F", PeriodType.Minute, 1); Add("FDX", PeriodType.Minute, 1); Add("GD", PeriodType.Minute, 1); Add("GE", PeriodType.Minute, 1); Add("GOOG", PeriodType.Minute, 1); Add("GS", PeriodType.Minute, 1); Add("HAL", PeriodType.Minute, 1); Add("HD", PeriodType.Minute, 1); Add("HIG", PeriodType.Minute, 1); Add("HNZ", PeriodType.Minute, 1); Add("HON", PeriodType.Minute, 1); Add("HPQ", PeriodType.Minute, 1); Add("IBM", PeriodType.Minute, 1); Add("INTC", PeriodType.Minute, 1); Add("IP", PeriodType.Minute, 1); Add("JNJ", PeriodType.Minute, 1); Add("JPM", PeriodType.Minute, 1); Add("KFT", PeriodType.Minute, 1); Add("KO", PeriodType.Minute, 1); Add("MA", PeriodType.Minute, 1); Add("MCD", PeriodType.Minute, 1); Add("MDT", PeriodType.Minute, 1); Add("MER", PeriodType.Minute, 1); Add("MMM", PeriodType.Minute, 1); Add("MO", PeriodType.Minute, 1); Add("MRK", PeriodType.Minute, 1); Add("MS", PeriodType.Minute, 1); Add("MSFT", PeriodType.Minute, 1); Add("NOV", PeriodType.Minute, 1); Add("NSC", PeriodType.Minute, 1); Add("NYX", PeriodType.Minute, 1); Add("ORCL", PeriodType.Minute, 1); Add("OXY", PeriodType.Minute, 1); Add("PEP", PeriodType.Minute, 1); Add("PFE", PeriodType.Minute, 1); Add("PG", PeriodType.Minute, 1); Add("PM", PeriodType.Minute, 1); Add("QCOM", PeriodType.Minute, 1); Add("RF", PeriodType.Minute, 1); Add("RTN", PeriodType.Minute, 1); Add("S", PeriodType.Minute, 1); Add("SLB", PeriodType.Minute, 1); Add("SLE", PeriodType.Minute, 1); Add("SO", PeriodType.Minute, 1); Add("T", PeriodType.Minute, 1); Add("TGT", PeriodType.Minute, 1); Add("TWX", PeriodType.Minute, 1); Add("TXN", PeriodType.Minute, 1); Add("TYC", PeriodType.Minute, 1); Add("UNH", PeriodType.Minute, 1); Add("UPS", PeriodType.Minute, 1); Add("USB", PeriodType.Minute, 1); Add("UTX", PeriodType.Minute, 1); Add("VZ", PeriodType.Minute, 1); Add("WB", PeriodType.Minute, 1); Add("WFC", PeriodType.Minute, 1); Add("WMB", PeriodType.Minute, 1); Add("WMT", PeriodType.Minute, 1); Add("WY", PeriodType.Minute, 1); Add("WYE", PeriodType.Minute, 1); Add("XOM", PeriodType.Minute, 1); Add("XRX", PeriodType.Minute, 1); Total = 0; Add(StrategyPlot(0)); StrategyPlot(0).Plots[0].Pen.Color = Color.Blue; StrategyPlot(0).PanelUI = 2; CalculateOnBarClose = false; } protected override void OnBarUpdate() { // if (BarsInProgress != 0) return; Total = 0; for(x = 1;x < 50; x++) { // Print(Closes[x][0]); if(Closes[x][0] > Closes[x][1]) {Total++;} if(Closes[x][0] < Closes[x][1]) {Total--;} } StrategyPlot(0).Value.Set(Total); }
  7. Well I think I just got TIKI going just now and spitting out the DOW stock data to the ninja output window its updating all the data series extremely fast. I'm not sure if its on the first dataseries or on all them though, I just loaded up SPY so that even if it needs a tick from the main dataseries to update, SPY is more than fast enough. This is certainly better than the 5 second lag I get on DTN. Going to drink some coffee and grunt through adding the S&P 100 right now. Bump: Lame...well it appears Ninja has a 50 instrument limit as fas as add() goes. Above this the strategy loads but doesn't switch on. very disappointing considering how smoothly its handling 50 instruments.
  8. Well I messed around with this on the weekend and just loaded it up. Bad news is my calculations make no sense and the plot is bogus but the good news is Ninja seems to be handling 30 manually added instruments like cakewalk.. In intialize I just did: Add("AA", PeriodType.Minute, 1); Add("AXP", PeriodType.Minute, 1); Add("BA", PeriodType.Minute, 1); Add("BAC", PeriodType.Minute, 1); ect, ect.. then in onbarupdate (set to calculate on close false) for(x = 0;x < 29; x++) { if(Closes[0][x] > LastTransactedPrice) {Direction = 1;} ect ect So 30 instruments work and a for loop works, this should be doable. I couldn't figure out how to do this by overriding onmarketdata() but hopefully this will be good enough. Once I get the logic right I'll see if I can crash it with the S&P100.
  9. I guess I would like to go off on this point, because I really don't understand why "conventional wisdom" is so against paper trading. I thought it was cool to see around the blog/message board trading world in the past few weeks the idea that you need 10,000 hours and or 10 years to become a master trader(something that was posted here well over a year ago) is making its way around. Yet "conventional wisdom" tells the fool who comes to this game expecting to make millions overnight that they HAVE to risk their hard earned cash or they are wasting their time. A fool losing hard earned cash does not make them less of a fool, the only way a fool becomes less of a fool is from time and trades to get a feel of how the market breathes. You need rule #11...go on the 10 year plan. I have 4 years market experience, 3.5 trading real money but I mark my real march towards becoming a master trader as starting last january, so I don't even consider myself to "actually" have a year of experience yet. Last january was when I started trading futures with real money, had read every book, digested ever idea I could think about the past 3 years. So far this week I've taken 3 real money trades and 94 paper trades. My avg hold time on my real money trades are 2.3 minutes...if you trade real money with patience you have to do something with your time, what better to do than experiment with paper trading? Basically, I don't think the problem has anything to do with paper trading...Its simply 99.9999% of people do not take this business as serious as they HAVE to. The only difference to me between a paper and a real money trade is the confidence I have in the setup. If I wanted to be a boxer or MMA guy, I want to have thousands of reps in practice before I have to throw a strike in a REAL fight.
  10. Have you ever checked out mini nikkei instead of hang seng? While I dream of HAVING to trade ES someday, for my amount of capital I simply see no reason to trade a larger point size than YM. For my style and considering how highly correlated the 2 instruments are, I find no real trading personality difference between the two other than that I can't use an ultra tight stop on ES as far as capital goes like I can on YM. I don't use limit orders per se but always buy the bid or sell the ask and if I don't get filled oh well, next. 12-14 point hard stop depending on ATR, but that is just incase I get in front of the wrong side of a program. Also, since YM is slower its easier to read the tape so my real stop is usually around 7 or 8 points that I'm taking off and moving on before my hard stop is hit. 6 YM ticks against me is my discretionary eye brow raiser that maybe things aren't working out and I'm ready to bail..the problem with that on ES is that with the tick size and speed/depth your barely even talking 2 ticks as far as trading the same way..pure noise. Mini Nikkei is the only other instrument that interest me currently because of its tick size like YM and that I could trade all day because of the time zone differences with a nice 4 hour dinner break. I soppose once I'm comfortable enough to quit my job and go pro, that is what my day will look like until I have too much capital that I have to trade ES and the meaning of an ultra tight stop changes, and/or I just don't want to trade all day anymore like I would love to now.
  11. Well I'm not sure what you mean. I mean if you look at discretionary trading from an algo standpoint, the discretionary trader is basically forward testing and optimizing the most powerfull neural network software ever devised.. That is vastly different than if your optimizer tells you to use a 21.22 and 51.054 moving average because its slightly more profitable than a 20 and 50 moving average. From trying to get into auto trading, I feel that at the retail level we are still stuck in the 90s while there have been huge advances in auto trading in the last 10-15 years. I totally agree with your multi strategy statement, our software should open up to a page that shows the portfolio of strategies and how they interact against real market data and purely random data. Instead we get alot of point and click strategy wizards and useless indicator optimizations. This sounds like an interesting new book on the subject but I had been waiting for someone to post a review which someone did yesterday. Probly worth checking out as far as stuff "filtering down" to our level. http://www.amazon.com/Quantitative-Trading-Build-Algorithmic-Business/dp/0470284889/ref=sr_1_1?ie=UTF8&s=books&qid=1220109693&sr=1-1
  12. I think the idea that the VIX measures "fear" at this point is simply inaccurate. If you watch it intraday it almost exactly inversely tracks the index. I mean its basically telling you how much guys are willing to pay as far as SPX options premium..which I think you have to assume SPX options are mostly a hedging vehicle..so its more of an "uncertainty" index than fear. If its starting to seem artifically high I imagine that is just because we are getting use to this insane volatility..YM today opens, jumps up 150 points, drops 300, then up 200 and we don't even think about it now.
  13. Oh I had not heard that that NT7 will support multiple cores..thats fantastic... I'm going to try to build TIKI over the thanksgiving holiday. I guess the problem with that is I'm sure ninja can handle 30 custom series like that if its just doing addition and subtraction. I would think at some point memory becomes a bigger issue than actual CPU useage but I could be totally off there. As far as neobreadth, isn't the the ease of that have to do with the fact that neoticker handles the list of single equities behind the scenes? We could potentially do something like that pulling from a text file with a for loop in Initialize(): for (blah blah) Add("X", PeriodType.Minute, 1);
  14. A real eye opening experience is to program a system and optimize on things you know are completely bogus and random. A system that only buys at 10:16 on a tuesday..if you use even the slightest bit of money management to have bigger winners than losers its not that hard to find a "profitable system" like this with an "edge". The problem is the system has no edge, its just data snooping bias and purely random. Even win rate has a ton of randomness in it. Here is a system: long SPY at 10:00am everyday for the last 2 months, stop at 1/2 ATR and target at a full ATR.. 89% win rate but it loses .13% before commisions.. To me the point is, with historical backtesting you simply can not know if you are exploiting a fundamental behaviour of the market or if you have simply found some good looking random patterns in your data...Even then with walk forward testing, there is nothing to say your are simply not getting lucky, especially since if you walk forward for 2 months and get great results you will probly have alot of false confidence in the system. I do agree that I don't get why anyone uses optimized parameters in a system.. Currently, I simply see no way around using a monte carlo simulation. At least then you have a control to test against as far as purely random behaviour is concerned. Not to mention that to me the retail algo literature focuses far too much on the entry with trade management and the exit an after thought.
  15. All I know is this series of threads led me to this book, and this book made me question this entire thread...there is no reason to not follow this path... Probability Theory: The Logic Of Science An interesting sidenote you will find in that book is that Jaynes believes the entire idea of a "stochastic process" is bogus...interesting when you look at the current problems and how much was based on financial engineering of treating markets as "stochastic process"...
  16. The problem with this is in the NYSE breakdown of volume statistics...50% "programs", 40% "institutional", 10% or less "retail"...since no one is really running "programs" at the "retail" level the volume is 90% "the big guys" vs the 10% retail... If 90% of volume were truely trying to execute mean reversion wise at the VWAP then we would not see the trends we currently see.The harsh reality is that the market is made up of the big guys trying to front run the big guys trading mean reversion and catch them at the wrong time...who on the other side of the trade are trying to front run "the big guys"...Thats why to me if you try to use the VWAP outside of being a tick precise "mean" then you are missing the point of its value... This is exactly why I don't post in the "VSA" threads, its just bogus...a misunderstanding of the "enemy" at hand...
  17. Hey ryker, Can you define what is too big as far as ninja goes here as far as your experience? I've mostly been waiting to see what they do in 7 as far as the the multi instrument indicator stuff...i'm sure that won't be much different than in strategies... If its 100 vs 500 that is cool..if its 500 vs 3...not so good... Statistical software wise I've messed around with R but never found anything that wasn't easier to do with Excel...Excel is probly robust enough, certainly for this kind of thing.. Output to the Ninja window with commas is kind of a pain for large datasets to save/paste/import into excel but certainly impossible and easier than any other solution I've found at my level of programming....
  18. Well I want to say again, I know I'm knitpicking here and not saying you are not a master trader. I am though highly influenced by the concepts in Evidence Based Technical Analysis, so I start from the idea that whatever I hear does not work at face value and move on from there. From my experience this system probly works for you as a way to filter out trades, and the real "meat" is in your understanding of price action...thats besides the point though. My main point is this is not the optimal way to calculate things... " Of course the distribution has an effect on the variance. The variance is computed from it." Of course the variance that is computing your std dev bands is based on the distribution that your assuming in your variance calculations... How can you make that calculation without assuming a distribution? My point is YOU CANT. So your assuming a normal distribution to make your variance calculations, which we fundamentally know is far from optimal. The fact that its good enough for you to trade on does no mean its optimal. "we can define an average for a finite data set. " of course we can define a mean for a finite data set..the problem is though you have to define a distribution to calculate the variance from that mean...hence you have to quantify a distribution in order to calculate std dev bands from the VWAP...you currently define those bands assuming a normal distribution...which we know is wrong if we know anything. Personally, I would think at least a poison distribution would be a better assumption, but its interesting to think about what is far more optimal. Also, steve46 point has no point in this discussion. Of course the VWAP is traditionally used an institutional bench mark and/or execution algorithm, thats where the liquidity is if your "big"....I imagine the VWAP is far more meaningless with futures directly as futures are a way to hedge those VWAP taken equity positions by the big guys...This is a discussion of mean and variance.. Thats why I said it could be interesting to compute a underlie vwap with something like neoticker for the SPX then throw it over ES.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.