Jump to content

Welcome to the new Traders Laboratory! Please bear with us as we finish the migration over the next few days. If you find any issues, want to leave feedback, get in touch with us, or offer suggestions please post to the Support forum here.

BlowFish

Market Wizard
  • Content Count

    3308
  • Joined

  • Last visited

Everything posted by BlowFish

  1. Best just to use the 'report' button at top right of the post. By quoting (or even posting) in the thread there is always a chance that the mods will miss your follow up post thus actually preserving the infringement. (that appears to have happened in this case).
  2. That's weird, I have only ever seen holes of a second or two, never minutes let alone hours. Having said that I don't spend that much time in front of the PC at the moment. When did the 2h 40m outtage occur and did the client show disconnected? Usually an outage of that length the forums and chat rooms would be incandescent with fury. I wonder if you are experiencing another issue as well.
  3. Sure...You need to be able to operate on historical data and have the same results as from live data. A great example of this is the wealth of market delta type indicators available for TS/MC. They all of course only work on live data. There are many many other cases. Put GBP.USD USD.EUR & EUR.GPB on a chart live calculations will be completely different to historical ones as with the history the best resolution you can get is 1 second completely destroying the sequencing with respect to the other pairs. Basically the current architecture rules out anything that requires precision beyond a second which further rules anything where the order that the ticks arrived in is important to you. It shuts down a whole (increasingly important) area of analysis in one fell swoop. TSsupport are aware of the issue, however I do not think they understand the importance of them. They will when there competitors (under increasing pressure) offer solutions. I had pretty good connections within TSsupport but I find it frustrating that sometime (often) they just don't get it (clearly no traders on the team). I wish they had a Lawrence Chan (Neoticker) . Even Raymond over at Ninja finally gets it though sadly it was after they had gone too far down a blind alley with NT 7.0 (upshot is it will not be seen for a while). Half of the stuff TSSuport introduce is essentially worthless (beyond ticking boxes in feature lists or seducing gullible wannabes) without the internal precision to make the results meaningful. (note: I am not saying that us guys that are using it are gullible wanabes just that if you think you can not do any sort of real quantitative work that requires any sort of precision in the time domain you are misguided). A real shame, and frustrating as hell. I still like the product but it could be so much better if they really listened and understood some of the issues. As an aside I think it is pretty much one guy (best to remain nameless) who is always defensive and intransigent, sadly he seems to carry some sway. As an aside one of the things that makes me laugh (and cry) is how they will copy a bug or really badly implemented 'feature' In TS. (How volume is reported in P&F charts springs to mind) On the plus side if Tradestation implement millisecond or better precision you can bet that MC will follow. That's why I am keen to see TS make an announcement that will carry more weight than any rational argument. If TS blew it's foot off MC would be sure to follow. :D
  4. Quite so! Feeding an RSI (of price) a MACD (of price) and a CCI (of price) into a net will not end well! Using a normalised MA to pre process price dat before using it as an input might be a step in the right direction. Ranger keep us posted on how it goes. Incidentally user forums and blogs for some of these packages are likely to be as good a place for practical advice on how to get the best from these sorts of tools. Most of the issues are not unique to financial time series so no reason not to cast your net wider (if you pardon the pun). This is not a recommendation and it is years since I read it but I do recall getting a bit out of Amazon.com: Neural, Novel & Hybrid Algorithms for Time Series Prediction (9780471130413): Timothy Masters: Books (pretty sure I didn't pay 355 bucks for it though!).
  5. Me too Just need to get them to store stuff with millisecond (or better) time stamps and to really make sure the multi threaded stuff is 101% rock solid. Having an option to build bars from multiple streams historical data tick by tick would be nice too. One of the reasons I am keen for Tradestation to make an official announcement about some of these things is it is likely to get TSSupport to accelerate their scheduling! btw Nate despite strongly disagreeing about Neo, I am inclined to agree with you about OOPs in general . I would have settled for TS introducing more flexibility with functions and procedures, support for libraries and full control over scope of variables. You still need core tools to write manageable monolithic code. As for Ninja that is a graphic example of how architecture is far far far more important than methodology. Despite Ninja utilising an OOP paradigm within a managed framework and despite just having been largely 're-written" (v7) Raymond has pretty much admitted that to implement the precision that users are demanding will require substantial portions of code to be re written (again).
  6. Have you actually used Neoticker for anything? Your comments reveal more about you and your level of understanding than they do Neoticker. Read less do more! Love it or hate it, Neoticker is an absolutely remarkable piece of software. It did stuff at launch that most developers are just starting to dream about today (and realising that there architectures dont support it, hence this thread). As an example just properly time stamping data coupled with the option of reconstructing historical bars tick by tick (across multiple data streams) means you can do full order book analysis on live and historic data with Neo, always could. Even today there are are few packages (if any) that can deliver complete precision to those that require it. Sure it is a little bit cumbersome to learn the architecture and API but on the plus side you can program it directly using any modern language (as well as there own scripting (formula) language). The architecture allows you to do pretty much whatever you can imagine and did do from day 1. It certainly does not need rewriting as it was architected properly to start with. Whilst many softcos struggle to deliver multi reliable multi threading Tickquest launch grid computing and multi instancing. You have to chuckle.
  7. If you are using profiles you will need to maintain (or use software that maintains) arrays of volumes @ price. There are techniques you can employ to minimise (or even prevent) having to iterate through these arrays to find 'stuff'. That will require some skill at coming up with algorithms . Whilst for n = lowprice to highprice step ticksize might cut it for a handful of instruments it won't be 'low latency' for scanning larger baskets. Anyway I digress, I should write an operational requirement and shoot it out to software manufacturers. In the long run it will likely save you time and frustration.
  8. I'd say just get some tools and start 'playing' you don't really need to know too much about exactly how neural networks work the real skill (and art) comes in picking what inputs to feed in and what you look for in the output. (predicting price, particularly a long way in to the future is about as tough a task as one could give a net). Genetic algorithms are a much more straight forward beast, they are pretty good at determining suitable inputs or ranking outputs of nets incidently. A lot of the heartache is in pre and post processing the data. In fact as far as I can tell that is the number one issue, normalising the inputs appropriately and trying to predict sensible things. I am not sure it's stuff you can learn from a book (though I would be first in line for a book on practical applications, tips & tricks) you just need to try and see what kinda 'works' and what does not. I have had a copy of NeuroShell Daytrader for years (I used to be a bit of a software junky). If I am honest I never got past the 'dabbling with intent' phase though I learnt much more through that than anything I read. I must say I was impressed by the vast array of tools in a single package (some are pay extra modules but the basic package has loads). It has a couple of types of NN's, genetic optimisation, a vast amount of maths stuff, regular TA stuff, plus various 'exotic' bits and pieces. Edit: unlike Ranger I prefer to buy an out and out license. If you want to place an order you learn how to use your brokers platform rather than learn the FIX protocol. Of course with the brokers platform it has a pretty specific task so it is much easier to see that it 'works'. Does anyone have recommendations for resources that cover the practical application of these sorts of tool? I do have a few older tomes though not sure any are worthy of an out and out recommendation.
  9. If you want 'low latency' particularly scanning hundreds or even thousands of symbols real time you might want to check out Neoticker. Not sure exactly what you mean by 'reliability', kind of implies you want something over and above something 'just working'. Your general tone 'pros respond' 'low latency' etc. seems to suggest you want something a bit beyond standard retail offerings? Anyway Neoticker is certainly worth checking out if you need near instant results from a whole bunch of different symbols, you can also 'cascade' scans and apply basic logic based on different scan results to quickly thin out lists.
  10. Hmmm interesting. If that is the case then they must purposely drop ticks under certain conditions.
  11. <doh> You are quite correct caching and translation look aside buffers can present problems. I still don't think that is what is causing what I am seeing but I will test again. Thanks for the heads up.
  12. Any news on this? It is a pretty fundamental issue that has dragged on for some while now. What's the ETA for a fix or has it slipped through the cracks? For example searching for "search" did not find this thread.
  13. Not so long ago search was not returning the results it should, not sure if it has been fixed?
  14. I am not sure that is a valid assumption. DTN prioritises completeness over timeliness, round news is when Zen might well drop packets however it is likely to have the freshest prices for those couple of seconds. Not saying that is what happened just that it may not be a safe assumption.
  15. Chad, enjoying the series thanks. Seems like you have something pretty robust as the cornerstone. A couple of thoughts. Unless I have missed something you are not using out of sample data for your performance results? (maybe IRT automatically splits the data). Normally you would optimise on months 1 & 2 and look at results on out of sample data months 3 & 4. Maybe I have miss understood. It would be good to see out of sample results at the end of each segment, gives a far more realistic idea of what to expect in trading. It might add the complication that different parameters end up performing better out of sample! By introducing a second set of targets you are essentially running a separate system (well the same system twice with different parameters). You are blending a 5 point stop 5 point target system that has a greater %winners but lower total profit with a 5 point stop 9.75 target system that is more profitable but with less winners. You might be able to achieve similar characteristics using a single set of intermediate parameters. Certainly as you mix more money management elements (3 targets trailing stops moving to break even) it becomes more difficult to have an intuitive grasp of what's going on. If you know that a certain cert of parameters maximizes return or another set % winners it's easier to keep a handle on things. I'd still be inclined to maintain them as 2 'forks'.
  16. Seems like it would be worth looking at the gap not closed days. If one could find some behaviour to filter some of these whilst not keeping you out of too many good trades you might be on to something. Things to look at (random ideas) are perhaps time (x% gaps close within y hours), internals (gaps don't close if directional move and high $tick or uvol/dvol), simple price action (gaps dont close after a NR day) there are loads to be honest. You could look at other percentages (apart from 50%) but I guess you are getting into curve fitting. One thing that occurs to me if price goes 10+ points against you before closing a 3 point gap it might be a rather uncomfortable to trade. You might want to take a look at MAE (maximum adverse excursion). linked to this is GAP and go days...when do you throw in the towel (stop out). I would be inclined to look at the time factor beyond simply IB not IB.
  17. In short 1) Yes 2)Yes, you must set indicator to update every tick 3) Yup that line should limit the indicator to intraday charts. Not really necessary I guess. Yes it was code I hacked together as I said up top. It needs a good tidy up comments etc. Cumulative delta has nothing to do with bar closes it is simply compares volume transacted@bid with that transacted@ask print by print.
  18. It's always fallen under 'quant'. Just a fancy new word for what people have been doing for aeons, modelling and managing risk across their portfolios. Maybe there is a bit more emphasis on modelling nowadays. As to algo trading the retail trader has never had it so good. You can rent a virtual machine in a managed environment for peanuts and you have things like strategy runner available at brokers. Even if you (are mad enough to) want to run stuff on a PC in your home/office infrastructure has never been so cheap. As to software whilst most has 'features' that frustrates there is remarkable stuff available to retail guys from neural nets, genetic algorithms, to the same algorithmic toolboxes available to the great and the good in the commercial world. There is stuff for under a couple of thousand bucks that will take data from hundred of instruments and run optimisations across a grid of networked computers for example. This thread is all about back testing not deploying run time systems btw. Very different propositions with very different requirements but both quite doable by those with appropriate knowledge. Take a look at Chads videos for something that is pretty straightforward that looks like it is already developing into something quite robust. In short I guess I disagree
  19. A real simple solution for those that do not have the infrastructure to handle large ammounts of order book data would be to simply flag each trade saying @BB true/false @BA true/false. That adds a measly 2 bits of information to each trade packet, pretty trivial. Of course that is not good enough for more intricate studies (like tracking pulled orders) but would certainly be a decent compromise..
  20. My point is they may be fine for non cumulative bid ask work too (e.g. relative work or delta oscillators where inaccuracies become stale quickly). Anyway lets hope Zenfire make the necessary changes to get things where they were a year or so ago it's a shame there is not much real choice in the retail space. Hehe I'd tale those odds too :D though before people get too excited I should stress I was referring to accuracy in using delta to determining direction of order flow, still a far better starting point than a whole bunch of other metrics - but still some way to go to incorporate that information into ones trading! Edit: Enjoy your break btw
  21. How on earth did you conclude that from what he said? Staggering. I think you must have missed the turn off for Elite Trader.
  22. I think underground trader was post SOES bandits, may be wrong about that it was all a long time ago I have one of his earlier books (it has secrets in the title....that should be a warning) nothing remarkable in it that I can remember. I'm somewhat sceptical about Mr Yu to be honest.
  23. However despite this you where recommending it as such up until the recent CME reporting changes. It is certainly not "unusable for proper bid/ask data needs". You are assuming that everyones needs are the same as yours. Sure if you want to do cumulative work it is probably inappropriate and it would make sense to use something like DTN.IQ. However if you are a scalper using relative V@B V@A or if timeliness is of greater concern than completeness then Zen is likely a better choice. Also consider that using 'delta' as a proxy for order flow/inventory is likely to only be around 80% accurate at best anyway.
  24. We are not compounding annually are we? We are compounding daily. Maybe that is the source of your confusion. If you have an approach that trades a few times a day and averages a point or two a day you will take 5k to 1million+ in about a year without assuming undue risk. (Undue obviously is defined by the individual trader). As another poster pointed out perhaps you dont not have an edge at all? If you are a day trader and not averaging a point or two a day or you have a couple of good weeks followed by a couple of bad weeks then what you think is an edge is probably not. You really are doing yourself a disservice to close your mind to what are not just possibilities but mathematical facts. I did a quick google search for a decent position sizing calculator so you could slot n some figures and see for your self. I came across this blog that is a good 'meta page' for a whole bunch of position sizing resources (links at the end). Position Sizing
  25. I'd be quite interested what those are? Are they as Tams summises to do with multi-threading and timing? Whilst they seem to have addressed those I still have observed what I believe to be some issues even with MC restricted to a single core. I passed them (TSSuport) details of my observations with some sample code (written to highlight potential problems) and results but don't really have the energy to take it further. I am interested in what your guy found as I wonder if it coincides with my suspicions.
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.