Jump to content

Welcome to the new Traders Laboratory! Please bear with us as we finish the migration over the next few days. If you find any issues, want to leave feedback, get in touch with us, or offer suggestions please post to the Support forum here.

  • Welcome Guests

    Welcome. You are currently viewing the forum as a guest which does not give you access to all the great features at Traders Laboratory such as interacting with members, access to all forums, downloading attachments, and eligibility to win free giveaways. Registration is fast, simple and absolutely free. Create a FREE Traders Laboratory account here.

TheNegotiator

Massively Parallel Processing

Recommended Posts

Does anyone know CUDA or CUDA.net? Is anyone integrating it in any way? Are there any good alternatives to it? I think there is ATI Stream maybe.

 

but the market is a sequential event...

Share this post


Link to post
Share on other sites

A single market in real-time is a sequential event yes. What about if you were interested in processing real-time data of all the stocks in the S&P 500 for example? How much quicker might you be able to run backtesting and optimisations if you are clever about how you use the programming? I know that cuda is used already by various financial companies. Why should it be their edge though when all we have to do is pop a couple of gpu's in sli into our own pc's? Not that you could compete at the same level, just on the same playing field.

Share this post


Link to post
Share on other sites
A single market in real-time is a sequential event yes. What about if you were interested in processing real-time data of all the stocks in the S&P 500 for example? How much quicker might you be able to run backtesting and optimisations if you are clever about how you use the programming? I know that cuda is used already by various financial companies. Why should it be their edge though when all we have to do is pop a couple of gpu's in sli into our own pc's? Not that you could compete at the same level, just on the same playing field.

 

backtest is still a sequential event

 

you can use genetic algorithm to simulate optimization,

that speeds up processing without added hardware.

 

 

ps. CUDA is good for vector processing (Look up Google)... ok to make 3D financial models, but not sequential events like trading.

 

One famous charting software (which should remain nameless) had to turn off multi-core CPU and multi-thread processing, because of problem with data integrity. This should give you an idea the practical application of massively parallel processing.

Share this post


Link to post
Share on other sites

Hmm. Interesting. It seems to not be of huge benefit right now. If you were backtesting/optimising, could the time period you were analysing not be broken up into many constituent parts, tested and then reformed to give an output?

 

I am sure there are applications for this, perhaps at a later date as the technology evolves.

Share this post


Link to post
Share on other sites
Hmm. Interesting. It seems to not be of huge benefit right now. If you were backtesting/optimising, could the time period you were analysing not be broken up into many constituent parts, tested and then reformed to give an output?

 

I am sure there are applications for this, perhaps at a later date as the technology evolves.

 

the basic fact is.... you have to buy before you can exit long -- that is an sequential event.

You cannot chop up a sequential event and process the exit long before a buy is executed.

 

 

furthermore, most of the trading logics require something like this:

if price is larger than 50 SMA ... then buy...

if number of loss trades today is smaller than "max_allowed_loss_trade"... then proceed...

 

 

that 50 SMA is a sequential event that must go back 50 bars... you cannot chop up this event for parallel processing.

 

same goes for any conditional decisions that looks back in time.

 

 

Hope these examples helps.

Share this post


Link to post
Share on other sites

Hello guys,

 

I was considering learning CUDA for making a backtester of my own making faster, but the problem with GPGPU is that these technologies are greatly hindered by the RAM -> GPU bandwidth. Hence it is benefitial for backtests where a large part of RAM is moved onto the graphics card once, from where the GPUs can perform the whole backtest. However, with today's cards having at max 3GBs of memory, you can't possibly hold more than 1.5 year of tick data for some less liquid instruments (say TF).

 

Backtesting is a sequential event, however not in the sense discussed here. The fact that SMA50 needs to look back 50 periods is valid, however once you precompute the the indicator for the whole investigated period, you can have each core investigate a different period of the market (searching for entries). The same can be applied for exits. However, it is the synchronization of entries and exits on patterns/SL/PT that cannot be done in parallel. So, in my view, GPGPU in backtesting if worth it only if you have enormously complicated entry patterns that take a lot of time to evaluate, where having 3000 cores evaluating entries at every tick is what makes the backtest fast. In any other setup, the power of GPGPU can't be used for quicker backtesting in my view.

Share this post


Link to post
Share on other sites
Hello guys,

.... However, it is the synchronization of entries and exits on patterns/SL/PT that cannot be done in parallel. ....

 

.......... LOL ..........

 

there is a genius in every clown.

Share this post


Link to post
Share on other sites
the basic fact is.... you have to buy before you can exit long -- that is an sequential event.

You cannot chop up a sequential event and process the exit long before a buy is executed.

 

 

furthermore, most of the trading logics require something like this:

if price is larger than 50 SMA ... then buy...

if number of loss trades today is smaller than "max_allowed_loss_trade"... then proceed...

 

 

that 50 SMA is a sequential event that must go back 50 bars... you cannot chop up this event for parallel processing.

 

same goes for any conditional decisions that looks back in time.

 

 

Hope these examples helps.

 

Unless you are executing the same process for many tickers ;-) From simple view of the world and our actions to it - yes it is not really possible to do stuff in parallel. But when you move a bit into what is really going on - you'll find a lot of things for parallel computing ;)

 

Anyway back to topic - CUDA is usually about heavy C programs. If your are able to do that - than try first to write backtest for your data on your CPU in C... maybe you will find the performance sufficient.... / like me, I was planning to the same. Then I found that by using optimized libraries and 4 CPU cores my speed up was :haha: 250x

Share this post


Link to post
Share on other sites
...Then I found that by using optimized libraries and 4 CPU cores my speed up was :haha: 250x

 

Hello andro, just ouf ot curiosity, what libraries do you use?

Share this post


Link to post
Share on other sites
Hello andro, just ouf ot curiosity, what libraries do you use?

 

Python and Numpy, for code overfilled with "if"s Cython.

 

And just to be precise I'm talking about code "vectorization" and "parallelization". But the same applies to CUDA.

Share this post


Link to post
Share on other sites

Join the conversation

You can post now and register later. If you have an account, sign in now to post with your account.
Note: Your post will require moderator approval before it will be visible.

Guest
Reply to this topic...

×   Pasted as rich text.   Paste as plain text instead

  Only 75 emoji are allowed.

×   Your link has been automatically embedded.   Display as a link instead

×   Your previous content has been restored.   Clear editor

×   You cannot paste images directly. Upload or insert images from URL.


  • Topics

  • Posts

    • NFLX Netflix stock, with a solid top of range breakout, from Stocks to Watch at https://stockconsultant.com/?NFLX  
    • NFLX Netflix stock, with a solid top of range breakout, from Stocks to Watch at https://stockconsultant.com/?NFLX  
    • It depends. If you have lots of money that you can buy a house without a loan and if you don't have any parents to sponsor then it is a good idea. Otherwise it might be a bad idea depending where in Canada you are heading to. I earned a good middle income in my home country and I migrated to Vancouver 5 years ago at the age of 35. I had to start right from the bottom, lowest of the low.. Now i am finally earning a middle income in Canada but I still cannot afford to buy a one bedroom apartment. Having left behind friends, family and home, most of the times I think it is not worth it.   In short, do not migrate if you already have a good life in your home country and you are happy. Only migrate to Canada if you really have to leave your home country say there is a war or something really bad. Discrimination still exists here and its really tough for newcomers unless you are super rich. Good luck. David Chong, Quora  
    • This is bigger than the internet. Bigger than mobile. Bigger than social media.   While everyone was distracted by stock market fluctuations and political theater…   Most people have NO IDEA what just happened last week with ChatGPT.   Their new memory feature allows ChatGPT to remember EVERYTHING about you across all your conversations.   Think about that for a minute...   While most tech companies have been collecting mere breadcrumbs about you - your likes, your clicks, your browsing history - OpenAI is now collecting the most valuable dataset in human history: your complete psychological profile.   This is Zuckerberg x 5,000.   The more you use ChatGPT, the more it understands you, becoming a supercharged reflection of yourself that improves at an exponential rate.   Are you a regular ChatGPT user?   Consider whether it’s time to turn off the “you can train on my information” feature. To prevent your data from being used for training while still using the memory feature:   Disable Model Training: Navigate to Settings > Data Controls. Toggle off "Improve the model for everyone". Manage Memory Settings: Go to Settings > Personalization > Memory. Here, you can: Turn off memory entirely. Delete specific memories. Use Temporary Chat for sessions that won't be saved or used for training. Now the investment implications…   Why This is Bigger Than You Think Consider this: the relationship between humans and ChatGPT is evolving beyond a mere tool.   People are now treating these AI assistants as friends, confidants, and even romantic partners.   I'm not making this up - there are already documented cases of people ending real human relationships to pursue “connections” with their AI companions.   A viral Instagram meme shows a person going through life with a glowing, featureless humanoid figure - representing ChatGPT - as their companion.   The post has over 1.1 million likes and comments like "Bro ChatGPT is like my best friend. Ain't even ashamed to say it" with 25,000 likes.   But here's where things get really interesting for investors and entrepreneurs...   Three Things to Watch For starters, hardware is the next big thing for the big players.   The iPhone form factor is dead.   It hasn't meaningfully changed in nearly a decade. The next evolution in hardware will be designed specifically to interface with these AI companions.   OpenAI is already working on hardware with Johnny Ive, the legendary designer behind the iPhone and iPod. But you can’t ignore Elon Musk’s edge here.   So what does all of this mean for you?   The companies that control the personal AI relationships will be worth trillions. OpenAI and Elon Musk will have the coziest moats. We're witnessing the birth of a new internet - one built on agents that can communicate with each other across platforms. Google's new agent-to-agent protocol allows AI agents to work together without sharing internal memories or tools. The hardware companies that create the perfect interface for these AI companions will dominate the next decade of technology. And almost nobody is talking about what this means.   My prediction? Within five years, most people will have a personal AI that knows them better than anyone else. And they will interact with it in ways that seem foreign today.   (And, yes, it will almost certainly have dystopian elements.)   In the meantime, the biggest gains won’t come from household names. And, right now, James is seeing a prime opportunity to invest in the most under-the-radar plays in AI…   For dirt cheap. By Chris C. Source: https://altucherconfidential.com/posts/use-chatgpt-protect-yourself-now
    • KBH KB Home stock, nice day and rally off the 50.82 support area, from Stocks to Watch at https://stockconsultant.com/?KBH      
×
×
  • Create New...

Important Information

By using this site, you agree to our Terms of Use.