I haven't read the book. From the description on Amazon it seems the investor/mathematician let his emotions come in the way of his investment decisions which led to him losing money. I think that's a risk we run by providing that human touch, which sometimes may even help us if the software is making stupid decisions.
My goal is to build a fully-automated system (no human interference) that makes "confident" decisions based on statistics and hard numbers.
Your emotional response is telling you that your algorithms won't be affected by emotion. Your emotional response has already caused you to prefer one course of action (algorithms) over another (humans picking stocks).
Your automated system will require human input. I lost a bunch of money in Jan - I saw a huge drawdown, discovered a fixable flaw in my strategy, panicked and closed my positions. Human inputtime - do I close positions and shut down?
Had I stuck with the flawed strategy, I'd have made (a small amount of) money.
Dont think that you dont need to manage your psychology simply because you are running an algorithm.
Yummyfajitas, that's exactly why I do not want human input. Also, I'm not trying to make money on every bet, but to make more money in my winning bets (less fees) than I lose in my losing bets.
What I'm trying to convey is that you do have human input even for a pure algo strategy. If you acknowledge it exists, attempt to minimize it and manage what remains, you are far less likely to have it cause you problems.
You still have to monitor and read the news , because in our case event if event drivent some things can happen that will negate your result.
Exemple : Elon musk talk too much before a financial report, or feud beetween china governement and Ali Baba.
We still have to monitor the news for this kind of probleme. Even if you automize everything you can't prevent this kind of things. And if you do event driven , you will have to think about this.
http://www.amazon.com/gp/aw/d/0465054811/ref=mp_s_a_1_sc_1?q...