Friday, October 14, 2011

Free auditing of Stanford AI and Machine Learning Courses w/Peter Norvig

Just wanted to notify viewers of a few great courses that are being offered free for auditing and/or participation by well known industry experts, including co-author of the classic text on AI, 'Artificial Intelligence: A Modern Approach,' Peter Norvig and Prof. Andrew Ng.

http://www.ai-class.com/
see also,
http://www.sfgate.com/cgi-bin/article.cgi?f=/c/a/2011/10/14/BUFR1LH9JR.DTL

The notice is a bit late, but they are still accepting registrations.

3 comments:

  1. IT,

    I'm new to machine learning, but I've found that GAs and ANNs seem to over fit non-recurring noise. I'm a lot more comfortable with something like the Kalman filter because it has far fewer free parameters.

    I came across a paper that did, however, successfully apply ANNs to FX trading. Do you think that the Kalman filter could be used in place of the ANN described in the paper?

    http://www.jbs.cam.ac.uk/research/working_papers/2004/wp0418.pdf

    ReplyDelete
  2. Hi Jeff,
    I haven't had a chance to read the paper yet, but I'll make a quick comment on your question. One of the observations I've mentioned earlier is that a smoother function is easier to extrapolate data reliably because of less noise. One of the advantages of the Kalman filter is the estimate is close to a smooth function. Now, the same, however can be applied to GAs and ANNs, BUT, the function needs to be smoothed prior to processing through these learners, rather than inputting very noisy signals.

    IT

    ReplyDelete
  3. One other point I didn't address, is whether the KF can be substituted for the ANN or GA in the paper. Glancing over it, I didn't see any direct references to an ANN, but if the outputs are not linear a KF may not be a suitable substitute as it's based on linear processes.

    ReplyDelete