Wednesday, July 24, 2013

Machine Learning in JavaScript

I did a presentation at Ottawa JavaScript on machine learning, which covered a lot of the material in my two recent posts on Bayesian classifiers. This was new to a lot of the audience, so I made slides to step through a very simple example.

Using the "box of chocolates" analogy, the slides demonstrate how to predict if a chocolate contains nuts, depending on its colour and shape.

You can view the slides here:

http://darrenderidder.github.io/talks/MachineLearning

2 comments:

camspiers said...

Hi Darren,

The slide that shows Bayes Theorem as the following is in error:

P(B | A)P(A) / (P(B | A)P(A)+(1−P(B | A)) (1−P(A)))

as:

P(B | not A) isn't equal to (1−P(B | A).

The degree to which evidence is expected on one hypothesis P(B|A) isn't necessarily correlated with the degree to which the evidence is expected on another hypothesis P(B|not A).

For an example, a manufacturer makes dice with either (1,2,3,4,5,6) or (1,2,2,4,5,6) on them. Let's name them type 1 and type 2 respectively.

We choose a die from a bag with 50% chance of getting either type of dice.

The question is what is P( roll a 2 | type 1 ) and what is P( roll a 2 | type 2). The answers are 1/6 and 1/3.

Regards

Darren said...

Hi Cam
I should update that slide. Those three dots "..." are meant to indicate that there are some assumptions and calculations leading to a naive Bayesian classifier of the sort used for spam filtering. The title doesn't really make that clear.