6

I would like to get a different solution to a problem I have solved "symbolically" and through a little simulation. Now, I would like to know how could I have got the integration directly using Mathematica.

Please consider a target represented by a disk with r = 1, centered at (0,0).I want to do a simulation of my probability to hit this target throwing darts.

Now, I have no biases throwing them, that is on average I shall hit the center mu = 0 but my variance is 1.

Considering the coordinate of my dart as it hit the target (or the wall :-) ) I have the following distributions, 2 Gaussians:

XDistribution : 1/Sqrt[2 \[Pi]\[Sigma]^2] E^(-x^2/(2 \[Sigma]^2))

YDistribution : 1/Sqrt[2 \[Pi]\[Sigma]^2] E^(-y^2/(2 \[Sigma]^2))

With those 2 distribution centered at 0 with equal variance =1 , my joint distribution becomes a bivariate Gaussian such as :

1/(2 \[Pi]\[Sigma]^2) E^(-((x^2 + y^2)/(2 \[Sigma]^2)))

So I need to know my probability to hit the target or the probability of x^2 + y^2 to be inferior to 1.

An integration after a transformation in a polar coordinate system gave me first my solution : .39 . Simulation confirmed it using :

Total@ParallelTable[
   If[
      EuclideanDistance[{
                         RandomVariate[NormalDistribution[0, Sqrt[1]]], 
                         RandomVariate[NormalDistribution[0, Sqrt[1]]]
                        }, {0, 0}] < 1, 1,0], {1000000}]/1000000

I feel there were more elegant way to solve this problem using the integration capacities of Mathematica, but never got to map ether work.

500
  • 6,509
  • 8
  • 46
  • 80

2 Answers2

6

There are actually several ways you can do this.

The simplest would be to use NIntegrate as:

JointDistrbution = 1/(2 \[Pi] \[Sigma]^2) E^(-((x^2 + y^2)/(2 \[Sigma]^2)));
NIntegrate[JointDistrbution /. \[Sigma] -> 1, {y, -1, 1}, 
    {x, -Sqrt[1 - y^2], Sqrt[1 - y^2]}] // Timing

Out[1]= {0.009625, 0.393469}

This is another way to do it empirically (similar to your example above), but a lot slower than using NIntegrate:

(EuclideanDistance[#, {0, 0}] <= 1 & /@ # // Boole // Total)/
     Length@# &@RandomVariate[NormalDistribution[0, 1], {10^6, 2}] // 
  N // Timing

Out[2]= {5.03216, 0.39281}
abcd
  • 41,765
  • 7
  • 81
  • 98
4

Built-in function NProbability is also fast:

NProbability[ x^2 + y^2 <= 1, {x, y} \[Distributed] 
BinormalDistribution[{0, 0}, {1, 1}, 0]] // Timing

or

NProbability[x^2 + y^2 <= 1, x \[Distributed] 
NormalDistribution[0, 1] && y \[Distributed] 
NormalDistribution[0, 1] ] // Timing

both give {0.031, 0.393469}.

Since the sum of squares of n standard normals is distributed ChiSquare[n], you get a more streamlined expression NProbability[z < 1, z \[Distributed] ChiSquareDistribution[2]] where z=x^2+y^2 and x and y are distributed NormalDistribution[0,1]. The timing is same as above: {0.031, 0.393469}.

EDIT: Timings are for a Vista 64bit Core2 Duo T9600 2.80GHz machine with 8G memory (MMA 8.0.4). Yoda's solution on this machine have timing {0.031, 0.393469}.

EDIT 2: Simulation using ChiSquareDistribution[2] can be done as follows:

(data = RandomVariate[ChiSquareDistribution[2], 10^5]; 
  Probability[w <= 1, w \[Distributed] data] // N) // Timing

yields {0.031, 0.3946}.

EDIT 3: More details on timings:

For

First@Transpose@Table[Timing@
  NProbability[x^2 + y^2 <= 1, {x, y} \[Distributed] 
  BinormalDistribution[{0, 0}, {1, 1}, 0]], {10}]

I get {0.047, 0.031, 0.031, 0.031, 0.031, 0.016, 0.016, 0.031, 0.015, 0.016}

For

First@Transpose@Table[Timing@
NProbability[x^2 + y^2 <= 1, 
 x \[Distributed] NormalDistribution[0, 1] && 
  y \[Distributed] NormalDistribution[0, 1] ], {10}]

I get {0.047, 0.031, 0.032, 0.031, 0.031, 0.016, 0.031, 0.015, 0.016, 0.031}.

For

First@Transpose@Table[Timing@
NProbability[z < 1, 
 z \[Distributed] ChiSquareDistribution[2]], {10}]

I get {0.047, 0.015, 0.016, 0.016, 0.031, 0.015, 0.016, 0.016, 0.015, 0.}.

For yoda's

First@Transpose@Table[Timing@(JointDistrbution = 
  1/(2 \[Pi] \[Sigma]^2) E^(-((x^2 + y^2)/(2 \[Sigma]^2))); 
 NIntegrate[
  JointDistrbution /. \[Sigma] -> 1, {y, -1, 
   1}, {x, -Sqrt[1 - y^2], Sqrt[1 - y^2]}]), {10}]

I get {0.031, 0.032, 0.015, 0., 0.016, 0., 0.015, 0.016, 0.016, 0.}.

For the empirical estimation

First@Transpose@Table[Timing@(Probability[w <= 1, 
 w \[Distributed] data] // N), {10}]

I got {0.031, 0.016, 0.016, 0., 0.015, 0.016, 0.015, 0., 0.016, 0.016}.

kglr
  • 1,438
  • 1
  • 9
  • 15
  • I find it very suspicious that your timings are _exactly_ the same for all three of your solutions _and_ mine... I certainly get very different timings – abcd Dec 20 '11 at 05:20
  • @yoda, curious isn't it? I was about to ask you if you could run the above code on your machine. – kglr Dec 20 '11 at 05:41
  • These are the timings I get for each of your three methods (in the order you've listed) and mine (last): `{0.035673, 0.022273, 0.097494, 0.009067}` – abcd Dec 20 '11 at 05:46
  • @yoda, thank you. I see a lot of variation (and some cycling with several numbers recurring) in my timings. Obviously something non-random in my environment? – kglr Dec 20 '11 at 06:13
  • BTW, I tried the same timings with a fresh kernel session and 100 repetitions and I get a similar cyclying pattern. – kglr Dec 20 '11 at 06:26
  • Hmm... I have no idea why! Thanks though :) – abcd Dec 20 '11 at 06:28