I am coding a simulation of a globular cluster in python, starting with particles of the same mass and 0 initial velocity in a random spherical distribution. I am trying to investigate the root-mean-square radius of the system over time and trying to see how long it takes for this R_rms to stop contracting and reach an equilibrium state. However, I have the issue of stars being flung out of the system at very high velocities due to close encounters, which totally dominate my R_rms calculations. What is the best way to ignore these particles in my calculation?
I have thought about ignoring them if:
- the distance from the origin is outside of the original sphere of generation, however have the issue of the cluster as a whole moving until it is outside of the sphere of generation.
- Ignoring the particles if they have a velocity > the escape velocity of the system. This seems like the most reasonable way, however, I am worried about ignoring a particle which has a velocity > the escape velocity but has a later encounter that keeps it in the system.
Here is a plot of the positions of the particles over time where you can see several particles have been flung out.