If I generate n random numbers in the interval [0,1] then the mean will be around 0.5 and they will be uniformly distributed. How could an algorithm/formula look like if I want to get n random numbers still in the interval [0,1], however, e.g. with a mean of 0.6. They should still be distributed as uniformly as possible, however numbers bigger than > 0.5 a bit more frequently.
So far I have only found solutions, which would assume a different distribution, e.g. with a normal distribution it would be quite easy to have numbers around the desired mean, but then numbers which are much larger or much smaller will be much less frequent and I'd like to avoid that.
The programming language does not really matter. I am currently trying to do that with R however.