2

I have an SVN repository that I use for storing both code and data files (binary & text). I have noticed slower performance as the repository has grown. I would like to understand better what contributes to this.

Obviously performance depends on the speed at which new data is transferred. I'm interested in performance independent of this (e.g. the time to execute SVN Update when no files have been changed).

To what extent is this kind of performance affected by (1) the number of files in the repository? (2) the size of files in the repository?

Both will slow things down, but I'm wondering whether one or the other is significantly more important.

  • An additional question I was asking myself is: since subversion only stores the differences between two revisions of a file, will speed be affected by the number of revisions of a file? – M4N Jul 03 '09 at 21:17
  • @Martin: No the number of revisions a file cinsists of is not (really important) as SVN uses a skipping-delta-scheme to skip certain revisions – Peter Parker Jul 03 '09 at 21:36
  • I have noticed slowness in doing an update when the number of files is increased a lot. This is true even if there are no files to update. – Tim Jul 04 '09 at 01:58

1 Answers1

3

If you have an FSFS Repository, there should be no perfromance degradation for common operations, even after 10000+ revisions and gigabytes of data.

RELATED: SVN performance after many revisions

More likely, you are seeing something else happening.

Remember, Big working copies = Lots of disk space = Slower client-side operations

Make sure you aren't checking out the whole repo when you only need a subset. Organize your stuff into subfolders, and only check out the necessary subfolders.

Community
  • 1
  • 1
myron-semack
  • 6,259
  • 1
  • 26
  • 38