tl;dr: This is normal. Don't worry about it.
Run git gc
if you like, but that will be run automatically.
Many files is hard on the file system
No, on certain types of file systems many files in a single directory can make finding files in that directory slow. Particularly file systems which store the contents of directories as a linked-list. They would have to walk the whole list of files. This was a problem on FAT32 and ext2.
Modern file systems like NTFS (Windows), ext3 and ext4 (many Linuxes), and HFS+ (OS X) can efficiently handle large numbers of files in a directory by using a variation on B-Tree.
Furthermore, Git was developed by kernel developers and they know what they're doing. Git does not put its objects in a single directory, but breaks them up into subdirectories using the first two characters of the object ID. Since the commit IDs are hashes they will be evenly distributed over many directories.
Finally, recent versions of Git will periodically reduce the number of individual object files by compressing them into packfiles.
even harder if you have to sync that folder
This implies you've put Git onto a shared drive like Dropbox. Putting Git on Dropbox is like disassembling a truck and mailing it to yourself in the post. It's slow, expensive, you're likely to lose pieces, and you could have just driven the truck. Dropbox can kill Git performance and corrupt the repository. Anything with slow seek
times like a network drive is very bad for Git which uses the filesystem as a simple object database.
Git is a distributed version control system. If you want to distribute your repository, use Git to do it. It's very efficient at it. You can keep your repo on Dropbox, but use git-remote-dropbox to do it safely. You can use an existing Git hosting service like Github or Gitlab. Or you can put a bare repository somewhere you have ssh access to.