Memory exhaustion in git garbage collection

Just as I was trying to garbage collect big repository, I started seeing this problem:

# git gc
Counting objects: 2746, done.
Delta compression using up to 2 threads.
fatal: Out of memory, malloc failed2)
error: failed to run repack

Looking for some remedy to this, it seems to be I can specify memory window for repacking process by using –window-memory.

# git repack -adf –window-memory=50m
Counting objects: 2746, done.
Delta compression using up to 2 threads.
Compressing objects: 100% (2672/2672), done.
Writing objects: 100% (2746/2746), done.
Total 2746 (delta 910), reused 0 (delta 0)
Removing duplicate objects: 100% (256/256), done.

I think I have to specify something low for –window-memory as it seems to maximum object size of individual objects, but it looks at number of object as specified in –window, which default is 10. (so window memory of 50MB would still take 500MB) Default for window memory is set to 0, which means unlimited. It perhaps wouldn’t be too much of problem if you have small files in your repository. But if you have huge one, then you’ll have to specify this, or it’ll crash failing to allocate memory, unless you have tons of memory.
If this problem is happening at garbage collection, even after you’ve done above, subsequent garbage collections would fail if you don’t specify the following.

# config pack.windowMemory 50m