Version 3 (modified by Sam Hocevar, 12 years ago) (diff)

fast-import changes

git-bigfiles - Git for big files

git-bigfiles is our fork of Git. It has two goals:

  • make life bearable for people using Git on projects hosting very large files (hundreds of megabytes)
  • merge back as many changes as possible into upstream Git


git-bigfiles already features the following fixes:

  • config: add core.bigFileThreshold option to treat files larger than this value as big files
  • git-p4: fix abysmal performance when importing medium-sized files (operation is now up to 8000 times faster)
  • fast-import: do not compute deltas on big files and deflate them on-the-fly (operation is now twice as fast and use 3.7 times less memory)


Clone the git-bigfiles repository:

git clone git://

If you already have a working copy of upstream git, you may save a lot of bandwidth by doing:

git clone git:// --reference /path/to/git/

The main Git repository is constantly merged into git-bigfiles. See the git-bigfiles repository on

Attachments (3)

Download all attachments as: .zip