[[Image(git-bigfiles.png, right, nolink)]] = git-bigfiles - Git for big files = '''git-bigfiles''' is our fork of [http://git-scm.com/ Git]. It has two goals: * make life bearable for people using Git on projects hosting very large files (hundreds of megabytes) * merge back as many changes as possible into upstream Git == Features == '''git-bigfiles''' already features the following fixes: * '''config''': add `core.bigFileThreshold` option to treat files larger than this value as big files * '''git-p4''': fix poor string building performance when importing big files (git-p4 is now only marginally faster on Linux but 4 to 10 times faster on Win32) * '''fast-import''': avoid computing deltas on big files and deflate them on-the-fly (fast-import is now twice as fast and uses 3.7 times less memory with big files) == Results == Git `fast-import` memory usage and running time when importing a repository with binary files up to 150 MiB: [[Image(fast-import-memory.png, 200px)]] == Development == git-bigfiles development is centralised in a `repo.or.cz` Git repository. Clone the repository: {{{ git clone git://repo.or.cz/git/git-bigfiles.git }}} If you already have a working copy of upstream git, you may save a lot of bandwidth by doing: {{{ git clone git://repo.or.cz/git/git-bigfiles.git --reference /path/to/git/ }}} The main Git repository is constantly merged into git-bigfiles. See the [http://repo.or.cz/w/git/git-bigfiles.git git-bigfiles] repository on [http://repo.or.cz/ repo.or.cz]. == To do == The following commands are real memory hogs with big files: * `git diff-tree --stat --summary -M HEAD` * `git gc` (as called by `git merge`) fails with ''warning: suboptimal pack - out of memory'' == Readings == * 2006/02/08: Linus on [http://kerneltrap.org/mailarchive/git/2006/2/8/200594 Handling large files with GIT]