Why does git fail on push/fetch with Too many open files
There are two similar error messages:
EMFILE: Too many open files
ENFILE: Too many open files in system
It looks like you're getting EMFILE
, which means that the number of files for an individual process is being exceeded. So, checking whether vi
can open files is irrelevant—vi
will use its own, separate file table. Check your limits with:
$ ulimit -n
1024
So on my system, there is a limit of 1024 open files in a single process. You shouldn't need to ask your system administrator (please don't use the acronym SA, it's too opaque; if you must abbreviate, use "sysadmin") to raise the limit.
You may wish to check which files Git opens by running Git under strace
.
This could be a bug in Git or in a library, or it could be you're using an old version of something, or it could be something more bizarre. Try strace
first to see which files it opens, and check whether Git closes those files.
Update from Hazok:
After using the above recommendations, it turns out the error was caused by too many loose objects. There were too many loose objects because git gc
wasn't being run often enough.
Hg Git Pull Causes Too Many Open Files Error
What does ulimit -n
say on your Mac? This is the limit on the number of open files. Try then running ulimit -n N
for some larger N than what it was previously, and run the hg command again.
You may want to put the ulimit -n N
command in your ~/.bashrc
to run it every time you log in, if you have this problem more than just this once.
I get an error while pushing the project to the repo on Github
Pull the changes first with git pull and after git push. Seems that the online repo has more folders/files than your code in local on your computer. Are other people working on the same project together with you?
fatal: early EOF fatal: index-pack failed
First, turn off compression:
git config --global core.compression 0
Next, let's do a partial clone to truncate the amount of info coming down:
git clone --depth 1 <repo_URI>
When that works, go into the new directory and retrieve the rest of the clone:
git fetch --unshallow
or, alternately,
git fetch --depth=2147483647
Now, do a regular pull:
git pull --all
I think there is a glitch with msysgit in the 1.8.x versions that exacerbates these symptoms, so another option is to try with an earlier version of git (<= 1.8.3, I think).
Git push: fatal 'origin' does not appear to be a git repository - fatal Could not read from remote repository.
First, check that your origin is set by running
git remote -v
This should show you all of the push / fetch remotes for the project.
If this returns with no output, skip to last code block.
Verify remote name / address
If this returns showing that you have remotes set, check that the name of the remote matches the remote you are using in your commands.
$git remote -v
myOrigin ssh://git@example.com:1234/myRepo.git (fetch)
myOrigin ssh://git@example.com:1234/myRepo.git (push)
# this will fail because `origin` is not set
$git push origin main
# you need to use
$git push myOrigin main
If you want to rename the remote or change the remote's URL, you'll want to first remove the old remote, and then add the correct one.
Remove the old remote
$git remote remove myOrigin
Add missing remote
You can then add in the proper remote using
$git remote add origin ssh://git@example.com:1234/myRepo.git
# this will now work as expected
$git push origin main
not able to push file more than 100mb to git hub
Generally no as this is an intentional limitation in GitHub. The reason for this limitation is that git (general git, not GitHub) stores every version of each file. Therefore having multiple revisions of large files will make the repository bloat and increases the clone and fetch times for other users of a repository (see GitHub Help - Working with large files).
The way GitHub recommends working with large files is using Git LFS (Large File Storage), generally explained as:
... an open-source extension to Git that allows you to work with large
files the same way as any other textWith Git Large File Storage, you and your repository's contributors
can clone large files from the Git command line, open pull requests,
and comment on the diffs. It's the ideal solution for pushing files to
GitHub that are larger than 100 MB.
(see GitHub documentation - versioning large files).
As a general recommendation (not necessarily related to GitHub) the best practice would be not to work with big files, and consider other alternatives to your specific case.
Why does git say Pull is not possible because you have unmerged files?
What is currently happening is, that you have a certain set of files, which you have tried merging earlier, but they threw up merge conflicts.
Ideally, if one gets a merge conflict, they should resolve them manually, and commit the changes using git add file.name && git commit -m "removed merge conflicts"
.
Now, another user has updated the files in question on their repository, and has pushed their changes to the common upstream repo.
It so happens, that your merge conflicts from (probably) the last commit were not not resolved, so your files are not merged all right, and hence the U
(unmerged
) flag for the files.
So now, when you do a git pull
, git is throwing up the error, because you have some version of the file, which is not correctly resolved.
To resolve this, you will have to resolve the merge conflicts in question, and add and commit the changes, before you can do a git pull
.
Sample reproduction and resolution of the issue:
# Note: commands below in format `CUURENT_WORKING_DIRECTORY $ command params`
Desktop $ cd test
First, let us create the repository structure
test $ mkdir repo && cd repo && git init && touch file && git add file && git commit -m "msg"
repo $ cd .. && git clone repo repo_clone && cd repo_clone
repo_clone $ echo "text2" >> file && git add file && git commit -m "msg" && cd ../repo
repo $ echo "text1" >> file && git add file && git commit -m "msg" && cd ../repo_clone
Now we are in repo_clone, and if you do a git pull
, it will throw up conflicts
repo_clone $ git pull origin master
remote: Counting objects: 5, done.
remote: Total 3 (delta 0), reused 0 (delta 0)
Unpacking objects: 100% (3/3), done.
From /home/anshulgoyal/Desktop/test/test/repo
* branch master -> FETCH_HEAD
24d5b2e..1a1aa70 master -> origin/master
Auto-merging file
CONFLICT (content): Merge conflict in file
Automatic merge failed; fix conflicts and then commit the result.
If we ignore the conflicts in the clone, and make more commits in the original repo now,
repo_clone $ cd ../repo
repo $ echo "text1" >> file && git add file && git commit -m "msg" && cd ../repo_clone
And then we do a git pull
, we get
repo_clone $ git pull
U file
Pull is not possible because you have unmerged files.
Please, fix them up in the work tree, and then use 'git add/rm <file>'
as appropriate to mark resolution, or use 'git commit -a'.
Note that the file
now is in an unmerged state and if we do a git status
, we can clearly see the same:
repo_clone $ git status
On branch master
Your branch and 'origin/master' have diverged,
and have 1 and 1 different commit each, respectively.
(use "git pull" to merge the remote branch into yours)
You have unmerged paths.
(fix conflicts and run "git commit")
Unmerged paths:
(use "git add <file>..." to mark resolution)
both modified: file
So, to resolve this, we first need to resolve the merge conflict we ignored earlier
repo_clone $ vi file
and set its contents to
text2
text1
text1
and then add it and commit the changes
repo_clone $ git add file && git commit -m "resolved merge conflicts"
[master 39c3ba1] resolved merge conflicts
Related Topics
Linux Kernel: Kernel Version String Appended with Either ''+" or "-Dirty"
How to Make Ssh Command Execution to Timeout
What's The Difference Between Insmod and Modprobe
How to Run a Windows Docker Container on Linux Host
Sed Regex Problem on Mac, Works Fine on Linux
Using Output of Previous Commands in Bash
Can Upstart Expect/Respawn Be Used on Processes That Fork More Than Twice
Os X Permission Denied for /Usr/Local/Lib
Finding The Max and Min Values and Printing The Line from a File
On-The-Fly Output Redirection, Seeing The File Redirection Output While The Program Is Still Running
Set Environment Variables in an Aws Instance
Securing Udp - Openssl or Gnutls or ...
Why Doesn't Tar Preserve File Permissions
Why Processes Are Deprived of CPU for Too Long While Busy Looping in Linux Kernel
Identify Program That Connects to a Unix Domain Socket
Pcre Issue When Setting Up Wsgi Application
How to Create a Real Thread with Clone() on Linux
Which Suits Linux ? Gnu Make Vs Cmake Vs Codeblocks Vs Qmake