Linux Randomly Deleted My File While Compiling What Do I Do

Linux randomly deleted my file while compiling what do I do?

Your problem is here: -o prog3.c. gcc’s -o option is used to tell gcc which name it should give to the executable it generates. So here, you’re basically asking your compiler to replace your prog3.c source file by an executable. Sadly your code is gone...

Is it possible to recover overwriten file after compilation with `-o` flag, but forgotten argument?

Your file has been deleted and can't be restored.

GDB is running a deleted executable file

Once a program, library, etc. is loaded into memory, it can be deleted but still be able to run. Once the program is closed and restarted, you will be running the new program.

An instance of where this is problematic is the many times I've deleted a library or executable that my Linux box needed to run. Once I restart my computer, I soon realize that I messed it up ...

Fun times.


By way of example,

/*
Compile me.
Run the executable.
While the program is running, delete the executable/binary.
The program will continue printing the message.
*/

#include <stdio.h>
#include <unistd.h>

int main () {
while (1) {
printf("I'm still running.\n");
sleep(2);
}

return 0;
}

To answer your question, close the currently running gdb session. Then start a new one, using your new executable.


PS: If you started it before you deleted it, you're still running it. (Where it is the old program.)

Delete files while reading directory with readdir()

Quote from POSIX readdir:

If a file is removed from or added to
the directory after the most recent
call to opendir() or rewinddir(),
whether a subsequent call to readdir()
returns an entry for that file is
unspecified.

So, my guess is ... it depends.

It depends on the OS, on the time of day, on the relative order of the files added/deleted, ...

And, as a further point, between the time the readdir() function returns and you try to unlink() the file, some other process could have deleted that file and your unlink() fails.


Edit

I tested with this program:

#include <dirent.h>
#include <stdio.h>
#include <stdlib.h>
#include <string.h>
#include <sys/types.h>
#include <unistd.h>

int main(void) {
struct dirent *de;
DIR *dd;

/* create files `one.zip` and `one.log` before entering the readdir() loop */
printf("creating `one.log` and `one.zip`\n");
system("touch one.log"); /* assume it worked */
system("touch one.zip"); /* assume it worked */

dd = opendir("."); /* assume it worked */
while ((de = readdir(dd)) != NULL) {
printf("found %s\n", de->d_name);
if (strstr(de->d_name, ".zip")) {
char logname[1200];
size_t i;
if (*de->d_name == 'o') {
/* create `two.zip` and `two.log` when the program finds `one.zip` */
printf("creating `two.zip` and `two.log`\n");
system("touch two.zip"); /* assume it worked */
system("touch two.log"); /* assume it worked */
}
printf("unlinking %s\n", de->d_name);
if (unlink(de->d_name)) perror("unlink");
strcpy(logname, de->d_name);
i = strlen(logname);
logname[i-3] = 'l';
logname[i-2] = 'o';
logname[i-1] = 'g';
printf("unlinking %s\n", logname);
if (unlink(logname)) perror("unlink");
}
}
closedir(dd); /* assume it worked */
return 0;
}

On my computer, readdir() finds deleted files and does not find files created between opendir() and readdir(). But it may be different on another computer; it may be different on my computer if I compile with different options; it may be different if I upgrade the kernel; ...

What killed my process and why?

If the user or sysadmin did not kill the program the kernel may have. The kernel would only kill a process under exceptional circumstances such as extreme resource starvation (think mem+swap exhaustion).

How to delete compiled JS files from previous typescript(.ts) files?

I came here seeing the title, and adding gulp into the mix was not a solution. So here is how I solved it.

Prefix your npm build script with rm -rf ./js/ &&

"scripts": {
...
"build": "rm -rf ./js/ && tsc",
...
},

rm -rf ./js/ forcefully removes recursively all files and dirs below ./js/ doku rm in bash

&& says, if successful do the next command && in bash

Title at the time of answering:
"How to delete compiled JS files from previous typescript(.ts) files?"

Why does my script periodically freeze while deleting millions of files?

The details are in the Linux documentation, assuming you're using Linux (other OSes may be different): see for example https://www.kernel.org/doc/Documentation/sysctl/vm.txt.

Linux handles writes to disk by creating "dirty pages", which are sections of memory that are pending a physical copy to the disk. The physical copy comes later. That's why os.remove() is usually very fast: it will just create or modify a page in memory and leave the physical copy for later. (If, soon, we do another os.remove() that needs to change the same page of memory, then we win: no need to write this page several times to disk.)

Normally, a daemon called "pdflush" wakes up periodically to do this write to disk. But if a process generates really a lot of dirty pages, then at one point the kernel will stop it (during a random one of the os.remove() calls) and force the write to disk to occur now, for some fraction of the pending pages. It only allows the program to continue when the dirty pages are again below a reasonable threshold. Likely, "pdflush" will then immediately continue writing the rest. Obviously, if your program continues to generate dirty page, it will reach again the upper limit and be paused again.

This is what causes the pauses in your process. It's a side-effect of how the kernel works. You can ignore it: physically, the disk is busy all the time.

XCode: Deleted file still has a reference somewhere...ut

You need to ensure the file reference is removed from the compile sources phase. It is possible for the file to be referenced there but not in the project navigator due to the underlying structure of the pbxproj file.

Sample Image

How do I make Git forget about a file that was tracked, but is now in .gitignore?

.gitignore will prevent untracked files from being added (without an add -f) to the set of files tracked by Git. However, Git will continue to track any files that are already being tracked.

To stop tracking a file, we must remove it from the index:

git rm --cached <file>

To remove a folder and all files in the folder recursively:

git rm -r --cached <folder>

The removal of the file from the head revision will happen on the next commit.

WARNING: While this will not remove the physical file from your local machine, it will remove the files from other developers' machines on their next git pull.



Related Topics



Leave a reply



Submit