Get Memory Overflow Caused by a Memory Leak and the Application Keep Running and Allocating

C potential memory leak caused by abnormally terminating program

You don't have to worry about it on popular OSes like Windows and Linux.

Virtual memory ceases to exist when the process terminates. So it's not possible for it to leak after a process terminates.

Physical memory always belongs to the OS to allocate as it pleases, whether or not your process is still running. (Unless you lock your allocations, in which case it ceases to be locked when the corresponding virtual memory mapping is destroyed which happens on process termination anyway.)

There are a few resources that are not cleaned up (like some types of shared memory) but that's pretty exotic.

When you call malloc, typically just backing store (essentially RAM+swap) is reserved and a virtual memory mapping (which is essentially free) is created. When you first write to that virtual memory mapping, physical pages of memory (RAM) are mapped into it to "back" it. That RAM always belongs to the OS to use as it pleases and the OS will re-use the RAM for other purposes if it deems that wise.

When a process terminates, its address space ceases to exist. That means any virtual memory mappings or reservations go away. Unshared physical pages will have their use count drop to zero when the virtual memory mapping goes away, rendering those pages of physical memory free.

It's worth understanding this in detail because you can easily draw the wrong conclusions about edge cases if you don't understand what's going on under the hood. Also, this will give you a framework to plug concepts such as file mappings, memory overcommit, and shared memory into.

Does an Application memory leak cause an Operating System memory leak?

On operating systems with protected memory (Mac OS 10+, all Unix-clones such as Linux, and NT-based Windows systems meaning Windows 2000 and younger), the memory gets released when the program ends.

If you run any program often enough without closing it in between (running more and more instances at the same time), you will eventually run out of memory, regardless of whether there is a memory leak or not, so that's also true of programs with memory leaks. Obviously, programs leaking memory will fill the memory faster than an identical program without memory leaks, but how many times you can run it without filling the memory depends much rather on how much memory that program needs for normal operation than whether there's a memory leak or not. That comparison is really not worth anything unless you are comparing two completely identical programs, one with a memory leak and one without.

Memory leaks become the most serious when you have a program running for a very long time. Classic examples of this is server software, such as web servers. With games or spreadsheet programs or word processors, for instance, memory leaks aren't nearly as serious because you close those programs eventually, freeing up the memory. But of course memory leaks are nasty little beasts which should always be tackled as a matter of principle.

But as stated earlier, all modern operating systems release the memory when the program closes, so even with a memory leak, you won't fill up the memory if you're continuously opening and closing the program.

Is leaked memory freed up when the program exits?

Yes, a "memory leak" is simply memory that a process no longer has a reference to, and thus can no longer free. The OS still keeps track of all the memory allocated to a process, and will free it when that process terminates.

In the vast majority of cases the OS will free the memory - as is the case with normal "flavors" of Windows, Linux, Solaris, etc. However it is important to note that in specialized environments such as various Real-Time Operating Systems the memory may not be freed when the program is terminated.

Out of Memory error - possibly due to memory leak?

You are running out of memory because your code will run recursively and create an infinite number of objects of BoardActivity, Move and Player.

How? Well, when the BoardActivity class is initialized, the line

Move move = new Move();

will create new Move object. This, in turn will result in the execution of the line

Player player = new Player();

from inside the Move class. This will spawn a new Player object which contains the line

BoardActivity board = new BoardActivity();

Now this will create another BoardActivity object and this takes us back to the beginning creating an endless cycle of object spawning.

Simply put, BoardActivity->Move->Player->BoardActivity->Move->Player....

I'm guessing the AppCompatActivity is an android activity. Android activities are pretty heavy and allocates some good amount of heap memory when initialized. This is basically why you are running out of memory so quickly.

You shouldn't be creating an activity this way. Either create one and start from the main activity or load it as the default activity using the AndroidManifest

Also instead of creating a separate OnClickListener for each button, it would be better to create a single OnClickListener and check button ids using if statements to determine which button was clicked.

Why would this memory leak occur

When you run pop() as seen in your full program, the only pointer to the qNode previously pointed to by heap->arr[0] is passed back in the return value from pop().

qNode* pop(Heap* heap)
{
qNode* result = heap->arr[0];

heap->arr[0] = heap->arr[heap->heapSize - 1];
heap->heapSize = heap->heapSize - 1;

heapify(heap, 0);
return result;
}

This means that the caller of pop() now holds the last pointer to that qNode and is obligated to make sure that the qNode is freed before the pointer goes out of scope.

Your code contains several places where the caller of pop() fails to do this. Here is an example from buildTree (last line shown). The qNode is leaked when buildTree returns because there is now no longer any pointer to that qNode.

Node* buildTree(Heap* heap, int* tab) {
for (int i = 0; i < 256; ++i) {
if (tab[i]) {
Node* node = createNode(tab[i], i, NULL, NULL);
push(heap, node);.


}
}

if (heap->heapSize == 1) {
return pop(heap)->data;

Memory leak in C,C++; forgot to do free,delete

It's per-process. Once your process exits, the allocated memory is returned to the OS for use by other processes (new or existing).

To answer your edited question, there's only a finite amount of memory in your machine. So if you have a memory leak, then the major problem is that the memory isn't available for other processes to use. A secondary, but not negligible, effect is that your process image grows, you'll swap to disc and performance will be hit. Finally your program will exhaust all the memory in the system and fail, since it's unable to allocate any memory for itself.

It's arguable that for a small process with a short lifetime, memory leaks are tolerable, since the leaked memory will be small in quantity and short-lived.

Take a look at this resource, for possibly more info than you'll ever need. What we're discussing here is dynamic or heap allocation.

Is this considered memory leak?

In a hosted environment (e.g. your typical Unix / Windows / Mac OS X, even DOS, machine) when the application terminates all the memory it occupied is automatically reclaimed by the operating system. Therefore, it doesn't make sense to worry about such memory leaks.

In some cases, before an application terminates, you may want to release all the dynamic memory you allocated in order to detect potential memory leaks through a leak detector, like valgrind. However, even in such a case, the example you describe wouldn't be considered a memory leak.

In general, failing to call a destructor is not the same as causing a memory leak. Memory leaks stem from memory allocated on the heap (with new or malloc or container allocators). Memory allocated on the stack is automatically reclaimed when the stack is unwound. However, if an object holds some other resource (say a file or a window handle), failing to call its destructor will call a resource leak, which can also be a problem. Again, modern OSs will reclaim their resources when an application terminates.

What happens if memory is leaking?

When your process allocates memory from the OS on an ongoing basis, and never frees up any of it, you will eventually be using more memory than there is physically in the machine. At this point, the OS will first swap out to virtual memory (degrades performance) if it has any, and at some point your process will reach a point where the OS can no longer grant it more memory, because you've exceeded the maximum amount of addressable space (4GB on a 32bit OS).

There are basically two reasons this can happen: You've allocated memory and you've lost the pointer to it (it has become unreachable to your program), so you cannot free it any longer. That's what most people call a memory leak. Alternatively, you may just be allocating memory and never freeing it, because your program is lazy. that's not so much a leak, but in the end, the problems you get into are the same ones.

Anatomy of a Memory Leak

The best explanation I've seen is in Chapter 7 of the free Foundations of Programming e-book.

Basically, in .NET a memory leak occurs when referenced objects are rooted and thus cannot be garbage collected. This occurs accidentally when you hold on to references beyond the intended scope.

You'll know that you have leaks when you start getting OutOfMemoryExceptions or your memory usage goes up beyond what you'd expect (PerfMon has nice memory counters).

Understanding .NET's memory model is your best way of avoiding it. Specifically, understanding how the garbage collector works and how references work — again, I refer you to chapter 7 of the e-book. Also, be mindful of common pitfalls, probably the most common being events. If object A is registered to an event on object B, then object A will stick around until object B disappears because B holds a reference to A. The solution is to unregister your events when you're done.

Of course, a good memory profile will let you see your object graphs and explore the nesting/referencing of your objects to see where references are coming from and what root object is responsible (red-gate ants profile, JetBrains dotMemory, memprofiler are really good choices, or you can use the text-only WinDbg and SOS, but I'd strongly recommend a commercial/visual product unless you're a real guru).

I believe unmanaged code is subject to its typical memory leaks, except that shared references are managed by the garbage collector. I could be wrong about this last point.



Related Topics



Leave a reply



Submit