Check What Files Are Open in Python

check what files are open in Python

I ended up wrapping the built-in file object at the entry point of my program. I found out that I wasn't closing my loggers.

import io
import sys
import builtins
import traceback
from functools import wraps

def opener(old_open):
@wraps(old_open)
def tracking_open(*args, **kw):
file = old_open(*args, **kw)

old_close = file.close
@wraps(old_close)
def close():
old_close()
open_files.remove(file)
file.close = close
file.stack = traceback.extract_stack()

open_files.add(file)
return file
return tracking_open

def print_open_files():
print(f'### {len(open_files)} OPEN FILES: [{", ".join(f.name for f in open_files)}]', file=sys.stderr)
for file in open_files:
print(f'Open file {file.name}:\n{"".join(traceback.format_list(file.stack))}', file=sys.stderr)

open_files = set()
io.open = opener(io.open)
builtins.open = opener(builtins.open)

check if a file is open in Python

I assume that you're writing to the file, then closing it (so the user can open it in Excel), and then, before re-opening it for append/write operations, you want to check that the file isn't still open in Excel?

This is how you could do that:

while True:   # repeat until the try statement succeeds
try:
myfile = open("myfile.csv", "r+") # or "a+", whatever you need
break # exit the loop
except IOError:
input("Could not open file! Please close Excel. Press Enter to retry.")
# restart the loop

with myfile:
do_stuff()

List all currently open file handles?

The nice way of doing this would be to modify your code to keep track of when it opens a file:

def log_open( *args, **kwargs ):
print( "Opening a file..." )
print( *args, **kwargs )
return open( *args, **kwargs )

Then, use log_open instead of open to open files. You could even do something more hacky, like modifying the File class to log itself. That's covered in the linked question above.

There's probably a disgusting, filthy hack involving the garbage collector or looking in __dict__ or something, but you don't want to do that unless you absolutely really truly seriously must.

Check if a file is not open nor being used by another process

An issue with trying to find out if a file is being used by another process is the possibility of a race condition. You could check a file, decide that it is not in use, then just before you open it another process (or thread) leaps in and grabs it (or even deletes it).

Ok, let's say you decide to live with that possibility and hope it does not occur. To check files in use by other processes is operating system dependant.

On Linux it is fairly easy, just iterate through the PIDs in /proc. Here is a generator that iterates over files in use for a specific PID:

def iterate_fds(pid):
dir = '/proc/'+str(pid)+'/fd'
if not os.access(dir,os.R_OK|os.X_OK): return

for fds in os.listdir(dir):
for fd in fds:
full_name = os.path.join(dir, fd)
try:
file = os.readlink(full_name)
if file == '/dev/null' or \
re.match(r'pipe:\[\d+\]',file) or \
re.match(r'socket:\[\d+\]',file):
file = None
except OSError as err:
if err.errno == 2:
file = None
else:
raise(err)

yield (fd,file)

On Windows it is not quite so straightforward, the APIs are not published. There is a sysinternals tool (handle.exe) that can be used, but I recommend the PyPi module psutil, which is portable (i.e., it runs on Linux as well, and probably on other OS):

import psutil

for proc in psutil.process_iter():
try:
# this returns the list of opened files by the current process
flist = proc.open_files()
if flist:
print(proc.pid,proc.name)
for nt in flist:
print("\t",nt.path)

# This catches a race condition where a process ends
# before we can examine its files
except psutil.NoSuchProcess as err:
print("****",err)

How to check if a file is already opened (in the same process)

You should open the same file but assign them to different variables, like so:

file_obj = open(filename, "wb+")

if not file_obj.closed:
print("File is already opened")

The .closed only checks if the file has been opened by the same Python process.

Check for open files with Python in Linux

Update as of April, 2018 for more recent versions of psutil:

Newer versions of psutil now use a slightly different function name for this, open_files, as shown in the following example from their docs:

>>> import psutil
>>> f = open('file.ext', 'w')
>>> p = psutil.Process()
>>> p.open_files()
[popenfile(path='/home/giampaolo/svn/psutil/file.ext', fd=3)]

Original Answer / older version of psutil:

If you look in the documentation for the psutil python module (available on PyPI) you'll find a method that checks for open files on a given process. You'll probably want to get a list of all active PIDs as described in the related stack overflow response. Then use the method below:

get_open_files()
Return regular files opened by process as a list of namedtuples including file absolute path name and file descriptor. Example:

>>> f = open('file.ext', 'w')
>>> p = psutil.Process(os.getpid())
>>> p.get_open_files()
[openfile(path='/home/giampaolo/svn/psutil/file.ext', fd=3)]

Changed in 0.2.1: OSX implementation rewritten in C; no longer requiring lsof.
Changed in 0.4.1: FreeBSD implementation rewritten in C; no longer requiring lsof.

Edit: for iterating over the active PIDs psutil has another method (which is also referenced in the above previous stack overflow response):

psutil.process_iter()

Return an iterator yielding a Process class instances for all running processes on the local machine. Every new Process instance is only created once and then cached into an internal table which is updated every time this is used.

How to check whether a file is_open and the open_status in python

This is not quite what you want, since it just tests whether a given file is write-able. But in case it's helpful:

import os

filename = "a.txt"
if not os.access(filename, os.W_OK):
print "Write access not permitted on %s" % filename

(I'm not aware of any platform-independent way to do what you ask)



Related Topics



Leave a reply



Submit