Monitoring for Changes in File(S) in Real Time

monitoring for changes in file(s) in real time

Similar to the suggestion to use a system API, this can be also done using qtbase which will be a cross-platform means from within R:

dir_to_watch <- "/tmp"

library(qtbase)
fsw <- Qt$QFileSystemWatcher()
fsw$addPath(dir_to_watch)

id <- qconnect(fsw, "directoryChanged", function(path) {
message(sprintf("directory %s has changed", path))
})

cat("abc", file="/tmp/deleteme.txt")

How to monitor a text file in realtime

  • Tail for Win32
  • Apache Chainsaw - used this with log4net logs, may require file to be in a certain format

watchdog monitoring file for changes

Instead of LoggingEventHandler define your handler:

#!/usr/bin/python
import time
from watchdog.observers import Observer
from watchdog.events import FileSystemEventHandler

class MyHandler(FileSystemEventHandler):
def on_modified(self, event):
print(f'event type: {event.event_type} path : {event.src_path}')

if __name__ == "__main__":
event_handler = MyHandler()
observer = Observer()
observer.schedule(event_handler, path='/data/', recursive=False)
observer.start()

try:
while True:
time.sleep(1)
except KeyboardInterrupt:
observer.stop()
observer.join()

on_modified is called when a file or directory is modified.

Can vim monitor realtime changes to a file

You can :set autoread so that vim reads the file when it changes. However (depending on your platform), you have to give it focus.

From the help:

When a file has been detected to have
been changed outside of Vim and it
has not been changed inside of Vim,
automatically read it again. When the
file has been deleted this is not
done.

How to watch a file, whenever it changes, take the new additional line and perform some actions on it with powershell

What you are doing is only allowed to show a file in real-time on the screen. You cannot mess with the output doing that.

The command you are using is not for interactive use cases.

You can monitor for file updates without doing what you are doing, by using a SystemFileWatcher, which allows for monitor for file actions, that you can then take action on.



'PowerShell filesystemwatcher monitor file'

https://duckduckgo.com/?q=%27PowerShell+filesystemwatcher+monitor+file%27&t=h_&ia=web


For example from one of the hits from the above link.

https://powershell.one/tricks/filesystem/filesystemwatcher

Monitoring Folders for File Changes

With a FileSystemWatcher, you can monitor folders for file changes and
respond immediately when changes are detected. This way, you can
create “drop” folders and respond to log file changes.

Specifically, as per your use case:

Advanced Mode (asynchonous)

If you expect changes to happen in rapid succession or even
simultaneously, you can use the FileSystemWatcher in asynchronous
mode: the FileSystemWatcher now works in the background and no longer
blocks PowerShell. Instead, whenever a change occurs, an event is
raised. So with this approach, you get a queue and won’t miss any
change.

On the back side, this approach has two challenges:

  • Handling Events: since PowerShell is single-threaded by nature, it is
    not trivial to respond to events, and even more cumbersome to debug
    event handler code.

    Keeping PowerShell running: ironically, because the FileSystemWatcher
    now no longer blocks PowerShell, this leads to another problem. You
    need to keep PowerShell waiting for events but you cannot use
    Start-Sleep or and endless loop because as long as PowerShell is busy

    • and it is considered busy even if it sleeps - no events can be handled.

Implementation

The script below does the exact same thing as the synchronous version
from above, only it is event-based and won’t miss any events anymore:

# find the path to the desktop folder:
$desktop = [Environment]::GetFolderPath('Desktop')

# specify the path to the folder you want to monitor:
$Path = $desktop

# specify which files you want to monitor
$FileFilter = '*'

# specify whether you want to monitor subfolders as well:
$IncludeSubfolders = $true

# specify the file or folder properties you want to monitor:
$AttributeFilter = [IO.NotifyFilters]::FileName, [IO.NotifyFilters]::LastWrite

try
{
$watcher = New-Object -TypeName System.IO.FileSystemWatcher -Property @{
Path = $Path
Filter = $FileFilter
IncludeSubdirectories = $IncludeSubfolders
NotifyFilter = $AttributeFilter
}

# define the code that should execute when a change occurs:
$action = {
# the code is receiving this to work with:

# change type information:
$details = $event.SourceEventArgs
$Name = $details.Name
$FullPath = $details.FullPath
$OldFullPath = $details.OldFullPath
$OldName = $details.OldName

# type of change:
$ChangeType = $details.ChangeType

# when the change occured:
$Timestamp = $event.TimeGenerated

# save information to a global variable for testing purposes
# so you can examine it later
# MAKE SURE YOU REMOVE THIS IN PRODUCTION!
$global:all = $details

# now you can define some action to take based on the
# details about the change event:

# let's compose a message:
$text = "{0} was {1} at {2}" -f $FullPath, $ChangeType, $Timestamp
Write-Host ""
Write-Host $text -ForegroundColor DarkYellow

# you can also execute code based on change type here:
switch ($ChangeType)
{
'Changed' { "CHANGE" }
'Created' { "CREATED"}
'Deleted' { "DELETED"
# to illustrate that ALL changes are picked up even if
# handling an event takes a lot of time, we artifically
# extend the time the handler needs whenever a file is deleted
Write-Host "Deletion Handler Start" -ForegroundColor Gray
Start-Sleep -Seconds 4
Write-Host "Deletion Handler End" -ForegroundColor Gray
}
'Renamed' {
# this executes only when a file was renamed
$text = "File {0} was renamed to {1}" -f $OldName, $Name
Write-Host $text -ForegroundColor Yellow
}

# any unhandled change types surface here:
default { Write-Host $_ -ForegroundColor Red -BackgroundColor White }
}
}

# subscribe your event handler to all event types that are
# important to you. Do this as a scriptblock so all returned
# event handlers can be easily stored in $handlers:
$handlers = . {
Register-ObjectEvent -InputObject $watcher -EventName Changed -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Created -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Deleted -Action $action
Register-ObjectEvent -InputObject $watcher -EventName Renamed -Action $action
}

# monitoring starts now:
$watcher.EnableRaisingEvents = $true

Write-Host "Watching for changes to $Path"

# since the FileSystemWatcher is no longer blocking PowerShell
# we need a way to pause PowerShell while being responsive to
# incoming events. Use an endless loop to keep PowerShell busy:
do
{
# Wait-Event waits for a second and stays responsive to events
# Start-Sleep in contrast would NOT work and ignore incoming events
Wait-Event -Timeout 1

# write a dot to indicate we are still monitoring:
Write-Host "." -NoNewline

} while ($true)
}
finally
{
# this gets executed when user presses CTRL+C:

# stop monitoring
$watcher.EnableRaisingEvents = $false

# remove the event handlers
$handlers | ForEach-Object {
Unregister-Event -SourceIdentifier $_.Name
}

# event handlers are technically implemented as a special kind
# of background job, so remove the jobs now:
$handlers | Remove-Job

# properly dispose the FileSystemWatcher:
$watcher.Dispose()

Write-Warning "Event Handler disabled, monitoring ends."
}

So, with the above, you tweak it to look for updates/modifications, then use

$CaptureLine = Get-Content -Path 'UNCToTheLogFile' | Select-Object -Last 1

Or

$CaptureLine = Get-Content -Path  'D:\temp\book1.csv' -Tail 1

And do what you want from that.

Constantly scanning for new files in R working directory

I would recommend putting this in a while loop.

setwd("path_you're_interested_in")
old_files <- character(0)
while(TRUE){
new_files <- setdiff(list.files(pattern = "\\.csv$"), old_files)
sapply(new_files, function(x) {
# do stuff
})
old_files = c(old_files, new_files)
Sys.sleep(30) # wait half minute before trying again
}

How to monitor a complete directory tree for changes in Linux?

To my knowledge, there's no other way than recursively setting an inotify watch on each directory.

That said, you won't run out of file descriptors because inotify does not have to reserve an fd to watch a file or a directory (its predecessor, dnotify, did suffer from this limitation). inotify uses "watch descriptors" instead.

According to the documentation for inotifywatch, the default limit is 8192 watch descriptors, and you can increase it by writing the new value to /proc/sys/fs/inotify/max_user_watches.



Related Topics



Leave a reply



Submit