File Glob Patterns in Linux Terminal

File Glob Patterns in Linux terminal

A nice way to do this is to use extended globs. With them, you can perform regular expressions on Bash.

To start you have to enable the extglob feature, since it is disabled by default:

shopt -s extglob

Then, write a regex with the required condition: stuff + ka + either v or bh + i + stuff. All together:

ls -l *ka@(v|bh)i*

The syntax is a bit different from the normal regular expressions, so you need to read in Extended Globs that...

@(list): Matches one of the given patterns.

Test

$ ls
a.php AABB AAkabhiBB AAkabiBB AAkaviBB s.sh
$ ls *ka@(v|bh)i*
AAkabhiBB AAkaviBB

Test whether a glob has any matches in Bash

Bash-specific solution:

compgen -G "<glob-pattern>"

Escape the pattern or it'll get pre-expanded into matches.

Exit status is:

  • 1 for no-match,
  • 0 for 'one or more matches'

stdout is a list of files matching the glob.
I think this is the best option in terms of conciseness and minimizing potential side effects.

Example:

if compgen -G "/tmp/someFiles*" > /dev/null; then
echo "Some files exist."
fi

Globbing with ls to find all files matching a certain pattern

You may want to consider find

find . -name '*.pdf' -exec ls -l {} \;

or

find . -name '*.pdf' -ls

where . is your current working directory. The glob functionality comes with 4.0+ bash. The glob extensions are not portable in other words.

find files with glob pattern

if I understood correctly you want to find vim-.gz but not vim-runtime.gz. If that's right than please make the question clearer. the answer would then be:

regex is the wrong track. finds -name option doesn't use regex. it uses file glob patterns, like the ones you specifiy on command line. try

find /var/cache/pacman/pkg -name 'vim-[0-9]*.gz'

edit: vim-[0-9]* is not good because it finds packages that have a number in the package name. but what about this?

find /var/cache/pacman/pkg -name 'vim-*-*-*.pkg.tar.xz'

edit:
sorry, my fault. doesnn't work because * matches runtime-7.3.754.
Maybe the regex idea was better than I thought (didn't know the -regex option).
how about this?

find /var/cache/pacman/pkg -regex '.*/vim-[^-]*-[^-]*-[^-]*.pkg.tar.xz'

Using a glob expression passed as a bash script argument

Addressing the "why"

Assignments, as in var=foo*, don't expand globs -- that is, when you run var=foo*, the literal string foo* is put into the variable foo, not the list of files matching foo*.

By contrast, unquoted use of foo* on a command line expands the glob, replacing it with a list of individual names, each of which is passed as a separate argument.

Thus, running ./yourscript foo* doesn't pass foo* as $1 unless no files matching that glob expression exist; instead, it becomes something like ./yourscript foo01 foo02 foo03, with each argument in a different spot on the command line.

The reason running ./yourscript "foo*" functions as a workaround is the unquoted expansion inside the script allowing the glob to be expanded at that later time. However, this is bad practice: glob expansion happens concurrent with string-splitting (meaning that relying on this behavior removes your ability to pass filenames containing characters found in IFS, typically whitespace), and also means that you can't pass literal filenames when they could also be interpreted as globs (if you have a file named [1] and a file named 1, passing [1] would always be replaced with 1).


Idiomatic Usage

The idiomatic way to build this would be to shift away the first argument, and then iterate over subsequent ones, like so:

#!/bin/bash
out_base=$1; shift

shopt -s nullglob # avoid generating an error if a directory has no .status

for dir; do # iterate over directories passed in $2, $3, etc
for file in "$dir"/*.status; do # iterate over files ending in .status within those
grep -e "string" "$file" # match a single file
done
done >"${out_base}.extension"

If you have many .status files in a single directory, all this can be made more efficient by using find to invoke grep with as many arguments as possible, rather than calling grep individually on a per-file basis:

#!/bin/bash
out_base=$1; shift

find "$@" -maxdepth 1 -type f -name '*.status' \
-exec grep -h -- /dev/null '{}' + \
>"${out_base}.extension"

Both scripts above expect the globs passed not to be quoted on the invoking shell. Thus, usage is of the form:

# being unquoted, this expands the glob into a series of separate arguments
your_script descriptor dir_*_map

This is considerably better practice than passing globs to your script (which then is required to expand them to retrieve the actual files to use); it works correctly with filenames containing whitespace (which the other practice doesn't), and files whose names are themselves glob expressions.


Some other points of note:

  • Always put double quotes around expansions! Failing to do so results in the additional steps of string-splitting and glob expansion (in that order) being applied. If you want globbing, as in the case of "$dir"/*.status, then end the quotes before the glob expression starts.
  • for dir; do is precisely equivalent to for dir in "$@"; do, which iterates over arguments. Don't make the mistake of using for dir in $*; do or for dir in $@; do instead! These latter invocations combine each element of the list with the first character of IFS (which, by default, contains the space, the tab and the newline in that order), then splits the resulting string on any IFS characters found within, then expands each component of the resulting list as a glob.
  • Passing /dev/null as an argument to grep is a safety measure: It ensures that you don't have different behavior between the single-argument and multi-argument cases (as an example, grep defaults to printing filenames within output only when passed multiple arguments), and ensures that you can't have grep hang trying to read from stdin if it's passed no additional filenames at all (which find won't do here, but xargs can).
  • Using lower-case names for your own variables (as opposed to system- and shell-provided variables, which have all-uppercase names) is in accordance with POSIX-specified convention; see fourth paragraph of the POSIX specification regarding environment variables, keeping in mind that environment variables and shell variables share a namespace.

Bash glob pattern behaviour different between terminal and shell script

Thanks @Aserre and @anubhava, it was indeed the combination of bash path and making sure globstar was enabled (for MacOSX). Full script is:

#!/usr/local/bin/bash

shopt -s globstar

echo {,**/}*.*

And yes ./** would suffice but that wasn't my problem :)

Match two integers using glob expression

What's wrong with:

$ ls /dev/ttyS[1-9]*

I mean, I know it doesn't force the subsequent characters to be digits - but because of how we know those files are named, it will always work.

Update: It seems you're slightly confused. These are glob patterns, not regexes. But I'm not sure how Python fits in with this.

If you want an exact glob pattern for your requirements, then use:

$ ls /dev/ttyS{[1-9],[1-9][0-9]}

Recursively matching filenames with glob argument

The shell performs glob expansion before it even thinks of invoking the command. Programs such as grep don't do anything to prevent globbing: they can't. You, as the caller of these programs, must tell the shell that you want to pass the special characters such as * and ? to the program, and not let the shell interpret them. You do that by putting them inside quotes:

grep -E 'ba(na)* split' *.txt

(look for ba split, bana split, etc., in all files called <something>.txt) In this case, either single quotes or double quotes will do the trick. Between single quotes, the shell expands nothing. Between double quotes, $, ` and \ are still interpreted. You can also protect a single character from shell expansion by preceding it with a backslash. It's not only wildcard characters that need to be protected; for example, above, the space in the pattern is in quotes so it's part of the argument to grep and not an argument separator. Alternative ways to write the snippet above include

grep -E "ba(na)* split" *.txt
grep -E ba\(na\)\*\ split *.txt

With most shells, if an argument contains wildcards but the pattern doesn't match any file, the pattern is left unchanged and passed to the underlying command. So a command like

grep b[an]*a *.txt

has a different effect depending on what files are present on the system. If the current directory doesn't contain any file whose name begins with b, the command searches the pattern b[an]*a in the files whose name matches *.txt. If the current directory contains files named baclava, bnm and hello.txt, the command expands to grep baclava bnm hello.txt, so it searches the pattern baclava in the two files bnm and hello.txt. Needless to say, it's a bad idea to rely on this in scripts; on the command line it can occasionally save typing, but it's risky.

When you run ack .* in a directory containing no dot file, the shell runs ack . ... The behavior of the ack command is then to print out all non-empty lines (pattern .: matches any one character) in all files under .. (the parent of the current directory) recursively. Contrast with ack '.*', which searches the pattern .* (which matches anything) in the current directory and its subdirectories (due to the behavior of ack when you don't pass any filename argument).

How can I use inverse or negative wildcards when pattern matching in a unix/linux shell?

In Bash you can do it by enabling the extglob option, like this (replace ls with cp and add the target directory, of course)

~/foobar> shopt extglob
extglob off
~/foobar> ls
abar afoo bbar bfoo
~/foobar> ls !(b*)
-bash: !: event not found
~/foobar> shopt -s extglob # Enables extglob
~/foobar> ls !(b*)
abar afoo
~/foobar> ls !(a*)
bbar bfoo
~/foobar> ls !(*foo)
abar bbar

You can later disable extglob with

shopt -u extglob


Related Topics



Leave a reply



Submit