linux command line: How can simply feed arbitrary strings to pipe?
The answer is the <<<${my_string}
syntax
jq <<<'{"errorMessage": "....", "key1": "some message...", "key2": "message 2 ..."}'
It can be used to send in whatever as the stdin
for a command, no superfluous echo:ing needed. I use it all the time in scripts to parse the content of variables.
How to write bash output to arbitrary program stdin
You can use a named pipe. Whatever starts Sum
is responsible for creating the named pipe some place convenient, then starting Sum
with the named pipe as its standard input.
mkfifo /tmp/pipe
Sum < /tmp/pipe
Then your script could take the name of the pipe as an argument, then treat it as a file it can write to.
#!/bin/bash
p=$1
echo 2 > "$p"
echo 5 > "$p"
Then you could call your script with client /tmp/pipe
.
CMake's execute_process and arbitrary shell scripts
You can execute any shell script, using your shell's support for taking in a script within a string argument.
Example:
execute_process(
COMMAND bash "-c" "echo -n hello | sed 's/hello/world/;'"
OUTPUT_VARIABLE FOO
)
will result in FOO
containing world
.
Of course, you would need to escape quotes and backslashes with care. Also remember that running bash would only work on platforms which have bash - i.e. it won't work on Windows.
regex for counting/validating the pipe
File input.txt
:
a b c|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b
a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b| 2 S
a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b|a 1 b
The script could be:
#!/bin/bash
#
inputfile="input.txt"
if [[ ! -f "$inputfile" ]]
then
echo "The input file does not exist."
exit 1
else
while read -r line
do
echo "LINE=$line"
pipe_count=$(echo "$line" | awk -F'|' '{print NF-1}')
if [[ $pipe_count == 10 ]]
then
echo "OK, 10 |"
else
echo "NOT OK, only $pipe_count |"
fi
echo ""
done <"$inputfile"
fi
get command output in pipe, C for Linux
Is this it?
NAME
popen, pclose - process I/O
SYNOPSIS
#include <stdio.h>
FILE *popen(const char *command, const char *type);
int pclose(FILE *stream);
DESCRIPTION
The popen() function opens a process by creating a pipe, forking,
and invoking the shell. Since a pipe is by definition unidirectional, the
type argument may specify only reading or writing, not both; the resulting
stream is correspondingly read-only or write-only.
The command argument is a pointer to a null-terminated string
containing a shell command line. This command is passed to /bin/sh
using the -c flag; interpretation, if any, is performed by the shell.
The type argument is a pointer to a null-terminated string which must be
either ‘r’ for reading or ‘w’ for writing.
The return value from popen() is a normal standard I/O stream in
all respects save that it must be closed with pclose() rather than fclose().
Writing to such a stream writes to the standard input of the command; the
command’s standard output is the same as that of the process that called
popen(), unless this is altered by the command itself. Conversely, reading
from a ‘‘popened’’ stream reads the command’s standard output, and the
command’s standard input is the same as that of the process that called
popen().
Note that output popen() streams are fully buffered by default.
The pclose() function waits for the associated process to terminate
and returns the exit status of the command as returned by wait4().
Shell - Write variable contents to a file
Use the echo
command:
var="text to append";
destdir=/some/directory/path/filename
if [ -f "$destdir" ]
then
echo "$var" > "$destdir"
fi
The if
tests that $destdir
represents a file.
The >
appends the text after truncating the file. If you only want to append the text in $var
to the file existing contents, then use >>
instead:
echo "$var" >> "$destdir"
The cp
command is used for copying files (to files), not for writing text to a file.
Simulating ENTER keypress in bash script
echo -ne '\n' | <yourfinecommandhere>
or taking advantage of the implicit newline that echo generates (thanks Marcin)
echo | <yourfinecommandhere>
Now we can simply use the --sk
option:
--sk
,--skip-keypress
Don't wait for a keypress after each test
i.e. sudo rkhunter --sk --checkall
How to split a string into an array in Bash?
IFS=', ' read -r -a array <<< "$string"
Note that the characters in $IFS
are treated individually as separators so that in this case fields may be separated by either a comma or a space rather than the sequence of the two characters. Interestingly though, empty fields aren't created when comma-space appears in the input because the space is treated specially.
To access an individual element:
echo "${array[0]}"
To iterate over the elements:
for element in "${array[@]}"
do
echo "$element"
done
To get both the index and the value:
for index in "${!array[@]}"
do
echo "$index ${array[index]}"
done
The last example is useful because Bash arrays are sparse. In other words, you can delete an element or add an element and then the indices are not contiguous.
unset "array[1]"
array[42]=Earth
To get the number of elements in an array:
echo "${#array[@]}"
As mentioned above, arrays can be sparse so you shouldn't use the length to get the last element. Here's how you can in Bash 4.2 and later:
echo "${array[-1]}"
in any version of Bash (from somewhere after 2.05b):
echo "${array[@]: -1:1}"
Larger negative offsets select farther from the end of the array. Note the space before the minus sign in the older form. It is required.
Running multiple commands with xargs
cat a.txt | xargs -d $'\n' sh -c 'for arg do command1 "$arg"; command2 "$arg"; ...; done' _
...or, without a Useless Use Of cat:
<a.txt xargs -d $'\n' sh -c 'for arg do command1 "$arg"; command2 "$arg"; ...; done' _
To explain some of the finer points:
The use of
"$arg"
instead of%
(and the absence of-I
in thexargs
command line) is for security reasons: Passing data onsh
's command-line argument list instead of substituting it into code prevents content that data might contain (such as$(rm -rf ~)
, to take a particularly malicious example) from being executed as code.Similarly, the use of
-d $'\n'
is a GNU extension which causesxargs
to treat each line of the input file as a separate data item. Either this or-0
(which expects NULs instead of newlines) is necessary to prevent xargs from trying to apply shell-like (but not quite shell-compatible) parsing to the stream it reads. (If you don't have GNU xargs, you can usetr '\n' '\0' <a.txt | xargs -0 ...
to get line-oriented reading without-d
).The
_
is a placeholder for$0
, such that other data values added byxargs
become$1
and onward, which happens to be the default set of values afor
loop iterates over.
Sorting on a numerical value using csvfix for linux - turns numbers to strings
According to the on line documentation for csvfix, sort has a N
option for numeric sorts:
csvfix sort -f 2:N file.csv
Having said this, CSV isn't a particularly good format for text manipulation. If possible, you're much better off choosing DSV (delimiter separated values) such as Tab or Pipe separated, so that you can simply pipe the output to sort
, which has ample capability to sort by field, using whatever collation method you need.
Related Topics
How to Change the Permissions in Openshift Container Platform
Using $Origin to Specify the Interpreter in Elf Binaries Isn't Working
Shell Script to Write and Read Data from Serial Communication
When Running Ls -L, Why Does the Filesize on a Directory Not Match the Output of Du
Why Ln -Sf Does Not Overwrite Existing Link to Directory
Fail If a Script Expects Input or Entering Passwords
Shuffle Output of Find with Fixed Seed
How to Get Gcc to Skip Errors, But Still Output Them
Combine Two CSV Files Based on Common Column Using Awk or Sed
How to Hide Password from Jenkins Shell Output
Displaying or Redirecting a Shell's Job Control Messages
How Provide Nested Mount of Overlayfs
How to Prevent an X Window from Receiving User Input
Using Ssh to Run a Cleartool Command with Agruments on Remote a Linux MAChine
Why Does Bash Behave Differently, When It Is Called as Sh
How to Escape the Bang (!) Character in Linux Bash Shell
Preventing to Bash Script from Running in Parallel or Overlap Using Cron