Compress files while reading data from STDIN
Yes, use gzip for this. The best way is to read data as input and redirect the compressed to output file i.e.
cat test.csv | gzip > test.csv.gz
cat test.csv
will send the data as stdout and using pipe-sign gzip will read that data as stdin. Make sure to redirect the gzip output to some file as compressed data will not be written to the terminal.
compress data into existing zipfile subdir from stdin
You speak of a zipfile and also of gzip
, but gzip
creates neither a .zip
file nor an archive file with a subdirectory, so let's concentrate on zip
.
Unfortunately, zip
has no option to specify a target subdirectory for the file from standard input, so we would have to resort to a temporary file, e. g.:
( cd /tmp
mkdir mysubdirinzip
echo thisshouldbezipped >mysubdirinzip/-
zip mycurzipfile.zip mysubdirinzip/-
rm -r mysubdirinzip
)
mv /tmp/mycurzipfile.zip .
How to do decompression to a compressed file read from standard input(stdin) in a shell script
Just call zcat
, this will un-gzip from the standard input.
#!/bin/sh
zcat | pax -r /tmp
...
How to read int from stdin but print text before?
Your partial output is being buffered (hence not displayed until later). You should get what you want if you change the first printf
to this:
printf "%s? %!" name
The %!
specifier asks to flush the output buffer at that point.
To see the other strings you need to flush at those later points also.
However, I wonder why you're using functions from the Format
module. Unless your project has complicated requirements for the format of the output, you would probably be better off using the simpler Printf
module. Since you're just using printf
, the code will be the same. However the Printf
module tries to be more interactive with its buffering. If you change open Format
to open Printf
, your code will work with almost no change (or at least it does for me). You still probably want to change "%s?\n"
to "%s? "
.
Is it possible to compress a piece of already-compressed-data by encrypting or encoding it?
Double compression is like perpetual motion. It is a oft discussed idea but never works. If it worked, you could compress and compress and compress and get the file down to 1 bit... See
How many times can a file be compressed?
The fundamental problem is that most files are NOT compressible--random, encrypted files even less so.
To answer your questions:
1) yes! See burrows wheeler compression
2) no.
3) no.
Related Topics
Minicom Black Background Color Is Not Respected
Gentoo Crontab: Why This Simple Crontab Is Not Working
Dd: How to Calculate Optimal Blocksize
How to Edit a Binary File on Unix Systems
How to Pipe the Results of 'Find' to Mv in Linux
Adding Any Current Directory './' to the Search Path in Linux
Linux: How to Know the Module That Exports a Device Node
Surround All Lines in a Text File with Quotes ('Something')
How to Register Fuse Filesystem Type with Mount(8) and Fstab
Changing Highlight Line Color in Emacs
Every Command Is Returning 'Bash: <Command>: Command Not Found...'
Passing Main Script Variables into Perl Modules
How to Redirect the Output of an Application in Background to /Dev/Null
Linux Kernel "Historical" Git Repository with Full History
How to Use Nohup to Run Process as a Background Process in Linux
Preserving File Permissions for Samba Shares When File Is Edited
How to Calculate System Memory Usage from /Proc/Meminfo (Like Htop)