Read Command Output Inside Su Process

Read command output inside su process

Ok, I've found a solution. It should look like this:

Process p = Runtime.getRuntime().exec(new String[]{"su", "-c", "system/bin/sh"});
DataOutputStream stdin = new DataOutputStream(p.getOutputStream());
//from here all commands are executed with su permissions
stdin.writeBytes("ls /data\n"); // \n executes the command
InputStream stdout = p.getInputStream();
byte[] buffer = new byte[BUFF_LEN];
int read;
String out = new String();
//read method will wait forever if there is nothing in the stream
//so we need to read it in another way than while((read=stdout.read(buffer))>0)
while(true){
read = stdout.read(buffer);
out += new String(buffer, 0, read);
if(read<BUFF_LEN){
//we have read everything
break;
}
}
//do something with the output

Hope it will be helpful for someone

How to read stdout from a sub process in bash in real time

As you correctly observed, $(command) waits for the entire output of command, splits that output, and only after that, the for loop starts.

To read output as soon as is available, use while read:

./script-test | while IFS= read -r line; do
echo "do stuff with $line"
done

or, if you need to access variables from inside the loop afterwards, and your system supports <()

while IFS= read -r line; do
echo "do stuff with $line"
done < <(./script-test)
# do more stuff, that depends on variables set inside the loop

libsuperuser get command output in real time

You must use Shell.Interactive.addCommand() with its callback:

Shell.Interactive rootSession = new Shell.Builder().useSU().open(/*...*/);

Once the session is open, you can add commands:

rootSession.addCommand(new String[] { "ls -l /sdcard" },
1, // a command id
new Shell.OnCommandLineListener() {
@Override
public void onCommandResult(int commandCode, int exitCode) {
// ...
}
@Override
public void onLine(String line) {
// ...
}
});

The onLine(String line) is what you are looking for.

See sample code "InteractiveActivity" from Chainfire's libsuperuser repository.

How do i store the output of a bash command in a variable?

PROCESS=$(echo "$LINE" | awk '{print $2}')

or

PROCESS=$(ps aux | grep "$1" | awk '{print $2}')

I don't know why you're getting the error you quoted. I can't reproduce it. When you say this:

PROCESS=$LINE | awk '{print $2}'

the shell expands it to something like this:

PROCESS='mayoff  10732 ...' | awk '{print $2}'

(I've shortened the value of $LINE to make the example readable.)

The first subcommand of the pipeline sets variable PROCESS; this variable-setting command has no output so awk reads EOF immediately and prints nothing. And since each subcommand of the pipeline runs in a subshell, the setting of PROCESS takes place only in a subshell, not in the parent shell running the script, so PROCESS is still not set for later commands in your script.

(Note that some versions of bash can run the last subcommand of the pipeline in the current shell instead of in a subshell, but that doesn't affect this example.)

Instead of setting PROCESS in a subshell and feeding nothing to awk on standard input, you want to feed the value of LINE to awk and store the result in PROCESS in the current shell. So you need to run a command that writes the value of LINE to its standard output, and connects that standard output to the standard input of awk. The echo command can do this (or the printf command, as chepner pointed out in his answer).

for the su sub-process, should i write command to the output or input stream?

Maybe the naming is a little weird, but Process.getOutputStream() returns an OutputStream connected to the standard input of the process.

The names are from the point of view of the parent process. The parent process's output is the subprocess's input. The parent process's input is the subprocess's output.

How would I store the shell command output to a variable in python?

By using module subprocess. It is included in Python's standard library and aims to be the substitute of os.system. (Note that the parameter capture_output of subprocess.run was introduced in Python 3.7)

>>> import subprocess
>>> subprocess.run(['cat', '/etc/hostname'], capture_output=True)
CompletedProcess(args=['cat', '/etc/hostname'], returncode=0, stdout='example.com\n', stderr=b'')
>>> subprocess.run(['cat', '/etc/hostname'], capture_output=True).stdout.decode()
'example.com\n'

In your case, just:

import subprocess

v = subprocess.run(['cat', '/etc/redhat-release'], capture_output=True).stdout.decode()

Update: you can split the shell command easily with shlex.split provided by the standard library.

>>> import shlex
>>> shlex.split('cat /etc/redhat-release')
['cat', '/etc/redhat-release']
>>> subprocess.run(shlex.split('cat /etc/hostname'), capture_output=True).stdout.decode()
'example.com\n'

Update 2: os.popen mentioned by @Matthias

However, is is impossible for this function to separate stdout and stderr.

import os

v = os.popen('cat /etc/redhat-release').read()


Related Topics



Leave a reply



Submit