Node.Js Execute System Command Synchronously

node.js execute system command synchronously

Node.js (since version 0.12 - so for a while) supports execSync:

child_process.execSync(command[, options])

You can now directly do this:

const execSync = require('child_process').execSync;
code = execSync('node -v');

and it'll do what you expect. (Defaults to pipe the i/o results to the parent process). Note that you can also spawnSync now.

node.js -- execute command synchronously and get result

So I have a solution working, but don't exactly like it... Just posting here for reference:

I'm using the node-ffi library referenced in the other SO post. I have a function that:

  • takes in a given command
  • appends >> run-sync-output
  • executes it
  • reads run-sync-output synchronously and stores the result
  • deletes this tmp file
  • returns result

There's an obvious issue where if the user doesn't have write access to the current directory, it will fail. Plus, it's just wasted effort. :-/

Run few exec() commands one-by-one

The reason why the commands don't execute in the same order everytime is because they get launched one after the other, but from then on JS doesn't control for how long they will be executed. So, for a program like yours that is basically this:

launch cmd1, then do callback1
launch cmd2, then do callback2
respond to the client

you don't have any control over when will callback1 and callback2 will get executed. According to your description, you are facing this one:

launch cmd1
launch cmd2
respond to the client
callback2
(something else happens in your program)
callback1

and that's why you only see what you see.


So, let's try to force their order of execution! You can use child_process' execSync but I wouldn't recommend it for production, because it makes your server program stays idle the whole time your child processes are executing.

However you can have a very similar syntax by using async/await and turning exec into an async function:

const { exec: execWithCallback } = require('child_process');
const { promisify } = require('util');
const exec = promisify(execWithCallback);

async function myFunc1() {
try {
const {stdout, stderr} = await exec('command 1');
} catch(error) {
console.error(`exec error: ${error}`);
throw error;
}
}

// same for myFunc2

and for your server:

app.get('/my_end_point', async (req, res) => {
try {
await myFunc1();
await myFunc2();
res.send('Hello World, from express');
} catch (error) {
res.send(error);
}
});

Meteor execute system command synchronously

You could use a future:

function callToEngine(argument) {
var Future = Npm.require('fibers/future');

var fut = new Future();

///do stuff

engine.on('close', function(code)
{
result = JSON.parse(answerBuffer);
console.log('child process exited with code ' + code);
fut.return(result);
});

return fut.wait();
}

Then simply use:

var result = callToEngine(argument);

The future will ensure that return will only return something when fut.return is run

More info on other designs at the Meteor Async Guide : https://gist.github.com/possibilities/3443021

Execute multiple shell commands synchronously in Node

There is a lot of stuff you could do. the simplest one would be to use execSync, and execute one command after each other.

as you need commands to be executed one after the other, that would be the solution I would go with.

If you want to stick to an asynchronous pattern, use a control-flow-lib like step or async.

You could even promisify exec with a lib like bluebird.

it all depends on personal coding preferences and the task you need to do.

Executing shell command using child process

If you take a look at the child process docs, they say exec:

spawns a shell and runs a command within that shell, passing the stdout and stderr to a callback function when complete.

However, Ganache is a process that continues running and doesn't "complete" until you kill it. This allows you to send multiple requests to Ganache without it shutting down on you.



Related Topics



Leave a reply



Submit