Using Node.Js, How to Read a JSON File into (Server) Memory

Using Node.JS, how do I read a JSON file into (server) memory?

Sync:

var fs = require('fs');
var obj = JSON.parse(fs.readFileSync('file', 'utf8'));

Async:

var fs = require('fs');
var obj;
fs.readFile('file', 'utf8', function (err, data) {
if (err) throw err;
obj = JSON.parse(data);
});

Reading a JSON file uses a lot of memory in node

This is about the expected size. At least half a gig is being taken up by the string content of big.json - JavaScript uses UCS-2, so each character must necessarily take 2 bytes. Note you're not freeing it (no delete statement), so it's still referred-to from the stack when you take the measurement.
Hard to tell exactly what the memory layout of an "array of objects" is, but objects, being hash maps, do have some overhead. If they contain strings, then again, count each character double.
All in all, this memory usage is realistic and not entirely unexpected.

How to send data read from file as response from an express route in NodeJS?

This line

app.get("/getJsonData", (err, res) => 

The err is not an error, its the request it should be:

app.get("/getJsonData", (req, res) => {
res.send(JSON.stringify(jsonData));
});

Its literally at the example Hello world site of express.js, there you will sees that they use req

https://expressjs.com/de/starter/hello-world.html

The reason why you always get an 500 error is because err, wich is obviously the request, is always truthy

You also dont need to Stringify it, you could just do res.json(jsonData)

How do I read a Huge Json file into a single object using NodeJS?

Using big-json solves this problem.

npm install big-json
const fs = require('fs');
const path = require('path');
const json = require('big-json');

const readStream = fs.createReadStream('file.json');
const parseStream = json.createParseStream();

parseStream.on('data', function(pojo) {
// => receive reconstructed POJO
});

readStream.pipe(parseStream);

Read xlsx file and json file with url signed on azure blob storage using nodejs

• Thank you @Motra for your question about reading ‘.xlsx’ and ‘.json’ files from the blob storage with the urls signed with a limited life-to-live. It will surely be a valuable addition to the SO community. For the same purpose as stated, posting your comment as an answer as it will be easier for other people with same issue to find it.

Rather than using ‘readFileSync(urlSignedBlobStorageForJSON)’ as a variable to indicate the file to be read from the blob storage, use ‘axios.get(urlBlobStorageData)).data’ to read the content from the blob storage account.

This would help you in reading the contents from ‘.xlsx’ and ‘.json’ files with URLs signed for a limited life-to-live.

• Also, please refer to the below SO community thread for more detailed information on the appropriate use of above said cmdlet in nodejs: -

node.js axios download file stream and writeFile

JSON file not found

A possible solution is to get the full path (right from C:\, for example, if you are on Windows).


To do this, you first need to import path in your code.

const path = require("path");

Next, we need to join the directory in which the JavaScript file is in and the JSON filename. To do this, we will use the code below.

const jsonPath = path.resolve(__dirname, "email_templates.json");

The resolve() function basically mixes the two paths together to make one complete, valid path.

Finally, you can use this path to pass into readFileSync().

fs.readFileSync(jsonPath);

This should help with finding the path, if the issue was that it didn't like the relative path. The absolute path may help it find the file.


In conclusion, this solution should help with finding the path.

NodeJS: For each json file in directory, read file and run function in another JS file

You can change you index.js file to the below code snippet
And also require the run.js file to access the login function

const taskFolder = './tasks/';
var run = require('./run')
var fs = require('fs')
async function getTasks(file) {
const fsPromise = require('fs').promises;
let data = await fs.readFileSync(`${taskFolder}${file}`);
let parsedData = JSON.parse(data);
email = parsedData.email;
run.login(email)
}

fs.readdir(taskFolder, (err, files) => {
files.forEach(getTasks);
});

Modify the run.js file to export the login function which can be used inside index.js file

function login(email){
console.log(email);
}

module.exports ={
login
}


Related Topics



Leave a reply



Submit