How to Have Each Record of Json on a Separate Line

How to have each record of JSON on a separate line?

You can use Pretty Print JSON Output (Jackson).

Bellow are some examples

  1. Convert Object and print its output in JSON format.

     User user = new User();
    //...set user data
    ObjectMapper mapper = new ObjectMapper();
  2. Pretty Print JSON String

     String test = "{\"age\":29,\"messages\":[\"msg 1\",\"msg 2\",\"msg 3\"],\"name\":\"myname\"}";

    Object json = mapper.readValue(test, Object.class);

Reference :

How to format JSON objects to be on individual lines in bash

As always when it comes to working with JSON in scripts and from the command line, jq to the rescue:

$ jq -c . input.json
{"filename":"./","line":5,"rule":"MD009","aliases":["no-trailing-spaces"],"description":"Trailing spaces"}
{"filename":"./","line":6,"rule":"MD009","aliases":["no-trailing-spaces"],"description":"Trailing spaces"}

Save R JSON object with new lines for each record

use jsonlite::stream_out

df <- mtcars[1:5,]
jsonlite::stream_out(df, file('tmp.json'))

that gives newline delimited JSON or "ndjson"

How to write each JSON objects in a newline of JSON file? (Python)

I use a trick to do what I want, to rewrite all the objects into a new line. I write what I want to keep into a newfile.

with open('verbes_lowercase.json','r',encoding='utf-8-sig') as json_data:
with open("verbes.json",'w',encoding="utf-8-sig") as file:
for k in range(0,length):
del data[k]["attribute1"]
if (k!=length-1):
file.write(json.dumps(data[k], ensure_ascii=False)+",\n")
file.write(json.dumps(data[length-1], ensure_ascii=False)+"]")

Python json.dumps() outputs all my data into one line but I want to have a new line for each entry

You can newline after each f.write(json.dumps(value, sort_keys=True, indent=0
like this -

Pandas to_json in separate lines

Use indent parameter

import pandas as pd

data = [{'a': 1, 'b': 2},
{'a': 3, 'b': 4}]

df = pd.DataFrame(data)

print(df.to_json(orient="records", indent=1))
#output :

Related Topics

Leave a reply