Save JSON Outputted from a Url to a File

Save JSON outputted from a URL to a file

This is easy in any language, but the mechanism varies. With wget and a shell:

wget 'http://search.twitter.com/search.json?q=hi' -O hi.json

To append:

wget 'http://search.twitter.com/search.json?q=hi' -O - >> hi.json

With Python:

urllib.urlretrieve('http://search.twitter.com/search.json?q=hi', 'hi.json')

To append:

hi_web = urllib2.urlopen('http://search.twitter.com/search.json?q=hi');
with open('hi.json', 'ab') as hi_file:
hi_file.write(hi_web.read())

cURL - execute a command and save the Json result to file?

Okay solved it. Thanks to @GabrielSantos's answer, this is the working code:

curl -H "Accept: application/json+v3" -H "x-api-key: <my_api_key>" https://beta.check-mot.service.gov.uk/trade/vehicles/mot-tests?registration=X182XCD -o myresponse.json

The \ for the registration was stopping the paramater being accepted and the Json was staying blank.

If I remove -o myresponse.json, the result is printed on the CMD, which is also useful.

Lesson: If entire code going to be single CMD command, remove EVERY \.

Hope this helps!

Pretty-Print JSON Data to a File using Python

You should use the optional argument indent.

header, output = client.request(twitterRequest, method="GET", body=None,
headers=None, force_auth_header=True)

# now write output to a file
twitterDataFile = open("twitterData.json", "w")
# magic happens here to make it pretty-printed
twitterDataFile.write(simplejson.dumps(simplejson.loads(output), indent=4, sort_keys=True))
twitterDataFile.close()

Save json string to client pc (using HTML5 API)

You can use a Blob and the HTML5 a[download] feature to provide a JSON backup download:

var data = {a:1, b:2, c:3};
var json = JSON.stringify(data);
var blob = new Blob([json], {type: "application/json"});
var url = URL.createObjectURL(blob);

var a = document.createElement('a');
a.download = "backup.json";
a.href = url;
a.textContent = "Download backup.json";

Here is a jsfiddle example: http://jsfiddle.net/potatosalad/yuM2N/

Writing Scrapy Python Output to JSON file

There is actually a scrapy command to do this (Read):

scrapy crawl <spidername> -o <outputname>.<format>
scrapy crawl quotes -o quotes.json

But since you asked for the python code, I came up with this:

    def parse(self, response):
with open("data_file.json", "w") as filee:
filee.write('[')
for index, quote in enumerate(response.css('div.quote')):
json.dump({
'text': quote.css('span.text::text').extract_first(),
'author': quote.css('.author::text').get(),
'tags': quote.css('.tag::text').getall()
}, filee)
if index < len(response.css('div.quote')) - 1:
filee.write(',')
filee.write(']')

Which simply does the same thing as the scrapy output command for json files.

Output a list of JSON data to a JSON array and write them to file

To turn a list of json strings into a list of json objects, you have a few options for applying json.loads to each element in the array. Exhibit A, list comprehension:

parsed_data = [json.loads(s) for s in a]

Exhibit B, map:

parsed_data = map(json.loads, a)

Or, to create a list using map:

parsed_data = list(map(json.loads, a))

EDIT: to output this new data structure into the desired format, you can do:

with open('output.json', 'w') as fs:
json.dump(parsed_data, fs, indent=4)

Retrieving JSON from URL on Android

What you receive is a series of characters from the InputStream that you append to a StringBuffer and convert to String at the end - so the result of String is ok :)

What you want is to post-process this String via org.json.* classes like

String page = new Communicator().executeHttpGet("Some URL");
JSONObject jsonObject = new JSONObject(page);

and then work on jsonObject. As the data you receive is an array, you can actually say

String page = new Communicator().executeHttpGet("Some URL");
JSONArray jsonArray = new JSONArray(page);
for (int i = 0 ; i < jsonArray.length(); i++ ) {
JSONObject entry = jsonArray.get(i);
// now get the data from each entry
}


Related Topics



Leave a reply



Submit