Psycopg2 Insert Python Dictionary as Json

psycopg2 insert python dictionary as json

cur.execute("INSERT INTO product(store_id, url, price, charecteristics, color, dimensions) VALUES (%s, %s, %s, %s, %s, %s)", (1,  'http://www.google.com', '$20', json.dumps(thedictionary), 'red', '8.5x11'))

That will solve your problem. However, you really should be storing keys and values in their own separate columns. To retrieve the dictionary, do:

cur.execute('select charecteristics from product where store_id = 1')
dictionary = json.loads(cur.fetchone()[0])

Create/Insert Json in Postgres with requests and psycopg2

It seems like you want to create a table with one column named "data". The type of this column is JSON. (I would recommend creating one column per field, but it's up to you.)

In this case the variable data (that is read from the request) is a list of dicts. As I mentioned in my comment, you can loop over data and do the inserts one at a time as executemany() is not faster than multiple calls to execute().

What I did was the following:

  1. Create a list of fields that you care about.
  2. Loop over the elements of data
  3. For each item in data, extract the fields into my_data
  4. Call execute() and pass in json.dumps(my_data) (Converts my_data from a dict into a JSON-string)

Try this:

#!/usr/bin/env python
import requests
import psycopg2
import json

conn = psycopg2.connect(database='NHL', user='postgres', password='postgres', host='localhost', port='5432')

req = requests.get('http://www.nhl.com/stats/rest/skaters?isAggregate=false&reportType=basic&isGame=false&reportName=skatersummary&sort=[{%22property%22:%22playerName%22,%22direction%22:%22ASC%22},{%22property%22:%22goals%22,%22direction%22:%22DESC%22},{%22property%22:%22assists%22,%22direction%22:%22DESC%22}]&cayenneExp=gameTypeId=2%20and%20seasonId%3E=20172018%20and%20seasonId%3C=20172018')

# data here is a list of dicts
data = req.json()['data']

cur = conn.cursor()
# create a table with one column of type JSON
cur.execute("CREATE TABLE t_skaters (data json);")

fields = [
'seasonId',
'playerName',
'playerFirstName',
'playerLastName',
'playerId',
'playerHeight',
'playerPositionCode',
'playerShootsCatches',
'playerBirthCity',
'playerBirthCountry',
'playerBirthStateProvince',
'playerBirthDate',
'playerDraftYear',
'playerDraftRoundNo',
'playerDraftOverallPickNo'
]

for item in data:
my_data = {field: item[field] for field in fields}
cur.execute("INSERT INTO t_skaters VALUES (%s)", (json.dumps(my_data),))


# commit changes
conn.commit()
# Close the connection
conn.close()

I am not 100% sure if all of the postgres syntax is correct here (I don't have access to a PG database to test), but I believe that this logic should work for what you are trying to do.

Update For Separate Columns

You can modify your create statement to handle multiple columns, but it would require knowing the data type of each column. Here's some psuedocode you can follow:

# same boilerplate code from above
cur = conn.cursor()
# create a table with one column per field
cur.execute(
"""CREATE TABLE t_skaters (seasonId INTEGER, playerName VARCHAR, ...);"""
)

fields = [
'seasonId',
'playerName',
'playerFirstName',
'playerLastName',
'playerId',
'playerHeight',
'playerPositionCode',
'playerShootsCatches',
'playerBirthCity',
'playerBirthCountry',
'playerBirthStateProvince',
'playerBirthDate',
'playerDraftYear',
'playerDraftRoundNo',
'playerDraftOverallPickNo'
]

for item in data:
my_data = [item[field] for field in fields]
# need a placeholder (%s) for each variable
# refer to postgres docs on INSERT statement on how to specify order
cur.execute("INSERT INTO t_skaters VALUES (%s, %s, ...)", tuple(my_data))


# commit changes
conn.commit()
# Close the connection
conn.close()

Replace the ... with the appropriate values for your data.

How to insert array of json by psycopg2

This is the answer from psycopg team

Looking at mogrify:

print(cur.mogrify("%(data)s", insertdata).decode()) 
ARRAY['{"foo": 1}','{"foo": 2}']

The adapter for the json wrapper doesn't add a cast to the snippet it generates, so it's passed as "unknown" to the parser. Postgres can cast unknown -> json, but, because there isn't a type such as "unknown array" (there is "anyarray", but it's only a pseudotype to be used as parameter type, I don't think it can be really instantiated), the array is temporarily converted to a text array. Because there isn't an implicit text[] -> json[] cast, automatic cast fails.

You can cast the expression, as suggested, using

cur.execute("insert into testtable values(%(id)s, %(data)s::json[])", insertdata)

Note that psycopg 3 has a more refine adaptation system, able to figure out the type of the parameter, and the query above doesn't require an explicit cast.

Psycopg2 insert python dictionary in postgres database

Two solutions:

d = {'k1': 'v1', 'k2': 'v2'}

insert = 'insert into table (%s) values %s'
l = [(c, v) for c, v in d.items()]
columns = ','.join([t[0] for t in l])
values = tuple([t[1] for t in l])
cursor = conn.cursor()
print cursor.mogrify(insert, ([AsIs(columns)] + [values]))

keys = d.keys()
columns = ','.join(keys)
values = ','.join(['%({})s'.format(k) for k in keys])
insert = 'insert into table ({0}) values ({1})'.format(columns, values)
print cursor.mogrify(insert, d)

Output:

insert into table (k2,k1) values ('v2', 'v1')
insert into table (k2,k1) values ('v2','v1')


Related Topics



Leave a reply



Submit