Python CSV to SQLite

Importing a CSV file into a sqlite3 database table using Python

import csv, sqlite3

con = sqlite3.connect(":memory:") # change to 'sqlite:///your_filename.db'
cur = con.cursor()
cur.execute("CREATE TABLE t (col1, col2);") # use your column names here

with open('data.csv','r') as fin: # `with` statement available in 2.5+
# csv.DictReader uses first line in file for column headings by default
dr = csv.DictReader(fin) # comma is default delimiter
to_db = [(i['col1'], i['col2']) for i in dr]

cur.executemany("INSERT INTO t (col1, col2) VALUES (?, ?);", to_db)
con.commit()
con.close()

is there a way to import csv data into a sqlite db, while not explicitly typing the columns out

If you don't want to store it in a file (as you don't want to create a table) you can do

conn = sqlite3.connect("file::memory:?cache=shared")

and to create the temp table (which would be a similar result as SELECT INTO), notice that you don't need to specify the column types

crt = 'CREATE TEMP TABLE t('
ins = 'INSERT INTO t VALUES('
for n in range(len(csvrows[0])):
if n:
crt += ', '
ins += ', '
crt += f'c{n}'
ins += '?'
crt += ');'
ins += ');'
sqlcursor.execute(crt)
sqlcursor.executemany(ins, csvrows)

sqlite has a VALUES keyword that you can use in a SELECT UNION or similar, for example

sqlite> VALUES (0,1,2,3,4),(5,6,7,8,9) UNION SELECT * FROM sqlite_master;
0|1|2|3|4
5|6|7|8|9

but I don't think this could help.

Create a sqlite table from a csv file

As per sqlite3 API reference, just put database name and error will go away.

con = sqlite3.connect("data.db")

python import csv to sqlite

You need to open the file before you pass it to csv.reader. Heres a basic runnable example that works. I added a class to allow your existing methods to be used as is.

import sqlite3
import csv

class csvrd(object):
def csvFile(self):

self.readFile('Labels.csv')

def readFile(self, filename):
conn = sqlite3.connect('Unicommerce.db')
cur = conn.cursor()
cur.execute("""CREATE TABLE IF NOT EXISTS unicom(products varchar,channel varchar,regulatory_forms varchar,shipment varchar)""")
filename.encode('utf-8')
print "test1"
with open(filename) as f:
reader = csv.reader(f)
for field in reader:
cur.execute("INSERT INTO unicom VALUES (?,?,?,?);", field)

conn.commit()
conn.close()

c = csvrd().csvFile()

Import .csv files into SQL database using SQLite in Python

This works for me on Windows 10, but should work under Linux/Unix too. There are several problems:

  1. The last two rows of person.csv are not correct format, but this does not prevent the program from working. You can fix this with a text editor.
  2. person.csv uses tabs as the delimiter not commas.
  3. There is a typo (spelling) in the line that starts with "to_db ="
  4. There is a mismatch in the number of columns to import (2 instead of 11)
  5. Wrong table name on executemany.

In addition, I create the database in a file rather than in memory. It is small enough that performance should not be a problem and also any changes you make will be saved.

Here is my corrected file (you can do the other table yourself):

import sqlite3, csv

# con = sqlite3.connect(":memory:")
con = sqlite3.connect("person.db")
cur = con.cursor()
cur.execute("CREATE TABLE person (personid STR,age STR,sex STR,primary_voting_address_id STR,state_code STR,state_fips STR,county_name STR,county_fips STR,city STR,zipcode STR, zip4 STR, PRIMARY KEY(personid))")

with open('person.csv','r') as person_table:
dr = csv.DictReader(person_table, delimiter='\t') # comma is default delimiter
to_db = [(i['personid'], i['age'], i['sex'], i['primary_voting_address_id'], i['state_code'], i['state_fips'], i['county_name'], i['county_fips'], i['city'], i['zipcode'], i['zip4']) for i in dr]

cur.executemany("INSERT INTO person VALUES (?,?,?,?,?,?,?,?,?,?,?);", to_db)
con.commit()

import cvs data to sqlite database using python

By default, a route only answers to GET, so you need to add other accepted methods explicitly:

@app.route('/file_read', methods=['POST'])

how to insert values from csv into sqlite3 using python

You can simply use DataFrame.to_sql() method:

import sqlite3

conn = sqlite3.connect('c:/temp/test.sqlite')

#...

df.to_sql('rduWeather', conn, if_exists='append', index_label='id')

Demo:

In [100]: df.to_sql('rduWeather', conn, if_exists='append', index_label='id')

In [101]: pd.read_sql('select * from rduWeather', conn)
Out[101]:
id date temperaturemin temperaturemax
0 0 2007-01-13 48.0 69.1
1 1 2007-01-19 34.0 54.0
2 2 2007-01-21 28.0 35.1
3 3 2007-01-25 30.9 46.9
4 4 2007-01-27 32.0 64.0
5 5 2007-02-05 19.9 39.9

In [102]: pd.read_sql('select * from rduWeather', conn, index_col='id')
Out[102]:
date temperaturemin temperaturemax
id
0 2007-01-13 48.0 69.1
1 2007-01-19 34.0 54.0
2 2007-01-21 28.0 35.1
3 2007-01-25 30.9 46.9
4 2007-01-27 32.0 64.0
5 2007-02-05 19.9 39.9


Related Topics



Leave a reply



Submit