How to Export a MySQL Database to JSON

How to export a MySQL database to JSON?

It may be asking too much of MySQL to expect it to produce well formed json directly from a query. Instead, consider producing something more convenient, like CSV (using the INTO OUTFILE '/path/to/output.csv' FIELDS TERMINATED BY ',' snippet you already know) and then transforming the results into json in a language with built in support for it, like python or php.

Edit python example, using the fine SQLAlchemy:

class Student(object):
'''The model, a plain, ol python class'''
def __init__(self, name, email, enrolled):
self.name = name
self.email = email
self.enrolled = enrolled

def __repr__(self):
return "<Student(%r, %r)>" % (self.name, self.email)

def make_dict(self):
return {'name': self.name, 'email': self.email}

import sqlalchemy
metadata = sqlalchemy.MetaData()
students_table = sqlalchemy.Table('students', metadata,
sqlalchemy.Column('id', sqlalchemy.Integer, primary_key=True),
sqlalchemy.Column('name', sqlalchemy.String(100)),
sqlalchemy.Column('email', sqlalchemy.String(100)),
sqlalchemy.Column('enrolled', sqlalchemy.Date)
)

# connect the database. substitute the needed values.
engine = sqlalchemy.create_engine('mysql://user:pass@host/database')

# if needed, create the table:
metadata.create_all(engine)

# map the model to the table
import sqlalchemy.orm
sqlalchemy.orm.mapper(Student, students_table)

# now you can issue queries against the database using the mapping:
non_students = engine.query(Student).filter_by(enrolled=None)

# and lets make some json out of it:
import json
non_students_dicts = ( student.make_dict() for student in non_students)
students_json = json.dumps(non_students_dicts)

How to export a MySQL database to json using mysql-connector from PYTHON?

You can use default=str on json.dumps If you have objects not serializable to JSON

cursor.execute("SELECT * FROM registros")
data=cursor.fetchall()
for e in data:
print(json.dumps(data, sort_keys=True, indent=4, separators=(',', ': '), default=str))

Export mysql database / mysql tables in json format through command line OR PHP

mysql cannot output directly in json format

so you have two options:

  1. export in XML the use a tool to convert from XML to JSON (a tool that can deal with large tables of course)

  2. write a small script (for example in PHP) that fetch the data from the DB and writes it to file in JSON

Important note:

If you choose option nr. (2) you may have trouble loading the whole table data, converting to JSON and saving to file in a single "atomic" step if you have a lot of records.

However you may break the task into steps.

Basically a table turned into a JSON is an array of objects, each object representing a single record.

  1. Open a connection to the db

  2. Start writing the output file and open the array by writing an open square bracket [

  3. Execute your query fetching n ( 1 < n < 1000 ) record at time. (In order to do that you have to SORT the table by any field -ex. id- and use the LIMIT clause)

  4. Convert each record with json_econde, write the string to file, write a comma , unless you have written the last record.

  5. Back to 3 until you reach the last record.

  6. Write a closing square bracket to file ] (closing the array).

  7. Close the file and the db connection

This would require a little more coding but it's not rocket science.

...and maybe you find something online that already does that.

Update:

A script that fetch the data from the DB and writes it to file in JSON can be found here on GitHub: Dump-table-to-JSON (last updated 2016).

How can I export all table all records in a database into json using C#

JSON only has a limited number of data types (string, floating-point number, Boolean, null); you may lose precision by exporting your MySQL data to JSON (because DATETIME, TIMESTAMP, GUID, BLOB, etc., will have to be converted to a string).

But if you still want to export a database to JSON, first you need to find all the tables in the database (by querying the information_schema.tables table), then iterate over each table, selecting all the rows and dumping them to JSON. Because this may be a lot of data, to avoid running out of memory you'll need to stream the results to your output file (instead of creating a large number of objects in memory then converting them to JSON). This requires using a low-level JSON writing API, so you need to ensure that WriteStartObject and WriteEndObject calls are paired correctly to create valid JSON.

The following program snippet demonstrates this technique:

using (var connection = new MySqlConnection("Server = localhost; Database = app_erp_suneka; Uid = root; Pwd = ;"))
{
connection.Open();

// get the names of all tables in the chosen database
var tableNames = new List<string>();
using (var command = new MySqlCommand(@"SELECT table_name FROM information_schema.tables where table_schema = @database", connection))
{
command.Parameters.AddWithValue("@database", "app_erp_suneka");
using (var reader = command.ExecuteReader())
{
while (reader.Read())
tableNames.Add(reader.GetString(0));
}
}

// open a JSON file for output; use the streaming JsonTextWriter interface to avoid high memory usage
using (var streamWriter = new StreamWriter(@"C:\Temp\app_erp_suneka.json"))
using (var jsonWriter = new JsonTextWriter(streamWriter) { Formatting = Newtonsoft.Json.Formatting.Indented, Indentation = 2, IndentChar = ' ' })
{
// one array to hold all tables
jsonWriter.WriteStartArray();

foreach (var tableName in tableNames)
{
// an object for each table
jsonWriter.WriteStartObject();
jsonWriter.WritePropertyName("tableName");
jsonWriter.WriteValue(tableName);
jsonWriter.WritePropertyName("rows");

// an array for all the rows in the table
jsonWriter.WriteStartArray();

// select all the data from each table
using (var command = new MySqlCommand($@"SELECT * FROM `{tableName}`", connection))
using (var reader = command.ExecuteReader())
{
while (reader.Read())
{
// write each row as a JSON object
jsonWriter.WriteStartObject();
for (int i = 0; i < reader.FieldCount; i++)
{
jsonWriter.WritePropertyName(reader.GetName(i));
jsonWriter.WriteValue(reader.GetValue(i));
}
jsonWriter.WriteEndObject();
}
}

jsonWriter.WriteEndArray();
jsonWriter.WriteEndObject();
}

jsonWriter.WriteEndArray();
}
}

Export MySQL json data type field directly to CSV

Well its a bit like if the data were in columns, you have to extract each value you want from each rows data column

SELECT JSON_EXTRACT(data, '$.email') as email,
JSON_EXTRACT(data, '$.user_id') as user_id,
. . .

FROM users_health
WHERE foo = 'bar'
INTO OUTFILE '/var/lib/mysql-files/orders.csv'
FIELDS TERMINATED BY ','
ENCLOSED BY '"'
LINES TERMINATED BY '\n';

How to export in json format for all databases that have a conversations table from said date to said date

Bill Karwin's answer was right on. I used what I could and created the following script:

#/bin/bash

days="8"
group_name="C4"
wdir="/home/superfly/conversations"
backup_dir="${wdir}/.backups"
db_list="database_list.txt"
table="conversations"
#--
backup_date="`date +"%m-%d-%y"`"
tarball_file="${group_name}__${backup_date}"
email1="one@gmail.com"
email2="two@gmail.com"
email3="three@gmail.com"

#-- clean up any old files that may exist
cd $backup_dir && rm -rf ${tarball_file}* ${db_list} *.json;

#-- fetch database list minus the default system database names
mysql --login-path=master --skip-column-names -e "SELECT schema_name AS DatabaseList \
FROM information_schema.schemata WHERE schema_name NOT IN ('mysql', 'Database', \
'information_schema', 'performance_schema', 'sys', 'schema_name')" > $db_list

#-- fetch the last 8 days of the conversations table in json format
for database in `cat $db_list`;
do
mysql --login-path=master -BN --database=${database} --execute="SELECT \
JSON_OBJECT('id', id, 'sender_id', sender_id, 'recipient_id', recipient_id, \
'timestamp', timestamp, 'content', content, 'quick_reply', quick_reply, 'type', \
type, 'seq', seq, 'meta', meta) FROM $table WHERE timestamp >= DATE_SUB(NOW(), \
INTERVAL ${days} day)" > ${database}.json;
done

#-- remove empty jason files - compress populated json files
find $backup_dir -size 0 -delete;
tar czf ${tarball_file}.tar.gz *.json;
rm -rf *.json $db_list;

#-- email compressed file
echo "Conversations Logs From The Last 8 days" | mutt -a ${tarball_file}.tar.gz \
-s"Conversations Logs" -- $email1 $email2 $email3
exit 0;


Related Topics



Leave a reply



Submit