Importing Json into MySQL

Importing JSON file to MySQL 5.7

A couple of options are:

  1. 13.2.6 LOAD DATA Syntax
  2. 7.2 JSON Import Utility

Test:

$ mysql
Welcome to the MySQL monitor. Commands end with ; or \g.
Your MySQL connection id is 1
Server version: 5.7.26 MySQL Community Server (GPL)

mysql> SELECT VERSION();
+-----------+
| VERSION() |
+-----------+
| 5.7.26 |
+-----------+
1 row in set (0.00 sec)

mysql> DROP TABLE IF EXISTS `test`.`example_table`;
Query OK, 0 rows affected, 1 warning (0.00 sec)

mysql> CREATE TABLE IF NOT EXISTS `test`.`example_table` (
-> `id` INT NOT NULL AUTO_INCREMENT,
-> `json_data` JSON NOT NULL,
-> PRIMARY KEY (`id`)
-> );
Query OK, 0 rows affected (0.01 sec)

1. LOAD DATA

File: /path/to/file/sample.json

{"price": null, "sale_list": [{"buyer": "SmackMe089", "date": "April 29th 2019 21:20:50", "id": "1234", "item_desc": ""}]}

MySQL Command-Line Client

mysql> LOAD DATA INFILE '/path/to/file/sample.json'
-> INTO TABLE `test`.`example_table` (`json_data`);
Query OK, 1 row affected (0.01 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0

mysql> SELECT `id`, `json_data`
-> FROM `test`.`example_table`\G
*************************** 1. row ***************************
id: 1
json_data: {"price": null, "sale_list": [{"id": "1234", "date": "April 29th 2019 21:20:50", "buyer": "SmackMe089", "item_desc": ""}]}
1 row in set (0.00 sec)

2. JSON Import Utility

File: /path/to/file/sample.json

{
"price": null,
"sale_list": [
{
"buyer": "SmackMe089",
"date": "April 29th 2019 21:20:50",
"id": "1234",
"item_desc": ""
}
]
}

MySQL Shell: As @TimBiegeleisen says, you can use MySQL Shell (even with MySQL 5.7), but you must activate X Plugin:

$ mysqlsh --sql
MySQL Shell 8.0.16

Your MySQL connection id is 2 (X protocol)
Server version: 5.7.26 MySQL Community Server (GPL)
No default schema selected; type \use <schema> to set one.

MySQL 127.0.0.1:15726+ SQL > SELECT VERSION();
+-----------+
| VERSION() |
+-----------+
| 5.7.26 |
+-----------+
1 row in set (0.0004 sec)

MySQL 127.0.0.1:15726+ SQL > SELECT `id`, `json_data`
-> FROM `test`.`example_table`\G
*************************** 1. row ***************************
id: 1
json_data: {"price": null, "sale_list": [{"id": "1234", "date": "April 29th 2019 21:20:50", "buyer": "SmackMe089", "item_desc": ""}]}
1 row in set (0.00 sec)

MySQL 127.0.0.1:15726+ SQL > \js
Switching to JavaScript mode...

MySQL 127.0.0.1:15726+ JS > util.importJson('/path/to/file/sample.json', {schema: 'test', table: 'example_table', tableColumn: 'json_data'});
Importing from file "/path/to/file/sample.json" to table `test`.`example_table` in MySQL Server at 127.0.0.1:15726

.. 1.. 1
Processed 204 bytes in 1 document in 0.0007 sec (1.36K documents/s)
Total successfully imported documents 1 (1.36K documents/s)

MySQL 127.0.0.1:15726+ JS > \sql
Switching to SQL mode... Commands end with ;

MySQL 127.0.0.1:15726+ SQL > SELECT `id`, `json_data`
-> FROM `test`.`example_table`\G
*************************** 1. row ***************************
id: 1
json_data: {"price": null, "sale_list": [{"id": "1234", "date": "April 29th 2019 21:20:50", "buyer": "SmackMe089", "item_desc": ""}]}
1 row in set (0.0007 sec)
*************************** 2. row ***************************
id: 2
json_data: {"price": null, "sale_list": [{"id": "1234", "date": "April 29th 2019 21:20:50", "buyer": "SmackMe089", "item_desc": ""}]}
2 rows in set (0.0006 sec)

how to import json file from https web page into a mariadb or mysql database?

Just a thought, but you could maybe just write a BASH script, e.g.

#!/bin/sh


cd "$(dirname "$0")"
export PATH=/bin:/usr/bin:/usr/local/bin
TODAY=`date +"%d%b%Y-%H%M"`

MYSQL_HOST='127.0.0.1'
MYSQL_PORT='3306'
MYSQL_USER='root'
MYSQL_PASSWORD='root'


url=https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson
DATA=$(curl ${url} 2>/dev/null)
printf '%s' "$DATA" | awk '{print $0}'
exit

...
...
...

and then use a tool like https://webinstall.dev/jq/, Python and whatever other tools you have on your system to extract the data that you want and then update the DB.

I am more familiar with PHP, which would work also. You could tidy that up a bit, but seems to work actually.

Sample Image

Sample Image

earthquake.php

<?php
$database = false;
try {

$options = array(PDO::ATTR_DEFAULT_FETCH_MODE => PDO::FETCH_OBJ, PDO::ATTR_ERRMODE => PDO::ERRMODE_WARNING, PDO::ATTR_EMULATE_PREPARES => true );
$conn = new PDO('mysql:host=127.0.0.1;dbname=test;port=3306;charset=utf8','root','root', $options);


} catch (PDOException $e) {

// Echo custom message. Echo error code gives you some info.
echo '[{"error":"Database connection can not be estabilished. Please try again later. Error code: ' . $e->getCode() . '"}]';
exit;
}

$url = "https://earthquake.usgs.gov/earthquakes/feed/v1.0/summary/all_week.geojson";
$ch = curl_init();
curl_setopt($ch, CURLOPT_RETURNTRANSFER, 1);
curl_setopt($ch, CURLOPT_URL, $url);
$result = curl_exec($ch);
$features = json_decode($result)->features;
foreach ($features as $feature) {
echo 'Mag: '.$feature->properties->mag.', Place: '.$feature->properties->place.', '.gmdate("Y-m-d H:i:s", $feature->properties->time/1000).PHP_EOL;
$query = 'INSERT INTO features (mag, place, time) VALUES (?, ?, ?)';
$params = [$feature->properties->mag, $feature->properties->place, gmdate("Y-m-d H:i:s", $feature->properties->time/1000)];
$stmt = $conn->prepare($query) or die ('["status":{"error":"Prepare Statement Failure","query":"' .$query . '"}]');
$stmt->execute($params) or die('[{"error":"' . $stmt->errorInfo()[2] . '","query":"' .$query . '","params":' .json_encode($params) . '}]');

}

?>

create a local DB called features with mag, place and time columns. If you have php on your system just run it from the CLI, php earthquake.php

e.g. Insert into mysql from Bash script

I use Laravel a bit, and would probably actually build my own model and use Eloquent and a little UI to handle that, but using a script seems like an option.

How can I import a JSON file into MySQL database, using a simple query, without actually converting it to any other file formats like CSV etc.?

I hope that in the near future there is a native functionality from MySQL.

An option (not simple query) is something like the following script (adapt as needed). Depending on the number of items may have performance problems.

File: /path/to/file/loadsetProfile.json:

[
{
"executionDateTime":"2017-07-07 15:21:15",
"A":1,
"B":1
},
{
"executionDateTime":"2017-07-07 15:21:15",
"A":2,
"B":2
},
{
"executionDateTime":"2017-07-07 15:21:15",
"A":3,
"B":3
},
{
"executionDateTime":"2017-07-07 15:21:15",
"A":4,
"B":4
}
]

MySQL Command-Line:

mysql> SELECT VERSION();
+-----------+
| VERSION() |
+-----------+
| 5.7.18 |
+-----------+
1 row in set (0.00 sec)

mysql> DROP PROCEDURE IF EXISTS `import_from_json`;
Query OK, 0 rows affected (0.00 sec)

mysql> DROP FUNCTION IF EXISTS `uuid_to_bin`;
Query OK, 0 rows affected (0.00 sec)

mysql> DROP TABLE IF EXISTS `temp_my_table`, `my_table`;
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE TABLE IF NOT EXISTS `temp_my_table` (
-> `id` BINARY(16) NOT NULL PRIMARY KEY,
-> `content` JSON NOT NULL
-> );
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE TABLE IF NOT EXISTS `my_table` (
-> `executionDateTime` TIMESTAMP,
-> `A` BIGINT UNSIGNED,
-> `B` BIGINT UNSIGNED
-> );
Query OK, 0 rows affected (0.00 sec)

mysql> CREATE FUNCTION `uuid_to_bin` (`id` VARCHAR(36))
-> RETURNS BINARY(16)
-> DETERMINISTIC
-> RETURN UNHEX(REPLACE(`id`, '-', ''));
Query OK, 0 rows affected (0.00 sec)

mysql> DELIMITER //

mysql> CREATE PROCEDURE `import_from_json`(`_id` VARCHAR(36))
-> BEGIN
-> DECLARE `_id_current_json` BINARY(16) DEFAULT `uuid_to_bin`(`_id`);
-> DECLARE `_items_length`,
-> `_current_item` BIGINT UNSIGNED DEFAULT 0;
-> DECLARE `_content` JSON DEFAULT (SELECT `content`
-> FROM `temp_my_table`
-> WHERE `id` = `_id_current_json`);
->
-> IF JSON_VALID(`_content`) THEN
-> SET `_items_length` := JSON_LENGTH(`_content`),
-> @`insert_import_from_json` := NULL;
-> WHILE `_current_item` < `_items_length` DO
-> SET @`insert_import_from_json` := CONCAT('
'> INSERT INTO `my_table` (
'> `executionDateTime`,
'> `A`,
'> `B`
'> )
'> SELECT
'> `content` ->> \'$[', `_current_item`, '].executionDateTime\',
'> `content` ->> \'$[', `_current_item`, '].A\',
'> `content` ->> \'$[', `_current_item`, '].B\'
'> FROM `temp_my_table`
'> WHERE `id` = \'', `_id_current_json`, '\'
'> ');
-> PREPARE `stmt` FROM @`insert_import_from_json`;
-> EXECUTE `stmt`;
-> SET `_current_item` := `_current_item` + 1;
-> END WHILE;
->
-> IF `_current_item` > 0 THEN
-> SET @`insert_import_from_json` := NULL;
-> DEALLOCATE PREPARE `stmt`;
-> END IF;
-> END IF;
-> END//
Query OK, 0 rows affected (0.00 sec)

mysql> DELIMITER ;

mysql> SET @`UUID` := UUID();
Query OK, 0 rows affected (0.00 sec)

mysql> LOAD DATA LOCAL INFILE '/path/to/file/loadsetProfile.json'
-> INTO TABLE `temp_my_table`
-> LINES TERMINATED BY '\r'
-> (`content`)
-> SET `id` = `uuid_to_bin`(@`UUID`);
Query OK, 1 row affected (0.00 sec)
Records: 1 Deleted: 0 Skipped: 0 Warnings: 0

mysql> CALL `import_from_json`(@`UUID`);
Query OK, 0 rows affected (0.00 sec)

mysql> SELECT
-> `executionDateTime`,
-> `A`,
-> `B`
-> FROM
-> `my_table`;
+---------------------+------+------+
| executionDateTime | A | B |
+---------------------+------+------+
| 2017-07-07 15:21:15 | 1 | 1 |
| 2017-07-07 15:21:15 | 2 | 2 |
| 2017-07-07 15:21:15 | 3 | 3 |
| 2017-07-07 15:21:15 | 4 | 4 |
+---------------------+------+------+
4 rows in set (0.01 sec)

Using LOAD DATA to import JSON data and specify date data type on import

I understand that you are using MySQL 8.0.4 or higher because you are using JSON_TABLE() function.

One option is to use STR_TO_DATE function:

SELECT
`test`.`buyer`,
STR_TO_DATE(`test`.`date`, '%M %D %Y') `date`,
`test`.`id`,
`test`.`item_desc`
FROM
`example_table`,
JSON_TABLE(
`json_data`,
'$.sale_list[*]' COLUMNS(
`buyer` VARCHAR(40) PATH '$.buyer',
`date` VARCHAR(40) PATH '$.date',
`id` VARCHAR(40) PATH '$.id',
`item_desc` VARCHAR(40) PATH '$.item_desc'
)
) `test`;

See dbfiddle.

UPDATE

In LOAD DATA it is possible to convert the Date to the selected format, however, the sentence is somewhat more complicated and the file format to load must maintain the structure. In addition, in the new Date format, information such as Time will be lost:

LOAD DATA INFILE '/path/to/file/sample.json'
INTO TABLE `example_table` (@`json`)
SET `json_data` =
JSON_REPLACE(
@`json`,
'$.sale_list[0].date',
LEFT(
STR_TO_DATE(
JSON_UNQUOTE(
JSON_EXTRACT(
@`json`,
'$.sale_list[0].date'
)
),
'%M %D %Y %H:%i:%s'),
10)
);

How to import JSON values inside MySQL (10.2.36-MariaDB) table?

The solution which must work on 10.2.36-MariaDB (all used constructions are legal for this version):

CREATE TABLE test (ID INT, CUSTOMER_ID INT, CUSTOMER_NAME VARCHAR(255))
WITH RECURSIVE
cte1 AS ( SELECT LOAD_FILE('C:/ProgramData/MySQL/MySQL Server 8.0/Uploads/json.txt') jsondata ),
cte2 AS ( SELECT 1 level, CAST(jsondata AS CHAR) oneobject, jsondata
FROM cte1
UNION ALL
SELECT level + 1,
TRIM(SUBSTRING(jsondata FROM 1 FOR 2 + LOCATE('}', jsondata))),
TRIM(SUBSTRING(jsondata FROM 1 + LOCATE('}', jsondata) FOR LENGTH(jsondata)))
FROM cte2
WHERE jsondata != '' )
SELECT oneobject->>"$.ID" ID,
oneobject->>"$.CUSTOMER_ID" CUSTOMER_ID,
oneobject->>"$.CUSTOMER_NAME" CUSTOMER_NAME
FROM cte2 WHERE level > 1;

Tested on MySQL 8.0.16 (I have no available MariaDB now):

Sample Image

The content of json.txt file matches shown in the question (misprint in attribute name edited).

Sample Image

PS. Of course the SELECT itself may be used for to insert the data into existing table.



Related Topics



Leave a reply



Submit