Load JSON Data with LOAD DATA
On this page
Data from JSON files can be loaded using the LOAD DATA command.
JSON LOAD DATA
Syntax
LOAD DATA [LOCAL] INFILE 'file_name'[REPLACE | SKIP { CONSTRAINT | DUPLICATE KEY } ERRORS]INTO TABLE tbl_nameFORMAT JSONsubvalue_mapping[SET col_name = expr,...][WHERE expr,...][MAX_ERRORS number][ERRORS HANDLE string]subvalue_mapping:( {col_name | @variable_name} <- subvalue_path [DEFAULT literal_expr], ...)subvalue_path:{% | [%::]ident [::ident ...]}
Semantics
Error Logging and Error Handling are discussed in the LOAD DATA page.
Extract specified subvalues from each JSON value in file_
.tbl_
, or to variables used for a column assignment in a SET
clause.DEFAULT
clause literal instead.WHERE
clause.
To specify the compression type of an input file, use the COMPRESSION
clause.
The file named by file_
must consist of concatenated UTF-8 encoded JSON values, optionally separated by whitespace.
Non-standard JSON values like NaN
, Infinity
, and -Infinity
must not occur in file_
.
If file_
ends in .
or .
, it will be decompressed.
JSON LOAD DATA
supports a subset of the error recovery options allowed by CSV LOAD DATA
.
Like CSV LOAD DATA
, JSON LOAD DATA
allows you to use globbing to load data from multiple files.
Writing to multiple databases in a transaction is not supported.
LOAD DATA from an AWS S3 Source
JSON files that are stored in an AWS S3 bucket can be loaded via a LOAD DATA query without a pipeline.
LOAD DATA S3 '<bucket name>'CONFIG '{"region" : "<region_name>"}'CREDENTIALS '{"aws_access_key_id" : "<key_id>","aws_secret_access_key": "<access_key>"}'INTO TABLE <table_name>;
For more information on loading data from an AWS S3 bucket, refer to .
Last modified: April 8, 2025