Load CSV Files with LOAD DATA

Data from CSV files can be loaded using the LOAD DATA command. For more information on the LOAD DATA command, refer to LOAD DATA. To see a few examples of using LOAD DATA with CSV files, refer to Load CSV Files Examples.

CSV LOAD DATA

Syntax

LOAD DATA [LOCAL] INFILE '<file_name>'
[REPLACE | IGNORE | SKIP { ALL | CONSTRAINT | DUPLICATE KEY | PARSER } ERRORS]
INTO TABLE <table_name>
[CHARACTER SET <character_set_name>]
[{FIELDS | COLUMNS}
[TERMINATED BY '<string>']
[[OPTIONALLY] ENCLOSED BY '<char>']
[ESCAPED BY '<char>']
]
[LINES
[STARTING BY '<string>']
[TERMINATED BY '<string>']
]
[TRAILING NULLCOLS]
[NULL DEFINED BY <string> [OPTIONALLY ENCLOSED]]
[IGNORE <number> LINES]
[ ({<column_name> | @<variable_name>}, ...) ]
[SET <column_name> = <expression>,...]
[WHERE <expression>,...]
[MAX_ERRORS <number>]
[ERRORS HANDLE <string>]

Remarks

  • Error Logging and Error Handling are discussed on the LOAD DATA page.

  • To specify the compression type of an input file, use the COMPRESSION clause. See LOAD DATA for more information.

  • If a CSV file appears to have the incorrect number of fields in any line, you can use the SKIP PARSER ERRORS option to skip the line. LOAD DATA reports a warning for every line that is skipped.

    Important

    Lines in a CSV file may appear to have the wrong number of fields if the FIELDS TERMINATED BY, FIELDS ENCLOSED BY, or ESCAPED BY clauses are incorrectly configured. If LOAD DATA incorrectly finds the start of the next line in a CSV after a parser error, it may parse all the subsequent lines incorrectly. For these reasons, investigate the CSV input and configuration settings mentioned above before using SKIP PARSER ERRORS.

  • The SKIP ALL ERRORS option is inclusive of the SKIP PARSER ERRORS, SKIP DUPLICATE KEY ERRORS and SKIP CONSTRAINT ERRORS options, i.e., specifying the SKIP ALL ERRORS option in a LOAD DATA query applies the behavior of the other three options.

  • The TERMINATED BY clause allows you to define field, column, and line delimiters so that the input data is interpreted and read correctly. For example, use FIELDS TERMINATED BYclause to load a CSV file where the fields are delimited by commas. Additionally, use the LINES TERMINATED BY '\r\n' clause if the lines in the CSV file are terminated by carriage return/newline pairs.

  • The ENCLOSED BY or equivalent OPTIONALLY ENCLOSED BY clause allows you to specify a string that encloses the field values. For example, use the ENCLOSED BY '"' clause to load a CSV file where the fields are enclosed within double quotation. Note that LOAD DATA will still load a field value even if it is not enclosed.

  • The ESCAPED BY clause allows you to specify the escape character. For example, if the input data contains special character(s), you may need to escape those characters to avoid misinterpretation. Also, you may need to redefine the default escape character to load a data set that contains the said character.

  • Many characters can be an escape. If the FIELDS ESCAPED BY clause is empty, the character escape sequence will do nothing.

  • The STARTING BY clause allows you to load only those lines of data that include a specified string (or prefix). While loading data, the STARTING BY clause skips the specified prefix and anything before it. It also skips the lines that do not contain the specified prefix.

    If no FIELDS or LINES clause is specified, then SingleStore uses the following defaults:

    FIELDS TERMINATED BY '\t'
    ENCLOSED BY ''
    ESCAPED BY '\\'
    LINES TERMINATED BY '\n'
    STARTING BY ''
  • The TRAILING NULLCOLS clause allows the input file to contain rows having fewer than the number of columns in the table. These missing fields must be trailing columns in the row; they are inserted as NULL values in the table. See Load CSV Files Examples.

  • The NULL DEFINED BY clause inserts NULL field values in the table for fields in the input file having the value string_to_insert_as_null. The OPTIONALLY ENCLOSED option ensures that a quoted field is also treated as NULL, not an empty string. Refer to Load CSV Files Examples for more information.

    Note

    If the string value 'NULL' is passed to a number-type column (for example, DECIMAL), it is parsed as a string and converted to 0. To insert NULL values instead, use the NULL DEFINED BY 'NULL' OPTIONALLY ENCLOSED clause. You can use the ENCLOSED BY clause in conjunction to specify the string that encloses the NULL values.

  • The IGNORE <number> LINES clause ignores the specified lines from the beginning of the input file. For example, use IGNORE 1 LINES to skip the header line that contains the column names.

LOAD DATA from an AWS S3 Source

CSV files that are stored in an AWS S3 bucket can be loaded via a LOAD DATA query without a pipeline.

LOAD DATA S3 '<bucket name>'
CONFIG '{"region" : "<region_name>"}'
CREDENTIALS '{"aws_access_key_id" : "<key_id>",
"aws_secret_access_key": "<access_key>"}'
INTO TABLE <table_name>;

For more information on loading data from an AWS S3 bucket, refer to .

Last modified: April 8, 2025

Was this article helpful?

Verification instructions

Note: You must install cosign to verify the authenticity of the SingleStore file.

Use the following steps to verify the authenticity of singlestoredb-server, singlestoredb-toolbox, singlestoredb-studio, and singlestore-client SingleStore files that have been downloaded.

You may perform the following steps on any computer that can run cosign, such as the main deployment host of the cluster.

  1. (Optional) Run the following command to view the associated signature files.

    curl undefined
  2. Download the signature file from the SingleStore release server.

    • Option 1: Click the Download Signature button next to the SingleStore file.

    • Option 2: Copy and paste the following URL into the address bar of your browser and save the signature file.

    • Option 3: Run the following command to download the signature file.

      curl -O undefined
  3. After the signature file has been downloaded, run the following command to verify the authenticity of the SingleStore file.

    echo -n undefined |
    cosign verify-blob --certificate-oidc-issuer https://oidc.eks.us-east-1.amazonaws.com/id/CCDCDBA1379A5596AB5B2E46DCA385BC \
    --certificate-identity https://kubernetes.io/namespaces/freya-production/serviceaccounts/job-worker \
    --bundle undefined \
    --new-bundle-format -
    Verified OK