UtilityKit

500+ fast, free tools. Most run in your browser only; Image & PDF tools upload files to the backend when you run them.

JSON to SQL Converter

Convert a JSON array of objects into INSERT statements with type-inferred CREATE TABLE. PostgreSQL, MySQL, SQLite.

About JSON to SQL Converter

JSON to SQL converts a JSON array of objects directly into INSERT statements suitable for PostgreSQL, MySQL, or SQLite. Each object becomes a row, every distinct key across all rows becomes a column, and column types are inferred from JavaScript types: number, boolean, string, null, or nested object/array (serialised to JSON and stored as TEXT). Identifier quoting uses backticks for MySQL and double quotes for PostgreSQL/SQLite. Boolean literals follow each dialect's convention (TRUE/FALSE for PostgreSQL, 0/1 for MySQL). The tool emits either a single bulk INSERT statement for fast imports or one INSERT per row when fine-grained control matters. Optional CREATE TABLE generation produces a complete schema definition with type-aware columns so the script is self-contained. All conversion runs in your browser, so JSON containing tokens or customer records stays local.

Why use JSON to SQL Converter

  • Three Dialects, Correct Output: PostgreSQL, MySQL, and SQLite each get appropriate identifier quoting (backticks vs double quotes) and boolean literal syntax (TRUE/FALSE vs 0/1) without manual fix-up.
  • Type Inference From JSON Types: number → INTEGER or NUMERIC, boolean → BOOLEAN, string → VARCHAR/TEXT sized to fit, object/array → TEXT with JSON content. The CREATE TABLE matches the data exactly.
  • Bulk INSERT or Per-Row Statements: Choose a fast multi-row INSERT for normal imports, or single-row mode when you need to filter, debug, or roll back individual inserts.
  • Stable Column Order From All Rows: Columns are derived from the union of keys across all objects, in the order they first appear — no missing columns when later rows add new fields.
  • JSON Object Storage: Nested objects and arrays in JSON values are serialised back to JSON strings and stored as TEXT, so deep structures are preserved for later parsing.
  • Browser-Only: API responses, customer data, and exports with secrets stay in your tab. Conversion runs in JavaScript with no server round-trip.

How to use JSON to SQL Converter

  1. Paste a JSON array of objects (e.g. [{"id":1,"name":"Alice"}, {"id":2,"name":"Bob"}]) into the input panel.
  2. Set the target table name — any identifier works; non-alphanumerics are sanitised.
  3. Choose the dialect: PostgreSQL, MySQL, or SQLite. Identifier quoting and boolean literals adapt automatically.
  4. Pick the mode: Bulk INSERT for a single fast statement, or one INSERT per row for finer rollback granularity.
  5. Toggle Include CREATE TABLE to prepend a schema definition with column types inferred from the JSON values.
  6. Click Generate SQL and copy or download the result. Paste directly into psql, mysql, sqlite3, or any database client.

When to use JSON to SQL Converter

  • Loading API response data into PostgreSQL for ad-hoc analytics: copy the JSON, paste, generate INSERTs, run.
  • Bootstrapping a SQLite test database from a fixtures.json file used by application tests.
  • Importing a JSON export from MongoDB or Firestore into a relational MySQL table.
  • Generating seed data for a Postgres dev environment from a JSON file in version control.
  • Migrating data between two services where the source emits JSON and the destination only accepts SQL.
  • Producing a database snapshot script from a JSON backup for archival in git or s3.

Examples

PostgreSQL with CREATE TABLE

Input: [{"id":1,"name":"Alice","active":true},{"id":2,"name":"Bob","active":false}]

Output: CREATE TABLE "my_table" ( "id" INTEGER, "name" VARCHAR(50), "active" BOOLEAN ); INSERT INTO "my_table" ("id", "name", "active") VALUES (1, 'Alice', TRUE), (2, 'Bob', FALSE);

MySQL with nested object as JSON

Input: [{"id":1,"meta":{"role":"admin"}},{"id":2,"meta":{"role":"user"}}]

Output: INSERT INTO `my_table` (`id`, `meta`) VALUES (1, '{"role":"admin"}'), (2, '{"role":"user"}');

SQLite single-row mode

Input: [{"k":"a","v":1},{"k":"b","v":2}]

Output: INSERT INTO "my_table" ("k", "v") VALUES ('a', 1); INSERT INTO "my_table" ("k", "v") VALUES ('b', 2);

Tips

  • Always run CREATE TABLE separately first, verify the inferred types match the application schema, then run the INSERTs — type mismatches caught mid-import are painful to fix.
  • For nested objects/arrays in JSON, the column type defaults to TEXT containing the JSON-stringified value — query with PostgreSQL JSONB or MySQL JSON_EXTRACT for structure access.
  • Bulk INSERT is much faster than single-row mode on large arrays because of reduced round-trip and parsing overhead, but a single bad row aborts the entire statement.
  • JSON null values become SQL NULL, distinct from empty strings — make sure your CREATE TABLE column constraints allow NULL or pre-process with json-repair to remove nulls first.
  • MySQL boolean output is 0/1 (not TRUE/FALSE) for maximum cross-version compatibility — works on MySQL 5.x, 8.x, and MariaDB without surprises.

Frequently Asked Questions

Is my JSON uploaded anywhere?
No. JSON is parsed and converted to SQL entirely in your browser using JavaScript. API responses with tokens, customer records, and other sensitive data never leave your tab.
What if my JSON objects have different keys?
All distinct keys across all objects become columns, in first-appearance order. Objects missing a key get NULL in that column — schemas with optional fields are handled correctly.
How are nested objects and arrays stored?
They are JSON-stringified and stored as TEXT (or VARCHAR if short enough). To query the structure later, use PostgreSQL JSONB casts or MySQL JSON_EXTRACT — the round-trip is lossless.
Can it handle JSON with thousands of rows?
Yes, up to ~50K rows comfortably. Beyond that, browser memory and string-concat speed become limiting — split into chunks or use a server-side import tool.
What about duplicate keys within a single object?
JSON.parse keeps the last value when keys are duplicated, per browser spec. Each input object effectively has unique keys after parsing, so the SQL output has no ambiguity.
Why are MySQL booleans 0/1 instead of TRUE/FALSE?
MySQL's BOOLEAN is a synonym for TINYINT(1), and 0/1 literals work universally across MySQL versions and storage engines. TRUE/FALSE work too but only since 5.x — 0/1 is the safer default.
How do I get DATE or TIMESTAMP types?
The tool emits date-like strings as VARCHAR/TEXT to preserve the exact source value. Cast to DATE inside the database with column::DATE in PostgreSQL or CAST(col AS DATE) in MySQL after import.
What if a column has mixed numbers and strings?
Type inference falls back to VARCHAR or TEXT — the most permissive type that holds all observed values. No data loss; you can re-type the column after import if you have a stricter schema in mind.

Explore the category

Glossary

JSON array of objects
The standard tabular JSON shape: an array where every element is an object with the same (or overlapping) keys. Each object becomes one row in SQL output.
Type inference
Determining each column's SQL type from the JavaScript types of the values in the input. Numbers become INTEGER/NUMERIC, booleans become BOOLEAN, strings become VARCHAR/TEXT.
Identifier quoting
Wrapping table and column names in delimiter characters so SQL parsers treat them as identifiers. MySQL uses backticks (`name`); PostgreSQL and SQLite use double quotes ("name").
Bulk INSERT
A single INSERT statement with multiple VALUES rows. Faster than one INSERT per row because of reduced parsing and network overhead, but atomic — one bad row aborts the entire statement.
Boolean literal
How true/false values are written in SQL. PostgreSQL accepts TRUE/FALSE keywords; MySQL prefers 0/1 integers because BOOLEAN is internally TINYINT(1).
NULL
The SQL absence-of-value marker, distinct from an empty string or zero. JSON null values are translated to SQL NULL, allowing IS NULL queries downstream.