Free CSV to JSON Converter
Convert CSV, TSV, and PSV files to JSON instantly. Supports nested JSON, NDJSON, keyed objects, data type detection, and bidirectional JSON↔CSV conversion.
🔒 Parsed entirely in your browser — no data uploaded
CSV Input
Output Format
The selected column becomes the object key
Use dot notation — column names matching the path will be nested
Filter Columns
▾
No columns detected yet
JSON Output
Paste CSV text or upload a file above to see JSON output here.
Showing first 200 rows — download for full output
CSV parsing is performed entirely in your browser. Large files may impact browser performance. No data is uploaded to any server. All conversion logic runs locally via JavaScript. Nothing leaves your device.
Frequently Asked Questions
CSV (Comma-Separated Values) is a plain-text tabular format where each row is a record and columns are separated by a delimiter (usually a comma). JSON (JavaScript Object Notation) is a hierarchical key-value format that supports nested structures, arrays, and typed values (numbers, booleans, null). CSV is compact and easy to open in spreadsheets like Excel or Google Sheets; JSON is richer and widely used in APIs, web applications, and data pipelines.
When data type detection is enabled, the converter inspects each cell value: strings that parse as finite numbers become JSON numbers (e.g.
'123' → 123, '3.14' → 3.14); 'true' and 'false' (case-insensitive) become JSON booleans; empty strings and the literal string 'null' become JSON null. Everything else remains a string. Disable the toggle to keep all values as strings.
Yes. The parser fully implements RFC 4180: fields containing the delimiter, double-quotes, or newlines must be enclosed in double-quotes. For example:
John,"Smith, Jr.",30 correctly parses as three fields. Literal double-quotes inside a quoted field are escaped by doubling them: "He said ""hello""" becomes He said "hello". This works for any supported delimiter.
NDJSON (Newline-Delimited JSON) outputs one JSON object per line with no enclosing array. It is ideal for streaming large datasets, log ingestion pipelines (Elasticsearch, Splunk, Logstash), and tools that process records line by line — like
jq, Python’s json.loads in a loop, or AWS Kinesis. Standard JSON Array output is better for REST APIs, JavaScript applications, and small to medium datasets.
The converter uses chunked processing: rows are parsed in batches using
setTimeout to yield the browser event loop and avoid UI freezing. A progress bar shows parsing status. Only the first 200 rows are rendered in the syntax-highlighted panel; the full dataset is included when you download the .json file. For very large files (50MB+), consider splitting them first, as browser memory limits may apply.