Are you using PostgreSQL on Heroku? Then, you should know Heroku Dataclips, a no-hassle, web based tool you can use to access and export your data using SQL. And it even supports JSON export! Sadly, the JSON exported by Heroku is not as nicely formatted as you might wish for in some situations.

Instead of some array of objects with keys that match the columns you exported, Heroku gives you a array-of-arrays style representation that uses an array of values for every row.1

Technically, this is JSON, but it is rather unfriendly to read and not useful at all for many use cases.

But there is help - jq to the rescue!

Using a small jq script, you can combine the metadata in the JSON export with the data to create a more friendly. Here is the jq script code to do that:

def make_object(keys):
  [keys, .] | transpose | map({(.[0]): .[1]}) | add;

[
    (.fields) as $fieldnames
    | .values[]
    | make_object($fieldnames)
]

Safe this to your local drive and invoke it:

jq --from-file convert-export-into-friendly-json.jq -- your-exported-json-file.json

This will convert the Heroku JSON file your-exported-json-file.json and send the results to STDOUT, so you can pipe it into another file:

    jq --from-file convert-export-into-friendly-json.jq -- your-exported-json-file.json >> friendly-formatted.json

Additionally, jq offers other options, such as compact output.

In case you need some sample data to use in some template or to fill some wireframe with meanigful (real) data, you might find this handy.

  1. Of course, most probably Heroku just wants to cut down the size of the export.