For longer than I’d care to admit I operated on the false belief that jq, the command line json parser, couldn’t handle json blobs larger than a few thousand bytes.

When I ran jq with these larger files it would just hang, and if I cancelled the operation I would get an error about malformed json.

I would search for other people complaining about this extreme limitation and found plenty of hits on Google, but the other users were discussing limitations in the terrabytes, not kilobytes. And even then the complaints were on the runtime, not a complete failure.

I learned this week that the problem I was hitting was instead a limitation of the pipe operation (the | character). It seems that on MacOS the pipe buffer maximum size is 64kB.

I was hitting this issue because I became accustomed to running jq like this:

cat myfile.json | jq '.'

My queries started to return right away once I switched to running jq on large files like this:

jq . myfile.json

So in a more complicated scenario, such as sourcing data from a remote source, you could try running:

jq . <(curl https://fake-json-api.mock.beeceptor.com/users) 

Rather than the slightly more intuitive:

curl https://fake-json-api.mock.beeceptor.com/users | jq .