~ 5 min read
✨ Use gron to grep JSON
Written by Brie Carranza

I took this photograph in the German countryside last autumn.
One of my absolute favorite tools is gron
: a CLI for making JSON greppable. I’ve found myself excitedly discussing gron
often enough that I wanted to take a moment to share what’s so cool about it here. From the README:
gron transforms JSON into discrete assignments to make it easier to grep for what you want and see the absolute ‘path’ to it. It eases the exploration of APIs that return large blobs of JSON
A quick demonstration of how gron
“flattens” JSON:
# cat cat.json
{"key1": "value1", "key2": "value2"}
# gron cat.json
json = {};
json.key1 = "value1";
json.key2 = "value2";
💡 That example alone might be enough for you to see where gron
fits into your workflow. You can install gron
or keep reading for some of the cool features and an asciinema
demo of how to use gron
to interact directly with GitLab’s REST API!
✨ What you can do with gron
There are some really nice gron
features besides the ability to flatten JSON:
-s
— treat each line of input as a separate JSON object- 👉 This is useful for parsing things like log files where each line is a JSON object.
--ungron
— after you are done withgrep
, turn the data back intoJSON
!- make
HTTP
requests withgron
I’ll share an example to illustrate the utility of each of these features.
🌊 -s
, --stream
-s, --stream Treat each line of input as a separate JSON object
Let’s say you have a service that uses JSON for logging where each line in the log file is a JSON object. If you do not pass -s
, gron
will only look at the first line. Use -s
to parse each line in the log file. An example of a service that logs in this manner is sidekiq
. Let’s use a slightly edited version of the example Sidekiq log entries to see how this works.
# file sidekiq.json
sidekiq.json: JSON data
# gron sidekiq.json
json = {};
json.lvl = "INFO";
json.msg = "Running in ruby 2.5.1p57 [x86_64-darwin17]";
json.pid = 90069;
json.tid = "104v8ph";
json.ts = "2019-09-01T22:34:59.778Z";
Without -s
, we just get the first JSON object that is detected.
# gron -s sidekiq.json
json = [];
json[0] = {};
json[0].lvl = "INFO";
json[0].msg = "Running in ruby 2.5.1p57 [x86_64-darwin17]";
json[0].pid = 90069;
json[0].tid = "104v8ph";
json[0].ts = "2019-09-01T22:34:59.778Z";
json[1] = {};
json[1].lvl = "INFO";
json[1].msg = "See LICENSE and the LGPL-3.0 for licensing details.";
json[1].pid = 90069;
json[1].tid = "104v8ph";
json[1].ts = "2019-09-01T22:34:59.778Z";
json[2] = {};
json[2].lvl = "INFO";
json[2].msg = "Upgrade to Sidekiq Pro";
json[2].pid = 90069;
json[2].tid = "104v8ph";
json[2].ts = "2019-09-01T22:34:59.778Z";
...
Let’s say you want to get the msg
field for every JSON object in the log file:
# gron -s sidekiq.json | grep msg | cut -d'=' -f2-
"Running in ruby 2.5.1p57 [x86_64-darwin17]";
"See LICENSE and the LGPL-3.0 for licensing details.";
"Upgrade to Sidekiq Pro";
⏪ -u
, --ungron
-u, --ungron Reverse the operation (turn assignments back into JSON)
Let’s say you have a big JSON object. You want a subset of the data inside and you want it to remain in JSON format. You could edit the JSON by hand but that’s tedious and error-prone. I think --ungron
is perfect for these situations.
- Send the JSON to
gron
- Use
grep
to extract the bits you want - Send what’s left to
gron
with--ungron
to get JSON output again
# cat cat.json
{"name": "plop", "type": "tabby"}
# gron cat.json
json = {};
json.name = "plop";
json.type = "tabby";
# gron cat.json | grep name
json.name = "plop";
# gron cat.json | grep name | gron --ungron
{
"name": "plop"
}
The examples I have included are simplified to demonstrate how gron
works. Consider the potential complexity of API responses and the length of log files that support engineers (and the like) must analyze. In these situations, gron
can be super helpful for finding the necessary information quickly and effectively. Adopting gron
involves a fairly gentle learning curve that makes the tool worth adding to your toolbox.
Inspired by this blog post by Jimmy Ray, I added alias norg="gron --ungron"
to my public dotfiles.
🥌 Make HTTP requests with gron
Thanks to the magic of libcurl
, you can make HTTP requests with gron
. That means you can retrieve the JSON and convert it into a format that is greppable all in one go.
Instead of doing this:
# curl --silent -L https://gitlab.com/api/v4/projects/44051429 \
| gron | grep web_url
json.namespace.web_url = "https://gitlab.com/brie";
json.web_url = "https://gitlab.com/brie/pastebin-bisque";
You can do this:
# gron https://gitlab.com/api/v4/projects/44051429 | grep web_url
json.namespace.web_url = "https://gitlab.com/brie";
json.web_url = "https://gitlab.com/brie/pastebin-bisque";
📹 Demo Time!
🐾 Next Steps
If you are interested in more on this topic, I would recommend:
- tomnomnom/gron on GitHub
- Advanced Usage - a walkthrough of the examples from the jq tutorial with
gron
- Advanced Usage - a walkthrough of the examples from the jq tutorial with
- fastgron for using
gron
with big (hundreds of MB or larger) JSON files - Grepping through API payloads with Gron
- JSON command-line toolbox (jq, gron, jc, etc)
- my log parsing bookmarks
- this StackExchange answer showing how to emulate
gron
withjq
usingto_entries
💖 From the Heart
I do a lot of troubleshooting at work and it’s really neat how often people say “wow, that’s so cool!” when I show them how gron
works. ✨ I hope you also get a bit of that magic, dear reader. I’d love to hear about it if you find gron
useful.