Guide to Passing Bash Variables to jq
Last updated: March 18, 2024
![](http://mypaperwriting.best/777/templates/cheerup1/res/banner1.gif)
1. Overview
jq is a de facto standard utility for working with JSON data in Linux.
In this tutorial, we’ll learn how to make the jq program reusable by passing Bash variables to it .
2. Using the –arg Option
We can pass Bash variables to a jq program as a pre-defined named variable using the –arg option :
First, let’s add sample JSON data in the fruits_template.json file:
We must note that we plan to reuse the fruits_template.json file throughout this tutorial .
Further, let’s assume that we want to override the value of the fruit field with a user-defined value that’s available through the bash_fruit_var Bash variable:
Next, let’s go ahead and write a jq program using the –arg option to display the fruit field from the bash_fruit_var variable and all other field values from the fruits_template.json file using the .field operator :
We must note that we’ve added the bash_ and jq_ prefixes to the variable names to indicate that the former is a Bash variable while the latter is a jq variable. Further, we shall remember that we need to reference a jq variable by adding the $ prefix .
Finally, let’s also see an alternate approach to accessing the jq_fruit_var through $ARGS.named :
Great! We got this right. Further, we must note $ARGS.named only holds the named argument variables , as applicable in this case.
3. Using $ENV or the env Function
We can also pass the Bash variables as environment variables to the jq program. Subsequently, we can reference them within the jq program using either the $ENV variable or the env function .
Let’s start by using the $ENV variable to access the bash_fruit_var from the environment :
We must note that we could also export the bash_fruit_var to pass it as an environment variable to the jq program.
Next, let’s use the env function to access the bash_fruit_var from the environment :
Great! We’ve got the same result. Further, we must note that $ENV as an object represents the environment variables set when the jq program started, whereas env outputs an object representing jq ‘s current environment .
4. Using the –args Option
Alternatively, we can also pass the Bash variables as positional arguments to the jq program using the –args option :
We must note that jq will treat all the arguments after –args as positional arguments, so we must place it after the filename .
Let’s go ahead and define two Bash variables, namely bash_fruit_var and bash_color_var :
Moving on, let’s pass bash_fruit_var and bash_color_var variables to the jq program to override the fruit and color fields in the fruits_template.json file :
We must note that $ARGS.positional only holds positional argument variables with zero-based indexing .
5. Using the –argjson Option
We can also pass JSON-encoded Bash variables to the jq program by using the –argjson option :
Let’s define the bash_fruit_json Bash variable to hold a JSON object with two fields, namely fruit and color :
In continuation, let’s pass the Bash variable bash_fruit_json as the jq variable jq_fruit_json and use it to override the values of fruit and color fields in the fruits_template.json file :
Finally, let’s rewrite our program to access the jq_fruit_json variable through $ARGS.named :
Perfect! The result is as we expected. Further, we must note that enclosing the Bash variable within double quotes is mandatory when it contains JSON-encoded values .
6. Using the –jsonargs Option
We can use the –jsonargs option for passing JSON-encoded Bash variables as positional arguments :
Similar to the position of the –args option, we must remember to place the –jsonargs after the filename because jq will interpret all the values after –jsonargs as positional arguments .
Let’s go ahead and pass bash_fruit_json as a positional argument to the jq program:
The output looks correct. Further, we must note that we used the $ARGS.positional object to access the JSON-encoded data passed to the program .
7. Using the –rawfile Option
We can use the –rawfile option to pass the contents of a file as a variable to the jq program :
We must note that while $ bash_path_var is a Bash variable representing a valid file path, jq_file_contents is a jq variable containing the contents from that file .
Let’s start by taking a look at the raw_fruits_data file containing the name and color information for multiple fruits:
Next, let’s define the bash_fruits_file Bash variable and initialize it to the raw_fruits_data file path:
We must note that storing the file path in a Bash variable allows us to reuse the same jq program in a Bash script.
Moving on, let’s write a jq program to transform the raw data available in the raw_fruits_data file into multiple JSON fruit objects by passing its content in the jq_raw variable:
We must note that we stored the content of the fruits_template.json file in the fruit_json variable and applied the split function over the jq_raw variable to get an array of strings. Further, we used the iterator operator ( .[] ) and the select function to filter out the empty values.
Generally speaking, using the –rawfile option, we can pass non-JSON data directly from a file to the jq program.
8. Using the –slurpfile Option
We can use the –slurpfile option to pass the contents of a file containing JSON-encoded data as an array to the jq program :
Let’s take a look at the my_fruit_jsons file containing multiple JSON objects:
We must note that the my_fruit_jsons file itself isn’t a valid JSON array object .
Next, let’s define the bash_fruits_json_file Bash variable and initialize it to the my_fruit_jsons file path:
We must note that storing the file path in a Bash variable allows us to reuse the same jq program in a script.
Next, let’s see this in action by binding the JSON objects to the my_fruits array object in the jq program :
We can notice that the output is a valid JSON object now. Further, we must note that the size field is missing, as it’s not present in the individual JSON objects.
Moving on, let’s write a jq program to transform the JSON objects available in the my_fruit_jsons file into raw text separated by a | character :
We must note the content of the fruits_template.json file is available in the $fruit_json variable. Moreover, we used the reduce function to generate pipe-separated values, each separated with the newline character (“\n”) .
Later, we used the split function followed by the select function to filter out the empty line by iterating over individual string values. Additionally, we used the –raw-output option to show the output as raw text .
9. Using Variables Within Filter
Let’s start by defining the Bash variable bash_field_var and using it intuitively to get a specific field from the JSON object in fruits_template.json :
We can notice that we get a syntax error. However, the error also suggests the right way to access the field using the variable.
Next, let’s use [$jq_field] with the .field operator to access the fields represented by the bash_field_var Bash variable :
Great! We got the correct result this time.
10. Conclusion
In this tutorial, we learned multiple ways to pass Bash variables to a jq program . Additionally, we explored different options with the jq command, such as –arg , –args , –argjson , –jsonargs , –rawfile , and –slurpfile , along with a few functions such as split and reduce .
Working with JSON in bash using jq
Jq is a powerful tool that lets you read, filter, and write json in bash.
![assign jq result to variable bash The logos for bash and JSON next to each other](https://cameronnokes.com/images/json-bash.png)
Want the TL:DR? See the jq cheatsheet
Perhaps you’ve seen or even written bash that looks like this:
(Note: the above code was taken from https://hackernoon.com/a-crude-imessage-api-efed29598e61 , which is a great article).
That’s tough to read and even tougher to write. You have to pipe to 4 different utilities just to get to a property in the JSON response body! Bash doesn’t understand JSON out of the box, and using the typical text manipulation tools like grep, sed, or awk, gets difficult. Luckily there’s a better way using a tool called jq .
jq can simplify the above bash to this:
That’s much nicer 😎. By making JSON easy to work with in bash, jq opens up a lot of automation possibilities that otherwise required me to write something in node.js (which isn’t bad, it just takes longer generally).
Why not just use node.js when you need to deal with JSON?
Sometimes node.js is the right tool. For most automation tasks, I like to use bash whenever possible because it’s faster and even more portable (I can share a bash script with team members that don’t have node.js installed). To me, bash is more expressive and succinct for certain tasks than node is.
jq isn’t a built-in command in any environment, so you have to install it. Run brew install jq on macOS. See jq’s install help page for how to install on other environments.
Basics of jq
jq works similarly to sed or awk — like a filter that you pipe to and extract values from. Also like sed or awk, it basically has it’s own domain specific language (DSL) for querying JSON. Luckily, it’s really intuitive (unlike awk 😜).
Get a property
Let’s say we have JSON that looks like this:
To print out the foo property, we use the . operator followed by the property name.
That will print out 123 , as expected.
This works with nesting too. So .a.b.c.d will traverse down nested objects’ properties.
This, all by itself, is pretty useful. For a realistic and totally useful example, let’s write a script that gets the Astronomy Picture of the Day and sets it as our wallpaper (this is macOS only).
Yay! All this astronomy stuff makes it feel like the right time for a Neil deGrasse Tyson gif.
Note that if a property has a spaces or weird characters in it, you’ll have to use quotes. For example:
Also, be sure to always wrap your jq selector in a single-quotes, otherwise bash tries to interpret all the symbols like . , whereas we want jq to do that.
Now let’s see how iteration works. The array or object value iterator operator, .[] , is what makes it possible.
Here’s a really basic example:
That will output 1, 2, 3 on separate lines.
In an array of objects, you can access a property on each item in the array like so:
Or on an object, .[] will output the value of each key/value pair:
So that will return 1 2.
Note that you can also pass an index to .[] , so
will return just bar.
Now how do we do something for each line? In the same way you’d handle anything that outputs multiple lines of information in bash: xargs , for loops, or some commands just handle multiple input items, etc. For example:
jq Functions
jq also has built-in “functions”. Returning to the previous object iteration example — let’s say we wanted get the keys of the object (not the values) as an array:
which will return a b . Note that we’re also using the pipe | operator, which works the same in jq as it does in bash — it takes the output from the left and passes it as input to the right.
Another handy function for arrays and objects is the length function, which returns the array’s length property or the number of properties on an object.
You can get even fancier and create intermediary arrays and objects on the fly. Here, I’m combining the keys of the dependencies and devDependencies objects (from a package.json file) into an array, flattening it, and then getting the length.
That returns the number of dependencies and devDependencies a package.json contains.
Creating objects
You can also create objects on the fly. This can be useful for re-shaping some JSON. For example:
Let’s use it for real now
What if I wanted to audit my package.json dependencies and remove anything that’s not being used? Unused dependencies slow down npm installs for everyone and is just messy. I could manually grep usages of each dependency (via grep or in my IDE), but if you have a lot of dependencies that gets tedious fast, so let’s automate it.
[1] Here’s how the grep flags work:
–include and –exclude-dir narrow the files that get searched
-R means recursive, tells it to grep all matching files
–color colorizes the output
-n displays line numbers
[2] I have to export it so that a subshell can see it. If you want xargs to call a custom function, you have to call it in a subshell for some reason
[3] -r is for “raw-output”, so no quotes around values, which makes it suitable for processing in other bash commands. We get the dependency names as an array (this is equivalent to Object.keys(require(‘./package.json’).dependencies) in node.js)
[4] Then we pipe that to xargs which handles setting up a grep for each lines. Here’s how the xargs flags all work:
-t tells it to echo the constructed command; useful for debugging
-I {} defines the replacement string where the dependency string will get placed
-P 4 defines the concurrency, so 4 concurrent greps
we tell it to start a bash subshell where our grep_dep function is called with it’s args
There’s more that could be done to the grep-ing in that script to make it more robust, but that’s the basic gist.
I used something similar to this recently at work to prune unused dependencies. We have a huge front-end monolith with a single package.json that has 250 dependencies 🙀, so some automated assistance was necessary.
jq is awesome and makes working with JSON in bash easy. The DSL for filtering, querying, and creating JSON goes much deeper than what I’ve covered here, so see https://stedolan.github.io/jq/manual/ for the full documentation.
Share article
Using bash variables in jq
Due to the special characters used in json, the easiest way to use jq with inline scripts it by putting it between single quotes. That makes it impossible to use bash variables inside your script . Fortunately, jq has an --arg parameter to create a predefined variable from an external source.
You can use it to define a $foo variable with the contents of bash variable $FOO , for example:
You might notice the -n parameter: This tells jq to use null as input instead of reading json from stdin like it normally would. I'm using it in all examples on this page as it makes them a bit shorter. In real world usage you probably won't need it.
As usual in bash scripts you can also execute a command inline:
Every variable created with --arg is treated as a string . That might cause some unexpected behaviour when you work with numbers, like when you attempt to add a number to an argument. Comparisons between strings and numbers seem to work fine, but I wouldn't count on it (pun intended).
There are two ways to solve this: either convert the variable to a number with tonumber or use --argjson instead. I prefer to use tonumber because that way I'm sure jq forces it to be a number. As my argument may be the result of an earlier command, I may be using an error message as input instead of the expected numeric result.
Variables from files
It's also possible to load variables from files. With --slurpfile we can read a file containing json objects. An array containing the separate objects is then made available as a predefined variable. Note that even if your file contains only a single json object, as the variable is still an array, the [0] is required to access it.
A second option is --rawfile , creating a string variable with the exact contents of the file. Note that as jq outputs json, reading and printing the file from the previous example results an escaped string.
![assign jq result to variable bash assign jq result to variable bash](https://www.codeproject.com/script/Membership/Images/octicons_github-lg.png)
- Latest Articles
- Top Articles
- Posting/Update Guidelines
- Article Help Forum
![assign jq result to variable bash assign jq result to variable bash](https://codeproject.freetls.fastly.net/images/write13.png)
- View Unanswered Questions
- View All Questions
- View C# questions
- View C++ questions
- View Javascript questions
- View Visual Basic questions
- View Python questions
- CodeProject.AI Server
- All Message Boards...
- Running a Business
- Sales / Marketing
- Collaboration / Beta Testing
- Work Issues
- Design and Architecture
- Artificial Intelligence
- Internet of Things
- ATL / WTL / STL
- Managed C++/CLI
- Objective-C and Swift
- System Admin
- Hosting and Servers
- Linux Programming
- .NET (Core and Framework)
- Visual Basic
- Web Development
- Site Bugs / Suggestions
- Spam and Abuse Watch
- Competitions
- The Insider Newsletter
- The Daily Build Newsletter
- Newsletter archive
- CodeProject Stuff
- Most Valuable Professionals
- The Lounge
- The CodeProject Blog
- Where I Am: Member Photos
- The Insider News
- The Weird & The Wonderful
- What is 'CodeProject'?
- General FAQ
- Ask a Question
- Bugs and Suggestions
Jq - assign variable from jq output of length of array
![assign jq result to variable bash assign jq result to variable bash](https://codeproject.freetls.fastly.net/script/Ratings/Images/star-empty-md.png)
![](http://mypaperwriting.best/777/templates/cheerup1/res/banner1.gif)
2 solutions
- Most Recent
IMAGES
VIDEO
COMMENTS
The present solution with read is useful when you need to assign multiple keys to multiple shell variables in one shot; you then just need to use read var1 var2 var3 < <(...). Here are more general snippets to assign jq output to shell variables, that can be very handy. Two methods for single-line input (without loop):
jq: How do I assign a jq output to a variable? Ask Question Asked 1 year, 10 months ago. Modified 1 year, 10 months ago. Viewed 5k times ... How to extract value from json contained in a variable using jq in bash. 0. JQ : Output with static value / variable. 0. JQ Not Taking In Variable Data. 1.
Solved Note: a better way of doing this is saving the curl input as json and then using that with jq so you dont have to request stuff multiple times (sorry if this is a dumb question im new to bash and jq)
Further, let's assume that we want to override the value of the fruit field with a user-defined value that's available through the bash_fruit_var Bash variable: $ bash_fruit_var=Banana. Next, let's go ahead and write a jq program using the -arg option to display the fruit field from the bash_fruit_var variable and all other field values from the fruits_template.json file using the ...
Parsing JSON output to variable in bash using jq filters. Ask Question Asked 1 year, 6 months ... Getting the result into a shell variable is then a matter of running the pipeline in a command substitution and assigning it to a variable: ... Look at the result of the curl request and whether the response looks very different from what you ...
Also, be sure to always wrap your jq selector in a single-quotes, otherwise bash tries to interpret all the symbols like ., whereas we want jq to do that. Iteration. Now let's see how iteration works. The array or object value iterator operator, .[], is what makes it possible. Here's a really basic example: echo '[1, 2, 3]' | jq '.[]'
Due to the special characters used in json, the easiest way to use jq with inline scripts it by putting it between single quotes. That makes it impossible to use bash variables inside your script. Fortunately, jq has an --arg parameter to create a predefined variable from an external source.
Tour Start here for a quick overview of the site Help Center Detailed answers to any questions you might have Meta Discuss the workings and policies of this site
I have a bash shell script that performs a cURL command, and gets the raw output of an array inside the json array. I can also successfully echo the number of elements in the array Why can I not assign the amount of elements inside the array to a variable? it is always blank
As ever when using eval, be careful that you trust the data you're getting, since if it's malicious or just in an unexpected format things could go very wrong.In particular, if the key contains shell metacharacters like $ or whitespace, this could create a running command. It could also overwrite, for example, the PATH environment variable unexpectedly.
jq '.defines[] | select(.id==2)' source.json I see desired output. I am quite new to bash scripting, so I can only guess that it's not jq part that is the source of problem, but the way I am trying to call it in bash.
In this article, we have discussed how to assign the result of a mysqldump command to a variable in a Bash script. We covered the key concepts of the mysqldump command, variable assignment in Bash, redirection operator >, and backticks vs. $() command substitution. We also provided a detailed example of how to put it all together in a Bash script.
This calls curl once and passes the resulting document through a single jq invocation without having to store it in a file or variable. The jq expression creates three strings. Each string is a variable assignment. The @sh operator in jq makes sure that the string is properly quoted for the shell.
This is the only 100%-safe answer; it lets jq properly create the filter using the value, rather than using bash to create a string that jq interprets as a filter. (Consider what happens if the value of EMAILID contains a ) .)
Use jq -r to output a string "raw", without JSON formatting, and use the @sh formatter to format your results as a string for shell consumption. Per the jq docs: @sh: The input is escaped suitable for use in a command-line for a POSIX shell. If the input is an array, the output will be a series of space-separated strings.
Or does it only work with files or pipes and using process substitution is the workaround in this case? Well, it works with files or standard input; pipes is a way of using standard input and process substitution is a way of using files.
Store the result in a variable to use in later stages. Say your json is already in a variable and you need the result in another one. jsonData="{"key":"value"} ... Pass bash variable (string) as jq argument to walk JSON object. 3. Passing multiple variables not files to jq command. 1.
1. If both the input and map objects are in separate files we can use an alternative to Botje 's solution, were we read both object into a single array using the -s (slurp) option. Then we can. jq -s '.[1].type = .[0][.[1].type] | last' map input. Change .type (on second index) to the mapped value from the first index.
In my bash script, I have tried dumpresult=$(mysqldump --user="${dbuser}" --password="${dbpw}" ${dbname} > ${desdir}/${filename_sql}) and dumpresult=`mysqldump --user="$