This is my first bash script and have ran into a little problem with setting the decimal precision. I have been tasked to create a bash script that calculates the area and circumference of a circle given the diameter of 20. This is what I currently have
#!/bin/bash
clear
diameter=$1 # storing first argument
radius=$(echo "scale=5;$diameter / 2" | bc) # setting radius
# echo "$radius"
# calculate area of a circle
area=$(echo "scale=5;3.14 * ($radius * $radius)" | bc -l) # A = pi(r^2)
# calculate circumference of a circle
circum=$(echo "scale=5;2 * 3.14 * ($radius)" | bc -l) # C = 2(pi)(r)
echo "Circumference: $circum"
echo "Area: $area"
When I run the script it prints out
Circumference: 62.80000
Area: 314.00000
It should be printing out
Circumference: 62.83185
Area: 314.15926
I am not understand why it is not displaying the correct decimal values. I have given the scale=5
to display five decimal places which it is doing. I am confused why the zeros are showing up and not the true decimal values. Any help or suggestions would be greatly appreciated.