"$brew install apache-spark' gets me version 2.3.x. '$brew search apache-spark' and '$brew info apache-spark' do not provide a an option to install a different version. is it possible to get a different version with homebrew?
-
Does this answer your question? https://stackoverflow.com/questions/3987683/homebrew-install-specific-version-of-formula?rq=1 – Ross Apr 13 '18 at 03:41
6 Answers
Run these commands (assuming you have apache-spark already installed via Homebrew)
cd "$(brew --repo homebrew/core)"
git log Formula/apache-spark.rb
Eg. the 2.2.0 version:
...
commit bdf68bd79ebd16a70b7a747e027afbe5831f9cc3
Author: ilovezfs
Date: Tue Jul 11 22:19:12 2017 -0700
apache-spark 2.2.0 (#15507)
....
git checkout -b apache-spark-2.2.0 bdf68bd79ebd16a70b7a747e027afbe5831f9cc3
brew unlink apache-spark
HOMEBREW_NO_AUTO_UPDATE=1 brew install apache-spark
Cleanup
git checkout master
git branch -d apache-spark-2.2.0
Check / switch:
brew list apache-spark --versions
brew switch apache-spark 2.2.0

- 2,819
- 2
- 25
- 46
-
Does this still work? When I run `git log Formula/apache-spark.rb` I only get _commit af3b5be4e824ee1168a449138c8069695c942d5b (grafted) Author: BrewTestBot
Date: Thu Jul 5 15:46:17 2018 +0000 gron: update 0.6.0 bottle._ – ac2051 Jul 17 '18 at 09:52 -
Apparently brew uses shallow clones, so git log seems grafted (fake history). You could run `git fetch --unshallow` first, or `git fetch --depth=???` to update tree history on your machine. I haven't tried it, but should work – Tom Lous Jul 18 '18 at 06:05
-
-
There is no branch if you don't create it from a specific commit. `git checkout -b apache-spark-2.2.0 [commit]` creates the branch. You can call it anything you like. You just have to find the correct commit – Tom Lous Jul 23 '20 at 10:59
-
I'm getting the following error: `Error: Failed to download resource "apache-spark" Download failed: https://downloads.apache.org/spark/spark-2.4.5/spark-2.4.5-bin-hadoop2.7.tgz` when I run: `HOMEBREW_NO_AUTO_UPDATE=1 brew install apache-spark`, any chance I could specify that file process? – ahajib Aug 19 '20 at 17:11
-
Apparently the package is removed on apache https://downloads.apache.org/spark/ you should edit the homebrew formula to download from http://archive.apache.org/dist/spark/ – Tom Lous Aug 21 '20 at 07:00
-
1`swith` command was depricated. So now this is not so nice and tidy - you just install old formula version checking it out with git. Besides of that everything works. – ANDgineer Mar 09 '21 at 09:24
-
1You don't need the switch command now, this process still works as of today. Only thing is you need to edit the Formula url. – user1735921 Jul 26 '21 at 04:50
-
Does anyone get `Error: apache-spark: wrong number of arguments (given 1, expected 0)` after running `HOMEBREW_NO_AUTO_UPDATE=1 brew install apache-spark` ? – John Jiang Feb 10 '23 at 20:18
-
I had a small fix to make it work for me: `brew install Formula/apache-spark.rb` Keep attention if you had set `SPARK_HOME` to a specific version. – Saeed Mohtasham Jul 19 '23 at 16:37
I had the same problem, when I install through homebrew, by default it could only find the apache-spark 2.3.0 formula and cannot find 2.2.0 at even deleted repos..
So, I have backed up the existing apache-spark.rb (version 2.3.0) from path: /usr/local/Homebrew/Library/Taps/homebrew/homebrew-core/Formula then overwritten with below:
class ApacheSpark < Formula
desc "Engine for large-scale data processing"
homepage "https://spark.apache.org/"
url "https://www.apache.org/dyn/closer.lua?path=spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz"
version "2.2.0"
sha256 "97fd2cc58e08975d9c4e4ffa8d7f8012c0ac2792bcd9945ce2a561cf937aebcc"
head "https://github.com/apache/spark.git"
bottle :unneeded
def install
# Rename beeline to distinguish it from hive's beeline
mv "bin/beeline", "bin/spark-beeline"
rm_f Dir["bin/*.cmd"]
libexec.install Dir["*"]
bin.write_exec_script Dir["#{libexec}/bin/*"]
end
test do
assert_match "Long = 1000", pipe_output(bin/"spark-shell", "sc.parallelize(1 to 1000).count()")
end
end
then followed above process to re-install which I have 2.2.0 and 2.3.0 with switch facility.
Hope it helps.

- 238
- 1
- 6
-
2You will also need to update the URL to: `http://archive.apache.org/dist/spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz`. The URL in the snippet doesn't work anymore. – juanpaolo Sep 14 '18 at 09:08
I need to install Apache Spark 2.4.0 version specifically on my MacBook. But is not available any more in the Brew listing but still you can make it.
Install latest Spark by brew install apache-spark
. Let say it installed apache-spark-3.0.1
Once completed do brew edit apache-spark
and edit the Pachecos-spark.rb as follows
class ApacheSpark < Formula
desc "Engine for large-scale data processing"
homepage "https://spark.apache.org/"
url "https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
mirror "https://archive.apache.org/dist/spark/spark-2.4.0/spark-2.4.0-bin-hadoop2.7.tgz"
version "2.4.0"
sha256 "c93c096c8d64062345b26b34c85127a6848cff95a4bb829333a06b83222a5cfa"
license "Apache-2.0"
head "https://github.com/apache/spark.git"
bottle :unneeded
depends_on "openjdk@8"
def install
# Rename beeline to distinguish it from hive's beeline
mv "bin/beeline", "bin/spark-beeline"
rm_f Dir["bin/*.cmd"]
libexec.install Dir["*"]
bin.install Dir[libexec/"bin/*"]
bin.env_script_all_files(libexec/"bin", JAVA_HOME: Formula["openjdk@8"].opt_prefix)
end
test do
assert_match "Long = 1000",
pipe_output(bin/"spark-shell --conf spark.driver.bindAddress=127.0.0.1",
"sc.parallelize(1 to 1000).count()")
end
end
Now uninstall the spark again using brew uninstall apache-spark
Install it again using brew install apache-spark
Result
% spark-shell
2021-02-09 19:27:11 WARN NativeCodeLoader:62 - Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Spark context Web UI available at http://192.168.0.17:4040
Spark context available as 'sc' (master = local[*], app id = local-1612927640472).
Spark session available as 'spark'.
Welcome to
____ __
/ __/__ ___ _____/ /__
_\ \/ _ \/ _ `/ __/ '_/
/___/ .__/\_,_/_/ /_/\_\ version 2.4.0
/_/
Using Scala version 2.11.12 (OpenJDK 64-Bit Server VM, Java 1.8.0_282)
Type in expressions to have them evaluated.
Type :help for more information.

- 51
- 2
-
I followed your solution, after uninstall the latest version, when I try to install again with brew, it gave me this error `Error: apache-spark: Failed to download resource "apache-spark_bottle_manifest" Download failed: https://ghcr.io/v2/homebrew/core/apache-spark/manifests/3.1.1` – wawawa Oct 03 '22 at 18:17
For posterity: there's no point in trying to resurrect an older brew commit because the url in the formula (https://www.apache.org/dyn/closer.lua?path=spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz) is no longer valid. This also means the brew formula for 2.2.1 will not work as-is either.
At the very least, you need to update the url to http://archive.apache.org/dist/spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz (as noted by @juanpaolo).
In order to install Spark 2.2.0 via Homebrew today,
- Grab the 2.2.0 formula (https://github.com/Homebrew/homebrew-core/blob/bdf68bd79ebd16a70b7a747e027afbe5831f9cc3/Formula/apache-spark.rb)
- Update the url in line 4 from https://www.apache.org/dyn/closer.lua?path=spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz to http://archive.apache.org/dist/spark/spark-2.2.0/spark-2.2.0-bin-hadoop2.7.tgz
brew install <path-to-updated-formula>
TLDR/for the lazy:
brew install https://gist.githubusercontent.com/eddies/bc148d83b1fc5555520d0cdf2dff8553/raw/c7ce091a083cacb3519502860695b56b0b806070/apache-spark.rb
Or, via brew tap:
brew tap eddies/spark-tap
brew install apache-spark@2.2.0

- 7,113
- 3
- 36
- 39
Even you can search for a list of formulas available for apache-spark:
brew search apache-spark
Then tap:
brew tap eddies/spark-tap
Then install whichever specific version is available:
brew install apache-spark@2.3.2

- 2,676
- 4
- 17
- 30

- 11
- 2
You can simply Unistall any version of scala that you have on your mac first.
Then from your terminal is macbook type brew install apache-spark@2.2.0
and that would install spark version 2.2.0 on your mac.

- 2,689
- 2
- 20
- 35
-
This did not work for me ==> bin % brew install apache-spark@2.4.8 Warning: No available formula with the name "apache-spark@2.4.8". Did you mean apache-spark? ==> Searching for similarly named formulae... This similarly named formula was found: apache-spark To install it, run: brew install apache-spark – sai May 01 '22 at 13:54
-
@sai Can you try with some other version of spark and see if that works.. – Nikunj Kakadiya May 02 '22 at 05:35
-