-
Notifications
You must be signed in to change notification settings - Fork 5
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
Crashing on Mac M1 (with appropriate changes to code) #1
Comments
The JSL on Mac M1 issue above has been solved and closed. I thought that would solve this problem. It did solve this
I know @DevinTDHa and you have have worked on the My Azul 1.8 java failed w/ wrong architecture problem so I tried building this starter app with java 11, Spark 3.3.1 and SparkNLP 4.2.3. (The above error is with the env I described in the closing comment of the above issue.) --- a/build.sbt
+++ b/build.sbt
@@ -7,11 +7,11 @@ val scalaTestVersion = "3.2.9"
name := "spark-nlp-starter"
-version := "4.2.1"
+version := "4.2.3"
scalaVersion := "2.12.15"
-javacOptions ++= Seq("-source", "1.8", "-target", "1.8")
+javacOptions ++= Seq("-source", "11", "-target", "11")
licenses := Seq("Apache-2.0" -> url("https://opensource.org/licenses/Apache-2.0"))
@@ -22,15 +22,15 @@ developers in ThisBuild := List(
email = "maziyar.panahi@iscpif.fr",
url = url("https://github.com/maziyarpanahi")))
-val sparkVer = "3.3.0"
-val sparkNLP = "4.2.1"
+val sparkVer = "3.3.1"
+val sparkNLP = "4.2.3"
libraryDependencies ++= {
Seq(
"org.apache.spark" %% "spark-core" % sparkVer % Provided,
"org.apache.spark" %% "spark-mllib" % sparkVer % Provided,
"org.scalatest" %% "scalatest" % scalaTestVersion % "test",
- "com.johnsnowlabs.nlp" %% "spark-nlp" % sparkNLP)
+ "com.johnsnowlabs.nlp" %% "spark-nlp-m1" % sparkNLP)
}
--- a/src/main/scala/Main.scala
+++ b/src/main/scala/Main.scala
@@ -8,6 +8,7 @@ object Main {
val spark: SparkSession = SparkSession.builder
.appName("spark-nlp-starter")
.master("local[*]")
+ .config("spark.jars.packages", "com.johnsnowlabs.nlp:spark-nlp-m1_2.12:4.2.3")
.getOrCreate
def main(args: Array[String]): Unit = { For some reason, two versions of
But dependencies only show one version:
Any thoughts? Thank you. |
Could you please create a new issue regarding RocksDB support on M1 in https://github.com/JohnSnowLabs/spark-nlp so we can officially have it on JIRA and follow up on it? Many thanks |
Answer to this issue is on the main repo: |
@DevinTDHa found the solution (see thread in
Thank you, @DevinTDHa and @maziyarpanahi. |
(This is probably more of an issue with https://github.com/JohnSnowLabs/spark-nlp than this code but this is a convenient testing ground for Mac M1 experiments. I am cross referencing this issue with the JSL issue JohnSnowLabs/spark-nlp#13079 I submitted.)
This code crashes when using
spark-nlp-m1
on a Mac M1.My edits:
I got that last line from SparkNLP.scala.
The text was updated successfully, but these errors were encountered: