Skip to content

Build and Test Local

Goun Na edited this page Jan 3, 2026 · 2 revisions

Spark Install

Download https://spark.apache.org/downloads.html

$ vi .bash_profile

export SPARK_HOME=/Users/gounna/spark-4.0.1-bin-hadoop3
PATH=$PATH:/Library/Frameworks/Python.framework/Versions/3.11/bin:$SPARK_HOME/bin

Build for local test

$ sbt package $ spark-shell --jars "/Users/gounna/git/graphframes/target/scala-2.12/graphframes_2.12-0.0.0+558-266964f9+20251010-2126-SNAPSHOT.jar"

WARNING: Using incubator modules: jdk.incubator.vector
25/10/10 21:47:01 WARN NativeCodeLoader: Unable to load native-hadoop library for your platform... using builtin-java classes where applicable
Using Spark's default log4j profile: org/apache/spark/log4j2-defaults.properties
Setting default log level to "WARN".
To adjust logging level use sc.setLogLevel(newLevel). For SparkR, use setLogLevel(newLevel).
Welcome to
      ____              __
     / __/__  ___ _____/ /__
    _\ \/ _ \/ _ `/ __/  '_/
   /___/ .__/\_,_/_/ /_/\_\   version 4.0.1
      /_/
         
Using Scala version 2.13.16 (OpenJDK 64-Bit Server VM, Java 17.0.8)
Type in expressions to have them evaluated.
Type :help for more information.
Spark context Web UI available at http://192.168.1.3:4040
Spark context available as 'sc' (master = local[*], app id = local-1760100428041).
Spark session available as 'spark'.

scala> 

SBT Test

sbt scalafmtAll
sbt "testOnly org.graphframes.pattern.PatternSuite" 
sbt "testOnly org.graphframes.PatternMatchSuite"

Clone this wiki locally