diff --git a/docs/dynamicform.md b/docs/dynamicform.md
index f30938ace55..adbc563f144 100644
--- a/docs/dynamicform.md
+++ b/docs/dynamicform.md
@@ -48,7 +48,7 @@ Also you can separate option's display name and value, using _${formName=default
### Creates Programatically
-Some language backend use programtic way to create form. for example, scala language backend. Function to create form is provided by [ZeppelinContext](./zeppelincontext.html).
+Some language backend use programtic way to create form. for example [ZeppelinContext](./interpreter/spark.html#zeppelincontext) provides form creation API
Here're some examples.
diff --git a/docs/index.md b/docs/index.md
index c74f4f28752..e90ef61fa74 100644
--- a/docs/index.md
+++ b/docs/index.md
@@ -14,9 +14,12 @@ group: nav-right
* [Tutorial](./tutorial/tutorial.html)
+### Interpreter
+
+* [Spark](./interpreter/spark.html)
+
### Manual
-* [Zeppelin Context](./zeppelincontext.html)
* [Dynamic Form](./dynamicform.html)
* [Display System](./display.html)
diff --git a/docs/interpreter/spark.md b/docs/interpreter/spark.md
new file mode 100644
index 00000000000..3bab5cc3411
--- /dev/null
+++ b/docs/interpreter/spark.md
@@ -0,0 +1,163 @@
+---
+layout: page
+title: "Spark Interpreter Group"
+description: ""
+group: manual
+---
+{% include JB/setup %}
+
+
+## Spark
+
+[Apache Spark](http://spark.apache.org) is supported in Zeppelin with
+Spark Interpreter group, which consisted of 4 interpreters.
+
+
+
+ | Name |
+ Class |
+ Description |
+
+
+ | %spark |
+ SparkInterpreter |
+ Creates SparkContext and provides scala environment |
+
+
+ | %pyspark |
+ PySparkInterpreter |
+ Provides python environment |
+
+
+ | %sql |
+ SparkSQLInterpreter |
+ Provides SQL environment |
+
+
+ | %dep |
+ DepInterpreter |
+ Depdency loader |
+
+
+
+
+
+
+
+### SparkContext, SQLContext, ZeppelinContext
+
+SparkContext, SQLContext, ZeppelinContext are automatically created and exposed as variable name 'sc', 'sqlContext' and 'z' respectively, Both scala and python environment.
+
+Note that scala / python environment shares the same SparkContext, SQLContext, ZeppelinContext instance.
+
+
+
+
+
+### Dependency loading
+
+When your code requires external library, instead of doing download/copy/restart Zeppelin, you can eaily do following jobs using %dep interpreter.
+
+ * Load libraries recursively from Maven repository
+ * Load libraries from local filesystem
+ * Add additional maven repository
+ * Automatically add libraries to SparkCluster (You can turn off)
+
+Dep interpreter leverages scala environment. So you can write any scala code here.
+
+Here's usages.
+
+```scala
+%dep
+z.reset() // clean up previously added artifact and repository
+
+// add maven repository
+z.addRepo("RepoName").url("RepoURL")
+
+// add maven snapshot repository
+z.addRepo("RepoName").url("RepoURL").snapshot()
+
+// add artifact from filesystem
+z.load("/path/to.jar")
+
+// add artifact from maven repository, with no dependency
+z.load("groupId:artifactId:version").excludeAll()
+
+// add artifact recursively
+z.load("groupId:artifactId:version")
+
+// add artifact recursively except comma separated GroupID:ArtifactId list
+z.load("groupId:artifactId:version").exclude("groupId:artifactId,groupId:artifactId, ...")
+
+// exclude with pattern
+z.load("groupId:artifactId:version").exclude(*)
+z.load("groupId:artifactId:version").exclude("groupId:artifactId:*")
+z.load("groupId:artifactId:version").exclude("groupId:*")
+
+// local() skips adding artifact to spark clusters (skipping sc.addJar())
+z.load("groupId:artifactId:version").local()
+```
+
+Note that %dep interpreter should be used before %spark, %pyspark, %sql.
+
+
+
+
+
+### ZeppelinContext
+
+
+Zeppelin automatically inject ZeppelinContext as variable 'z' in your scala/python environment. ZeppelinContext provides some addtional functions and utility.
+
+
+#### Object exchange
+
+ZeppelinContext extends map and it's shared between scala, python environment.
+So you can put some object from scala and read it from python, vise versa.
+
+Put object from scala
+
+```scala
+%spark
+val myObject = ...
+z.put("objName", myObject)
+```
+
+Get object from python
+
+```python
+%python
+myObject = z.get("objName")
+```
+
+
+#### Form creation
+
+ZeppelinContext provides functions for creating forms.
+In scala and python environment, you can create forms programtically.
+
+```scala
+%spark
+/* Create text input form */
+z.input("formName")
+
+/* Create text input form with default value */
+z.input("formName", "defaultValue")
+
+/* Create select form */
+z.select("formName", Seq(("option1", "option1DisplayName"),
+ ("option2", "option2DisplayName")))
+
+/* Create select form with default value*/
+z.select("formName", "option1", Seq(("option1", "option1DisplayName"),
+ ("option2", "option2DisplayName")))
+```
+
+In sql environment, you can create form in simple template.
+
+```
+%sql
+select * from ${table=defualtTableName} where text like '%${search}%'
+```
+
+To learn more about dynamic form, checkout [Dynamic Form](../dynamicform.html).
\ No newline at end of file
diff --git a/docs/zeppelincontext.md b/docs/zeppelincontext.md
deleted file mode 100644
index b6f27dd4b3d..00000000000
--- a/docs/zeppelincontext.md
+++ /dev/null
@@ -1,78 +0,0 @@
----
-layout: page
-title: "ZeppelinContext"
-description: ""
-group: manual
----
-{% include JB/setup %}
-
-
-### Zeppelin Context
-
-ZeppelinContext is automatically created and injected into Scala language backend.
-It provies following function and references.
-
-
-#### SparkContext, SQLContext
-
-ZeppelinContext provides reference to SparkContext and SQLContext with some shortcut function.
-
-```scala
-/* reference to SparkContext */
-z.sc
-
-/* reference to SQLContext */
-z.sqlContext
-
-/* Shortcut to z.sqlContext.sql() */
-z.sql("select * from ...")
-```
-
-
-
-#### Dependency loader (Experimental)
-
-ZeppelinContext provides series of functions that loads jar library from local FS or Remote Maven repository. Loaded library is automatically added into Scala interpreter and SparkContext.
-
-```scala
-/* Load a library from local FS */
-z.load("/path/to/your.jar")
-
-/* Load a library from Maven repository */
-z.load("groupId:artifactId:version")
-
-/* Load library from Maven repository with dependencies */
-z.load("groupId:artifactId:version", true)
-
-/* Load a library from Local FS and add it into SparkContext */
-z.loadAndDist("/path/to/your.jar")
-
-/* Load a library with dependencies from Maven repository and add it into SparkContext*/
-z.loadAndDist("groupId:artifactId:version")
-
-/* Load library with dependencies from maven repository and add it into SparkContext*/
-z.loadAndDist("groupId:artifactId:version", true)
-```
-
-
-
-#### Form creation
-
-ZeppelinContext also provides functions for creating forms. To learn more about dynamic form, checkout [Dynamic Form](./dynamicform.html).
-
-
-```scala
-/* Create text input form */
-z.input("formName")
-
-/* Create text input form with default value */
-z.input("formName", "defaultValue")
-
-/* Create select form */
-z.select("formName", Seq(("option1", "option1DisplayName"),
- ("option2", "option2DisplayName")))
-
-/* Create select form with default value*/
-z.select("formName", "option1", Seq(("option1", "option1DisplayName"),
- ("option2", "option2DisplayName")))
-```
\ No newline at end of file
diff --git a/index.md b/index.md
index 229e78f65bb..4811e323480 100644
--- a/index.md
+++ b/index.md
@@ -45,7 +45,7 @@ Zeppelin provides built-in Apache Spark integration. You don't need to build a s
Zeppelin's Spark integration provides
- Automatic SparkContext and SQLContext injection
-- Runtime jar dependency loading from local filesystem or maven repository. Learn more about [dependency loader](./docs/zeppelincontext.html).
+- Runtime jar dependency loading from local filesystem or maven repository. Learn more about [dependency loader](./docs/interpreter/spark.html#dependencyloading).
- Canceling job and displaying its progress