diff --git a/docs/interpret/faq.ipynb b/docs/interpret/faq.ipynb
index 8a5087e53..b4906c099 100644
--- a/docs/interpret/faq.ipynb
+++ b/docs/interpret/faq.ipynb
@@ -234,7 +234,13 @@
"source": [
"
Can we enforce monotonicity for individual EBM terms?
\n",
"\n",
- "We currently cannot do this through training, but it is possible to enforce monotonicity through postprocessing a graph. We generally recommend using [isotonic regression](https://scikit-learn.org/stable/modules/generated/sklearn.isotonic.IsotonicRegression.html#sklearn.isotonic.IsotonicRegression) on each graph output to force positive or negative monotonicity. Code examples coming soon!"
+ "It is possible to enforce monotonicity for individual terms in an EBM via two methods:\n:",
+ "\n",
+ "- By setting `monotone_constraints` in the `ExplainableBoostingClassifier` or `ExplainableBoostingRegressor` constructor. This parameter is a list of integers, where each integer corresponds to the monotonicity constraint for the corresponding feature. A value of -1 enforces decreasing monotonicity, 0 enforces no monotonicity, and 1 enforces increasing monotonicity. For example, `monotone_constraints=[1, -1, 0]` would enforce increasing monotonicity for the first feature, decreasing monotonicity for the second feature, and no monotonicity for the third feature.\n",
+ "\n",
+ "- Through postprocessing a graph. We generally recommend using [isotonic regression](https://scikit-learn.org/stable/modules/generated/sklearn.isotonic.IsotonicRegression.html#sklearn.isotonic.IsotonicRegression) on each graph output to force positive or negative monotonicity. This can be done by calling the `monotonize` method on the EBM object.",
+ "\n",
+ "Postprocessing is the recommended method, as it prevents the model from compensating for the monotonicity constraints by learning non-monotonic effects in other highly-correlated features."
]
},
{