
The backend is currently throwing groq.BadRequestError because the gemma2-9b-it model has been decommissioned by Groq.
Impact: All Bias, Chat, and Fact-Check modules are failing. Solution: We need to upgrade to llama-3.3-70b-versatile immediately.
I have raised a PR to fix this.