-
Notifications
You must be signed in to change notification settings - Fork 540
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
fix: SEO Fixes, broken link fixes #2787
Conversation
Add robots.txt to middleware to stop 500 errors
The latest updates on your projects. Learn more about Vercel for Git ↗︎
|
📝 WalkthroughWalkthroughThis pull request introduces several changes across the Unkey application, focusing on web crawling configurations, routing, and content updates. The modifications include adding Changes
Possibly related PRs
Suggested labels
Suggested reviewers
Thank you for using CodeRabbit. We offer it for free to the OSS community and would appreciate your support in helping us grow. If you find it useful, would you consider giving us a shout-out on your favorite social media? 🪧 TipsChatThere are 3 ways to chat with CodeRabbit:
Note: Be mindful of the bot's finite context window. It's strongly recommended to break down tasks such as reading entire modules into smaller chunks. For a focused discussion, use review comments to chat about specific files and their changes, instead of using the PR comments. CodeRabbit Commands (Invoked using PR comments)
Other keywords and placeholders
CodeRabbit Configuration File (
|
Thank you for following the naming conventions for pull request titles! 🙏 |
|
This adds robots.txt a sitemap, and a redirect for policy
Remove my email and fix a link that was 404ing
6db7e1d
to
17a4486
Compare
nice |
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Actionable comments posted: 1
🧹 Nitpick comments (4)
apps/www/next.config.js (1)
40-43
: Consider using a redirect for SEO tracking.Rewriting
/terms
to/policies/terms
is valid, but using a redirect may improve SEO by informing search engines that the route has permanently moved. For large directories or high-traffic endpoints, this can help preserve link equity.apps/www/app/sitemap.ts (3)
1-3
: Consider validating content arrays from external sources.
IfallPosts
,allPolicies
,allChangelogs
, orallGlossaries
are empty or undefined, it might cause unexpected behavior in your sitemap. You could provide a fallback (e.g., an empty array) or log a warning for better resilience.
4-5
: Allow configuration of the base URL.
Currently, thebaseUrl
is hardcoded as"https://unkey.com"
. If you ever need to generate sitemaps for other environments (e.g., testing or staging), consider makingbaseUrl
configurable via environment variables to avoid accidental misconfiguration.
26-61
: Potential improvement: standardizelastModified
values.
Currently, some entries havenew Date()
while others derivelastModified
from content fields. For consistency and best SEO practices, consider standardizing howlastModified
is assigned (e.g., always using the content’s own date, or always usingnew Date()
plus a fallback if missing).
📜 Review details
Configuration used: CodeRabbit UI
Review profile: CHILL
Plan: Pro
📒 Files selected for processing (8)
apps/dashboard/app/robots.txt
(1 hunks)apps/dashboard/middleware.ts
(1 hunks)apps/www/app/policies/[slug]/page.tsx
(0 hunks)apps/www/app/robots.txt
(1 hunks)apps/www/app/sitemap.ts
(1 hunks)apps/www/content/blog/how-unkey-treats-marketing.mdx
(1 hunks)apps/www/content/changelog/2024-03-01.mdx
(1 hunks)apps/www/next.config.js
(1 hunks)
💤 Files with no reviewable changes (1)
- apps/www/app/policies/[slug]/page.tsx
✅ Files skipped from review due to trivial changes (4)
- apps/www/content/blog/how-unkey-treats-marketing.mdx
- apps/www/app/robots.txt
- apps/dashboard/app/robots.txt
- apps/www/content/changelog/2024-03-01.mdx
🔇 Additional comments (1)
apps/www/app/sitemap.ts (1)
16-19
: Avoid using anchors (#${changelog.slug}
) in sitemap URLs.
Many crawlers treat fragment identifiers as non-navigable references to sections within a page, making them less useful for indexing distinct content. If each changelog entry truly has its own page, consider using unique routes or query params.
Add robots.txt to middleware to stop 500 errors
What does this PR do?
Fixes # (issue)
If there is not an issue for this, please create one first. This is used to tracking purposes and also helps use understand why this PR exists
Type of change
How should this be tested?
Checklist
Required
pnpm build
pnpm fmt
console.logs
git pull origin main
Appreciated
Summary by CodeRabbit
Release Notes
New Features
/terms
pageContent Updates
SEO Configuration
robots.txt
files for dashboard and www applications