-
Notifications
You must be signed in to change notification settings - Fork 1
Seo improvements #4
New issue
Have a question about this project? Sign up for a free GitHub account to open an issue and contact its maintainers and the community.
By clicking “Sign up for GitHub”, you agree to our terms of service and privacy statement. We’ll occasionally send you account related emails.
Already on GitHub? Sign in to your account
base: master
Are you sure you want to change the base?
Conversation
|
The latest updates on your projects. Learn more about Vercel for GitHub.
|
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Pull request overview
This PR implements comprehensive SEO improvements for the ReChrome application by adding structured data, sitemap generation, robots.txt configuration, and enhanced metadata.
- Added robots.txt with bot-specific crawl rules and sitemap reference
- Implemented dynamic sitemap generation using Next.js MetadataRoute API
- Added JSON-LD structured data (WebApplication and FAQPage schemas) for rich search results
- Enhanced metadata with Open Graph, Twitter cards, and improved SEO-focused titles/descriptions
Reviewed changes
Copilot reviewed 5 out of 6 changed files in this pull request and generated 6 comments.
Show a summary per file
| File | Description |
|---|---|
| public/robots.txt | New robots.txt file with crawler directives, sitemap reference, and bot-specific rules including blocking certain SEO crawlers |
| package.json | Version bump from 0.2.0 to 0.3.0 to reflect the SEO enhancements |
| app/sitemap.ts | New sitemap generation function using Next.js MetadataRoute with dynamic base URL support |
| app/schema.ts | New file with JSON-LD structured data generators for WebApplication and FAQPage schemas |
| app/layout.tsx | Enhanced metadata with SEO-optimized titles/descriptions, Open Graph/Twitter cards, robots configuration, and manual script injection for JSON-LD schemas |
💡 Add Copilot custom instructions for smarter, more guided reviews. Learn how to get started.
| <head> | ||
| <script | ||
| type="application/ld+json" | ||
| dangerouslySetInnerHTML={{ | ||
| __html: JSON.stringify(generateSchemaMarkup()), | ||
| }} | ||
| /> | ||
| <script | ||
| type="application/ld+json" | ||
| dangerouslySetInnerHTML={{ | ||
| __html: JSON.stringify(generateFAQSchema()), | ||
| }} | ||
| /> | ||
| </head> |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Next.js automatically handles adding the custom head elements in the layout through the metadata API. Manually adding a head element in the JSX can potentially cause conflicts or duplicate head elements. Consider using the metadata API's other.ldJson or viewport fields instead, or verify that this manual approach doesn't conflict with Next.js's automatic head management.
| type: "website", | ||
| images: [ | ||
| { | ||
| url: "/og-image.png", |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Open Graph image file '/og-image.png' referenced in the metadata does not exist in the public directory. This will cause broken image links when the page is shared on social media platforms. Either create the og-image.png file in the public directory with dimensions 1200x630, or remove the Open Graph image configuration.
| title: "ReChrome - Download Old Chrome Versions", | ||
| description: "Find and download any Chrome version you need. Complete Chrome version archive.", | ||
| creator: "@itzcodex24", | ||
| images: ["/og-image.png"], |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The Twitter image file '/og-image.png' referenced in the metadata does not exist in the public directory. This will cause broken image links when the page is shared on Twitter/X. Either create the og-image.png file in the public directory, or remove the Twitter image configuration.
| images: ["/og-image.png"], |
| "@type": "WebApplication", | ||
| "name": "ReChrome", | ||
| "description": "Download old Chrome versions for Windows, macOS, and Linux. Access the complete Chrome version history and archive.", | ||
| "url": "https://rechrome.vercel.app/", |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The sitemap is hardcoded to use 'https://rechrome.dev' in the schema markup URL field instead of using the dynamic baseUrl from the environment variable. This creates inconsistency with the sitemap.ts file which correctly uses the NEXT_PUBLIC_URL environment variable. Consider using the environment variable here as well for consistency across environments.
| "sourceOrganization": { | ||
| "@type": "Organization", | ||
| "name": "ReChrome", | ||
| "url": "https://rechrome.vercel.app/" |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
The sourceOrganization URL is hardcoded to 'https://rechrome.dev' instead of using a dynamic environment variable. This creates inconsistency with the sitemap.ts file which correctly uses the NEXT_PUBLIC_URL environment variable. Consider using the environment variable here as well for consistency across environments.
| Disallow: / | ||
|
|
||
| User-agent: SemrushBot | ||
| Disallow: / |
Copilot
AI
Dec 16, 2025
There was a problem hiding this comment.
Choose a reason for hiding this comment
The reason will be displayed to describe this comment to others. Learn more.
Blocking AhrefsBot and SemrushBot prevents these legitimate SEO audit tools from crawling your site. While they do consume bandwidth, these bots are commonly used by website owners for SEO analysis and competitive research. Consider allowing these bots with a reasonable crawl-delay instead of completely blocking them, unless there's a specific reason to prevent SEO auditing of the site.
| Disallow: / | |
| User-agent: SemrushBot | |
| Disallow: / | |
| Allow: / | |
| Crawl-delay: 5 | |
| User-agent: SemrushBot | |
| Allow: / | |
| Crawl-delay: 5 |
No description provided.