Tools in This Collection
Meta Tag Generator
Generate HTML meta tags for SEO, Open Graph, and Twitter Cards with live preview
Robots.txt Generator
Build robots.txt files visually with user-agent rules, sitemaps, and crawl-delay settings
Robots.txt Tester
Test robots.txt rules against any URL to verify what Googlebot and other crawlers can access
URL Slug Generator
Convert any title or phrase into a clean, SEO-friendly, hyphenated URL slug
JSON-LD Schema Generator
Generate Schema.org JSON-LD structured data for Articles, Products, FAQs, and Events
CORS Header Generator
Generate correct CORS response headers for nginx, Apache, Express, and Flask configurations
.htaccess Generator
Generate .htaccess files with HTTPS redirects, GZIP compression, caching, and IP blocking
HTTP Cache Header Builder
Build Cache-Control headers by resource type with strategy, directives, and ready-to-use code
OG Image Generator
Create properly sized Open Graph images (1200x630px) for social media sharing
Guides & Articles
Web Development and SEO Workflow
SEO and web performance both require a set of configuration files and HTML elements that are easy to get wrong. Missing meta tags reduce click-through rates, incorrect robots.txt blocks search engines, and misconfigured CORS prevents cross-origin requests. These tools generate correct outputs from a form interface.
Meta Tags: The Complete Required Set
A fully SEO-optimized page needs: title (under 60 chars), meta description (under 160 chars), canonical URL, plus a full Open Graph block (og:title, og:description, og:image, og:url, og:type) for social sharing. The Meta Tag Generator outputs all of these in the correct format with character counting and live preview of what the Google result and social card will look like.
Robots.txt and Crawler Control
Robots.txt tells search engine crawlers which pages to access. Common use cases: blocking admin areas, preventing indexing of filtered/sorted URLs (which create duplicate content), and blocking AI training crawlers (GPTBot, CCBot, Google-Extended, anthropic-ai). The Robots.txt Generator produces properly formatted files. The Robots.txt Tester verifies that your existing file actually blocks the URLs you intend to block — critical because robots.txt syntax errors are easy to make.
Structured Data for Rich Results
JSON-LD structured data (Schema.org) enables rich results in Google Search: FAQ carousels, product pricing, event times, and article bylines. Incorrect JSON-LD either fails silently or generates Search Console errors. The JSON-LD Generator produces valid markup for the most common schema types — Article, Product, FAQ, Event, and LocalBusiness.
CORS and HTTP Headers
CORS errors (blocked cross-origin requests) and caching misconfiguration are two of the most common web performance and API issues. The CORS Header Generator outputs the correct response headers for your specific cross-origin policy. The HTTP Cache Header Builder generates the right Cache-Control directives based on resource type (static assets, API responses, HTML pages) and your caching strategy.
Frequently Asked Questions
What meta tags does every page need for SEO?
Minimum required: title (under 60 chars), meta description (under 160 chars), canonical URL. For social sharing: og:title, og:description, og:image (1200x630px), og:url, og:type. For Twitter: twitter:card, twitter:title, twitter:description, twitter:image. The Meta Tag Generator outputs the complete set in one go.
How do I block AI scrapers in robots.txt?
Add user-agent blocks for known AI training crawlers: User-agent: GPTBot, CCBot, Google-Extended, anthropic-ai, Bytespider, Meta-ExternalAgent. Set Disallow: / under each to block site-wide access. Note: robots.txt is advisory — disreputable scrapers may ignore it. The Robots.txt Generator includes these blocks in its output.
What is JSON-LD and how does it affect SEO?
JSON-LD is a machine-readable format that tells search engines what type of content is on your page. Adding FAQ JSON-LD to a page can trigger the FAQ rich result in Google Search (expandable Q&A directly in search results). Product markup enables pricing, availability, and review stars. Article markup enables Top Stories integration for news content.
What causes CORS errors?
CORS errors occur when a browser tries to make a request to a domain different from the page's origin (cross-origin) and the server doesn't include the required Access-Control-Allow-Origin response header. APIs accessed from a frontend JavaScript application commonly trigger CORS. The CORS Header Generator outputs the correct server-side headers for your specific allowlisting policy.
What Cache-Control settings should I use for static assets?
For content-hashed static assets (CSS, JS with hash in filename): Cache-Control: public, max-age=31536000, immutable. These files never change at a given URL so can be cached forever. For HTML pages: Cache-Control: public, max-age=0, must-revalidate. For API responses that change: Cache-Control: no-store. The HTTP Cache Header Builder generates the right directive for each resource type.