Content & AI Visibility Tools
← All Tools
🏠 Overview
🤖 AEO Checker
🏆 E-E-A-T Checker
📖 Readability
🔑 Keyword Analysis
🗺️ Maps Audit
📍 GBP Audit
📄 LLMs.txt Auditor
🤖
AEO Checker Silver+
Technical audit for AI citation readiness. Checks AI bot access, FAQ/HowTo schema, structured data density, question coverage, and generates ready-to-deploy llms.txt and robots.txt files. Optional competitor comparison.
Answer Engine Optimisation
🏆
E-E-A-T Checker Silver+
Analyse Experience, Expertise, Authoritativeness and Trust signals. Each category has benchmarks so you know what "good" looks like, with effort labels on every fix.
Google Quality Signals
📖
Readability Checker Bronze+
Flesch score, grade level and sentence complexity — with specific long sentences flagged so you know exactly which lines to rewrite.
Content Quality Analysis
🔑
Keyword Analysis Bronze+
Keyword density, frequency and stuffing detection. Checks whether your primary keyword appears in title, H 1 and meta description.
On-Page Keyword Density
📍
Google Business Profile Audit Silver+
Full GBP audit via live Google data — rating, reviews, completeness score, opening hours, photos, categories, Maps 3-pack position and competitor comparison. More data unlocked at Gold+.
Local SEO & GBP Signals
📄
LLMs.txt Auditor Silver+
Fetch, score and fix your llms.txt file against the llmstxt.org spec. Validates structure, links and content quality. Generates a corrected file using real page data. Optional competitor comparison.
AI Crawler Guidance
Why Content & AI Visibility Matters in 2026

Traditional SEO focused on ranking in Google's blue links. In 2026, a growing share of search traffic is answered directly by AI — ChatGPT, Perplexity, Google AI Overviews and Microsoft Copilot. These AI engines don't just rank pages — they cite sources. If your content doesn't meet AI citation criteria, you're invisible to a growing chunk of your audience.

🤖
AEO Score
Technical readiness for AI citation — bot access, schema, FAQ coverage, llms.txt
🏆
E-E-A-T
Google's trust framework — with benchmarks showing what a good score looks like
📖
Readability
Specific long sentences flagged — not just a score, but exactly what to fix
🔑
Keywords
Density, stuffing detection, and keyword presence in title / H 1 / meta
ℹ️ What is AEO and how does this checker work?
Answer Engine Optimisation (AEO) — also called GEO (Generative Engine Optimisation) — is about making your site technically accessible and trustworthy enough for AI search tools to cite it.

What this tool checks: We fetch your page and analyse the real HTML — schema markup, robots.txt, structured data density, heading structure, and whether your content answers questions in a format AI engines can extract.

Important caveat: This is a technical readiness audit, not a live citation check. The engine cards show whether your site is technically optimised for each AI engine — not whether it is being cited right now. Think of it as "are the doors open?" rather than "are they visiting?"

Page vs domain: Enter a specific page URL to audit that page's schema and content. Enter your homepage URL to also check domain-level signals like robots.txt and llms.txt.
📄 What is llms.txt and where does it go?
llms.txt is a proposed open standard (inspired by robots.txt) that tells AI language models how to interact with your site. It lives at https://yoursite.com/llms.txt in your domain root.

What goes in it: Your site's purpose, key content areas, preferred citation name, and any sections you want AI to prioritise or avoid. Plain text format.

How to deploy:
WordPress: Upload via FTP/SFTP to your public_html root folder
Shopify: Settings → Files → Upload, then add URL redirect from /llms.txt
cPanel / shared hosting: File Manager → public_html → Upload
Vercel / Netlify: Place in the /public folder of your project
Custom server: Drop the file in your webroot, ensure it's served as text/plain

This tool generates a customised llms.txt based on your page content — just copy and upload.
❓ Why does FAQ schema matter so much for AEO?
AI search engines are fundamentally question-answering machines. When a user asks a question, the AI looks for pages that explicitly answer it — and FAQPage, HowTo and QAPage schema are direct signals that your content does exactly that.

Pages with FAQPage schema are consistently cited more often because:
• The question/answer structure maps directly to how AI extracts content
• It signals that the content was written to answer specific user queries
• Google may also display your FAQ as rich results in standard search

Quick implementation: Add 4–8 genuine questions about your topic with clear, direct answers. Wrap them in FAQPage JSON-LD. Use our Schema Builder tool to generate the code in 2 minutes.
🚫 What do common AEO errors mean — and how do I fix them?
❌ AI bot access blocked — Your robots.txt is blocking GPTBot, PerplexityBot, ClaudeBot or Bingbot. These bots cannot crawl your page, so AI engines cannot cite it. Fix: add User-agent: GPTBot
Allow: /
to your robots.txt. Robots.txt guide →

❌ No FAQPage schema — AI engines are question-answering machines. Without FAQPage or QAPage JSON-LD, your content is much harder for them to extract and cite. Fix: add 4–8 Q&A pairs wrapped in FAQPage schema. FAQPage schema guide →

❌ No llms.txt file — llms.txt tells AI language models how to interact with your site, what to prioritise, and how to credit you. Without it, AI engines make their own assumptions. Fix: create the file at your domain root and upload it. Use the generated file from this tool. llms.txt deployment guide →

❌ Low structured data density — Very few schema types detected relative to page length. AI engines rely on structured data to understand entities, topics and relationships on your page. Fix: add Article, BreadcrumbList, Organization and FAQPage schema at minimum. Structured data guide →

❌ Sitemap missing or not referenced in robots.txt — Without a sitemap, AI bots may not discover all your pages. Fix: generate an XML sitemap and add Sitemap: https://yoursite.com/sitemap.xml to your robots.txt. Sitemap guide →

❌ Poor heading structure — AI engines use heading hierarchy (H1 → H2 → H3) to understand page structure. Missing H1 or illogical nesting makes content harder to extract. Fix: ensure one H1 per page, logical H2 sections, and keyword-relevant subheadings. Heading structure guide →

❌ Low question coverage — Your page does not appear to answer specific user questions. AI engines look for explicit question–answer patterns. Fix: add a dedicated FAQ section or HowTo steps, and use question-form H2/H3 headings like "How does X work?" or "What is Y?". Question coverage guide →
✅ What do passing AEO checks mean — and how do I maintain them?
✅ AI bots confirmed — GPTBot, PerplexityBot and other AI crawlers can access your page. This is the baseline requirement for citation. Maintain it by reviewing robots.txt whenever you make changes — a single disallow rule can accidentally block all bots. Learn more →

✅ FAQPage schema present — Your Q&A pairs are machine-readable. Maintain this by keeping answers current and adding new questions as your topic evolves. Stale or inaccurate FAQs can hurt trust signals over time. Learn more →

✅ llms.txt found — AI engines have explicit guidance about your site. Review and update your llms.txt whenever you add major new content areas or change your site's purpose. Learn more →

✅ Sitemap found and valid — Your sitemap is discoverable. Maintain it by regenerating it when you publish new pages and keeping the lastmod dates accurate. Learn more →

✅ Strong structured data — Multiple schema types detected. Each new page type you add (product, recipe, event, how-to) should have matching schema. Use our Schema Builder to generate it in under 2 minutes. Structured data guide →
🎯 How to act on your AEO score — step by step
Step 1 — Fix bot access first. If GPTBot, PerplexityBot or ClaudeBot are blocked in your robots.txt, nothing else matters — they cannot see your page at all. Check the Bot Access table in your results and remove any blocking rules. Robots.txt guide →

Step 2 — Add FAQPage schema. This is the single highest-impact change for AEO. Write 4–8 genuine questions your audience asks and wrap them in FAQPage JSON-LD. Use our Schema Builder → to generate the code in 2 minutes. FAQPage guide →

Step 3 — Deploy your llms.txt. Use the generated file from this tool's results. Upload it to your domain root so AI engines know how to interact with your site. llms.txt deployment guide →

Step 4 — Use the optimised AI snippet. The snippet this tool generates is already structured for AI citation. Replace your opening paragraph or meta description with it to maximise extraction probability. Snippet guide →

Step 5 — Work through Improvements in impact order. The Improvements card in your results is already sorted by impact. Tackle HIGH items first — these move the score most. Each has a specific action attached. AEO improvements guide →

Step 6 — Re-audit after changes. Re-run this tool after implementing fixes to confirm your score has improved and no regressions have appeared.
Check Your AI Visibility Score

Enter any page URL. We'll fetch the real content and run a full technical AEO audit. Optionally add a competitor URL to compare scores side by side.

AEO Improvement Checklist ?Work through these in order. Bot access must be confirmed before anything else matters. FAQPage schema has the highest single impact on AI citation probability.
🤖 Confirm AI bot access — check robots.txt allows GPTBot, PerplexityBot, ClaudeBot and Bingbot. The single hardest blocker to citation. ⚡ Easy Guide →
Add FAQPage schema — 4–8 Q&A pairs in JSON-LD. Highest-impact schema type for AI citation probability. ⚡ Easy Guide →
📄 Deploy llms.txt — upload the generated file to your domain root. Gives AI engines explicit guidance about your site. ⚡ Easy Guide →
🗂 Add Article + Organization schema — tells AI engines who wrote the content and who publishes it. Supports trust signals alongside E-E-A-T. ⚡ Easy Guide →
✍️ Use the AI-optimised snippet — replace your opening paragraph with the generated snippet. Direct, question-answering format is what AI extracts first. ⚡ Easy Guide →
📋 Submit sitemap to Google & Bing — ensures all pages are discoverable. Reference it in robots.txt and resubmit after new content is published. ⚡ Easy Guide →
Run these next → 🏆 E-E-A-T Checker 📖 Readability 🗂 Add Schema Markup 🔍 Full Site Audit
What is E-E-A-T? ?E-E-A-T: Experience, Expertise, Authoritativeness, Trust. Used by Google's Quality Raters to evaluate content quality. Critical for YMYL topics (health, finance, legal). Not a direct ranking signal, but heavily influences which sites Google trusts to rank.
🎯 Experience First-hand experience — does the author have real-world experience with the topic?
Good: Personal case studies, photos, dates, specific anecdotes
Poor: Generic content that could apply to anyone
Benchmark: 2/3 checks = adequate · 3/3 = strong
📚 Expertise Knowledge depth — is the content written by someone with genuine expertise?
Good: Credentials, citations, technical depth, author schema
Poor: Shallow overviews, no author information
Benchmark: 2/3 checks = adequate · 3/3 = strong
🏆 Authoritativeness Recognition by others — is this site cited and trusted?
Good: Backlinks, mentions, social proof, reviews, awards
Poor: New site, no external citations, no reviews
Benchmark: 2/3 checks = adequate · 3/3 = excellent
🛡 Trust Accuracy and transparency — is the site trustworthy?
Good: HTTPS, privacy policy, contact page, security headers
Poor: Missing pages, no SSL, noindex errors
Benchmark: 3/4 checks = adequate · 4/4 = fully trusted
Analyse E-E-A-T Signals
Run these next → 🤖 AEO Checker 📖 Readability 🔍 Full Site Audit
Analyse Readability
🌐 From URL
📋 Paste Text
Flesch Reading Ease Guide ?Score 0–100. Calculated from average sentence length and average syllables per word. Higher = easier to read. Most web content should target 60–70. This directly affects dwell time, bounce rate, and AI citation probability.
ScoreLevelTypical ReaderSEO Target
0–30Very DifficultUniversity graduate❌ Too hard for most audiences
30–50DifficultCollege level⚠️ Acceptable for academic/legal only
50–60Fairly Difficult10th–12th grade⚠️ OK for B2B, improve where possible
60–70Standard8th–9th grade✅ Ideal for most web content
70–80Fairly Easy7th grade✅ Great for blogs and landing pages
80–100Easy5th–6th grade✅ Best for broad consumer audiences
Run these next → 🔑 Keyword Analysis 🏆 E-E-A-T Check 🤖 AEO Score
📊 What is keyword density and why does it matter?
Keyword density is the percentage of times a keyword appears relative to the total word count. Formula: (occurrences ÷ total words) × 100.

Why it matters: Google uses keyword frequency as a relevance signal — but only up to a point. Too low and the page lacks relevance. Too high and Google's spam systems may flag it.

Target range:
0.5–2.5% — Healthy. Appears naturally without forcing.
2.5–4% — Getting high. Diversify with synonyms and LSI terms.
4%+ — Stuffing risk. May trigger a penalty.

Full keyword density guide in the Learning Hub →
⚠️ What is keyword stuffing and how does Google detect it?
Keyword stuffing is the practice of overloading a page with a keyword to manipulate rankings. Google's Spam Policy explicitly prohibits it and it can result in a manual action or demotion.

How Google detects it: Google's language models understand natural writing patterns. Unnatural repetition stands out clearly and is flagged as manipulation.

Common mistakes:
• Repeating the exact keyword phrase instead of using natural variations
• Hiding keywords in white text or off-screen elements
• Loading keywords into meta tags, alt text or footer links unnaturally

The fix: Use synonyms, related terms and LSI keywords. Write for the reader, not the algorithm.

Keyword stuffing penalty guide in the Learning Hub →
🎯 How to act on keyword analysis data — step by step
Step 1 — Fix stuffing first. Any keyword over 4% density should be reduced immediately. Replace exact-match repetitions with synonyms. Learn more →

Step 2 — Check primary keyword placement. Your most important keyword must appear in the title tag, H1 and ideally the meta description. Title tag guide →

Step 3 — Close keyword gaps. Run the tool with a competitor URL to see which terms they rank for that you are missing. Gap analysis guide →

Step 4 — Add LSI keywords. Related terms and semantic variations strengthen topical authority without risking over-optimisation. LSI keywords guide →

Step 5 — Re-audit after changes. Re-run this tool after updating your content to confirm density has moved into the healthy range.
Analyse Page Keywords
Keyword Density Guide ?Density = (occurrences ÷ total words) × 100. Over-optimisation is penalised. Use synonyms and related terms (LSI keywords) to diversify rather than repeating the exact phrase.

Keyword density is the percentage of times a keyword appears relative to total word count. Over-optimisation is penalised — aim for natural, varied usage.

0.5%–2.5% — Healthy. Appears naturally without forcing.
⚠️ 2.5%–4% — Getting high. Use synonyms and LSI keywords instead of repeating the exact phrase.
4%+ — Keyword stuffing risk. Google may flag this as manipulative. Diversify your language.
Run these next → 📖 Readability 🤖 AEO Score 🏷 Meta Analyser
ℹ️ What does the Google Maps Audit check?
Google Maps Audit pulls live Maps SERP data for any keyword and location — showing you exactly who ranks, why they rank, and what you need to do to outrank them.

Keyword search mode: Enter a search term (e.g. "web designer") and location to see the full Maps rankings for that query — ranked list of all businesses, ratings, reviews, photos, claimed status, hours and categories.

Business name mode: Add your business name to find exactly where you appear in the results and compare yourself directly against the top 3.

Silver+ checks: Top 10 Maps results · Ranked list with full business data · 3-pack gap analysis · Review/rating/photos comparison · Hours & claimed status audit · Competitor categories

Gold+ additions: Top 20 results · Local Finder results · Competitor chart · Full data for all results

Platinum+: Popular times data per business
🎯 How to use this tool
Step 1 — Enter your keyword. This is what your customers search in Google Maps. Use your primary category: "web designer", "plumber", "pizza restaurant". Be specific — "emergency plumber" returns different results to "plumber".

Step 2 — Enter location as City,Region,Country. Example: Chesterfield,England,United Kingdom

Step 3 — Optionally add your business name. This finds your position in the results and generates a personalised gap analysis — how many more reviews and what rating you need to reach position 1.

Step 4 — Read the ranked list. Every business in the top 10 (Silver) or top 20 (Gold+) is shown with rating, review count, photos, claimed status, hours and category. Look for patterns in what the top 3 have that you don't.

Step 5 — Use the competitor chart. The bar chart shows review counts for all ranked businesses at a glance. Your bar is highlighted in cyan — the gap between your bar and the top bars is your review target.
📊 Why Maps ranking matters for local businesses
The Google Maps 3-pack appears above all organic results and captures the majority of local search clicks. Studies consistently show the top 3 local listings receive over 75% of all clicks on local search results pages.

Google's three ranking factors:
Relevance — does your category and description match the search query?
Distance — how close is your business to the searcher?
Prominence — reviews, rating, photos, links, mentions. This is what you can actively improve.

What this audit tells you: It shows you the exact businesses Google currently ranks above you, what their prominence signals look like (reviews, rating, photos, claimed status), and the precise gap you need to close. No guesswork — real live data.
Search Google Maps Rankings
ℹ️ Enter the keyword your customers search, your location, and optionally your business name to find your position and get a personalised gap analysis.
Related: 📍 GBP Audit 📍 Local SEO Checker 🤖 AEO Checker
ℹ️ What does this tool check and how does it work?
Google Business Profile (GBP) — formerly Google My Business — is the single most important local SEO asset for any business with a physical location or service area. It directly controls how you appear in Google Maps, the Local 3-pack and AI-powered local results.

How it works: Enter your business name and location. We pull live public data directly from Google — the same data Google displays in Maps and Search. No Google login required.

What we check (Silver+): Profile completeness score, business name, address, phone, website, primary & secondary categories, category IDs, opening hours with 24-hour handling, current open/closed status, business description, photos count, claimed status, book online URL, contact URL, domain, contributor URL, place ID, CID Maps link, service attributes, ratings with star distribution, review response rate, individual reviews with owner reply tracking.

Gold+ additions: Maps 3-pack position, rating gap vs top 3, review count gap, completeness gap, gap analysis chart, popular times heatmap by day, Q&A audit with unanswered question detection.

Platinum+ additions: Full 100-review history with detailed response rate analysis.
🚫 Common GBP errors and what they mean
❌ Opening Hours missing — Your GBP listing has no hours set. According to Google, businesses with hours are significantly more likely to appear in local search results. Customers actively filter by businesses that are open now. Fix: Go to your Google Business dashboard → Edit profile → Hours. If you're open 24 hours, select "Open 24 hours". Learn more →

❌ No photos uploaded — Your listing has zero photos. Google's own data shows businesses with photos receive 42% more requests for directions and 35% more website clicks than those without. Fix: Add at minimum a logo, cover photo and one interior photo. Photos must be JPG or PNG, at least 720×720px, between 10KB and 5MB. Learn more →

❌ Profile not claimed — This listing exists on Google but has not been claimed by the business owner. An unclaimed profile cannot be edited, updated or optimised. Fix: Search for your business on Google Maps, click "Claim this business" and complete Google's verification process. Verification typically takes 1–5 business days by postcard. Learn more →

❌ No business description — Your profile has no "From the business" description. This is free text you control and it appears prominently in your listing. Google uses it for relevance. Fix: Go to Edit profile → Add business description. Keep it under 750 characters, describe what you do, who you serve and what makes you different. Do not include URLs or promotional language like "best" or "#1". Learn more →

❌ No book online URL — Your profile has no booking link. Businesses with booking links get significantly higher click-through rates because customers can take action immediately from Maps. Fix: Go to Edit profile → Booking → add your booking or appointment URL. Works with most booking platforms. Learn more →

❌ No reviews yet — New listings with no reviews have very low visibility in Maps results. Google's ranking algorithm weighs review count heavily under "Prominence". Fix: Ask every satisfied customer for a review using your short review link from GBP dashboard → Get more reviews. Learn more →

❌ Low review response rate — You're not responding to reviews. Google treats review responses as an active management signal and it directly affects ranking. Fix: Respond to every review within 24 hours, especially negative ones. Be specific and professional — never copy-paste generic responses. Learn more →
✅ What passing checks mean and how to maintain them
✅ Profile claimed and verified — You own and control this listing. Maintain it by keeping your business information current — Google penalises listings with outdated information over time. Learn more →

✅ Hours set — Customers and Google can see when you're open. Maintain this by setting special hours for bank holidays and seasonal changes. Businesses that don't update holiday hours can be marked as "Permanently Closed" by Google users. Learn more →

✅ Photos present — Your listing is visually complete. Maintain by adding new photos regularly. Google prioritises recently updated profiles. Aim for at least one new photo per month. Learn more →

✅ Good review response rate — You're actively managing your reputation. Maintain by responding to every new review promptly. Set a reminder to check GBP weekly. Learn more →

✅ In the Maps 3-pack — You're visible to searchers without them scrolling. Maintain this position by continuing to build reviews, keeping your profile 100% complete, and adding GBP posts weekly to signal activity. Learn more →
🎯 How to use this tool — Grabzies example
Step 1 — Enter your business name exactly as it appears on Google Maps.
Example: Grabzies — not "Grabzies Ltd" or "Grabzies Web Design" unless that's the exact Maps listing name.

Step 2 — Enter location as City,Region,Country.
Example: Chesterfield,England,United Kingdom
Spaces around commas are fine — we clean them automatically. Do not enter a full street address.

Step 3 — Read your completeness score.
Grabzies scores 80/100 — good overall. Missing items: business description and book online URL. Each gap is listed below the profile card with a direct link to fix it in your GBP dashboard.

Step 4 — Check the review response rate.
Grabzies has no reviews yet. The first 5 reviews are the highest priority action for any new GBP listing — they unlock the star rating display and significantly boost visibility.

Step 5 — Check attributes.
The attributes section shows what services Google has recorded for your business (delivery, wifi, outdoor seating etc). If attributes are missing, add them in GBP → Edit profile → More → Attributes.

Step 6 (Gold+) — Use the gap analysis.
The gap analysis compares your review count and rating against the current top 3 businesses for your category in your location. If you need 17 more reviews to reach the top 3 average — that's your target. The popular times chart shows when your business is busiest, so you can time review requests and posts for peak activity.
📸 Why photos matter — Google's own guidelines
According to Google's Business Profile guidelines, businesses with photos receive 42% more requests for directions and 35% more website clicks than businesses without photos.

Photo types Google recommends:
Logo — helps customers recognise your business. Appears in Maps listings. Google recommends a square image, at least 720×720px.
Cover photo — the main image at the top of your profile. Should best represent your business overall.
Business photos — storefront exterior, interior, team, products and services. Google says exterior photos help customers recognise your business when they visit.

Google's technical requirements:
Format: JPG or PNG · Size: 10KB–5MB · Minimum resolution: 250×250px · Recommended: 720×720px or larger · No excessive filters or AI alterations — images must represent reality.

For Grabzies: Add a logo, a photo of the shopfront, photos of your work/services, and a team photo. Even 5–10 good photos will significantly improve your Maps visibility before you have any reviews.

Note: After upload, photos go through Google review. It can take up to 24–48 hours before they appear on your profile. Your business must be verified for photos to show publicly. Photos guide →
⭐ How to get more Google reviews — what actually works
Reviews are Google's primary "Prominence" signal in local ranking. More reviews, higher average rating = higher Maps position. Here's what works:

1. Get your review link. Go to your GBP dashboard → Get more reviews → copy the short link. Put it in every email signature, invoice footer, WhatsApp reply and receipt. Make it as easy as possible.

2. Ask immediately after a positive interaction. The best time to ask is right after a customer expresses satisfaction — in person, by phone or by email. Timing matters enormously. A same-day request converts 3–5× better than a request sent a week later.

3. Email follow-up. Send a simple email 24 hours after a job is complete: "Hope you're happy with [service]. A 60-second Google review helps us enormously: [link]." This converts at 10–15% with warm customers.

4. Respond to every review — especially negative ones. According to Google, responding to reviews shows you value your customers. Specific, professional responses to negative reviews often convert undecided customers better than 5-star reviews alone. Never use generic "Thank you for your review!" — be specific about what they mentioned.

5. Never offer incentives. Google's review policy explicitly prohibits offering discounts, gifts or payments in exchange for reviews. This can result in your listing being suspended. You can ask — you cannot pay.

For Grabzies: Start with past customers you know were happy with your web design work. A personal message asking for a review converts far better than a mass email. Reviews guide →
🗺️ How the Maps 3-pack works and how to get in it
The Maps 3-pack shows the top 3 local businesses for a search query. It appears above organic results and captures the majority of local search clicks. Google uses three factors to determine who appears:

Relevance — Does your GBP category and description match the search query? This is why choosing the most specific primary category is critical. "Web Designer" ranks differently to "Website Designer" or "Digital Marketing Agency".

Distance — How close is your business to the searcher? You can't change your physical location, but you can ensure your address is 100% accurate in GBP.

Prominence — How well-known is your business? This includes reviews and rating (the biggest variable you control), links from other websites, mentions in directories, and overall online presence. This is where most businesses can make the biggest gains.

What the Gold+ gap analysis shows: The gap analysis compares your review count and rating against the current top 3 businesses for your category in your exact location. If the top 3 average 17 reviews and you have 0, you need 17 more to match. If they average 5.0★ and you have 4.2★, you have a rating gap to close. These are your concrete targets.

Other ranking factors you control:
• Profile completeness — every empty field is a missed signal
• Review response rate — signals active management
• Consistent NAP (Name, Address, Phone) across all directories
• GBP posts — regular posting signals an active business
• LocalBusiness schema on your website — use our Local SEO Checker to audit this

3-pack ranking guide →
🏷️ What are GBP attributes and why do they matter?
Attributes are factual details about your business that customers can filter by in Google Maps. Examples include Wi-Fi, outdoor seating, delivery, wheelchair accessible entrance, online appointments, LGBTQ+ friendly, women-led, and more.

Two types of attributes:
Editable attributes — factual things you set yourself, like "Has Wi-Fi" or "Offers delivery". These appear on your profile immediately after you set them.
Subjective attributes — things like "Popular with locals" or "Cosy atmosphere" — set by Google based on customer reviews and visits. You cannot edit these directly.

Available attributes vary by business category. A restaurant gets delivery/dine-in/takeaway options. A service business gets online appointments/same-day service options. A retailer gets in-store shopping options.

How to edit attributes: GBP dashboard → Edit profile → More → Attributes. Google says attribute reviews usually take about 10 minutes but can take up to 30 days.

Why they matter for ranking: Customers searching for "web designer with same-day service near me" or "accessible website designer Chesterfield" will see your listing filtered by attributes. Missing attributes = missing customers. Attributes guide →
⏰ Understanding your popular times data
Popular times shows when your business receives the most visits, calls and website clicks — broken down by hour and day of week. Google collects this from anonymised location data of users who have visited your business.

How to use popular times for SEO and marketing:
Time your review requests — ask for reviews immediately after your peak hours when customers are most engaged with your business
Schedule GBP posts — post offers and updates just before your busiest periods to catch customers when they're actively thinking about your business
Plan staffing — popular times data is more accurate than guesswork for scheduling
Identify slow periods — low activity hours are opportunities for targeted promotions

Note: Popular times data is only available for businesses with sufficient visit history. New businesses or those with very low footfall may not have this data. It updates over time as more visits are recorded.

The chart below (Gold+) shows your popular times as a heatmap — the darker the colour, the busier that hour. Popular times guide →
📋 NAP consistency — what it is and why it matters
NAP stands for Name, Address, Phone. Google cross-references your NAP data across your GBP listing, your website and third-party directories (Yell, Yelp, Bing Places, Apple Maps etc). Inconsistencies confuse Google about your business identity and reduce your local ranking.

Common NAP mistakes:
• "Grabzies" in GBP vs "Grabzies Ltd" on your website
• "Sheepbridge Lane" vs "Sheepbridge Ln" vs "Sheepbridge Lane, Chesterfield"
• Different phone numbers on different platforms
• Old address still on directory sites after a move

How to fix NAP inconsistencies:
1. Decide on your canonical NAP — exactly how your name, address and phone should appear everywhere
2. Update your GBP listing first — this is Google's primary source
3. Update your website footer, contact page and schema markup
4. Search "[business name] [town]" and update every directory listing you find
5. Use our Local SEO Checker to audit your website's LocalBusiness schema

Our Local SEO Checker (on the Audit Tools page) specifically checks whether your on-page NAP matches your schema markup — run that alongside this GBP audit for a complete local SEO picture. NAP guide →
Audit Your Google Business Profile
ℹ️ Enter your business name exactly as it appears on Google Maps, and location as City,Region,Country — e.g. Chesterfield,England,United Kingdom
Related: 🤖 AEO Checker 🏆 E-E-A-T Checker 📍 Local SEO Checker
📄 What is llms.txt and why does it matter?
llms.txt is a plain text Markdown file at yoursite.com/llms.txt. Proposed by Jeremy Howard of Answer.AI in September 2024, it guides AI language models to your most important content — similar to how robots.txt guides search crawlers, but instead of restrictions it's a curated content guide.

The spec (llmstxt.org) requires:
• First line must be # Your Site Name (H1, nothing before it)
• A blockquote summary: > Brief description of your site
• Sections using ## H2 headings
• Links in format: - [Page Title](url): description

Where to deploy it: Upload to your web root so it's accessible at https://yourdomain.com/llms.txt.
Apache/Nginx: place in /var/www/html/ or your document root.
WordPress: upload via Media or FTP to root. Add to robots.txt: Sitemap: https://yoursite.com/llms.txt

Full llms.txt implementation guide →
🚫 What do common llms.txt errors mean — and how do I fix them?
❌ First line is not # Title — The spec requires line 1 to be an H1 heading with nothing before it. Even a blank line breaks compliance. Fix: make # Your Site Name the very first line. Spec guide →

❌ No blockquote summary — The > description blockquote on line 2 or 3 is required — it's what AI models read first to understand your site. Fix: add > A site providing expert SEO tools for digital marketers. Spec guide →

❌ No ## H2 sections — Sections organise your content for AI parsing. Without them the file is an unstructured block of text. Fix: add ## Key Pages, ## Docs, ## About etc. Spec guide →

❌ Links not in spec format — Plain URLs don't give AI models context. The spec requires - [Title](url): description. Fix: reformat every link with a title and brief description of what the page covers. Spec guide →

❌ Links point to wrong domain — External links in your llms.txt confuse AI crawlers about what your site actually covers. Fix: all links should point to your own domain only. Links guide →

❌ X-Robots-Tag conflict — Your server is sending X-Robots-Tag: noindex while your llms.txt is trying to guide AI crawlers. These are contradictory signals. Fix: remove noindex from the X-Robots-Tag header if you want AI indexing. Robots guide →
✅ What do passing checks mean — and how do I maintain them?
✅ File exists and returns 200 — Your llms.txt is live and publicly accessible. Maintain it by not accidentally blocking the path in .htaccess or your CDN. Learn more →

✅ Correct H1 first line — AI models correctly identify your site name. Update it only if your brand name changes. Learn more →

✅ Blockquote summary present — AI models can immediately understand your site purpose. Keep the description accurate as your site evolves. Learn more →

✅ Spec-compliant links — Your pages are properly described and linkable by AI crawlers. Update the file whenever you add major new content areas or restructure your site. Learn more →

✅ llms-full.txt present — You have the extended version providing full site context. Keep it in sync with llms.txt. Learn more →
Audit Your llms.txt
Related: 🤖 AEO Checker 🏆 E-E-A-T Checker 🤖 Robots & Sitemap
🔒
Sign in to continue
Create a free account to start using SEO Audit Pro.
0 / 0 audits used today
CREATE FREE ACCOUNT SIGN IN