What a Good Website Audit Should Include

If you own or manage a small-business website, a good audit should answer one practical question: what should be fixed first, and why?

A useful review does not begin with a list of favorite tactics. It begins with the job the site is supposed to do, then tests that job against evidence from tools such as Google Analytics 4, Google Search Console, PageSpeed Insights, Lighthouse, Screaming Frog SEO Spider, and form or CRM records. Recommendations should come only after goals, analytics, search visibility, content, conversion paths, technical health, accessibility, and user experience have been reviewed together.

In short, a good website audit includes the business goal, measurement setup, search and crawl evidence, performance data, content quality, conversion paths, accessibility, structured data, and a prioritized action plan. It should explain not just what is wrong, but which fixes are most likely to improve leads, sales, support, or qualified traffic.

  • Business goal and primary conversion action
  • Analytics, events, and tracking gaps
  • Search visibility, indexing, and crawl health
  • Page speed, Core Web Vitals, and mobile experience
  • Content fit against real searcher questions
  • Forms, calls to action, checkout, and follow-up paths
  • Accessibility, structured data, and technical cleanup

Clarify The Business Goal

The same site can pass or fail for different reasons depending on its purpose. A local HVAC company may need phone calls and service-area landing pages. A B2B SaaS site may need demo requests, pricing-page engagement, and documentation visits. An ecommerce store may need product-detail views, add-to-cart events, checkout completion, and repeat purchase signals. Name the primary goal before scoring pages.

Write the goal in a measurable sentence: “Increase qualified consultation requests from organic search,” “reduce checkout drop-off on mobile,” or “help existing customers find support without opening a ticket.” Then map that goal to one or two events in Google Analytics 4[1], such as form submissions, phone-link clicks, file downloads, trial starts, or checkout steps. If those events are not configured, the first recommendation is measurement, not redesign.

This step also prevents false positives. A blog post with a low conversion rate may be doing its job if it introduces the brand to search visitors near the top of the funnel. A service page with heavy traffic but no leads is a stronger candidate for copy, trust, speed, or form work. In real audits, this is one of the most common misreads: teams chase the page with the most sessions instead of the page closest to revenue.

Review Evidence Before Opinions

Evidence should come before design opinions. Start with Google Search Console for queries, impressions, clicks, indexing status, and page experience signals. Use Google Search Essentials[2] to check whether the site can be crawled, indexed, and understood. Use PageSpeed Insights[3] or Lighthouse for lab and field performance signals, then compare those findings with actual conversions in GA4.

Core Web Vitals give the review concrete performance thresholds. According to web.dev[4], a good Largest Contentful Paint is under 2.5 seconds, a good Interaction to Next Paint is under 200 milliseconds, and a good Cumulative Layout Shift score is under 0.1. Google also states that Interaction to Next Paint replaced First Input Delay as a Core Web Vital on March 12, 2024. If a key landing page misses those marks on mobile, name the affected template, the metric, and the likely cause, such as oversized hero images, render-blocking scripts, or unstable ad slots.

Technical checks should be specific enough that a developer can act on them. Crawl the site with a tool such as Screaming Frog SEO Spider[5], then inspect indexable URLs, canonical tags, title duplication, 404 responses, noindex tags, and redirect chains. Google’s redirect documentation says Googlebot follows up to 10 redirect hops before Search Console may report a redirect error, but a clean review should still prefer one direct hop for users and crawlers when a URL has permanently moved.

  • If a service page gets 1,000 organic sessions in GA4 but only 2 form submissions, review the offer, proof, form placement, and mobile load experience before writing more blog posts.
  • If the XML sitemap, robots.txt file, canonical tags, and indexed pages disagree, compare them with Google Search Central sitemap guidance[6] and the site’s actual crawl results.
  • If Search Console queries mention price, availability, location, support, or proof, check whether the page copy, FAQs, headings, and internal links answer those questions directly.
  • If a page ranks but does not convert, separate intent mismatch from page-quality problems before changing the content.

Accessibility belongs in the evidence review, not as an optional polish pass. The W3C Web Content Accessibility Guidelines 2.2[7] define conformance levels A, AA, and AAA. For normal text, WCAG Success Criterion 1.4.3 sets a contrast ratio of at least 4.5:1; for large text, the ratio is at least 3:1. If a lead form uses pale gray labels, missing focus states, or error messages that only rely on color, treat that as a usability and accessibility issue, not a style preference.

Structured data should also be checked against primary documentation. Schema.org defines vocabulary such as Organization, LocalBusiness, Product, BreadcrumbList, FAQPage, and Article, while Google Search Central structured data documentation[8] explains which markup can qualify for Google Search features. A good recommendation does not say “add schema” as a blanket fix. It names the page type, the eligible markup, the required properties, and whether the visible page content supports the markup.

Audit The Buyer Journey

A website is not just a collection of pages. Visitors move from first impression to proof, details, comparison, objection handling, and action. The review should test whether the homepage, service pages, product pages, blog posts, forms, thank-you pages, and follow-up paths work together for the main goal named at the start.

Use a simple path test. Start with one high-intent query from Search Console, open the ranking page on a mobile device, and follow the path a visitor would take.

  • Does the page match the query?
  • Does it state who the offer is for?
  • Does it show proof near the decision point?
  • Does it answer price, process, timing, or risk objections?
  • Does the call to action appear before the visitor gets tired?
  • Does the form, phone link, cart, or checkout work without confusion?

Here is a small worked example for a service business. Search Console shows that “emergency roof repair near me” brings impressions to a roof repair page, but GA4 shows weak form completion. The page loads in 3.4 seconds for LCP on mobile, the phone number is plain text instead of a clickable tel link, the form asks for nine fields before showing expected response time, and the page has no service-area proof.

If the evidence showsThen the fix is likely
Mobile LCP is 3.4 secondsCompress or replace the hero image and retest the template
Phone number is plain textMake it a clickable tel link and measure phone-link clicks
Form asks for nine fields up frontReduce step one to name, phone, ZIP code, and issue type
No service-area proof near the CTAAdd served areas, local proof, and relevant internal links

The faster gains usually come from removing friction on pages that already have qualified demand. A strong page can still underperform if the next step is unclear, slow, hidden, or broken. A pricing page may answer the right questions but send every visitor to a long generic contact form. A blog post may attract qualified readers but fail to link to the matching service page. A product page may have strong copy but lose mobile users when the cart drawer shifts the layout or blocks the checkout button.

Prioritize Recommendations

The final report should separate urgent fixes, high-impact improvements, and nice-to-have changes. A long list without priority does not help a busy team. Each recommendation should state the affected page or template, the evidence, the expected business effect, the owner, and the measurement plan.

Use a decision table when the team has more fixes than time. Score each issue from 1 to 5 for impact, confidence, and effort. Impact means the likely effect on leads, sales, support load, or qualified traffic. Confidence means how strong the evidence is. Effort means the expected cost or difficulty. A simple priority score is impact multiplied by confidence, divided by effort. A broken lead form on the top service page might score 5 impact, 5 confidence, and 1 effort, for a priority score of 25. Rewriting a low-traffic blog post might score 2 impact, 3 confidence, and 3 effort, for a score of 2.

FindingEvidenceRecommendationSuccess Measure
Mobile landing page misses Core Web VitalsLCP is 3.4 seconds; web.dev marks good LCP as under 2.5 secondsReduce hero image weight, defer noncritical scripts, retest in PageSpeed InsightsLCP under 2.5 seconds and no decline in form starts
Lead form is too heavy for urgent visitorsNine required fields before visitor sees response expectationMove optional details to step two and keep first step to contact and needHigher form-start-to-submit rate in GA4
Service page does not answer location intentSearch Console query includes “near me,” but page has no service-area proofAdd served areas, local proof, and relevant internal linksHigher qualified clicks and assisted conversions from the page

Recommendations should avoid fake precision. Google has not published a formula saying that a certain percent of SEO is backlinks, content, speed, or structured data. Do not invent weights. Tie each next step to observed evidence, public technical guidance, and the site’s business goal.

A useful audit is finished when the team can decide what to do tomorrow. If the top issue blocks measurement, fix analytics first. If it blocks discovery, fix crawl, indexation, sitemap, robots.txt, canonical, or redirect problems first. If it blocks action, fix the call to action, form, checkout, trust proof, page speed, or accessibility issue first. Everything else belongs lower on the list.

If you want a quick baseline before a deeper manual review, start from the Website Advisor home page and enter a URL. Use the result as a starting point for deciding whether measurement, technical health, content, or conversion flow deserves attention first.

FAQ

Should a website audit start with SEO?

Not by itself. SEO matters, but the first step is the business goal. A page can rank and still fail if the offer, proof, form, speed, or follow-up path does not support the visitor’s next decision.

Which tools should a small team use first?

Start with Google Search Console, Google Analytics 4, PageSpeed Insights, and a crawl tool such as Screaming Frog SEO Spider. Add Google Tag Manager if event tracking needs cleanup, and use Bing Webmaster Tools when Bing visibility matters for the site’s audience.

How many recommendations should an audit include?

Include enough findings to explain the site’s condition, but separate the first 3 to 5 actions from the backlog. A 60-item report is useful only if the owner can see which fixes affect revenue, leads, support, or search visibility first.

What makes an audit recommendation credible?

A credible recommendation names the page, the evidence, the technical rule, the proposed change, and the measurement plan.

Editor’s Note

The standards and thresholds referenced here were checked on 2026-04-23 against Google Search Central, web.dev, and W3C WCAG documentation. Google periodically updates thresholds, documentation, and ranking systems, so verify current guidance before acting on audit results.

Sources

  1. Google Analytics 4 events: https://support.google.com/analytics/answer/9322688
  2. Google Search Essentials: https://developers.google.com/search/docs/essentials
  3. PageSpeed Insights: https://pagespeed.web.dev/
  4. web.dev Core Web Vitals: https://web.dev/vitals/
  5. Screaming Frog SEO Spider user guide: https://www.screamingfrog.co.uk/seo-spider/user-guide/
  6. Google Search Central sitemap guidance: https://developers.google.com/search/docs/crawling-indexing/sitemaps/overview
  7. W3C Web Content Accessibility Guidelines 2.2: https://www.w3.org/TR/WCAG22/
  8. Google Search Central structured data documentation: https://developers.google.com/search/docs/appearance/structured-data/intro-structured-data