Broken Links

Links that lead to 404 error pages. How they harm SEO, crawl budget, and user experience.

In brief

A broken link is a hyperlink that points to a non‑existent page, returning an HTTP 4xx or 5xx status (most commonly 404). Such links appear when pages are deleted, URLs are changed, or coding errors occur. They degrade UX, waste crawl budget, and can negatively affect rankings.

Broken links are hyperlinks that point to pages unavailable to users and search bots. Most often the server returns a 404 (Not Found) status, but 410 (Gone), 500 (Internal Server Error), and other errors also count. Broken links can be internal (pointing to other pages on the same site) or external (pointing to other sites).

TXT
Examples of broken links:
- Internal: https://site.ru/deleted-page (404)
- External: from your site to a deleted page on another site
- Images: src="image.jpg" but the file is missing
- CSS/JS: linked files that return 404

Why Broken Links Are Harmful

The SEO impact of broken links can be significant:

  • Poor UX — users click and see an error, increasing bounce rate and reducing trust.
  • Wasted crawl budget — bots spend time crawling non‑existent pages instead of useful content.
  • Drained link equity — if a broken link receives PageRank (e.g., from an important page), that equity is lost or sent to an error page.
  • Negative authority signals — a large number of broken links can lower the site’s quality assessment (E‑E‑A‑T).
Broken links on the homepage, main navigation, or high‑equity pages are especially dangerous. They not only annoy users but also signal to Google that the site is poorly maintained.

How to Find Broken Links

  • Screaming Frog SEO Spider — crawls the site and shows all response codes. Filter by 4xx/5xx status.
  • Google Search Console → Pages → 404 errors. GSC shows pages Google discovered and returned 404.
  • Sitebulb, Netpeak Spider, Ahrefs Site Audit — similar tools with advanced reports.
  • External broken links — check via Ahrefs (Broken Backlinks) or Semrush (Backlink Audit).
TXT
Quick check via command line (Linux, macOS):
curl -I https://site.ru/page 2>/dev/null | head -n 1 | grep -E "404|410|500"

For mass checking, write a Python script or use wget with the --spider flag.

How to Fix Them

  • For internal broken links: - If a page is permanently deleted — keep 404 or 410, but update all links pointing to it. - If content was moved — set up a 301 redirect to a relevant new page. - If it’s a typo — fix the URL.
  • For external broken links (pointing to your site): - If the page should exist — restore the content or set a 301 to the current version. - If the page is deleted — return 410 (to speed up removal from index) and contact donor webmasters if possible.
  • Conduct regular audits — every 1–3 months, especially after redesigns or mass deletions.
Do not simply ignore broken links. Google may interpret many 404s as a sign of neglect. Better to set a 410 (explicitly says gone) or a redirect.

Common questions

No, Google does not penalise 404 errors if they are natural (page really gone). The problem is a large number of broken links plus lack of redirects — that hurts UX and wastes crawl budget.
No. robots.txt controls access, it doesn’t fix links. Better to set redirects or update the links.
Monthly for dynamic sites (e‑commerce, news portals) and quarterly for static blogs.
Find them via audit (Screaming Frog shows broken images). Replace with working URLs or remove the img tags. Otherwise they count as broken links too.
Linking to a broken external resource doesn’t directly penalise your site, but it degrades UX. If others link to your site with broken links, you just lose potential traffic — no penalty.
Direct contacts

Discuss your project?

Share your goals and website context — I will suggest a practical next step.

Broken Links — What is it?