{"id":223797,"date":"2020-03-13T07:10:29","date_gmt":"2020-03-13T01:40:29","guid":{"rendered":"https:\/\/www.qcsglobal.com\/marketing\/nine-site-audit-issues-we-always-see-and-tips-to-tackle-them-search-engine-watch\/"},"modified":"2020-04-18T17:10:43","modified_gmt":"2020-04-18T11:40:43","slug":"nine-site-audit-issues-we-always-see-and-tips-to-tackle-them-search-engine-watch","status":"publish","type":"post","link":"https:\/\/qcsglobal.com\/blogs\/nine-site-audit-issues-we-always-see-and-tips-to-tackle-them-search-engine-watch\/","title":{"rendered":"Nine site audit issues we always see and tips to tackle them Search Engine Watch"},"content":{"rendered":"\n<div>\n<div class=\"wp-caption\"><\/div>\n<p><strong>After carrying out thousands of site audit activities across varying industries and site sizes, there are some stand out issues that are repeated over and over again. <\/strong><\/p>\n<p>Certain CMS platforms have their downfalls and cause the same technical issues repeatedly but most of the time these issues are caused by the sites being managed by multiple people, knowledge gaps, and simply the factor of time.<\/p>\n<p>We tend to use two crawlers at Zazzle Media which will be mentioned throughout this post. The first being Screaming Frog, which we make use of when we need raw exports or need to be very specific with what we are crawling. The second being Sitebulb, which is much more of a site audit tool, rather than a crawler. We tend to make use of Sitebulb more due to being able to manage projects and the overall progress a site is making.<\/p>\n<p>So let\u2019s get started, with the issues we see time and time again.<\/p>\n<h2>1. Broken internal links<\/h2>\n<p>One of the more simple issues, but something that can be missed, if you aren\u2019t looking out for it. Broken links can disrupt the user journey, and for the <a href=\"https:\/\/www.searchenginewatch.com\/2018\/05\/21\/no-need-for-google-12-alternative-search-engines-in-2018\/\" target=\"_blank\" rel=\"noopener noreferrer\">search engines<\/a>, this disables crawl bots from connecting pieces of content.<\/p>\n<p>Internal links are mainly utilized to connect pieces of content, and in terms of Google\u2019s algorithm, internal links allow link equity to be distributed from one page to another. A broken link can disrupt this as if the link is broken causing failure of equity transfer from one page to another. In terms of PageRank, Google\u2019s algorithm evaluates the number of high-quality links to a page in order to determine page authority.<\/p>\n<p>Put simply, a broken internal link can negatively affect page authority and stop the flow of link equity.<\/p>\n<p>The scale of this issue will vary dramatically depending on the type of site you are running. However, on most sites there will be some form of broken links.<\/p>\n<h3>Quick tip<\/h3>\n<p>A simple crawl will pick these up, running a tool such as Screaming Frog with a basic configuration will provide a full list of broken links, alongside the parent URL.<\/p>\n<h2>2. Meta title length<\/h2>\n<p>Based on the number of times this occurs, it can be a very minimal problem or something that could dramatically alter a whole business.<\/p>\n<p>Short meta titles could indicate a lack of targeting while long titles would cause truncation and in turn, lower click-through rates.<\/p>\n<h3>Quick tip<\/h3>\n<p>To write the perfect meta title and descriptions which maximize pixel usage and CTA, we recommend using the\u00a0Sistrix SERP generator tool.<\/p>\n<h2>3. Redirecting internal links<\/h2>\n<p>Redirecting internal links can cause problems for your site architecture as it takes slightly longer for users and search engines to find content. With content changing or products becoming sold out, either a permanent (301) or temporary (302) redirection is used. A 302 redirection tells a search engine to keep the old page, as the 302 redirection is simply a temporary measure. A <a href=\"https:\/\/www.searchenginewatch.com\/2019\/07\/08\/six-http-status-codes-seo\/\" target=\"_blank\" rel=\"noopener noreferrer\">301 redirection<\/a> instructs the search engine that the page has permanently moved, and will be replaced at the new location.<\/p>\n<p>Redirection loops are when your browser tells the search engine to redirect to a page, which once again tells your browser to redirect to another page \u2013 which can happen over and over again until it hits the final destination. Redirection loops should be avoided at all costs, as this will increase crawl time and can send mixed signals to search bots.<\/p>\n<p>The problem isn\u2019t with redirecting a URL (if completed correctly), the issue lies within the links pointing to the URL redirection. For example, URL A redirects to a new URL B. But URL C still points to URL A \u2013 which is incorrect.<\/p>\n<p>Sitebulb can crawl and find all the URLs that currently link to the redirecting URL, where you can then change the href target to point to the new URL via the CMS.<\/p>\n<h3>Quick tip<\/h3>\n<p>Redirecting URLs should be avoided where possible, as this can increase a search bots crawl time, in turn, potentially leading to the website\u2019s URL being skipped within the allocated crawl.<\/p>\n<h2>4. Outdated sitemaps<\/h2>\n<p>XML sitemaps do not have to be static, as with larger websites to continuously update the XML file directory will be very time-consuming. It is recommended to use a dynamic .xml sitemap, as this ensures every time a piece of content, or media is added, your CMS automatically updates this file directory. A Sitebulb audit will highlight that your website has a missing sitemap.<\/p>\n<p>It is really important to use Dynamic XML sitemaps correctly, as in some cases, the dynamic sitemap can end up adding URLs you do not want in the sitemap<\/p>\n<h3>Quick tip<\/h3>\n<p>If you are using a standard CMS such as WordPress search\/sitemap.xml to the end of your domain, this should show your website\u2019s sitemap.<\/p>\n<h2>5. Orphan URLs<\/h2>\n<p>Orphan pages, otherwise known as \u201cfloating pages\u201d are URLs that are indexed and published but can neither be found by users nor search engines by following internal links. This means that an orphan page can end up never being crawled. A typical scenario of an orphan page could be a winter sale, where the page was once needed, but now due to the season isn\u2019t needed anymore.<\/p>\n<p>Essentially, when there are a few this is not harmful, however, when there is a large amount this it can bloat your website. The result, poor link equity distribution, keyword cannibalization (for which we have a separate<a href=\"https:\/\/www.zazzlemedia.co.uk\/blog\/fixing-keyword-cannibalisation\/\" target=\"_blank\" rel=\"noopener noreferrer\">\u00a0guide here<\/a>) and a poor internal linking experience for both search bot and user.<\/p>\n<h3>Quick tip<\/h3>\n<p>As this is a specific type of crawl, Zazzle Media uses Screaming Frog to crawl the sitemap data. At the same time, we run another crawl with either Screaming Frog or Sitebulb to find the orphan pages by comparing the two data sets.<\/p>\n<p>Read our quick guide that concerns <a href=\"https:\/\/www.zazzlemedia.co.uk\/blog\/fix-orphaned-pages-seo\/#gref\" target=\"_blank\" rel=\"noopener noreferrer\">orphan URLs<\/a> and how to deal with them for a more in-depth approach.<\/p>\n<h2>6. Site speed<\/h2>\n<p>Google has previously indicated that<a href=\"https:\/\/searchengineland.com\/google-speed-update-page-speed-will-become-ranking-factor-mobile-search-289904\" target=\"_blank\" rel=\"noopener noreferrer\">\u00a0site speed is a crucial ranking factor<\/a>, and more specifically is a part of its ranking algorithm for search engine results. This is because site speed is closely related to good user experience, slow websites have high bounce rates due to content taking a long time to load. A benefit from improving your websites site speed is that it will better the user experience, but also could reduce website bounce rate too.<\/p>\n<p><img loading=\"lazy\" decoding=\"async\" class=\"aligncenter wp-image-140360 size-full\" src=\"https:\/\/23i69d6p0gw1zwz4y3smspc1-wpengine.netdna-ssl.com\/wp-content\/uploads\/2020\/03\/page-speed-checking-during-site-audits.png\" sizes=\"(max-width: 612px) 100vw, 612px\" srcset=\"https:\/\/23i69d6p0gw1zwz4y3smspc1-wpengine.netdna-ssl.com\/wp-content\/uploads\/2020\/03\/page-speed-checking-during-site-audits.png 612w, https:\/\/23i69d6p0gw1zwz4y3smspc1-wpengine.netdna-ssl.com\/wp-content\/uploads\/2020\/03\/page-speed-checking-during-site-audits-300x208.png 300w\" alt=\"Page speed checking during site audits\" width=\"612\" height=\"424\" \/><\/p>\n<p><em>Source: Search Influence, 2017<\/em><\/p>\n<p>Additionally, as site speed is directly related to lowering bounce rate, this should in turn boost revenues \u2013 as users are actively remaining engaged on your website for longer.<\/p>\n<h3>Quick tip<\/h3>\n<p>To check your website\u2019s site speed, we recommend using Google\u2019s very own<a href=\"https:\/\/developers.google.com\/speed\/pagespeed\/insights\/\" target=\"_blank\" rel=\"noopener noreferrer\">\u00a0page speed insights tool<\/a>, where this will not only give you a page speed score, but also a host of recommendations on how to best improve your site speed and how you compare to search competition!<\/p>\n<h2>7. Hierarchy\/structure<\/h2>\n<p>A website\u2019s Hierarchy structure, otherwise known as information architecture, is essentially how your website\u2019s navigation is presented to a search engine or user. The fundamental issue that most websites suffer from is page rank distribution.<\/p>\n<p>Websites\u2019 main pages or most profitable pages should be within three clicks from the homepage. Pages that are more than three clicks away from the homepage, subsequently receive less page rank distribution, and in other scenarios will only occasionally be crawled (if ever).<\/p>\n<p>Without an effective hierarchy, crawl budget can be wasted. This can mean for pages within the depths of your website (more than three clicks away from the root) could rank poorly as Google is unsure of the importance of the page and link equity could be spread thinly.<\/p>\n<h3>Quick tip<\/h3>\n<p>An SEO and user-friendly site architecture is all about allowing search bots and users to seamlessly navigate your website. Flattening your site architecture can increase indexation, allow more keyword rankings, and in turn boost organic traffic.<\/p>\n<h2>8. Internal linking<\/h2>\n<p>Internal linking is an important feature of a website as this allows users to navigate your website, and most importantly (from an SEO perspective) allows search engine crawlers to understand the connections between content. An effective internal linking strategy could have a big impact on rankings.<\/p>\n<p>It is no surprise to us when a Sitebulb audit states to review your internal linking strategy, as complex sites, with thousands of pages can get messy. A typical example of a messy internal linking structure could be anchor texts that do not contain a keyword, URL linking inconsistencies in volume (for PageRank distribution), and links not always pointing to the canonical version of a URL. Issues such as the ones listed can create mixed signals for search engine crawlers and ultimately confuses a crawler when it comes to indexing your content.<\/p>\n<h3>Quick tip<\/h3>\n<p>Sitebulb can complete an audit where this highlights any issues with link distribution, shows which pages receive the most internal links, shows any broken internal links \/ incorrectly used and so much more. We then digest this data to devise a strategy of how we can best optimize your website\u2019s internal linking strategy.<\/p>\n<h2>9. Thin content<\/h2>\n<p>Writing unique pieces of content that provides value to a user can be incredibly challenging, and most importantly time-consuming! Hence, this is one of the most frequent issues we always see on website audits. More specifically, thin content is directly against Google\u2019s guidelines and can result in a penalty worst-case scenario.<\/p>\n<p>Search engines when crawling your website are looking for functional pieces of content to understand your business services and product offerings. Not only are search engines looking for functional pieces of content, but search bots also want to see your expertise, quality, and trust. Google has a huge 166 page \u2018<a href=\"https:\/\/static.googleusercontent.com\/media\/guidelines.raterhub.com\/en\/searchqualityevaluatorguidelines.pdf\" target=\"_blank\" rel=\"noopener noreferrer\">Search Quality Guidelines<\/a>\u2018 document that explains what search quality constitutes. We recommend familiarizing yourself with this document to ensure that you write quality content for your website which is in line with Google\u2019s search guidelines.<\/p>\n<p>This is a regular issue that many websites overlook, but is a critical route to organic success.<\/p>\n<h3>Quick tip<\/h3>\n<p>A Sitebulb audit will identify any URLs with thin content, and prioritize the severity of the issue. Aim for about 350 \u2013 500 words per page to succinctly communicate your information. However, the quality of this content is still a very important factor.<\/p>\n<h2>In conclusion<\/h2>\n<p>These are just some of the most common types of issues discovered from an SEO audit, and technical changes can be tricky as well as incredibly time-consuming to implement <span style=\"font-style: normal; font-weight: 400;\">at times<\/span>. Completing a technical audit of your website, and correcting any issues can lead to improving keyword rankings, organic traffic, and if the products\/services are right, achieve more sales.<\/p>\n<p>The sky\u2019s the limit when it comes to search engine optimization and with the landscape constantly changing, this is a superior strategy to achieve long term competitive advantage in the digital landscape.<\/p>\n\n<\/div>\n<p><a href=\"https:\/\/www.searchenginewatch.com\/2020\/03\/12\/nine-issues-on-site-audits-and-tips-to-tackle\/\">Source link <\/a><\/p>\n","protected":false},"excerpt":{"rendered":"<p>After carrying out thousands of site audit activities across varying industries and site sizes, there are some stand out issues that are repeated over and over again. Certain CMS platforms have their downfalls and cause the same technical issues repeatedly but most of the time these issues are caused by the sites being managed by [&hellip;]<\/p>\n","protected":false},"author":1,"featured_media":223798,"comment_status":"open","ping_status":"open","sticky":false,"template":"","format":"standard","meta":{"_et_pb_use_builder":"","_et_pb_old_content":"","_et_gb_content_width":"","footnotes":""},"categories":[10,13],"tags":[],"_links":{"self":[{"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/posts\/223797"}],"collection":[{"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/posts"}],"about":[{"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/types\/post"}],"author":[{"embeddable":true,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/users\/1"}],"replies":[{"embeddable":true,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/comments?post=223797"}],"version-history":[{"count":1,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/posts\/223797\/revisions"}],"predecessor-version":[{"id":223800,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/posts\/223797\/revisions\/223800"}],"wp:featuredmedia":[{"embeddable":true,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/media\/223798"}],"wp:attachment":[{"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/media?parent=223797"}],"wp:term":[{"taxonomy":"category","embeddable":true,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/categories?post=223797"},{"taxonomy":"post_tag","embeddable":true,"href":"https:\/\/qcsglobal.com\/blogs\/wp-json\/wp\/v2\/tags?post=223797"}],"curies":[{"name":"wp","href":"https:\/\/api.w.org\/{rel}","templated":true}]}}