Ever wanted to make your website really cool?
Meet Screaming Frog, your website superhero!
This guide is like a friendly chat about how Screaming Frog can help your site be the best it can be, without any tricky tech talk.
Let’s get started on turning your website into a digital masterpiece!
Imagine a superhero checking out every corner of your site—that’s what Screaming Frog does. I’ll guide you on what it finds and how you can use that info to make your site even better.
Think of your website like a library, and Screaming Frog helps organize it. I’ll explain simple tricks to make sure people can find your site easily.
Now, let’s talk about the fun stuff—pictures and videos! We’ll guide you on making them look great and work smoothly on your website.
After its adventure, Screaming Frog leaves you a detailed map (reports) of your website. I’ll help you read it so you can understand what needs attention and what’s already doing well.
Time to put on your digital construction hat!
I’ll show you how to use Screaming Frog’s findings to fix things like broken links and make your website faster.
Just like a superhero checking in on the city regularly, I’ll guide you on how to keep an eye on your website’s health and make small improvements over time.
By the end of this guide, you’ll be the superhero of your website, making it a fantastic place for everyone who visits.
Ready to start the adventure?
Let’s go!
Getting Started with Screaming Frog
Starting to make your website better is like going on an adventure. And for this journey, we have a special helper called Screaming Frog.
In this part, we’re going to help you get this tool, set it up on your computer, and make it work just right for what you need.
Think of it as the first steps to make Screaming Frog your website’s superhero!
Download and Installation
Before we get into making your website better, you need a special tool called Screaming Frog. It’s like getting a superhero tool for your website. First, we’ll help you find it and put it on your computer.
Imagine downloading Screaming Frog as grabbing your superhero suit. Here’s how you do it:
Find the Official Website: Visit Screaming Frog’s official website to get the tools your website needs. Just type “Screaming Frog” in your browser and gear up! Or Click here.
Get the Right Version:
Select the suitable version for your computer:
- Windows
- Mac
- Linux
Just like choosing the right fit for your clothing, this ensures everything works smoothly.
Installation Instructions:
Now that you have your superhero suit, let’s put it on your computer:
Start the Installation: Open the superhero suit package and follow the instructions. It’s like putting on your suit step by step.
Key Settings: Sometimes, superhero suits have special settings. We’ll tell you about any important ones so your Screaming Frog works just right.
Basic Setup and Configuration
Now that Screaming Frog is on your computer, let’s get to know it better. It’s like meeting your new helper for the first time. This section helps you get ready to use Screaming Frog to make your website amazing.
Open the Tool: Just like waking up your superhero, we’ll show you how to start Screaming Frog.
Overview of the Interface: Take a quick tour of the buttons and features. It’s like learning what each button on your superhero suit does.
Configuring Basic Settings:
Setting Up the Helper: Every superhero has preferences. We’ll help you set up Screaming Frog just the way you like it. Adjust the speed, choose how it looks, and more.
Basic Parameters: Understand the basic settings, like how fast your superhero tool should work and what it should look for on your website.
Setting Up a New Project:
Start a New Adventure: In Screaming Frog, your website project is like a new adventure. We’ll guide you on how to tell the superhero tool where to go (entering your website’s address) and any special things it should know.
This way, you’ll be all set to start your journey of making your website the best it can be with Screaming Frog!
Crawling Your Website
It is like sending a website detective to check every nook and cranny of your site. Think of it as a digital explorer using a tool called Screaming Frog.
This explorer goes through your website, checking each page and collecting important information. It’s a bit like a superhero searching for clues to make your website work even better.
Initiating a Crawl with Screaming Frog
Now that you have Screaming Frog set up, it’s time to make it explore your website, just like a superhero searching for clues.
This process is called a “crawl.”
I’ll guide you on how to start this exploration.
Step-by-Step Crawl:
Launch the Crawl: Think of it as telling your superhero to start looking around. I’ll show you where to click and what settings to choose.
Start with Entering the URL of your Website as given below screenshot.
Wait for the Magic: While Screaming Frog is crawling, it’s like your superhero checking every corner of your website for important information.
Free version of SF gives you option to crawl 200 URLs at a time.
Understanding Crawl Results
After exploring your website, it’s like someone went on a journey and found some interesting things. Now, we’re in the phase of figuring out what these discoveries mean.
Think of it as solving a little puzzle based on what we’ve found.
We want to understand what’s going well on your website and if there are places that might need a bit of attention.
Reviewing the Data
Once Screaming Frog completes its exploration of your website, it compiles a detailed report—a bit like a comprehensive document summarizing its findings.
I’ll guide you through this report, helping you navigate and interpret it.
Reading the Report:
Picture this as going through a detailed document that outlines everything discovered during the website exploration. I’ll assist you in understanding and interpreting this comprehensive overview.
Key Metrics:
Within the report, there are key metrics—important numbers and details that provide insights into how your website is performing.
Discover which pages on your website attracted the most attention. It’s akin to understanding the high-traffic areas of your digital space.
If there are any problems or challenges with your website, the report highlights them. Think of it as finding clues that help you address potential issues and enhance your website.
I will guide each of this step by step below.
Identifying Common Issues and Errors
When the crawl complete, we need to start exploring each finding one by one. I’ll help you spot and tackle common issues your website might be facing.
I’ll show you how to find and understand any errors Screaming Frog discovered. It’s like finding clues to fix things.
Internal Links:
This feature ensures that the links within your website are working correctly, providing a seamless navigation experience for users. It also helps identify and fix any broken internal links.
- Importance: Examines links within your website, ensuring smooth navigation.
- Action: Identifies and fixes broken internal links.
External Links:
This function analyzes the external links on your site for relevance. It checks for broken external links and provides recommendations for fixing them to maintain a healthy link profile.
- Importance: Analyzes links to external sites, verifying relevance.
- Action: Checks for broken external links, ensuring valuable connections.
Security:
The security check examines your website for potential security issues. It identifies threats and suggests solutions to enhance the overall security of your site.
- Importance: Checks for security issues, ensuring user data safety.
- Action: Identifies potential threats and suggests solutions.
Responses Codes:
This feature reviews the responses from your server, addressing errors and redirects. It helps in optimizing server responses for a better user experience.
- Importance: Examines server responses, ensuring proper functionality.
- Action: Addresses server errors and redirects for a seamless user experience.
URLs:
The URL analysis assesses the structure of your website’s URLs, considering their impact on search ranking and user experience. It helps optimize URLs for better visibility.
- Importance: Analyzes URL structure, impacting search ranking and user experience.
- Action: Identifies issues like URL length and structure for optimization.
Blocked URLs:
This function allows you to view and audit URLs that are disallowed by the robots.txt file. Ensuring proper management of blocked URLs is crucial for SEO.
- Importance: Views and audits URLs disallowed by the robots.txt protocol.
- Action: Ensures proper management of URLs according to the robots.txt rules.
Blocked Resources:
Examining blocked resources in rendering mode is essential for understanding how certain elements on your website may be restricted. It aids in optimizing the rendering process.
- Importance: Views and audits resources blocked in rendering mode.
- Action: Ensures resources are accessible for proper rendering.
URL Issues:
This aspect examines various issues related to URLs, such as the presence of non-ASCII characters, underscores, uppercase characters, parameters, and long URLs. Addressing these issues is vital for optimal website performance.
- Importance: Examines issues like non-ASCII characters, underscores, uppercase characters, parameters, or long URLs.
- Action: Identifies and addresses URL-related issues for optimization.
Duplicate Pages:
The tool helps identify exact and near-duplicate pages on your website, ensuring content uniqueness. This is crucial for SEO and avoiding duplicate content issues.
- Importance: Discovers exact and near-duplicate pages using advanced algorithmic checks.
- Action: Ensures content uniqueness and addresses duplication issues.
Page Titles:
The review of page titles is important for search ranking and user engagement. The tool helps identify missing or duplicate page titles that need attention for optimization.
- Importance: Reviews page titles, impacting search ranking and user engagement.
- Action: Identifies missing or duplicate page titles for improvement.
Meta Description:
Examining meta descriptions is critical for influencing search snippets and user clicks. Ensuring well-crafted and unique meta descriptions is part of effective SEO.
- Importance: Examines meta descriptions, influencing search snippets and user clicks.
- Action: Ensures compelling and unique meta descriptions.
Meta Keywords:
Although not widely used by major search engines, the tool checks for meta keywords, which might be relevant for regional search engines.
- Importance: Checks mainly for reference or regional search engines.
- Action: Adheres to modern SEO practices related to meta keywords.
File Size:
Assessing the size of URLs and images is crucial for optimizing website performance. The tool helps identify large files that may impact loading times.
- Importance: Assesses the size of URLs and images.
- Action: Optimizes file sizes for improved performance.
Response Time:
This feature allows you to view how long pages take to respond to requests. Optimizing response times is essential for a positive user experience.
- Importance: Views how long pages take to respond to requests.
- Action: Optimizes elements for faster response times.
Last-Modified Header:
Viewing the last modified date in the HTTP header provides insights into when a page was last updated. This information is important for managing and maintaining content.
- Importance: Views the last modified date in the HTTP header.
- Action: Ensures proper handling of last-modified information.
Crawl Depth:
Understanding how deep a URL is within a website’s architecture is crucial for site structure analysis. It helps optimize URL depth for better organization.
- Importance: Views how deep a URL is within a website’s architecture.
- Action: Optimizes URL structure for better accessibility.
Word Count:
Analyzing the number of words on every page provides insights into content richness. Maintaining an appropriate word count is important for SEO.
- Importance: Analyzes the number of words on every page.
- Action: Ensures content richness and relevance.
H1, H2:
Reviewing header tags (H1, H2) is essential for content organization and SEO. The tool helps identify missing, duplicate, or improperly used header tags.
- Importance: Reviews header tags, organizing content and aiding in SEO.
- Action: Ensures proper usage of header tags for clarity and optimization.
Meta Robots:
Examining directives like index, noindex, follow, nofollow, etc., helps control how search engines crawl and index your content.
- Importance: Views directives like index, noindex, follow, nofollow, noarchive, nosnippet, etc.
- Action: Ensures proper use of meta directives for search engine crawling behavior.
Meta Refresh:
Viewing meta refresh directives helps in understanding page redirection. Proper use of meta refresh is important for effective redirection strategies.
- Importance: Views meta refresh directives, including target page and time delay.
- Action: Ensures proper use of meta refresh for effective page redirection.
Canonicals:
Viewing link elements and canonical HTTP headers helps ensure proper indexing of content. Canonical tags are crucial for avoiding duplicate content issues.
- Importance: Views link elements and canonical HTTP headers.
- Action: Ensures correct implementation of canonical tags for proper indexing.
X-Robots-Tag:
This feature allows you to view directives issued via the HTTP header, providing additional control over how search engines crawl content.
- Importance: Views directives issued via the HTTP header.
- Action: Ensures proper handling of X-Robots-Tag directives.
Pagination:
Examining rel=”next” and rel=”prev” attributes helps optimize pagination for user-friendly navigation.
- Importance: Views rel=“next” and rel=“prev” attributes.
- Action: Optimizes pagination for user-friendly navigation.
Follow & Nofollow:
This function allows you to view meta nofollow and nofollow link attributes, providing insights into how search engines treat specific links.
- Importance: Views meta nofollow and nofollow link attributes.
- Action: Ensures proper use of follow and nofollow attributes.
Redirect Chains:
Discovering redirect chains and loops helps ensure efficient handling of redirects, avoiding potential issues.
- Importance: Discovers redirect chains and loops.
- Action: Ensures efficient handling of redirects.
hreflang Attributes:
Auditing missing confirmation links and checking language codes is crucial for international SEO and proper targeting of audiences.
- Importance: Audits missing confirmation links, inconsistent & incorrect languages codes, non-canonical hreflang, etc.
- Action: Ensures proper implementation of hreflang attributes for international SEO.
Inlinks:
Viewing all pages linking to a URL, along with anchor text and link attributes, helps manage inbound links for enhanced connectivity.
- Importance: Views all pages linking to a URL, anchor text, and link attributes.
- Action: Manages inbound links for enhanced connectivity.
Outlinks:
Viewing all pages a URL links out to, as well as resources, aids in managing outbound links for an effective linking strategy.
- Importance: Views all pages a URL links out to, as well as resources.
- Action: Manages outbound links for effective linking strategy.
Anchor Text:
Viewing all link text and alt text from images with links helps optimize anchor text for improved SEO.
- Importance: Views all link text and alt text from images with links.
- Action: Optimizes anchor text for improved SEO.
Rendering:
Crawling JavaScript frameworks is essential for proper rendering of dynamic content on your website.
- Importance: Crawls JavaScript frameworks like AngularJS and React.
- Action: Ensures proper rendering of dynamic content.
AJAX:
AJAX stands for Asynchronous JavaScript and XML. This feature allows the SEO Spider to select the option to obey Google’s AJAX Crawling Scheme. This scheme is deprecated now, but it used to ensure compatibility with search engine crawlers, making dynamic content rendered by JavaScript accessible and indexable.
- Importance: Selects to obey Google’s now deprecated AJAX Crawling Scheme.
- Action: Ensures compatibility with current SEO standards.
Images:
This feature allows the SEO Spider to view all URLs with image links on a website. Additionally, it enables the management of images for an optimal user experience. It’s crucial for analyzing the image structure and ensuring that images are appropriately Optimized for SEO.
- Importance: Views all URLs with the image link and all images from a given page.
- Action: Manages images for optimal user experience.
User-Agent Switcher:
The User-Agent Switcher feature enables the SEO Spider to crawl a website while posing as different user-agents, including popular search engine bots like Googlebot, Bingbot, Yahoo! Slurp, mobile user-agents, or even a custom user-agent. This is useful for checking how the website appears to different crawlers and devices.
- Importance: Crawls as different user-agents, including Googlebot, Bingbot, Yahoo! Slurp, mobile user-agents, or custom UA.
- Action: Ensures compatibility and visibility across various user agents.
Custom HTTP Headers:
This feature allows the SEO Spider to supply any header value in a request. Headers contain additional information about the request or the server’s response. Customizing these headers can help simulate specific scenarios or conditions during crawling.
- Importance: Supplies any header value in a request, from Accept-Language to cookie.
- Action: Customizes HTTP headers for specific requirements.
Custom Source Code Search:
The Custom Source Code Search feature enables the SEO Spider to find specific elements or code snippets within the source code of a website. It supports searching through the source code using XPath, CSS Path selectors, or regular expressions (regex).
- Importance: Finds anything in the source code of a website using XPath, CSS Path selectors, or regex.
- Action: Extracts specific information from the source code.
Custom Extraction:
Custom Extraction allows the SEO Spider to scrape and extract any data from the HTML of a URL. It supports extracting data using XPath, CSS Path selectors, or regex. This feature is particularly useful for retrieving specific information from web pages.
- Importance: Scrapes any data from the HTML of a URL using XPath, CSS Path selectors, or regex.
- Action: Extracts specific data based on website needs.
Google Analytics Integration:
This feature enables the SEO Spider to connect to the Google Analytics API, allowing the extraction of user and conversion data during a crawl. It enhances data analysis by incorporating insights from Google Analytics.
- Importance: Connects to the Google Analytics API and pulls in user and conversion data during a crawl.
- Action: Enhances data analysis with integrated Google Analytics insights.
Google Search Console Integration:
The Google Search Console Integration feature connects the SEO Spider to the Google Search Analytics and URL Inspection APIs. It allows for the collection of performance and index status data in bulk.
- Importance: Connects to the Google Search Analytics and URL Inspection APIs.
- Action: Collects performance and index status data for comprehensive analysis.
PageSpeed Insights Integration:
This feature connects the SEO Spider to the PageSpeed Insights (PSI) API, providing Lighthouse metrics, speed opportunities, diagnostics, and Chrome User Experience Report (CrUX) data at scale.
- Importance: Connects to the PSI API for Lighthouse metrics, speed opportunities, diagnostics, and Chrome User Experience Report (CrUX) data.
- Action: Enhances analysis with integrated insights into page speed.
External Link Metrics:
External Link Metrics allows the SEO Spider to pull external link metrics from third-party sources like Majestic, Ahrefs, and Moz APIs. This feature assists in conducting content audits or profiling links for improved linking strategy.
- Importance: Pulls external link metrics from Majestic, Ahrefs, and Moz APIs into a crawl.
- Action: Conducts content audits or profiles links for improved linking strategy.
XML Sitemap Generation:
This feature enables the SEO Spider to create XML Sitemaps and Image XML Sitemaps. It offers advanced configuration options over URLs to include, last modified date, priority, and change frequency.
- Importance: Creates an XML sitemap and an image sitemap using the SEO spider.
- Action: Ensures accurate and up-to-date sitemaps for effective search engine indexing.
Custom robots.txt:
The Custom robots.txt feature allows the SEO Spider to download, edit, and test a site’s robots.txt file. It provides customization options to control how search engines crawl and index the website.
- Importance: Downloads, edits, and tests a site’s robots.txt using the new custom robots.txt.
- Action: Customizes robots.txt for specific directives.
Rendered Screen Shots:
Rendered Screen Shots fetch, view, and analyze rendered pages crawled by the SEO Spider. It provides visual insights into how pages are rendered after JavaScript execution.
- Importance: Fetches, views, and analyzes rendered pages crawled.
- Action: Enhances visualization and analysis of rendered content.
Store & View HTML & Rendered HTML:
This feature is essential for analyzing the Document Object Model (DOM) of web pages. It stores and allows viewing of both the raw HTML and the rendered HTML of crawled pages.
- Importance: Essential for analyzing the DOM.
- Action: Stores and views HTML and rendered HTML for in-depth analysis.
AMP Crawling & Validation:
The AMP Crawling & Validation feature allows the SEO Spider to crawl Accelerated Mobile Pages (AMP) URLs and validate them using the official integrated AMP Validator.
- Importance: Crawls AMP URLs and validates them using the official integrated AMP Validator.
- Action: Ensures compatibility and adherence to AMP standards.
XML Sitemap Analysis:
This feature allows the SEO Spider to crawl an XML Sitemap independently or as part of a larger crawl. It helps identify missing, non-indexable, and orphan pages listed in the XML Sitemap.
- Importance: Crawls an XML Sitemap independently or part of a crawl.
- Action: Identifies missing, non-indexable, and orphan pages for optimization.
Visualizations:
Visualizations provide graphical representations of internal linking and URL structure using force-directed diagrams and tree graphs. They enhance understanding of the website’s architecture.
- Importance: Analyzes internal linking and URL structure using crawl and directory tree force-directed diagrams and tree graphs.
- Action: Enhances understanding of website structure for optimization.
Structured Data & Validation:
The Structured Data & Validation feature extracts and validates structured data against Schema.org specifications and Google search features.
- Importance: Extracts & validates structured data against Schema.org specifications and Google search features.
- Action: Ensures correct implementation of structured data for enhanced search features.
Spelling & Grammar:
This feature checks website content for accurate spelling and grammar. It supports over 25 different languages for comprehensive analysis.
- Importance: Spell & grammar check your website in over 25 different languages.
- Action: Enhances content quality with accurate spelling and grammar.
Crawl Comparison:
Crawl Comparison allows the SEO Spider to compare crawl data over time, tracking technical SEO progress and detecting changes in site structure, key elements, and metrics.
- Importance: Compares crawl data to see changes in issues and opportunities.
- Action: Tracks technical SEO progress, compares site structure, and detects changes for ongoing optimization.
By comprehensively reviewing these elements, you’re essentially becoming a digital detective, identifying areas for improvement and ensuring your website operates at its best.