Need to run a quick crawl of a website? Want to visualize the insights fast so you can decide what to do next? In this video, I’ll walk you through how to do a quick, high-level crawl of a site using Screaming Frog SEO Spider. Then we will visualize some key data points with the help of Google Sheets.
Table of Contents
In the latest episode of Hack My Growth, I’m going to walk you through a quick SEO crawl using Screaming Frog SEO Spider, as well as how we can quickly visualize that data using a Google Sheets template. Hey, thanks for checking out this video. If you’ve enjoyed the videos we’ve created on this channel, please hit subscribe, and don’t forget to turn on that notification button. We create new content each week to help you get the most out of your digital marketing efforts.
So as I talked about in the opener, we’re going to be walking through how we can set up a quick SEO crawl using Screaming Frog SEO Spider. This is a really great tool. It’s a tool that we use pretty much every single day here at the agency. And it helps us get both a 30,000 foot view of a website as well as a extremely granular detailed view of a website. And today we’re talking all about a quick audit. This is going to allow us to see a site from a very basic level, but help give us an understanding of maybe some big or quick wins that we could use with a client or a possible prospect.
SEO Crawl with Screaming Frog
All right, let’s get into the content. All right, so in this video, I’m going to walk you really quickly into an overview of a basic site crawl using Screaming Frog. Screaming Frog SEO Spider is probably one of the most useful tools for any SEO or really any site owner that really wants to understand more about their website. We’re just going to go a quick overview of how it works, how you’d set it up and how you’d run it. And then I’m also going to share quickly just a very simple report visualization that we can do within Google Sheets.
So the first thing you’re going to do with Screaming Frog is you can add your URL in here and if you just want to go, you can go ahead and hit start. But there’s a number of ways to actually run this crawl. You can do your whole domain here, but if you want to just run a few pages, you can do that as well. You’ve got different modes here. So you can run a list, you can actually pull data from the search results and then you can compare. So this is an extremely useful tool if you want to really understand your website, maybe a list of pages, as well as the search results themselves.
Now in this video, we’re talking specifically about crawling our own website. Now, I like to go in here first and do some configurations. If you’re going to do a deep crawl, you’re going to want to probably have a lot of these things clicked and checked off.
But if you’re really just looking to get a basic idea of what’s going on with a website, or maybe you’re doing a quick audit, or maybe someone’s asked you like, “Hey, do you want to work with our site?” You don’t have to go super deep in that first go around to really see whether or not this site is something that you want to get involved with.
But now, as you can see, you can crawl all of these different elements. You can crawl different page links. You can shift how the crawl behavior is. You can even crawl the site maps if you want to as well.
There’s a number of things you can extract as well with Screaming Frog, which is really cool. You can do page titles in metas, but you can also do JSON-LD, RDFa and schema.org. All of that stuff. So if you want to see are they using structured data, you want to have these checked. You can also store the HTML as well.
Here you’d set your limits. So how many pages do you want to crawl? Depending on your business, you could crawl pages with 50 links, you could crawl pages through up to 10,030. We’ve done crawls with over 30,000 links. It takes a lot of time, but it’s definitely a tool that that can handle that kind of work for you. So this is where you really want to set up those settings here. How many redirects you want to follow, right? So if they’ve got a redirect chain going on and all the different URLs and how deep you want to go.
Advance & Preferences
Screaming Frog’s got a lot of advanced features as well in preferences, and you can set all of these customizations here. All the different rules that you want to set to really help your crawl do what you want it to do.
Now I’ve already set a crawl here that we don’t have to look at it. And as you can see up here, we’ve got all the different tabs. There’s a ton of data that Screaming Frog is going to pull. We’ve got the URL, all the site data. Anytime you click on one of these links, you’re going to get more and more information, as you can see right here, which is really, really cool. Allows you to see pretty much anything about the website.
Over here they always give us a summary, and the summaries always have these nice little visualizations as well. So I can look at all the URLs that we’ve encountered. I can look at how that crawl data was seen. I can look at the security of the site itself. And as you notice over here, the pane also changes. I can look at the different URL structures and the page titles and all this really cool data that I can now use to go optimize.
Let’s say you going to take this data, and you want to do some basic analysis from just a copy and paste standpoint. Well, what you can do is go to this internal tab here and click export. This is going to build a CSV file. You can also select any of the other things that you want to select here. Google Sheets or whatnot. And go ahead and save that file. That is going to allow you now to copy and paste this file into Google Sheets.
And I’m going to share this Google Sheets with you. All you have to do is paste this data here, and this is all the crawl data, the internal crawl data. So you’ve got everything like the page type, the status codes of those pages. Are they good? Are they live pages? Are they being redirected? What’s the indexability? I mean, list goes on and on and on and on. As you can see here, we’ve got everything from word count, text ratio, crawl depth.
If you want to get this score right here, this is called Link Score, and this is a Screaming Frog metric for internal links. After you run this crawl here, you can actually go up here to hit crawl analysis and hit start.
What this will do is it’ll run another analysis on top of this crawl data, and that is where you’re going to get the Link Score. And the Link Score is basically an internal page rank of your pages that’s been calculated by Screaming Frog. So, if you want to see how strong a specific page is compared to the rest of the other pages, like we see here that this page has a 100, but then these pages are like in the 30s, and as it goes lower and lower now, because we’ve ranked it there, these pages have maybe less weight. Or these are maybe pages that we need to get some more internal links to, so that we can improve the link score of those pages. So it’s like an internal link metric, which is pretty cool.
Within the thing that we built, right now, just the quick overview, we don’t have this in there. But it’s a nice little thing here that you can do and again, you just go to crawl analysis and hit start. But as you can see, we’ve got all of this data, and sometimes it’s hard. Like when you’re just looking at a website or somebody comes to you and says, “Hey, can you do a quick audit? I want to see a high level view of what’s going on with the website.” You don’t really have all the time to go through this data, and you just want to see some big things and go, “Okay, yeah. There’s some opportunities here, or maybe there’s not.”
So after you paste the data in here, you can go over to this visualization tab. And what we did here is we just narrowed it down to some of the high level things that we wanted to look at. So we’ve got the URL, the total URL’s crawled with this website, what’s the average word count of the pages that we crawled? What’s the indexability status of these pages? Are they indexable, or are they not? What are the response codes? Do we have a lot of broken pages, things like that?
And as you can see, we’ve pulled all the data over here, and this is all done just with some Excel commands, and then we visualized it here. So, okay. We can do some title tag optimization. There’s some meta tag optimization need to be done. Probably need to work on canonicalization. And the crawl depths pretty good, but there’s a lot of pages here that are three and four clicks away. How can we flatten this site out and maybe make it a little bit better?
So this is not your in-depth crawl. This is not going to tell you everything about the website, because we’re doing this from a quick standpoint. This is a quick SEO crawl to help give us just a 1,000 foot view over this website that we can do really quickly using Screaming Frog. And then we built some visualizations here to let us go, “Okay, we see that there are some quick opportunities here that we can do something about.” Now you can go back to your crawl, and you can find what those pages are, and you can really start to manipulate the data to put it back to work for you, so now you can implement these changes.
So this was a quick SEO crawl with some visualizations using Google Sheets. I hope you found this helpful. If you’ve got any questions about Google Sheets or maybe Screaming Frog and how you can set that up, please let us know. We’d love to continue that conversation with you. And until next time, happy marketing.