Select a ServiceWeb design and developmentSearch Engine OptimizationGoogle ads and Facebook adseCommerce developmentWeb application developmentMobile application developmentShopify websiteWordpress websiteOther
Select Your Budget$1500 - $3000$3000 - $5000$5000 - $10000$10000 - $15000$15000 - $25000$25000 - $50000+
A significant portion of websites run on WordPress. In fact, our own Toronto Digital marketing company website i.e Volt Studios, runs on WordPress, or you’ll end up managing SEO for a site that does.
Now, you already know that ranking well can have a serious impact on your bottom line and the technical aspects of your site, things like your page speed, the site markup, URL structure, image optimizations, and so on all contribute significantly to that ability to rank. And the challenge is WordPress offers an endless array of customizations and plugins. So, it comes with its own unique technical challenges to optimize for SEO.
We’ve done SEO for some of the top traffic Canadian websites on the internet and many more. In this post I’ll show you tried and true approaches to increasing you rank and boosting your traffic specifically with WordPress. You’re going to encounter a different way of managing the technical aspects of SEO and learn how to reach your audience and grow your traffic by incorporating the best aspects of technical SEO.
Now, technical SEO sounds overwhelming but don’t let the name fool you. You can do this. And from what you’ll learn in this post you’ll become a better WordPress developer cum SEO. So, grab your computer, maybe a snack and get ready to learn, let’s go.
For the purpose of this post we’re going to be using a website bestseoagencyintoronto.ca. Now, this is a fictional website created for the purpose of instruction, but it’s replicating a typical recipe website that also offers a book for sale. Now, this website is running on WordPress version 5.2.3, which is hosted on Amazon Lightsail.
I’ve selected Amazon Lightsail for its simplicity of installing WordPress with one click as well as its relative affordability. I also enjoy that Lightsail makes it very easy to quickly deploy load balancers, content delivery networks, as well as SSL certificates. Speaking of SSL, the site is also configured with HTTPS using letsencrypt.org.
Now, for the purpose of this course I’m going to expect that you’re familiar with working within WordPress. So you’ve likely built out your site, created a few articles, edited pages, and deployed a few plug-ins. You should also have a fundamental understanding of SEO, as well as the goals of SEO, and you’ll need just basic HTML and CSS skills.
Now the way that I’ve set up this WordPress site doesn’t have to be the way that your WordPress site is set up and you likely can even be running a different version of WordPress.
Most everything that we’ll cover in this guide translates to different versions of WordPress and is simply meant to offer an understanding of how these components all work within one another, how the technical aspect is made up, and various ways that you can personalize your WordPress setup for your SEO goals.
We create custom wordpress sites that are optimized for SEO (Search Engine Optimization) and mobile. Our expert team of web developers in Toronto uses the latest technology to make sure your site looks great on all devices: desktop computers, tablets and smartphones.
We can also help you with other aspects of your online presence such as social media marketing, email marketing or even creating an app if needed.
We create custom wordpress sites that are optimized for SEO (Search Engine Optimization) and mobile. Our expert team of web developers in Toronto uses the latest technology to make sure your site looks great on all devices: desktop computers, tablets and smartphones.
We can also help you with other aspects of your online presence such as social media marketing, email marketing or even creating an app if needed.
All too often when it comes to doing local SEO in Toronto, I see an over-investment in website content. Whether it’s blogs, white papers, eBooks, or videos it’s a game that everyone plays and I think at this stage most everyone doing SEO understands it pretty well.
Find a topic that people are searching for and produce content that’s better or has a fresher perspective. But all of that content, well, that’s just the bait on the end of your fishing line. You can have the best bait, if you’re fishing with a shoe-string and a stick, well, you’re in for a bad day. If you want to compete, you want the best fishing rod, reel, line, and the knowledge on where and when to fish.
An SEO-friendly WordPress site is in some ways like your fishing rod. It’s the technical foundation for everything you’re going to do in your SEO efforts. When you develop a highly optimized website, you make it easy for search engines to access your content and understand it. An optimized website adheres to both the written and unwritten rules. Everything from the user experience to the organization of the content to the page speed plays a significant role in where you rank.
Build a highly optimized website and you’ll gain a real competitive advantage. This optimization however, is a serious departure from content development. We’re going to need to get a real look at what’s going on under the hood and this contains, among other things, looking at your indexability, root domains, redirects, canonical URLs, page tag structured data, and even critical rendering paths.
The goal is to find the right balance because truthfully, a website that’s really good for users, well, is really good for SEO. Keep in mind there’s no silver bullet here. You need to embrace as many of these elements as possible. At the very least, improve them beyond what your competition is doing. Nobody maintains top positions forever though, so you’ll have to stay vigilant.
The process of Google and other search engines evaluating your site is done through what’s known as the crawl. A bot will arrive on your site, click on all of your links, and hopefully index your content. But the truth is, we don’t have a lot of insight into what’s going on when Google and others crawl your site.
And fortunately there’s a tool called Screaming Frog SEO Spider that will run on your PC, Mac or even Linux machines and crawl your site just as a search engine would. You’ll find it at screamingfrog.co.uk. This is a fantastic tool, and it’s used by professional SEOs all over. The ability to automate a crawl and quickly identify problems is invaluable.
Screaming Frog is useful for identifying common SEO issues, evaluating site changes, and you can perform a fairly comprehensive audit of your WordPress site without needing to manually review every single page. The program crawls 500 pages for free, but there is a paid professional edition that has unlimited URL crawls and unlocks all sorts of advanced features. You’re going to see me using this tool quite a bit, and I think it’s worth downloading the free trial.
As you dive deeper into evaluating your WordPress site from a technical perspective, you’re likely going to spend quite a bit of time in the Chrome Inspector Tool. Now, I use Chrome the browser for most of my SEO work, and that’s because its developer tools are incredibly robust as well as its library of extensions. It’s very easy to use the Chrome Inspector.
You can simply right click and choose Inspect. And here we’ll see the Chrome Inspector panel appear on the right-hand side. Now, by default you’re in the elements tab, and this displays the page’s rendered HTML, which is different from the page’s source code.
I want to introduce you to a powerful tool baked directly into Google Chrome that you’re going to use quite a bit as you evaluate the technical makeup of your WordPress website. I’m here on www.example.com. I’m going to right click and choose Inspect.
From Google developer tools, I’m going to choose the double arrows on the right hand side and select Audits from the dropdown menu. Here, we have the ability to run one of Google’s powerful auditing tools called Lighthouse, and Lighthouse powers a lot of Google’s testing tools such as page speed insights. So what we can do really quickly is identify what audit we’d like to run.
We can decide whether we want to simulate any throttling, such as someone on a 4G connection, and then I’ll choose to run audits. At this point, Lighthouse is going to scan our website and then present us with an in depth report of performance metrics of the website.
Google Search Console is a free service offered by Google the helps you monitor and maintain your site’s presence in Google search results. It is incredibly powerful and it gives you all kinds of useful information, such as what pages of your site have errors, how people are searching for your content, if your sitemaps are working, and so on and so forth.
Google Search Console should be part of every technical SEO toolkit. It gives you such rich insights as what’s going on under the hood that it’s imperative that you have it installed and set up. Now for the purpose of you time together, I’m not going to go into the specifics of Google Search Console.
There are a ton of great resources right here that can walk you though in depth how to use it and how to leverage it for your technical SEO.
SEO plugins for WordPress are both a blessing and a curse. There is an endless amount of plugins available and it’s relatively easy to find an SEO problem that can be solved with a plugin. The problem is that plugins add to the overall bloat of WordPress. And one of the biggest complaints of WordPress is that it can get really bloated, really quickly, which can negatively impact your SEO.
So every time you’re faced with an SEO problem, I encourage you, to evaluate whether you really need a plugin or not. Most of what these plugins do, can be handled and managed manually.
And it’s important to evaluate whether you’re simply trying to save time or whether this plugin really does fundamentally shift your SEO in a positive direction. Don’t get me wrong, there are plenty of great plugins, and we’ll look at some together. But it’s important that you don’t get into a mindset of over using plugins. I really encourage you to think lean when it comes to your plugin use, as that will result, in an overall lighter and faster WordPress instance.
In order to dive in to technical SEO, we need to start by identifying a do it all plugin that we want to use. One of the most popular options is Yoast SEO. And it gives you ton of flexibility over your sites content, how you manage your meta descriptions, and even a way to build out your schema and sitemaps. Now, Yoast is really focused on being friendly to beginners and most of its advanced functionality is tucked away in a menu. I’m going to install Yoast to show you some of its core features.
So, from our WordPress dashboard, choose plugins on the left hand navigation, and add new. In the upper right hand corner, do a search for Yoast, and install now. We’ll go ahead and activate the plugin, and then let’s take a look. Yoast adds a menu option in the bottom left hand corner, so I’ll hover over that and select general. Now, Yoast is free. However, they do have an SEO Premium option
Here at Volt studios, we like to start every SEO project by crawling the site, and we use Screaming Frog SEO Crawler to do just that. This tool allow me to quickly identify how the site’s structured, all of the meta descriptions, meta keywords, what heading tags are being used, and so on. So I’ll start by adding our URL in the top bar here, and then I’ll select start. It’s this output that we’ll use to begin understanding what areas of our site we’ll need to improve.
Let’s take a look at this together. So in the upper left-hand corner, you’ll see that I’m on the internal tab, and this is showing me all of the internal pages to our site. The external tab will show us all of the calls that are being made to pages outside of our domain.
From here we have a preliminary understanding of what’s going on under the hood, and here on the right-hand side we can start to understand each of the SEO elements, and Screaming Frog breaks them down into areas to help us navigate. So we see the internal elements, and we started with our HTML, but we could begin to evaluate each section here. We have the external elements, protocol, our response code, and this is where we start to begin our true technical audit. So we can start by evaluating anything that’s been blocked by robots.txt.
This is helpful in understanding if our content is not going to be indexed by Google. We can see all of our redirections, we could see if we had any 404 errors that needed to be resolved. We can see if we have issues with our URLs, and then, our page titles. Here we can see where we have duplicate page titles, short page titles, long page titles, and so on. Each of these areas represents something that we’d like to review and evaluate. This tool provides a ton of incredible resources, so I encourage you to evaluate your own site using it, and see what you discover.
Conducted a crawl using Screaming Frog and start by evaluating the page titles. Now the components within the search results are your title tag, your URL and then the meta description. Now Google doesn’t always take the meta description that you provide, oftentimes it generates its own using the content on your site. But we can guide it by using that meta description.
In Screaming Frog SEO crawler after we’ve done a crawl on the meta description tab. It’s important that each page has a unique meta description. Google often uses the meta description as the short text which is visible in the search results. Which we can see a sample of, towards the bottom of the screen. It’s important that you don’t just list keywords.
You want to create a meaningful sentence or two that entices the visitor to click. The default meta description that WordPress has configured is really not that great. It has some capital letters, it has the buttons that are included in the page by now, read the blog and overall, it’s just not all that compelling.
Additionally, we can see many instances where there are meta descriptions missing. And we can confirm that here along the right hand side. A fundamental piece of managing your technical SEO is making sure that all of pages on your website are effectively mapped to your target keywords.
An important aspect of the technical structure of your WordPress website is your use of heading tags. These tags, known as H1, H2, H3, and so on help Google understand the importance of the content on your page. You always want your heading tags to appear in order on your page and you should only ever have one heading tag.
And you also want to make sure that you avoid using your heading one tag to say things like about us, or contact us. Instead, you want to focus on the core topic for the page. Essentially, identify that foundational keyword that you’re building your content around.
Crawl using Screaming Frog and select the H1 tab here towards the top of the screen. This allows us to evaluate the heading one tags that are being used for each page or article on our WordPress website.
Heading tags are an incredibly important component to our overall technical SEO strategy. Now, with many crawl tools such as Screaming Frog SEO Crawler you have the ability to review your heading tags but you’re only looking at heading one, and heading two. To get a better view of our heading tag structure I’m going to use the inspector within Google Chrome.
Right click on this page and choose inspect. Start out here in our elements view and choose Apple F or Control F on a PC and start by typing our opening bracket and h1. The reason that we add the opening bracket is that just about every heading one tag is going to start with that open bracket and then h1. If we simply do a search for h1 I’m likely going to find this within embedded CSS.
As you look to explore your WordPress site architecture, you’re going to want to spend some time with your permalink settings. And you’ll get here from within your WordPress dashboard by selecting Settings, and then Permalinks from the left-hand navigation.
Now, by default, WordPress comes configured with the plain permalink settings. And a permalink is really the permanent address of the piece of content that is associated to that URL. So each page or article that you create will get a new permanent link.
So by default, the plain permalink is what’s configured, and this is the least user-friendly. In this case, WordPress simply assigns an ID and calls it good. And this does nothing for your user or for Google, as it provides no context. Now, a very popular option is the Month and name, or the Day and name options.
And these include the dates. And you’ll see this on a lot of older news websites, or older blogs that are updating their content frequently. And this helps provide context to the end-user, as well as to Google.
However, with the addition of structured data that allows us to tell Google the month, day, and year that an article was published within that structured data, we don’t need to use the folder structure to provide that context. And this unnecessarily makes the URL longer, which is actually less enticing to a user.
We also have Numeric which is a format of Plain, it adds in a categorical component for archives. And then we have what is by far the best option for SCO, Post-name. And this allows you to provide each page a very clean, simple, plain English user-friendly URL. And this is called “the slug,” and each article that you publish will have a customizable slug that is appended to the URL. Now, there are some instances where you want to deviate from post-name.
And that’s if you have a very categorical, driven website. So, if you run a blog that has categories that are incredibly important to the context of the material, or if you run, say, an e-commerce site that has products that live within categories, you’re going to want to adjust this structure.
And you can do that by selecting Custom Structure and then before Post-name, you’ll select the category tag to insert it. And what this’ll do is it’ll create category and then post-name. Now if you are using WooCommerce or another e-commerce plug-in, you likely will have the capability to adjust your product permalinks.
By default, this is set up with “products” as the category, but you can switch that to “shop,” or “shop with product category.” Now, there’s not really a right or wrong answer within the e-commerce context.
It really depends on the makeup of your site and the way that you categorize your content. But above all, try to identify the simplest and shortest permalink structure to maximize your SEO site architecture.
A fairly common scenario within any SEO effort is changing a URL. Let’s say, for example, our article on bean soup has the URL including bean-soup. And let’s say that we’ve done some SEO research and we’ve decided this article will perform better if we named it mixed bean soup.
Install a Redirections plugin, and go ahead and set up a very basic manual redirection that takes one permalink structure, 2019/08/chicken-dumpling, and redirects it to our new permalink structure, which is simply chicken-dumpling as the slug. N
Now this is fine if you simply want to manage a handful of redirects, but if you’re changing the entire permalink structure for your website, you’re going to want to automatically generate those 301 redirects for every single instance of page that exists.
And to do that, you’re going to use a regular expression within the source URL, and that regular expression is essentially going to pattern match what we have in our old permalink structure.
A properly configured website is going to have a robots.txt file. And this file is really important, because it creates the set of instructions that crawlers use when they arrive on your website. And these instructions indicate what the crawler should crawl and what they should not crawl.
Essentially, it disallows or allows certain behavior. To show you the robots.txt file for example, I’ll simply go to example.com/robots.txt. All robots.txt files live at the same destination, and they’re all case sensitive, so they will always be lowercase r in the robots.
Here’s the robots.txt file for example, and this is the default set of directives that WordPress provides. Here it’s saying the user agent, that’s the piece of software that’s crawling the website, is an asterisk which means all user agents must follow the directive below. And the directive says you’re not allowed to crawl the folder wp-admin. And that’s the folder that we use to log in and administer everything we’re doing in WordPress, so it makes sense that we don’t want the crawlers going through all of those URLs.
It says you’re allowed, however, to visit one particular URL within that folder, admin-ajax. Now, it’s important to know that disallowing pages or subdirectories isn’t a security feature, this isn’t going to prevent people from accessing this content.
This just tells the robots to not waste their time crawling it. And this is really important, because a particular crawler, say Google, has a quota, an allotment of time that it’s going to dedicate to crawling your website. And once that time has elapsed, it’s done and it leaves.
So, if the crawler wastes time visiting content that is never going to be relevant for what people are searching for, it’s content that you’re never going to send traffic to, there’s no point in having Google, or any other crawler, visit that content. And that is why we use a robots.txt file.
We use it to provide directives, the instructions that we want the crawler to follow. Now, you’ll also notice that within the robots.txt file we provide the sitemap, and this sitemap will provide a list of all the pages that the website wants crawled. Let’s take a closer look at some sample robots.txt files and go through them together.
For many large sites, they will provide different directives to crawlers that are running crawls for serving ads versus, say Google, and this is where you often would want to differentiate between two user agents. You want Google to access all of the content that’s relevant to them, but your advertisements might serve on every piece of content and you want that crawler to have unlimited access to particular areas of the website.
So, then we have Disallow and then the string that you will not want to be crawled. By default, anything that’s not in the disallow list is allowed, so we don’t have to explicitly call out allow. So, here’s an example, lines five through seven, this is what you would see if you were blocking all crawlers from all content, and this problem comes up quite a bit.
If you identify that your site is not being indexed by Google, you’ll want to evaluate whether your robots.txt file has this directive. This is saying you don’t want anything crawled. You likely see this on development websites, so if you have staging.yoursite.com, you would disallow all crawling. And if that robots.txt file accidentally gets replicated to your live production site, well you’ll have a problem.
Another very common way that robots.txt is used out of the box is to simply provide every user agent the ability to crawl the entire site, and disallow is left blank. You’ll provide the location to your sitemap with Sitemap: and then the URL to that sitemap, typically, /sitemap.xml.
Now, it is important that you maintain case sensitivity and the space after the colon. Now, one of the most common mistakes that I encounter when reviewing robots.txt is providing multiple directives to a user agent at the same time. So, let’s say that we had directives for msnbot and Googlebot and they were identical directives, so we simply stack the user agents, User Agent A, User Agent B, Disallow path, Disallow path2.
This may actually create some scenarios by which the crawler gets confused. Perhaps we wanted User Agent B to only disallow path2, and User Agent A to only disallow path. In this case, it could be User Agent A only disallows both or User Agent B disallows both or a number of other ways that it can be incorrectly interpreted.
You see, crawlers aren’t always the smartest, they follow a very rudimentary set of rules. So, a better way to manage this is to always talk to one user agent at a time. So, if we wanted User Agent A to disallow both of these paths, we’d set it up as such, or we would simply add in the other user agent and disallow that path explicitly underneath that user agent.
So, when you set up your robots.txt file, it should always go user agent, path, and you can have as many disallows as you need. So, we could disallow path3, path4, and so on. Your robots.txt file is incredibly important for your SEO efforts, so take time to make sure that you’re using a robots.txt file, and evaluate that you’re using it in a way that is most effective for your SEO goal.
Let’s talk through a scenario where we create a robots dot txt file using a plugin for WordPress. Let’s say that I have a website example.com and it has a search functionality. So am going to search for spinach and you’ll notice at the top bar that the search results start with the question mark s equals spinach. And this question mark s parameter denotes that there is a site search. Let’s say that we do not want Google to index or crawl any of our site search content. So we’re going to disallow forward slash question mark s. A relatively straightforward way is to use what’s known as a virtual robots dot txt file.
A sitemap lays out all of the content on your site as a way to indicate to both search engine spiders and sometimes visitors to your site, where all of the information exists. Now there are traditionally two types of sitemaps. The HTML sitemap and the XML sitemap.
The HTML sitemap is usually for your users and the XML sitemap is used for crawlers. So I’m here on Lynda.com and I want to give you an example of both. If I scroll to the bottom of this site, you’ll see a link to Site Map. And this takes me to a page that allows me to navigate the categories and topics, a variety of sections such as All Courses, All Subjects, Languages, and so on.
This is an HTML site map. It’s here to help users navigate a large website. The sitemap that is most important when it comes to improving your SEO is your XML sitemap. And traditionally you find that at /sitemap.xml.
So I’d like to create a sitemap for this WordPress website. Now, if I go to /sitemap.xml, I’m going to be met with a 404, and this means I don’t have a sitemap set up or I’m not using the appropriate web address for my sitemap on this site.
Now, in this case, we don’t have a sitemap set up, so let’s create one together. And the easiest way to do that is using a WordPress plugin. So in WordPress dashboard start by finding my rank math WordPress plugin and then choosing dashboard from the menu option on the right hand side. Now, rank math is a do it all SEO plugin and within it is the ability to enable sitemaps.
So I’ll scroll down on this page until I find sitemaps and then I’ll toggle them to enable them. So now we can see that the sitemap is active. If we toggle back to our website and refresh the sitemap URL, we can do it.
It’s really easy to get tunnel vision when we’re doing technical SEO. We spend so much time within the WordPress dashboard configuring every little detail that we often forget to zoom out and consider where the WordPress site is hosted.
You see, a host can be all the difference between a great-performing website and a mediocre-performing one. You see, when it comes to SEO performance speed is what really matters. According to Google, over 50% of visitors will leave a page that takes longer than three seconds to load.
And if we’ve done everything we can on our optimization efforts, it comes down to our web host. How quickly can they deliver the content from the server to the user? So much goes in to the speed of that process.
There’s the actual server hardware. It could be shared hardware, meaning there’s lots of other websites on that same piece of hardware, therefore everybody is slowing it down…
When it comes to furthering your technical SEO, and increasing the speed that your website operates, you’re going to want to set up a Content Delivery Network. A Content Delivery Network, or CDN for short, allows the assets, that are delivered on your website to load much quicker for the end user. So your HTML, your style sheets, your images, and any video files, and so on, are all cached and served from a web server that is closer to the end user. So the idea is, your web host might be say in Toronto, and your user is in Brampton.
What a content delivery network does, is it takes any static content from that server in Toronto, and stores it in a server in Brampton. So when the user in Brampton goes to visit your website, they will receive the content that is statically served local to them. And the key idea of a content delivery network is that these servers are distributed all over.
– An incredibly popular and easy to set up content delivery network is Cloudflare. I’m here on cloudfare.com, and I’ve selected Pricing to show you the options that are available. And you’ll see that they have a free version which means there’s no excuse to not set this up.
What I’ll like to do is show you how to set up Cloudflare with WordPress, because this is a very common scenario, and it’s also one of the easiest to get configured. I’ll start by going in to our WordPress Dashboard. So by far the easiest way to set up Cloudflare is with a Plugin, so we’ll start by going to Plugins in the bottom left hand corner, and then choosing Add New.
In the upper right hand corner, I’ll search for Cloudflare. And here on the right hand side, I’ll choose Install Now, and then Activate. Next, we’ll need to configure the settings for Cloudflare, so I’ll select Settings from the Plugin menu.
I want to briefly discuss the structure of a domain, as well as the use of sub domains, as this is a relatively hot topic in the world of SEO. And given that it’s so widely debated, it’s probably worth the time to really understand what’s going on. So let’s talk about what a domain really is. A domain has three parts. There’s the Top Level Domain, or TLD, which is the.com, the.net, the.org.
Then you have the domain name, or sometimes referred to as the root domain, which would be example.com. It includes both the domain and the top level domain. The sub domain is the prefix just before the domain name.
So dub dub dub is considered a sub domain, just as blog.store. and so on, and then the HTTP or HTTPS is the protocol. And this is important, because there are two questions that come up quite a bit. Should I be using sub domains or sub folders, say blog.example.com
Once you’ve decided whether you’re going to have your domain name be dub-dub-dub or non-dub-dub-dub, it’s really important that you make sure that everything remains consistent. This is because if you have the instances of two URLs and neither one redirects to the right location then as far as Google’s concerned you have duplicate content. You have two completely unique websites.
To put this into perspective, let’s take a look at example.com Here I’m at www.example.com and if I remove the dub-dub-dub and I go straight to https://abc.com, you’ll notice that the website does not redirect me anywhere, this means that Google sees two versions that are duplicate and that can negatively impact my SEO. Now to really test this, let me show you how we can see what’s going on under the hood. I’m going to open up a new terminal window on the Mac. You can do that with command…
With my website, voltstudios.ca, if we’ve identified that I have www, and non www, both returning a 200 response. And what we want to see is one of these 301 redirecting to the other. In this case I would like my www to redirect to my non www.
As a personal preference, I like the shorter URLs and I often choose to remove www in the sites that I’m creating from scratch. However if you have an established website and you’ve been primarily pushing www, but you’ve identified that non www is also returning a 200, you can simply 301 to the www version. The best way to do this is by adding a rewrite rule to your htaccess file.
Unintentional content duplication due to improper URL handling is one of the more common SEO issues I come across. And it can hinder your SEO efforts dramatically. And it often goes unnoticed because to an outside observer the site works fine.
We often don’t really pay much attention to what happens in the address bar as we click around on links or when visiting a website directly. If we forget a trailing slash, mix in a few stray capital letters as we type in a URL, or even ignore adding in, say, www before that root domain, we often still arrive where we expected to be. But behind the scenes, the server is essentially saying, hey, what you asked for is unique and I found it. And this crops up in a myriad of ways.
When it comes to configuring canonical URLs with WordPress, you don’t have to use a plugin to pull this off. In fact, it’s relatively easy to setup canonical URLs manually. I’m going to show you how. Let’s say we’ve launched a campaign and we’re driving traffic with the following parameters attached to our URL. Without canonicals this is going to be seen as a unique URL, making it duplicate content without the parameters attached. Now, there are a few ways to evaluate if canonicals are in use.
One of the easiest is to arrive at the page. And in Chrome, right-click and choose Inspect. And from the Elements pane, I’m going to do Apple + F or on a PC, Control + F and search for canonical. So here we can see that there are no instances of canonical being used so we know we don’t have a canonical URL for this page. Another way would be to use a crawl tool such as Screaming Frog, which would also lets you know…
If you’re looking to manage your canonicals automatically, then there are many plugins that can handle this. However, if you’re using a do-it-all SEO plugin such as Yoast or RankMath, then it’s all going to be handled for you right out of the box. Now, I have RankMath installed on this WordPress website, and if I pull up a post you can see as I scroll down to the bottom all of RankMath’s settings come in.
if I select advanced, you’ll see here the canonical URL section. Now, this is always going to default to whatever the permalink is set up as. You can confirm that by comparing to the URL slug as part of the permalink block. But you can change this if, for whatever reason, you had this content duplicated elsewhere and you wanted to make sure that it was canonicalized.
When you use structured data to markup your content, you’re giving Google a clear way to understand that information, as well as its context, this markup helps Google understand if a piece of content is a bread crumb, an address to a local business, a map, a review on a product, or even the ingredients within a recipe. When you use structured data, you’re not only creating a highly optimized site, but you’re making certain pages eligible for inclusion in rich results.
To show you an example, let’s do a Google search for cornbread recipe, what we see right away, are some rich results, at the top of the page we have a carousel of recipes and each recipe features a photo, the name of the recipe, where the recipe’s coming from, a review rating, how long it takes to cook the recipe, and a snippet of the ingredients, and as we scroll down we see that play out in the search results as well.
This is our area of expertise. Looking for a Local SEO company in Toronto? Volt sudios is here for your help.
One of the easiest ways to implement structured data when using WordPress is through the use of a plugin. Now, my personal favorite for managing your structured data comes courtesy of a rank math. So go to scroll down and here is the feature for rich snippets. And toggle that to on. From here choose settings.
Now, the default rich snippet settings are nested within SEO titles and meta. You can see that because titles and meta as a sub-menu item is highlighted. You’ll see along the right-hand side there’s the option for rich snippet type.
This allows us to set a default type for our rich snippet. Now, we can change this by post but you want to set the default to make your life a little easier. In this case, from here I’ll choose save changes in the bottom right-hand corner and then we work with the specific structured…
One of the reasons WordPress is so popular is because it’s relatively easy to use. And part of that ease of use is how simple it is to add photos to your posts and pages. But when it’s so easy to add a photo, we often don’t think about where we’re getting that image and whether or not we’ve optimized that image.
Since we can simply select the file off of our computer, there’s very little thought that goes into compressing images. And when we don’t compress images, we run the risk of having very large files that take a ton of time to download with our end user.
So I want to show you how we can review image sizes within our WordPress site through a couple of techniques. Let’s start by visiting website . And going to right-click within Chrome and choose inspect.
Run a crawl using Screaming Frog and select the images tab and we can see here all of the image sizes that are showing up within this crawl. Now, see what type of improvement we can get by using a plugin to handle this optimization for us. toggle back over to our WordPress dashboard and I’m going to select plugins in the bottom left-hand corner, and then add new.
One of the most popular tools to compress images within WordPress is Smush. So I’ll do a search for Smush, and here we can see Smush Image Optimization and as you can see, there’s over a million active installations, this is an incredibly popular tool, it’s been well tested and it beats the competition in almost every comparison. choose install now in the upper right-hand corner, and then we’ll activate. scroll down my plugin list and select settings below Smush. So already, it’s…
As part of your evaluation on the technical makeup of your WordPress SEO, you’ll want to be sure that all of your images are using alt tags. These are incredibly important for accessibility, and they also give search engines the ability to understand the content of an image, since there’s no text for it to read. The quickest way to identify your alt tags is to run a crawl, and I do that, using Screaming Frog
We’re now in a mobile first SEO era and that means, it’s all about your mobile site, its performance, and how it’s navigated by your user. This makes it imperative that you’re evaluating if your webpage is mobile friendly per Google’s guidelines. It’s relatively straightforward to do. I’ve done a Google search for Google Mobile Friendly Test, it’s brought me to this website, and we simply need to enter in our URL. What Google has done is conducted a very simple test to evaluate whether the page is mobile friendly. And on the right hand side it shows us a rendering of what it looks like on a mobile device, which helps to immediately identify any problems. Now what’s nice is that we can see that we’ve passed.
When I load the example website you’ll see that there are some animations that bring content onto the page. Now personally, I think animations like this are unnecessary, and overall, they can cause the perception that a site is loading slowly. Additionally, most of them aren’t mobile friendly, and in some situations they actually do cause the site to load slower. I encourage you to evaluate the animation use on your site, and determine whether or not it can be removed. A great resource to evaluate animations is within Chrome Inspector.
As you begin to dig deeper into your technical SEO, it’s important that you begin to measure the overall site performance. Now, there are a variety of tools that you can use to do this but my favorite is baked right into Google Chrome . use mobile emulation during auditing since Google is mobile first, that’s imperative.
If you want to get serious about your technical performance and further increase your page speed, you’re going to want to utilize Cache. A page cache essentially creates a static asset that is loaded, instead of an asset that’s dynamically loaded.
And what this means is WordPress is built alongside a database. And so when we call a page, we have to fill in all the elements of the page. Where is the title come from, where is the content come from. Well, that’s a call to the database. And it calls and pulls each of those assets. If we instead serve up a static picture of the site, we don’t need to dynamically generate any of that content.
We build all of that content one time, then we take that static asset and we serve it in the cache. And so the first user who arrives, does all the dirty work of having the database crawl and crunch and pull in that insight, and then every user thereafter gets the cached…
I know we’ve covered a lot, but I encourage you to continue exploring the SEO landscape. You can still dive deeper . Your next step might be to watch are our advanced SEO articles coming soon. I have an advanced SEO posts on search factors, and one I’m building an SEO family website. There you can go deeper into how people search, what goes on behind the scenes with a search engine.
The key building blocks you want to have in place, and how machine learning, makes it all possible. Plus, I’ll go deeper into how to capture Google featured snippets, and how to mine results from Google, to unlock new ideas and opportunities. And be sure to revisit this course from time to time, to brush up on your skills as you continue to explore the world of Toronto SEO.