You make a website, you search for it, you don’t find it. It’s annoying isn’t it? Find out why this can happen, what causes this problem and how to get your website found in Google search results. Specific problems, their effect, and how to fix them.
Google search basics
Google needs to know about your site. It needs to have read your websites pages. Until it’s done this, your site won’t be found in Google.
Your website hasn’t been indexed by Google.
If you haven’t requested that your website be indexed by Google, your site isn’t likely to be listed in Google’s database, and therefore won’t be found in search engine results. A lot of people think this happens automatically, and in some cases it will, but other cases (especially for new domains) it can take a long time for Google to organically crawl your website.
You can test if google has indexed your site by typing site:yourwebsite.com in to Google (obviously you have to replace the yourwebsite.com part with your website’s address). If I was checking our website I’d type site:netnerd.com in to Google search.
If you don’t see any results, Google hasn’t indexed your site. If you do see results Google has indexed your site and you have some other problem that’s stopping your site being found in search engine results.
How to fix your website hasn’t been indexed by Google.
Register for a Google Webmaster tools account, then use it to request that Google index your website. You can find more detailed instructions covering how to do this on this blog post about how to add your website to google search.
On page factors that affect rankings
Factors that are specific to your website are called “on page factors”. These are really aspects that you control within the website itself. These are all things that you or your developer do on, or to, your website.
Website content
The words on your website are really what’s used to tell Google what topic your site is about. Using these words clearly, effectively and in a manner that’s useful to Google, can influence how your website appears in Google search results.
Google can’t work out what your website is about
The other way of saying this would be something like “your website lacks topical clarity”.
Computers aren’t really that intuitive, so you often have to spell things out for them. This includes Google. Yes, you have to literally spell it out, using words. These words need to be placed in the page content that humans see (in paragraphs and headings), and also in page content that humans don’t see (such as alt tags on images, and meta titles).
For example’s sake, let’s say you make a website about hamsters. To brighten things up you use phrases like “fluffy bundles of fun”, “cute little critters” or “Syrian rodent joy”, but you don’t use the word hamsters so much. The effect that this has is that your topic (hamsters) is effectively diluted. If you want your website to be found for the word “hamsters” then you need to use the word hamsters to a sufficient on your website.
You can find out about the various places that you need to add these topical words to in this section of our introduction to search engine optimisation guide.
You can check to see the density of specific words on your website’s pages using keyword density checker tools, and SEO checker tools can be used to check that topical keywords are being put in the right places on your website’s pages.
Your website doesn’t demonstrate topical authority
This is related to the section above. Topical authority is a bit like your website’s topic, but taken a step further. Often this is referred to as EEAT, which stands for Experience, Expertise, Authoritativeness, and Trustworthiness. It’s a concept from Google’s Search Quality Rater Guidelines which are used to assess the credibility and quality of online content.
At this point, it helps to start thinking a bit like Google. Google’s product is “relevant search results”. You search for something, Google lists the results, the stuff near the top (excluding sponsored links) lead to somewhere you can find what you’ve searched for. Google being good at providing links to “somewhere you can find what you’ve searched for” is what keeps people using Google!
For your website to be found, you need to demonstrate to Google, that your website contains information that will be helpful, relevant, correct and trustworthy in relation to what was searched for.
At a fairly basic level, if someone types this in to Google:
Plumber basingstoke
A website for a plumber based in Basingstoke is relevant. Mind you, a website for a plumber based in Basingstoke that has numerous five star reviews, is going to be relevant and trustworthy. Then again, a website for a plumber based in Basingstoke that has numerous five star reviews, and that has a blog containing guides covering plumbing you might do at home (such as changing a washer) is relevant, trustworthy and demonstrates knowledge, or topical authority in the context of EEAT.
I’m sure you can see where I’m going with this. Let’s just cover what each aspect of EEAT is:
E — Experience
Google looks for first hand or life experience with the topic. For example a product review written by someone who has actually used the product shows “experience.” A review is a bit of a narrow example, but your website does need to demonstrate that you’re experienced in the subject matter that it covers.
E — Expertise
Your website’s content should demonstrate that you know what you’re talking about. A medical article written (or reviewed) by a qualified doctor shows expertise. OK, so we’re not all doctors, but you can use examples of past work, or state how you’ve improved something to show expertise.
A — Authoritativeness
The creator, website, or brand should be recognised as an authority in the topic area. A blog run by a website hosting company that covers topics related to websites, and dealing with related problems or improvements would be an example of authoritativeness. I wonder where we can find one of those?
T — Trustworthiness
This is an important factor, as it covers accuracy, transparency, and honesty. Citing sources (just as I’ve linked to Google above), having clear contact information, and avoiding misleading claims all help to build trust.
These EEAT factors put together and applied to your site help to demonstrate topical authority, and a reliability of information. Used well, these allow you to provide content, that’s going to provide value to people using Google search, and make Google more likely to list your website in search engine results.
Technical SEO
A website can have grade A content, but be horrible to use.
Slow loading times and intermittent errors can be factors that make your site unappealing to visitors.
Google doesn’t want to direct people to websites that aren’t nice to use, simply because they’d prefer that people keep using Google. The chances of people continuing to use Google to search decline if Google start listing sites that provide a poor visitor experience.
Consequently your site needs to function well, load quickly and provide a good visitor experience for it to be found in Google search results.
Website Errors
Your website can error, and the associated issue not be obvious. An example of this would be using aggressive caching and the mobile menu not functioning. It’s a good idea to check your website functions as it should simply so that people can use it.
Issues such as broken links can cause problems with use and navigation. More importantly, as the Google bot follows links to discover your site’s pages, if a link can’t be followed the Google bot might not find the respective page, and this wouldn’t be indexed.
Tools such as Screaming Frog can be used to check your website’s pages for errors. The console built in to the developer tools available in Google Chrome can also be used to check for things like JavaScript errors.
Different errors can gave different effects:
Error Type 8164_a05cce-50> | Direct Effect on Ranking 8164_d6d794-48> | Indirect Effect on Ranking 8164_6dda47-9e> |
404 / broken pages 8164_4c9b2a-37> | Moderate – lost link equity 8164_d62078-79> | Reduced user experience, high bounce rate 8164_69bf99-09> |
500 / server errors 8164_8db55b-ac> | Strong – may prevent indexing 8164_c82423-35> | Reduced crawling efficiency 8164_b90ec9-2b> |
Slow page speed 8164_b87250-7f> | Moderate 8164_22867d-88> | High bounce rate, poor Core Web Vitals 8164_5efed1-84> |
Mobile usability issues 8164_8fcde1-87> | Moderate 8164_a3db9f-90> | Poor engagement metrics 8164_e3439b-91> |
Security issues (HTTPS, malware) 8164_f11086-cf> | Strong 8164_edc52c-b2> | Trust, click-through penalties 8164_9133cb-66> |
Schema / structured data errors 8164_fb3627-ef> | Minor 8164_00b1a2-97> | Lost rich results → lower CTR 8164_0d1e78-64> |
Robots.txt / noindex mistakes 8164_113f51-c5> | Strong 8164_258e6c-83> | Pages won’t rank at all 8164_eef6f2-42> |
Your website is slow to load or fails core web vitals
Core web vitals are a set of website metrics that Google uses to measure the quality of user experience on a webpage. They are part of Google’s Page Experience signals, which influence how websites rank in Google search results.
Some core web vitals (such as Largest Contentful Pain / LCP) are time based metrics, which is why I’m talking about slow loading and core web vitals as one of the same.
Pagespeed insights is an online tool that you can use to check your website’s core web vitals. What’s displayed in this tools results are effectively an average of what’s happening right now, and collected data about your site (cRUX data) from other visitors. Due to this, this tool isn’t “real time”. There is an equivalent tool called Lighthouse that’s built in to Chrome that does measure website performance in real time. This is covered in a lot more detail in our blog post about WordPress and Pagespeed insights.
There are a lot of factors that can affect core web vitals, especially if you’re using WordPress. In many cases improving your website’s core web vitals can be a collection of activities, each of which addresses a specific issue that negatively affects your website’s core web vitals.
You can find out how to use Chrome’s lighthouse tool to identify issues and validate their fixes in our blog post about WordPress and Pagespeed insights.
Some examples of how to deal with specific issue can be found on our blog such as:
- Optimising google fonts
- Optimising images
- Improving LCP image delivery
- Minifying and combining JavaScript and CSS
- Eliminating render blocking resources
We know that this isn’t a lot of fun for people, and in some cases difficult to achieve which is why we offer a WordPress optimisation service. The purpose of this service is, in part, to help your website pass core web vitals and gain improved rankings.
Off page factors that affect rankings
Off page factors are external to your website. You can’t improve these by doing things to your website, or changing things on your website. Off page factors might be considered an indication of what other people think of your website.
Although Google has stated social signals aren’t a direct ranking factor, they can indirectly influence SEO. For example shares, likes, and mentions on social media drive traffic and visibility, high engagement may lead to natural backlinks (more on these later) and brand awareness and reputation improve overall authority.
Reviews & Ratings
Reviews on Google Business Profile, Trustpilot, and niche directories can affect your website’s trustworthiness. Positive reviews help local SEO and are likely to increase click throughs from search results.
Local Citations
Listings on local directories, maps, and industry websites and accurate and consistent NAP (name, address, phone number) information helps Google understand your location and relevance for local search. This is especially important for service based businesses that want to appeal to a local audience.
The two most important off page factors that affect rankings
So far, most of what I’ve talked about are factors that you have a reasonable degree of control over. These next two factors you don’t have so much control over, but they’re also the two most important factors when it comes to ranking in search engine results.
Competition
Competition is a massive factor when it comes to ranking. If you have no competition, it’s going to be VERY easy for you to rank well in search engine results. If you have some competition, it becomes a bit harder to rank well in search engine results, and if you have a lot of competition, ranking well is even harder.
In all cases, if you want to rank above your competitors in search engine results, you’re going to have to outcompete them.
As you can tell from all that I’ve mentioned above, there are a lot of different aspects that can be competed for.
If you consider that Google takes each of the factors above, and assigns a different weighting to each, it’s possible that you don’t have to outcompete your competitors on every factor. You might well be able to gain rankings better than your competitors by outcompeting them on factors with greater weighting. The main problem is that nobody really knows how much weighting each factor has, other than Google themselves.
To put this into context; if you provide a specialist service in a local market, having a well structured, informative website that loads quickly and is listed in relevant business directories could be enough to get you on page one of Google.
However, this same strategy might not work in all industries. For example, a local plumber faces more competition. Even with a high quality website and directory listings, their site might not reach page one if competitors are stronger in the ranking factors that Google values most.
So what has the greatest weight when it comes to ranking factors?
Backlinks
Backlinks influence rankings. Quality backlinks are probably the biggest ranking factor a website can have.
Unfortunately it’s not just as simple as “more backlinks = better rankings”, so please don’t go and buy a huge amount of backlinks from someone on a freelancer website. At best this will have little effect, at worst Google could de-index your site completely.
For backlinks to make sense we have to travel back in time to when the internet was invented.
When Tim Berners-Lee invented HTML and HTTP links were really the fundamental factor. Tim Berners-Lee was a research scientist at CERN when he developed HTML, and the thought behind linking was (to paraphrase) orientated to being able to link citations on one scientific paper to the paper from which the citation was taken.
Links have been an underpinning concept of the internet since day one.
When the internet became a bit more adopted, and people started to run websites of their own, they’d often have a links page (to other related websites) and maybe a guest book in which people could leave messages, and references (links) to their website if they had one.
In early internet days (pre Google) there was an organic network of interlinked sites. You could read one website, follow links to another, read the guestbook on that website, find more links, follow those, read more websites, check their links page, follow some of those links and so on, and so on. A lot of people working tech support jobs on night shifts did exactly this. In a lot of cases (or enough night shifts) you’d find that if you were reading about a certain subject matter (probably something to do with Nostradamus, circa 1997) you’d keep coming across the same website. There’s a reason for this: Because it’s good, other sites link to it.
The people working tech support jobs overnight were effectively doing what Google would go on to do.
Before Google there were search engines. They weren’t quite as good as Google, as you had to read a lot of results to find what you were looking for. Google had established that this was the problem with existing search engines, and worked out that if they could provide much more relevant results, people would use them more than other search engines.
The endorsement provided by links was used as part of the Google algorithm to establish if sites were “good”. This part of their algorithm was called Pagerank. The Googlebot would follow links to discover more websites, it would then read those, then follow the links on that site to find other websites. By doing this enough Google gained an idea of which sites were linked to a lot, which would often appear near the top of page one, due to their “link endorsement” as this was an indication of their relevance for a given subject matter.
For a short time all was well in the World Wide Web.
As Google’s dominance in the search engine market grew, more and more companies wanted to be on page one of Google, and more and more marketing departments tried to make that happen. In short…
The people in marketing broke the internet.
Because Google had been quite open about how backlinks affected rankings, marketing departments started to use backlinks in a nefarious manner to influence search engine rankings. Those people working night shifts following links started to find that if they kept following links, sooner or later they’d end up on a page that was JUST links. Not an actual site, just links. These were link farms, and they were being used to make it look like sites had lots of links, and therefore lots of endorsement. Even when they didn’t, and it was just made up by marketing people.
The Googlebot was also following links and ending up on link farms. As this influenced search engine results, which site appeared at the top was more based on who had the best link farm, rather than who had the most endorsements. This invalidated Google’s ability to provide relevant search results, so they couldn’t really let this continue as their product was being invalidated.
Google’s page rank became a secret, and Google started to penalise sites who used nefarious linking to artificially improve search engine results.
Although Moz’ “domain authority” score is considered the modern day equivalent of Pagerank this is really along the lines of a guess at what Google’s Pagerank actually is.
Part of the domain authority is passed from one site to another in the form of a backlink. This is what people call “link juice”. More link juice is passed on from a site with higher domain authority, than a domain with lower authority. So having a backlink to your site, from a site with high domain authority has a greater positive influence than a lot of links to your site from sites with low domain authority. Low domain authority, low link juice. High domain authority, high link juice.
With regard to the penalisation side of things, although ranking is still heavily influenced by backlinks, Google have had to establish ways of telling if a site should be penalised or not. This is also a secret. It’s not knowing this secret that prompts people to say things like “don’t buy a lot of backlinks off some dude on fiverr” because this is a fairly risky practice.
As well as being penalised for bad backlinking practices, and being promoted for good backlinking there’s also a “doesn’t really do anything” type of backlinking that’s somewhere in between the two. If a pet shop website links to a website about sports cars, the sports cars site gains no benefit, because the backlink isn’t from a relevant source. This type of backlinking doesn’t help improve rankings in search engine results.
If Google detects something unnatural taking place with your site’s backlinks such as your site gaining 500 backlinks in a week, this is going to look a bit suspiciously like you’re trying to influence rankings. If these backlinks don’t stand up under scrutiny (they’re all from hacked websites, or they’re all from US websites and your site is UK centric, for example), you could well be penalised, which will negatively affect your website’s rankings.
At this point, I wouldn’t blame you for wondering why I’m telling you all this. Well, the conclusion that I would hope that you’re coming to is that the situation with backlinks isn’t as a simple as “get more backlinks to rank”. The whole backlinking situation is a bit detailed, and doing wrong can have an effect that’s a lot worse than not doing it at all.
For backlinks to improve your rankings they should:
- Be from a website that’s relevant to your website’s subject matter.
- Be from a trustworthy, reputable source.
- Be from a website with a good domain authority.
- Be a dofollow to pass on link juice/equity (nofollow may influence traffic and visibility indirectly)
- Have descriptive relevant anchor text.
Well, I guess that narrows things down a bit.
How do you get backlinks?
Unfortunately nobody really uses guestbooks and puts links pages on their websites any more. The days of backlinks organically growing on their own are, sadly, over.
If you read the list above and thought “how am I going to get these types of backlinks!?” You’re not alone.
There are some low hanging fruit when it comes to backlinks, such as business directories and review sites. Then again, if they’re low hanging fruit for you, they’re also low hanging fruit for your competition. These types of backlinks should be considered must haves due to this.
Beyond this, backlinking usually involves backlink outreach, which often involves talking to people, to obtain backlinks by:
Guest Posting
Write articles for other websites or blogs in your industry, and include a link back to your site in the author bio or content (if allowed). Choose reputable sites with relevant audiences, when you do this.
Build Relationships & Networking
Connect with other businesses, bloggers, or influencers in your industry. Share each other’s content, collaborate on projects, or get mentioned in roundups. Participate in forums or communities relevant to your field.
Broken Link Building
Find broken links on relevant websites in your niche. Contact the site owner suggesting they replace the broken link with a link to your relevant content.
Press & PR
Get your business mentioned in news articles, trade publications, or local media. Press coverage often comes with backlinks to your website.
Skyscraper Technique
Identify popular content in your niche that has lots of backlinks. Create even better, more comprehensive content and reach out to sites linking to the original content.
Create High Quality Content
Content that naturally attracts links is the most sustainable way to get backlinks:
- In depth guides or tutorials (e.g., “Complete Guide to…”).
- Infographics or visual resources.
- Case studies, before-and-after project showcases.
- Tools or calculators relevant to your niche.
Obviously this is a lot of work, and it’s going to take a lot of time and effort to do all this. If you have this and you’re willing to get started great, but if you don’t, what then?
Well, there is another way.
Use an SEO agency to help build your backlink profile.
Because a lot of SEO agencies spend a lot of their time focussing on building backlinks for their customers, they often have a network of contacts and website owners that they can use to help you gain relevant, useful backlinks. By using an agency like this, you’re effectively gaining access to their knowledge and contacts.
Not all SEO agencies are of the same ilk, and there are a few shifty ones out there, but if you ask an SEO agency how they’ll improve your search engine rankings and they say something like “a bit of on page topical work, and an organic relevant growth of your site’s backlink portfolio” this is a good sign. Not only are they operating with transparency, but they’re also going to do something that’s going to make a positive difference to your website’s rankings.
As long as the SEO agency you’re using provides a natural organic backlinking service (in the same manner that you would for your own site), you’re not going to be penalised by Google. All that’s really taking place is someone with experience, expertise and industry knowledge will be carrying out natural backlinking for you.
We know that not everyone has the time or Human Resources to carry out what I’ve outlined above, which is why we offer website optimisation services, website creation services and more recently SEO services that provide a bit of on page topical work, and an organic relevant growth of your site’s backlink portfolio.
Website Not Found in Google Search: FAQ
Why can’t Google find my new website?
Google needs to crawl and index your website before it can appear in search results. For new domains, this process may not happen automatically or quickly. You can check if your site is indexed by searching Google for site:yourwebsite.com. If no results appear, Google hasn’t indexed your site yet.
How can I make sure Google indexes my website?
The most effective way is to use Google Search Console (formerly Google Webmaster Tools). Register your website there and submit a sitemap to request that Google crawl and index your pages. This speeds up the discovery process significantly.
What are “on-page factors” and how do they affect my ranking?
On-page factors are elements within your control on your website, like your content, technical setup, and site structure. They influence rankings by telling Google what your site is about and how good the user experience is. Key on-page factors include:
- Topical Clarity: Using relevant keywords in your text, headings, and meta tags.
- EEAT (Experience, Expertise, Authoritativeness, Trustworthiness): Demonstrating credibility through high-quality, reliable, and well-supported content.
- Technical SEO: Ensuring your site is fast, mobile-friendly, and free of errors (like 404s or 500s).
What is topical authority (EEAT) and why is it important?
Topical authority refers to your website’s demonstrated credibility and depth of knowledge on a specific subject, often summarized by EEAT (Experience, Expertise, Authoritativeness, and Trustworthiness). Google uses these factors to assess content quality. Websites that demonstrate strong EEAT are seen as more relevant and reliable, making them more likely to rank highly.
How do technical issues like slow speed or errors prevent my site from ranking?
Google doesn’t want to send users to websites that provide a poor experience. Technical issues like slow loading times, failing Core Web Vitals, or server errors (500s) and broken links (404s) can hurt your rankings. Errors can also prevent the Google bot from crawling your pages, and a slow, error-prone site leads to poor user metrics (like high bounce rates), which Google interprets as low quality.
What are “off-page factors” and which ones matter most?
Off-page factors are external signals that Google uses to gauge your website’s authority and popularity, such as mentions on social media, online reviews, and local citations.
The two most important off-page factors are:
Competition: Your ranking is relative to your competitors. To rank well, you must be better than the sites currently on page one.
Backlinks: Quality backlinks (links from other relevant, high-authority, and trustworthy websites) are one of the most significant ranking factors. They act as a vote of confidence for your content.
Is it a good idea to buy backlinks to rank faster?
No, you should not buy large amounts of low-quality or irrelevant backlinks from dubious sources. Google’s algorithm, known as PageRank, was built on the idea that links are endorsements. Buying low-quality links can look unnatural and lead to a Google penalty, which can severely or completely remove your website from search results. Focus only on obtaining natural, relevant, and high-quality backlinks.
How can I get high-quality backlinks organically?
High-quality backlinks are earned, not bought. Strategies to gain them include:
- Create Exceptional Content: Content (like in-depth guides, case studies, or original research) that people naturally want to link to.
- Guest Posting: Writing articles for reputable, relevant websites with a link back to your site.
- Build Relationships: Networking with industry leaders and bloggers to get mentions or collaborations.
- Broken Link Building: Finding broken links on other sites and suggesting your relevant content as a replacement.








Social Signals