Google isn’t an omnipotent being, though it does sometimes seem like they’re fairly omnipresent. This is thanks to their fleet of bots and spiders, pieces of software running 24/7 that have no purpose other than to visit web pages, follow links, and crawl content. They find new sites, they check for changes on old sites, and they add a constant flow of data to the Index, Google’s sum of all knowledge.
If you’re not in Google’s index, you’re not in the search results. It’s as simple as that. You could have the best on-site SEO in the known universe, but if you’re not in the index, you won’t even have a search ranking. You’ll have nothing. So how do you get Google’s attention?
Google’s end goal is to have websites included in the index, and rank them according to a list of several hundred factors of varying importance. You can see a reverse-engineered version of the list here, though beware; it’s mostly guesswork. It’s pieced together from advice Google has given, information cribbed from a leaked rater guidelines document, and years of studies suggesting various levels of correlation between ranking and various factors.
To get websites into the index and ranked, Google needs to use sophisticated software called spiders to find the sites and pull the relevant information necessary for ranking. Given the size of the document above, that’s just about everything. When this software lands on your homepage or a subpage, it will download the page, follow links, download those pages, and so forth. Depending on whether or not you have a sitemap, it will use that to find pages it might have missed.
You can use search engine directives in your meta header to guide the search spiders. There are two attributes; nofollow and noindex. Nofollow tells Google not to follow a given link, so that link might as well not exist. It does exist, Google can see it, and it can rank you accordingly if it’s a spam link, but they won’t follow it to the destination page. Noindex is a page attribute that tells Google to ignore that page entirely. They don’t add it to the index, and it might as well not exist.
This brings you to two scenarios. In the first, you have an established website that has pages indexed. In the second, you are building a website and it has not yet been indexed.
In the first scenario, it’s simple to get your content indexed. You can wait for Google to find it, which they will, because new content tends to be published on the homepage or in recent posts lists. You can add it to your sitemap so the next time Google checks, it will see the new content. You can also link to it, both internally and externally, to further attract the attention of the spiders.
That simplicity is why I’m focusing on the second scenario; a site that doesn’t have existing links or indexed pages to make the process easier. It means you have to get the attention of the search spiders, and simply waiting isn’t a very good option. It certainly is an option, particularly if someone else finds your site and links to it, but there are much better, faster ways to be found by the search engines.
I’ve compiled several methods you can use to get indexed that much quicker. You don’t need to do all of them to be found, of course. Any one of them will work. The trick is, many of them are also strategies you can use for some additional search value, so it’s worth doing several at once. So, what are the strategies?
A sitemap is a simple concept; it’s a single page that has no content except for links to all of the pages on your site and some information about those pages. It’s like a phone book or directory listing for your site. You include every page you want to be shown, and exclude pages you want to remain hidden. Hidden pages might include contentless category pages or administrative pages that aren’t meant for public use.
There are two types of sitemap; HTML and XML. Of the two, XML is the faster, smaller, and easier to read for Google. The only reason you might use an HTML sitemap is if you wanted to turn it into a legitimate directory for human users as well. This is not really necessary in 99% of cases, so just go with an XML sitemap instead.
Xml sitemaps only list two pieces of information about each page. The first is the URL of the page. The second is the date and time it was most recently updated. This saves Google a lot of time, because they can compare your sitemap with the last version of it they have. Any page that hasn’t been updated since the last time they checked doesn’t need to be crawled, and they can focus on just the pages that have changed.
Now, it may sound like making and maintaining one of these sitemaps is a big chore. If you were doing it by hand, then yes, it certainly would be. Fortunately, there are apps and plugins meant for specifically this purpose. One such example is XML Sitemaps. To use it, you just need to put in the web address of your site, how often you want your sitemap to check for changes, and a couple more options. It only works for sites with under 500 pages, though, which makes it limited for growing sites.
Another option, if you’re using WordPress, is to use the Google XML Sitemaps plugin. You install it and it will create a sitemap and submit it.
Speaking of submission, if you aren’t using a system that automatically submits the sitemap to the search engines, you will have to do it yourself. Thankfully, you really only have to submit to Google and Bing. Yahoo uses the same index as Bing, and no other search engine is anywhere near valuable enough to warrant the time.
ShoutMeLoud has created two in-depth guides about submitting sitemaps to both Google and Bing.
Once your sitemap has been submitted, just wait. You should find your site and all of its subpages will have been indexed within a few hours, or a day at the most. This is by far the best option for submitting your site to the index, plus it establishes a system that can be easily maintained to keep Google up to date with changes to your site. Still, there are other options, so I’ll explore them as well.
Believe it or not, Google monitors Google properties. That means when a site comes up in one of their properties that they aren’t aware of, they will take a look and see if it’s something they need to index. The easiest way to use this for indexing your site is to use Google Analytics, which is good, because you should probably be using Google Analytics anyways.
There are several ways to implement Google Webmaster Tools. The easiest, though, is to just manually link it. You do need to have a Google account, which I’m assuming you already have. Even if you don’t, you really shouldn’t need me to walk you through the process of setting one up.
Once you have a Google account, you need to go to the Google Webmaster Tools site and sign in. You will click the “Add a site” button, and follow one of the processes they offer to verify you own a site.
By linking your site to Webmaster Tools, or by installing Google Analytics, you are putting your site into the awareness of Google. They can then browse your site, crawl it for details, and index it.
Google, as well as Bing, has a system you can use to manually submit an URL to their awareness.
This is a very limited system, called pinging Google, and it’s not a very good system. It’s what Google used to use, but it very quickly got out of hand. People would submit the URL of every page on their site, even though Google can crawl from one link to the next. It became too much data to handle, so Google started looking for alternatives. Nevertheless, the ping system still exists, and you can still use it.
I generally don’t recommend doing this more than once. Ping Google with your homepage URL to speed up indexation, but ignore it beyond that. You don’t need to ping every subpage. It won’t hurt your search ranking or anything, but it doesn’t help you at all.
You can manually ping Google by visiting this link and filling out the form. By “form” of course I just mean one box with the URL and one captcha. The Bing version works the same way, but specifies that you should only submit the URL of your homepage. You can find that submission form here.
If you have several sites you want to submit, or you want to target more than just the main two search engines, you can use a ping service to ping numerous search engines quickly. One such service is PingOMatic, which you can find here. Not only does it send your content to the main search engines, but it also sends it to specialized searches like Skygrid, Feed Burner, Spinn3r, and BlogShares.
This will help facilitate some basic link building as well, so you can experiment with it to see if it gives you a good kick start the way you would want it to. I do recommend investigating a service before you check it to ping it, just to make sure it won’t put you in black hat circles.
I mentioned before that one of the ways Google will find a site is by following a link from a different site. Therefore, one way to help get yourself indexed is to get links from large, frequently crawled sites. The biggest of these are the social networks and the social bookmarkers.
There are all sorts of other social networks you can use, so use whatever feels comfortable. Just avoid the networks or bookmark sites that make your posts limited in visibility. If a guest can’t see them, Google can’t see them, so they won’t be indexed and your effort goes to waste.
Blog networks are sites that specifically exist to circulate links to each others blogs. This is not to be confused with the more black hat link exchanges, though. You’re not shoehorning in unrelated links in order to generate credits to get your link out. Rather, you’re submitting your blog to industry aggregators and blog feeds.
Again, ShoutMeLoud comes with a good list of places you can submit your blog. Some of these are redundant with the previous section of this post, of course. Facebook, Google+, Twitter, and so forth. Some of the more unique entries include Alexa, Ahrefs, BlogCatalog, and Alltop.
One note about Alexa; no one cares about your Alexa ranking. That’s a skewed and flawed metric based only on a small subset of traffic, mostly traffic that is not demographically unbiased. The only people who seem to pay attention to it are spam bloggers or people looking to buy and sell sites, who are simply lacking any other valid metric for measuring the quality of a site. Don’t bother trying to shoehorn the Alexa toolbar into your marketing or anything like that. Submitting your site is enough.
This is a trick you can use in two different situations. One is simple; if you’re a local business, you want to show people where your business is located. Add a Google Maps widget to your About page, and there you go.
The other is a little trickier. With Google Maps, you can make anything interactive and explorable, including large drawings and pictures. This site posts large pieces of art and embeds them in Google Maps to allow users to explore the details of the large pieces. It also helps cut down on art theft.
In order to use this, you need to apply for a Google Maps API key, which you can do from the Google Developers system. Using that key, you can add a Maps widget to your page, with whatever use you want. By implementing a Google system, they find your site, and indexation begins.
So, as you can see, there are a lot of options available for when you want to index your site. Most of them have additional fringe benefits as well. Implement what you wish, and wait for Google’s attention; it’s all you can really do.
Growtraffic.com is the leading pop-under traffic network.
Get thousands of targeted visitors for wholesale prices.