It’s less common than it used to be, but now and then when you’re shopping around for an SEO service to manage your website, you might see something along the lines of “submits you to 100s of search engines!” as a selling point. You might think this is a great deal. After all, everyone only ever talks about Google, but if there are hundreds of search engines out there, you can bring in a lot of traffic from places other people don’t know about.
There are a few problems with this plan.
“Them,” in this case, being the search engines in question. If you were to stop 100 people on the street and ask them what search engine they use, 67 of them would answer “Google.” That leaves 33 people, and of those, 19 of them would answer Bing. Another 10 would answer Yahoo. 2 more would say Ask, because they’re hipsters. One of the last two would say AOL, because they’re still going through all those free weeks they got on CDs in the mail. The last one has no idea what a search engine is and asks you if the puppy in their mind can lead you to buried treasure. If you conduct this poll, don’t get into their van. Just a recommendation.
So when a company tells you they’re going to submit to 100s or 1,000s of search engines on your behalf, you have a questions to ask. That question is, who is using those search engines? Sure, you might be ranked #1 on WeirdTertiarySearch.com, but if your search volume through that engine is one person per year, what good is it doing you?
When was the last time you heard someone talking about submitting their site to Google for review? Okay, bad question. If you’re thinking seriously about search engine submissions, you’ve probably been convinced they matter. They don’t.
Google is the standard by which all other search engines are judged. Google has the most sophisticated algorithm, the most up to date search results and the best index. They maintain this through a vast network of bots, web spiders, crawlers and other synonymous pieces of technology. These bots are hard at work, 24/7/365, crawling the web and indexing everything. They monitor sites for changes and index those changes. They monitor sites for new links and crawl those links to discover new sites, and they index those sites. In fact…
There are three phases to the search operation. There’s crawling, there’s indexing, and there’s serving.
Crawling is the most basic step. Web crawlers trawl the Internet, looking for one thing and one thing only; websites that aren’t already in Google’s index. Whenever Google’s bots look at a website, they look specifically for links. They analyze those links, looking for a few things.
Essentially, Google has a legion of robots searching the Internet for pages it hasn’t seen before. When one is encountered, it is added to the Index, the vast sum of all Internet pages discovered by the ubermind that is Google.
Indexing is the process of Google recording the contents of the site, assigning it a rank and placing it in the search results. Depending on the quality, age and a hundred other factors, this initial ranking can be low or high.
Serving is the process whereby Google displays results when you make a search. Whether a page is served for a given query depends on how relevant the site is to that query. This judgment is made through a combination of algorithmic qualitative decisions and outsourced human assessment.
You’ll note that nowhere in this process does Google require you to manually submit your site. So where did this idea come from?
In the 1990s, stretching maybe into the early 2000s, Google was nowhere near as sophisticated as it is now. Back then, it indexed pages in the hundreds of thousands, not the billions it does today. Back then, it didn’t have programs that could crawl and index the web. Instead, it relied on people to submit their sites. That’s right; 25 years ago, search engine submission may have been a viable strategy.
Since then, the bots have appeared, making the idea of search submission obsolete. By the time you’ve submitted your site for consideration, Google has crawled it and indexed it.
Well, that’s not strictly true. See, in order to have your site indexed, Google needs to know about it. However, there’s no method or reason for submitting it directly. Instead, you have two options.
Option 1: earning links to your site. Web crawlers follow links, so if you want to be found, you need people to link to you. This means any site that’s indexed, linking to your site, is effectively submitting you to the search index.
Option 2: xml sitemap submission. Google absolutely LOVES the xml sitemap. See, a sitemap is a single file that includes links to every page on your site. You can generate them automatically and keep them up to date whenever you post a new blog entry or update a page. Google can look at this file, instantly see any new pages or changes you’ve made, and index them.
So why, if there’s no need for it, do Google, Bing and Yahoo all have page submission links? The answer is convenience and reassurance. If, for some reason, you publish a page and cannot publish a sitemap, and you can’t get links from other sites, a search submission can get the attention of the engines.
Whenever an SEO company tells you they’ll submit you to hundreds or thousands of search engines, however, it’s definitely a scam. The only ones that matter are the ones mentioned in this piece, and even then, Google and Bing are really the only engines with any volume worth mentioning.
Growtraffic.com is the leading pop-under traffic network.
Get thousands of targeted visitors for wholesale prices.