The last few days, I spent a lot of time, trying to optimize my websites for the Googlebot.

We all know, Google (as an empire, worth billions and billions of Dollars) is the leading website for searching and finding information throughout the world wide web. That’s okay. I, as a user, really like Google. Most of the time, I do find any information with Google. So, it’s worth its status.

As a maintainer of various websites, Google really stresses me out. It’s quite easy to announce a website. It also ain’t no problem to set up a Google account, use the webmaster tools, transmit a sitemap (or even more) and play with the settings.

I truely like the algorithm how Google works. PageRanking is no easy thing and Google does a great job. For me, as maintainer of very little websites, it’s unfortunately very hard, to get a high ranking. I’m quite often on the very first page of search responses – as long as the searches are explicit enough. To get a higher ranking for less specific searches, I would need lots of websites, linking back to mine. Okay, that’s the way how Google works. A link to website X on website Y is assumed to be a voting of website Y for website X. It’s a long way to manipulate that.

Whilst checking the raking and indexed pages of my sites nearly every day, I discovered a few things which really stresses me out:

1. Googlebot-Images
Two images of one of my websites are in the index of Google images. These images are very old, probably indexed November 2007. The last few days, I studied lots and lots of websites and blogs and discovered: you nearly have no chance to manipulate the Googlebot-Images. Lucky you, if he comes around. Some people on some websites stated, that it may take up to 24 month to get your images indexed.

What a pitty.

2. dynamically generated websites

Once you have a website which serves it’s content dynamically, you either need Google to behave just like a real user and click from one page to another until Google indexed all your pages or you need to submit sitemaps which are also dynamically generated.

And that’s, what I did the last few days. I wrote quite simple scripts, which (depending on the type of website) fetches URLs from the database and generates a listing of links. Two such types are WordPress blog and PixelPost photoblog.

Another example: Gallery2. Unfortunately, I found no easy way to extract the links from the database, but Gallery2 comes with a built-in sitemap.

But here comes the next thing, which stresses me out: about 4 days ago, I submitted a Gallery2 sitemap with more than 900 URLs. Until today, only 220 URLs have been indexed.

As fast as Google’s searchengine works, so slow is indexing.

3. sitelinks

Sitelinks are a very nice idea and look quite impressive. But how to get sitelinks for your websites? Google says, there’s an algorithm, which calculates whether or not sitelinks would help users and automatically decides if sitelinks a generated. But until today, I found no information about how this algorithm works.

Damn, I’d love to have sitelinks.

Conclusion: the best and easiest way to get your websites on top of search results is to have them indexed (and be patient with that), to have specific content on your site and to rely on good search strings of interested users.

Good night so far…

Hat’s gefallen?

Wenn der Artikel gefallen hat oder hilfreich war, bitte folgende Funktionen benutzen:

Tweet

Leave a Reply