seo tricks : old wine in new bags…

Get some pagerank : this trick would require tedious boring link checking, but since SeoLinx (an extension of SeoQuake) that has become a lot easier. SeoLinx shows the stats of a links target url so you don’t have to go to every page to retrieve the stats. Cool plugin. Let’s put it to some practical use.

the trick : comment on old forum threads

Once you have SeoLinx installed find an ‘old’ forum, register if you haven’t already and make sure you get a signature link. Sometimes you first have to be a member for a week or write ten posts, but once you have a sig-link you get backlinks off the forum.

Then go comment on really old forum threads.

With SeoLinx you can easily spot the juicy old threads. Old threads on for instance DigitalPoint or Webmasterworld are sometimes pagerank 3. In case of the DP post, PR2 with 8 posts at the time of writing.

Pick a forum, and browse to the last page of the threads. Hover over the thread anchor and SeoLinx shows you the pagerank of the thread page. As long as the number of posts is below (10, 16 depending on the forum settings) you can put your comments in and they will appear on the first page of that thread, that has that nice pagerank and juice.

Old wine in new bags can be a sweet thing.

the benefit

A pagerank 3 ‘targetted’ anchor is worth about $9,- a month, $100,- per year. It can take an hour to find a juicy one, but hey, $100,- value for an hours work is well worth the trouble.


I might make this a blog feature, seo tips and tricks of the month.

Spidering

Someone asked about the ‘pagerank spider’, I put the code online as is, it isn’t finished and if I wanted to finish it I would make a few changes.

the main remaining issues are

  • 1 memory usage
  • 2 how to handle the www.-prefix
  • 3 indexed pages at google
  • 4 http codes

1 a big class uses a lot of memory, a mysql backed version has an extra dependency, takes longer to develop and is slower. I needed a fast spider for a quick feedback on a small site.

Check out phpDig, they have a mature open-source(?) spider with a mysql backend, and a usergroup and forum.

2 google have a section where you can choose to have all domain pages indexed represented as either juust.org or www.juust.org. It hints on that having an influence on page ranking but no actual straight forward ‘rule’. I have no idea what the actual impact is.

3 google index and cache pages when spidering other sites that link to yours. If the page the link points to was valid at the time, the page it links to is indexed and cached. Especially with files you dumped, or query-result pages, search pages, you cannot remove the cached page but it is counted to your site.

Putting search pages on ‘noindex’ is smart, especially if you use one of these funky search box gadgets in your template that can list any result, if someone queries your site for (nasty+term) and puts the query as link to your search page, once the link is followed a page from your site loaded with (nasty+term) is indexed and you cannot erase it from the cache, so then you have a problem. Put the file on robots=”noindex”, and try and confine the search to your own domain, or use a profanity filter.

4 http-codes, I checked them out for a link-validator routine two weeks ago, I might be adding that mysql backend after all, and make a more sturdy version, but not for the next few weeks.

———
Some background info
searchtools.com /robots /robot-checklist

phpDig

social bookmarking to get your site indexed

Yesterday I put one link through twitter on twemes.com and four links on del.icio.us to the links.trismegistos.net php Link Directory.

Today I googled ‘trismegistos links’ to see what the effect (if any) would be and the link I put on twemes actually shows up first in google (and top-10 frontpage, spot 6 of 40.000 results).

I also issued a 700 URL sitemap to google webmaster first, and only added the bookmarks after the sitemap was downloaded.

I was just curious which method would yield the best result, and twitter/twemes is the winner.

why bother ?

Because a test I did shows most directory sites subcategory pages (where most links are) have no assigned pagerank and if i want to run an effective directory I have to get a fix on that problem and get a fix to fix it up.

I did two tests on directory sites, where I downloaded Yahoo SiteExplorer indexed urls’ and retrieved the pagerank per url.

sites pages/site total ranked percentage
16 1000 16.000 120 0.7%
150 50 7500 200 2.5%

Roughly interpreted, per site most pages are indexed, but only about 2 per 100 pages have a pagerank value assigned.

The others have no value assigned and don’t pass any value on links on it. In all cases (except dmoz, which is ranking on most branches) it was the index page and main category pages that were ranked and the pages with links were all N/A not-available.

So testing the effect of social bookmarking on pages that would hold links is interesting.

A submission now costs 3 to 15 cts, which shows the value of links in a directory is low, and only featured links (which usually appear on the index and main category pages which do rank) are sold for $3,-/year to $40,-/permanent.

An estimate for a link on a PR3 page for a year in a directory page is $13,-/year.

If i can get 700 pages to rank PR1 and sell links for $3,-/year, 700 pages with 20 links times 3 makes $60.000,-++ a year. Compare that to $10/year for 20 links on a PR3 category page, 10 pages is $2.000/year.

And if 700 social bookmarks can make sure after a year my whole directory is ranked, indexed, brings in $60..000,- and delivers the goods (a ranking link for entrants at $3,-/year) then a month linkspamming is well worth the trouble.

Another option is reciprocals on the category page itself (from an indexed page, some link-pages are conveniently not indexed ;)