Open Pagerank 4

Remember the days when everyone-and-their-grandma had one of these little tickers on the site proudly displaying their Pagerank ?

I was on some domain-metrics site and noticed they listed a PageRank. That surprised me. PageRank is not updated and published by Google anymore (such a shame). But apparently we now have Open Pagerank by They got an enormous amount of data by Common Crawl and Common Search, and based on that data, they calculate a new Pagerank.  They claim they update it every three months, but the last update is from may 2018. Be industrious. For now, they only calculated PR for domains, they plan to also provide PR for urls.

They also have very simple API. They provide a PHP example, but the example doesn’t work for me, I added

curl_setopt($ch, CURLOPT_SSL_VERIFYPEER, false);

…and then it worked like a charm. Probably some minor issue with the SSL-certificate or something.

This site gets pagerank 4 (3.93). Thank you, you are very kind. I rank at about 6.5 million. are doing a lot better, PR 6, and ranking around 19500.

We’ll see.





seo tricks : old wine in new bags…

Get some pagerank : this trick would require tedious boring link checking, but since SeoLinx (an extension of SeoQuake) that has become a lot easier. SeoLinx shows the stats of a links target url so you don’t have to go to every page to retrieve the stats. Cool plugin. Let’s put it to some practical use.

the trick : comment on old forum threads

Once you have SeoLinx installed find an ‘old’ forum, register if you haven’t already and make sure you get a signature link. Sometimes you first have to be a member for a week or write ten posts, but once you have a sig-link you get backlinks off the forum.

Then go comment on really old forum threads.

With SeoLinx you can easily spot the juicy old threads. Old threads on for instance DigitalPoint or Webmasterworld are sometimes pagerank 3. In case of the DP post, PR2 with 8 posts at the time of writing.

Pick a forum, and browse to the last page of the threads. Hover over the thread anchor and SeoLinx shows you the pagerank of the thread page. As long as the number of posts is below (10, 16 depending on the forum settings) you can put your comments in and they will appear on the first page of that thread, that has that nice pagerank and juice.

Old wine in new bags can be a sweet thing.

the benefit

A pagerank 3 ‘targetted’ anchor is worth about $9,- a month, $100,- per year. It can take an hour to find a juicy one, but hey, $100,- value for an hours work is well worth the trouble.

I might make this a blog feature, seo tips and tricks of the month.


Someone asked about the ‘pagerank spider’, I put the code online as is, it isn’t finished and if I wanted to finish it I would make a few changes.

the main remaining issues are

  • 1 memory usage
  • 2 how to handle the www.-prefix
  • 3 indexed pages at google
  • 4 http codes

1 a big class uses a lot of memory, a mysql backed version has an extra dependency, takes longer to develop and is slower. I needed a fast spider for a quick feedback on a small site.

Check out phpDig, they have a mature open-source(?) spider with a mysql backend, and a usergroup and forum.

2 google have a section where you can choose to have all domain pages indexed represented as either or It hints on that having an influence on page ranking but no actual straight forward ‘rule’. I have no idea what the actual impact is.

3 google index and cache pages when spidering other sites that link to yours. If the page the link points to was valid at the time, the page it links to is indexed and cached. Especially with files you dumped, or query-result pages, search pages, you cannot remove the cached page but it is counted to your site.

Putting search pages on ‘noindex’ is smart, especially if you use one of these funky search box gadgets in your template that can list any result, if someone queries your site for (nasty+term) and puts the query as link to your search page, once the link is followed a page from your site loaded with (nasty+term) is indexed and you cannot erase it from the cache, so then you have a problem. Put the file on robots=”noindex”, and try and confine the search to your own domain, or use a profanity filter.

4 http-codes, I checked them out for a link-validator routine two weeks ago, I might be adding that mysql backend after all, and make a more sturdy version, but not for the next few weeks.

Some background info /robots /robot-checklist