Poll-n-Ping, coz u r busy blogging

I would like to introduce a brand new service. It is a automated blog search directory pinging service named Poll-n-Ping. It is different from Ping-o-matic and similar services, because Poll-n-Ping monitors the blog (actually the feed) for changes and when it detects changes it will automatically ping the blog search directories.

You can checkout the service at http://www.mohanjith.net/pnp. All this comes free of charge, but donations are always welcome. Right now there is no limit on the number of blogs that can be monitored by a single user. If you want your blog to be submitted to all the blog search directories that we add support from time to time, you will have to visit Poll-n-Ping regularly.

Soon I plan to add alert service Poll-n-Ping, the subscribed users can receive notification mails or IM when content changes, blog goes offline, and/or blog comes online. However this will be a paid service unless I receive enough donations to support the hosting.

Poll-n-Ping has Turbogears under the hood :-).

Hope you will find the Poll-n-Ping service useful.

Hacking TurboGears: Automatically loggin in users

TurboGears
I love the way the Drupal handles account activation and password reset. The user just have to click a link that they receive via e-mail, and they are automatically logged in.

I wanted to do something similar with one of the applications I’m developing right now using TurboGears. I thought I would write a new identity provider, but instead went about hacking TurboGears. I noticed that TurboGears defualt soaprovider can be improved to seperate user authentication and marking a user as authenticated, hence making it reusable.

In my application’s controller I use this newly introduced method to mark the user as authenticated. I thought someone else might hit the same problem, and blogged about it.

You can download the patch from http://www.mohanjith.net/downloads/scripts/python/TurboGears/1.0.4.3/soaprovider.diff, it is created against TurboGears 1.0.4.3.

Automagically ping blog search engines

I wanted to automatically ping Technorati, Icerocket, and Google Blog Search, that means with no intervention the blog search engines should be pinged. I was alright with a delay of 15 minutes.

So I went about exploiting the XML-RPC services provided by the blog search engines. I came up with this python script. I set up a cron job to invoke the script every 15 minutes. See bellow for the source.
[sourcecode language=’py’]#!/usr/bin/python

import xmlrpclib
import urllib2
import os

from hashlib import md5

feed_url = ‘[Yorur feed url]’
blog_url = ‘[Your blog url]’
blog_name = ‘[Your blog name]’
hash_file_path = os.path.expanduser(“~/.blogger/”)

def main():
req = urllib2.Request(feed_url)
response = urllib2.urlopen(req)
feed = response.read()
hash_file_name = hash_file_path + md5(blog_url).hexdigest()

if os.path.exists(hash_file_name):
hash_file = open(hash_file_name, “r+”)
last_digest = hash_file.read(os.path.getsize(hash_file_name))
else:
hash_file = open(hash_file_name, “w”)
last_digest = ”

curr_digest = md5(feed).hexdigest()

if curr_digest != last_digest:
ping = Ping(blog_name, blog_url)
responses = ping.ping_all([‘icerocket’,’technorati’,’google’])
hash_file.write(curr_digest)

hash_file.close()

class Ping:
def __init__(self, blog_name, blog_url):
self.blog_name = blog_name
self.blog_url = blog_url

def ping_all(self, down_stream_services):
responses = []

for down_stream_service in down_stream_services:
method = eval(‘self._’ + down_stream_service)
responses.append(method.__call__())

return responses

def _icerocket(self):
server = xmlrpclib.ServerProxy(‘http://rpc.icerocket.com:10080’)
response = server.ping(self.blog_name, self.blog_url)
# print “Icerocket response : ” + str(response)
return response

def _technorati(self):
server = xmlrpclib.ServerProxy(‘http://rpc.technorati.com/rpc/ping’)
response = server.weblogUpdates.ping(self.blog_name, self.blog_url)
# print “Technorati response : ” + str(response)
return response

def _google(self):
server = xmlrpclib.ServerProxy(‘http://blogsearch.google.com/ping/RPC2’)
response = server.weblogUpdates.ping(self.blog_name, self.blog_url)
# print “Google blog search response : ” + str(response)
return response

main()[/sourcecode]
When ever the script is invoked it will get the post feed content, and create a md5 hash of it and then compare the hash against the last known hash, if they differ ping the given list of service.

This is very convenient if you have someplace to run the cron job. Even your own machine is sufficient if you can keep your machine on for at least 15 minutes after the blog post is made.

To run the script you need to python 2.4 to later and the python package hashlib. Hope you will find this useful.

GNOME Web browser Creative Commons extension

I’m happy to announce the release of cc-license-viewer 1.1.0 for Epiphany, the GNOME Web browser. It is capable of detecting Creative Commons licensed web pages either with rdf meta data or with the license badge from creativecomons.org and displaying an icon on the status bar.

This is a modified version of cc-license-viewer released by Jaime Frutos Morales. Jaime Frutos Morales extension is not capable of detecting web pages with the Creative Commons license badge.

You can download the extension cc-license-viewer-1.1.0.tar.gz

Follow the steps bellow to install the extensionn. I’m assuming you have already installed epiphany and epiphany-extensions.

Step 1 – Download the extension archive

 $ wget http://www.mohanjith.net/downloads/gnome/epiphany/extensions/cc-license-viewer/cc-license-viewer-1.1.0.tar.gz

Step 2 – Extract the extension archive to epiphany extensions directory

 $ cd /usr/lib/epiphany/2.20/extensions/ $ tar -xzvf [Location_to_archive] .

Step 3 – Restart epiphany and enable CC extension

Goto Tools -> Extensions and then select the check box against Creative Commons license viewer.

Step 4 – Goto Creative Commons page

Goto a CC licensed page, e.g. http://creativecommons.org

My next plan would be to extend the functionality of this extension such that more informative icon is shown at the status bar. For the time beign have fun with this extension.