Tuesday, November 18, 2008

Flash on AMD64 Linux

Adobe answered the cries of Linux users by releasing a 64-bit Linux Flash 10 Plugin. Note that as of this posting the plugin is an Alpha plugin, but I've installed it and have it working on both Firefox-3 and Opera-9.62 for Ubuntu 8.10 Linux.

After closing Firfox and Opera, then using Synaptic to remove flashplugin-nonfree from my system and Cleaning up My Ubuntu Box, I followed the installation instructions given by Adobe.

Firefox wanted to update itself as soon as I removed flashplugin-nonfree, but I held off on doing that until I had the new plugin installed. After extracting libflashplayer.so to ~/.mozilla/plugins I allowed Firefox to update via Ubuntus automatic updates, then started Firefox up and the Flash Player worked like a charm.

Opera didn't want to work right at first however. For some reason Opera wasn't aware of my ~/.mozilla/plugins directory as being a plugins directory so I had to add that directory to my plugins path in Opera.

Adding ~/.mozilla/plugins to my plugins path was simple enough.
Tools > Preferences > Advanced > Content > Plugins Options > Change Path


After adding ~/.mozilla/plugins to the list of folders and restarting Opera the Flash Player was working properly.

Sunday, November 16, 2008

PHP Robot Check

I've seen quite a few methods for determining if a request was made by a robot or search engine spider using PHP over the years, and they always have something to do with checking the User Agent. That works, but you have to keep up with User Agent strings which can get to be annoying.

I had an idea how to check for robots using PHP and htaccess today that doesn't involve a User Agent string at all. Instead it uses sessions and exploits the fact that all well behaved robots are going to request robots.txt before they request anything else.

I start by making sure my usual robots.txt file is in place. Then I upload a robots.php file to the same location as robots.txt with a tiny bit of code in it.

<?php
session_start();
$_SESSION['robot'] = 1;
echo file_get_contents('robots.txt');
exit;
?>


All that does is start a session, set a robot $_SESSION variable that I can check in subsequent scripts, then return the contents of robots.txt.

In htaccess I have the following line which transparently redirects requests to robots.txt made by visitors/spiders to robots.php, which in turn returns the contents of robots.txt

RewriteEngine on
RewriteRule robots\.txt robots.php


Now in my applications I can easily check for robots and drop things like advertisement banners to speed up page loads since spiders don't look at advertisements anyway.

<?php
session_start();
echo isset($_SESSION['robot']) ? 'ROBOT !!' : 'Not a robot';
?>


Here are some of the benefits of doing it this way.

  1. I can continue to modify robots.txt as I normally would
  2. I don't need to keep up with changing User Agent strings
  3. Checking for the existance of a session variable is quicker than performing pattern matching on a variable