04/04/2012

PHP Malware C99 Shell

Intro

This post is about identifying web back doors. Recently I made a research about PHP Malware C99 Shell and it seems to be very popular among lots of hacking groups and script kiddies. 

C99 PHP Shell

C99Shell is a very well designed shell that practically lets you do about anything with the server, if you have the proper access rights. Here is a list with more web back doors, the link given is actually a google project and it is not going to be accessible trough corporate web gateways (with mal-ware filtering, URL filtering or Content filtering).

Google Dorks

Now days someone would not even have to hack a web server, the only thing they have to do is google already compromised servers by using Google Dorks and boom already got into the compromised machine. Usually the compromised machines found this way are not so interesting, because something that is valuable is better protected (well not always!) and the google crawlers will spot it after a relatively big amount of time. Which means that when you google a web back door and find one then it is already searched many times before you.

To be more specific  a "Crawler" is a generic term for any program (such as a robot or spider) used to automatically discover and scan websites by following links from one webpage to another. Google's main crawler is called Googlebot. This table lists information about the common Google crawlers you may see in your referrer logs, and how they should be specified in robots.txt, the robots meta tags, and the X-Robots-Tag HTTP directives.

But if you want more fine-grained control, you can get more specific. For example, you might want all your pages to appear in Google Search, but you don't want images in your personal directory or hidden linkes such as web back door to be found and to be crawled. In this case, you can use robots.txt to disallow the user-agent Googlebot-image from crawling the files in your /personal directory (while allowing Googlebot to crawl all files), like this:


User-agent: Googlebot
Disallow:

User-agent: Googlebot-Image
Disallow: /personal

Someone can improve his/her web site crawling performance by simply adding directives for different crawlers, like this:

<meta name="robots" content="nofollow"><meta name="googlebot" content="noindex"> 

The truth is that most of the time the web site is going to crawled and be easily googled no matter what you do , an adversary will even be able to access none linked pages.

Web Back-door Google-Dorks using Google Alerts

Gaining access to web back doors in already compromised machines is easier done than thought. By simply using google alerts you can google all web back doors in the Internet and be notified through your google mail box. The best way to do it is by using the intitle:, intext:, inurl: search engine keywords. For example in order to google !C99madShell you simply type in the search:

  1. intitle:!C99madShell
  2. intext: !C99madShell
  3. inurl:backdor_name.php
Note: If you want to limit the search to your web site you can obviously use the site: keyword. For example you can type intitle:!C99madShell site:www.maiavictim.com boom you will search only your web infrastructure.The following screen shots shows how easy is to automate Web Back Doors searching in a daily bases:
 


The best thing to do in every situation in order to protect yourself from being hacked and not finding out about, is to regularly check you web infrastructure using google alerts. This is also a very good start before you begin a penetration test!! to check for already compromised web infrastructure (I know I am brilliant).


Expand and automate the search using basic scripting


A good thing to do in order to protect yourself from script kiddies is to similarly identify all web back doors that are found in the link mention above (the google project). A very good way to automate the whole process is with scripting!!

So firstly you go to google and insert the intitle:!C99madShell then the google search will return this:

  
If you copy the requested url you will see that it is exactly this one:

https://www.google.co.uk/#hl=en&sugexp=frgbld&gs_nf=1&cp=20&gs_id=4&xhr=t&q=intitle%3A!C99madShell&pf=p&output=search&sclient=psy-ab&oq=intitle:!C99madShell&aq=f&aqi=&aql=&gs_l=&pbx=1&bav=on.2,or.r_gc.r_pw.r_qf.,cf.osb&fp=
f711fce0343c3599&biw=580&bih=425 

Now you can use curl to search using google dorks and save your search results in your hard disk or simply use firefox and save search results by doing a save as. You can do this with curl in your command prompt by typing:

curl -A Mozilla http://www.google.com/search?q=C99madSHell |html2text -width 10000000 | grep "Cached - Similar" | grep www.*.php

The following screen shot show the command (notice the html to text Linux utility I used):  


The outcome of this command will be exactly the one shown below (after all the necessary grep-ing is done of course):


As you can see if you enlarge the picture (by simply clicking on the image) the search and filtering performed using curl is redirected into a file (after being properly greped to obtain only the desirable URL's). The output text file contains the potentially compromised web sites. Of course a manual filtering will have to be done to remove the references into URL's that are not really compromised.

Crontabing Google Searches 

The next best thing to do in order to completely automate the process is to use crontab, a good crontab tutorial is  clickmojo. As you already understand after reading this post you understand how toxic the Internet has become.

Here is how to run a google dork search at 6PM every night:

MAILTO=cron@youusername.youmailprovider.com
00 18 * * * /curl <google-dork to search> > logSearch.txt


Note:  You can grep or sed the obtained data to analyze the results and verify you logged only interesting URL's.

Epilog

Internet the last 2 years has become more and more toxic. Even users with no significant information to expose or online businesses start having a hard time to maintain their blogs or web sites without taking into consideration security seriously. Please feel free to post comments and give me back some feed on how useful you find my posts......

Reference:
  1. http://support.google.com/webmasters/bin/answer.py?hl=en&answer=1061943
  2. http://www.google.com/alerts 
  3. http://clickmojo.com/code/cron-tutorial.html