directly visiting a websites /process.php url

DaveGears86

Member
I often get emails from my php form which are empty (I have validation in place on the fields).

I've realised that the reason this is happening is because some sort of crawl/bot is visiting the url directly (www.----.com/process.php) which is the process script for fetching the data entered into the fields and emailing it to me etc.

Is there any way to prevent the script from executing when visited/crawled directly? The blank empty forms that I get as emails are a pain!

Thanks
 
Thanks, are you aware if there are any other precautions that can be taken?

Hopefully the robots.txt "disallow:" is enough to hold it off though!
 
Yeah, I'd have your processing script in a separate folder and disallow the whole folder for bots. You may get a rogue hacker bot or two but it should prevent the lion's share of blanks.
 
Cheers, I've been doing that with my image folders lately (not that it makes the /images/ directory strictly secure but it's good practice and stops the directory list being displayed etc.)
 
Actually, there is something else you can do (thinking about it). I assume your validation on the form is javascript? That's fine for human users with js enabled but bots don't care. Try putting some basic form validation in the php script itself. For example, if you you put the post from the submit button into a variable you can have the script check that it has been pressed (i.e. page not accessed directly) before it is executed;



Code:
$formsub = $_POST['Submit'];

if ($formsub) {[I]execute your script here[/I]}

else {[I]do something else[/I]}

That's a bit basic but should give you the idea.
 
Actually, there is something else you can do (thinking about it). I assume your validation on the form is javascript? That's fine for human users with js enabled but bots don't care. Try putting some basic form validation in the php script itself. For example, if you you put the post from the submit button into a variable you can have the script check that it has been pressed (i.e. page not accessed directly) before it is executed;



Code:
$formsub = $_POST['Submit'];

if ($formsub) {[I]execute your script here[/I]}

else {[I]do something else[/I]}

That's a bit basic but should give you the idea.

ahhh, now we're talking! some in script validation - good stuff, I'll have a play about with this and see if I can get a bulletproof result - Thanks again :icon_cheers:
 
I did some searching and indirectly stumbled across this -

if($_SERVER['REQUEST_METHOD'] == "POST")

It seems to work fine, and responds as I need. I don't see any potential issues or problems with it, unless I'm unaware of something?

Thanks
 
From my basic knowledge of security I'd say that process.php is insecure. A hacker could spoof a form submission using a form from another domain and hijack the form to send out spam by Bcc. At the very least the process.php should check that the domain / server on which the form is submitted is the same. Secondly input should be validated to check header fields do not contain any 'injected' code like 'CC:', 'BCC'.

From painful experience, once spam does get sent out your server IP very quickly gets blacklisted and emails sent out from it start bouncing back. Takes a good week or two to get off main blacklists and about a month to get off the google mail blacklist.
 
Back
Top