Print Reseller Scheme
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

directly visiting a websites /process.php url

Discussion in 'Website Coding & Programming Forum:' started by DaveGears86, Jun 24, 2012.

  1. DaveGears86

    DaveGears86 Member

    I often get emails from my php form which are empty (I have validation in place on the fields).

    I've realised that the reason this is happening is because some sort of crawl/bot is visiting the url directly ( which is the process script for fetching the data entered into the fields and emailing it to me etc.

    Is there any way to prevent the script from executing when visited/crawled directly? The blank empty forms that I get as emails are a pain!

  2. Corrosive

    Corrosive Moderator Staff Member

    Just set the script page as disallowed in your robots.txt That will stop the genuine bots.
    DaveGears86 likes this.
  3. DaveGears86

    DaveGears86 Member

    Thanks, are you aware if there are any other precautions that can be taken?

    Hopefully the robots.txt "disallow:" is enough to hold it off though!
  4. Corrosive

    Corrosive Moderator Staff Member

    Yeah, I'd have your processing script in a separate folder and disallow the whole folder for bots. You may get a rogue hacker bot or two but it should prevent the lion's share of blanks.
  5. DaveGears86

    DaveGears86 Member

    Good stuff, thanks Corrosive - appreciate it :thumb:
  6. Corrosive

    Corrosive Moderator Staff Member

    No problem, and don't forget what Paul M told you to do the other day, put an index page in your new folder :icon_wink:
  7. DaveGears86

    DaveGears86 Member

    Cheers, I've been doing that with my image folders lately (not that it makes the /images/ directory strictly secure but it's good practice and stops the directory list being displayed etc.)
  8. Corrosive

    Corrosive Moderator Staff Member

    Actually, there is something else you can do (thinking about it). I assume your validation on the form is javascript? That's fine for human users with js enabled but bots don't care. Try putting some basic form validation in the php script itself. For example, if you you put the post from the submit button into a variable you can have the script check that it has been pressed (i.e. page not accessed directly) before it is executed;

    $formsub = $_POST['Submit'];
    if ($formsub) {[I]execute your script here[/I]}
    else {[I]do something else[/I]}
    That's a bit basic but should give you the idea.
    Stationery Direct likes this.
  9. DaveGears86

    DaveGears86 Member

    ahhh, now we're talking! some in script validation - good stuff, I'll have a play about with this and see if I can get a bulletproof result - Thanks again :icon_cheers:
  10. DaveGears86

    DaveGears86 Member

    I did some searching and indirectly stumbled across this -

    It seems to work fine, and responds as I need. I don't see any potential issues or problems with it, unless I'm unaware of something?

  11. Edge

    Edge Active Member

    From my basic knowledge of security I'd say that process.php is insecure. A hacker could spoof a form submission using a form from another domain and hijack the form to send out spam by Bcc. At the very least the process.php should check that the domain / server on which the form is submitted is the same. Secondly input should be validated to check header fields do not contain any 'injected' code like 'CC:', 'BCC'.

    From painful experience, once spam does get sent out your server IP very quickly gets blacklisted and emails sent out from it start bouncing back. Takes a good week or two to get off main blacklists and about a month to get off the google mail blacklist.

Share This Page