Member Offer
  1. This site uses cookies. By continuing to use this site, you are agreeing to our use of cookies. Learn More.

301 redirect

Discussion in 'Website Design Forum:' started by rsdesigner, Jan 25, 2010.

  1. rsdesigner

    rsdesigner Junior Member

    Hi all,
    I have recently (in the process of) updated my website and have removed some of the old pages. The site has change from a predominantly html site to an asp site. Google is still looking for old url's so I have a few 'not found' crawl errors. The site is hosted on a windows server using the Helm control panel. I would like to set up some 301 permanent redirects from the old url's if it's not to late but I can't seem to find a way to do this from an .html page. Any one had any experience here?
  2. tbwcf

    tbwcf Active Member

    .htaccess is a nice easy way of doing this but it doesn't work on windows servers.. You'd need to change the IIS on the server using headers to redirect.. not sure how you'd do it through the control panel I'm afraid...

    The moral of the story... don't use windows servers..

    You could redirect from a php page..
    header("Location:"); // Redirect browser
    // Make sure that code below does not get executed when we redirect.

    not the best solution but a solution... alternatively use javascript
    <body onload="window.location.href='newpage.html'">
  3. Levi

    Levi Moderator Staff Member

    crawl errors - ie from spiders, couldn't you use a robot.txt to stop them from looking for them?
  4. Jazajay

    Jazajay Active Member

    A robots.txt entry would solve it but any users following links or old SERPs to those pages would then be lost also any equity from any pages linking to them would also be.

    You could backward engineer Tbwcf's PHP code to be ASP instead so something like ~
    Response.Status="301 Moved Permanently"
    Response.AddHeader "Location",""

    Now I'm pretty sure that would have to go above any output like in PHP.

    Just for clarity if you were to redirect via PHP instead change Tbwcf's code to ~
    header( "HTTP/1.1 301 Moved Permanently");
    header( "Location:")

    That way it will be sent as a 301 redirect which would be more beneficial for the search engines.:)
    We could also do it via HTML, it wouldn't be a 301 though, with something like ~
    <meta http-equiv="refresh" content="2;url=" />

    Which would go in the head of the document.

    The choice is really yours TBH. :)

    Hope it helps.

  5. rsdesigner

    rsdesigner Junior Member

    Thanks for the responses, doesn't look like I have any options. the servers are windows, the page was a .html page so I can't add any server side code and javascript re-directs apparently set off search engine alarm bells. Arse! but thanks for trying
  6. Levi

    Levi Moderator Staff Member

    Jaz, I interpreted the problem being google search results being the issue not internal links.

    rsdesigner - Assuming my interpretation is right - Have you got a sitemap file (in root of folder) or anything for google to 'map' the site with.

    If it's site based, maybe make a custom error page (explain their issue) and then have a 'directory' type thing (if possible) to direct the user to the new page.
  7. Harry

    Harry Senior Member

    Jaz is right with the 301. It'll move any pagerank etc with the file, so any links pointing to the old URI will now be forwarded to the new one along with any link juice.
  8. Jazajay

    Jazajay Active Member

    rsdesigner, rsdesigner don't give up fella I'm sure we can get creative.
    We just need to think about the problem in an irrational way and I tend to find the answer becomes obvious then. :)

    The first thing you could do is change the file to either a .php or .asp extention that way it will work, all internal links would have to be changed though. Couldn't you just do that?

    Secondly not sure about ILLS but couldn't we just tell the server to parse all documents as ASP or PHP pages anyway? That way the extention can be what ever you want but the server still looks for server side code and parses it when it finds it.

    I'll have a look around, as TBH ILL servers are really not my bag, about how to pull that off or if it is possible when I have some time.

    Alternatively noindex the page and then do a meta refresh, or a JavaScript redirect but the meta will be better for all users.

  9. rsdesigner

    rsdesigner Junior Member

    The problem is that Google is looking for index.html that is now default.asp (both did and do exist in the root folder). If I recreate the original index page and change it to an asp page, visitors linking to my site could end up going to the index page as opposed to the default page... hope that makes sense. An xml site map does exist for search engine purposes and has been re-submitted to Google, I hoped that would avoid crawl errors. My page rank for some reason was very poor so I'm not worried about carrying over and rank, in fact I was hoping the new site would be re-evaluated and gain a better rank.
    I'm reluctant to add a javascript or meta re-directs/refreshes as I believe these things make Googles arse twitch, I want to avoid any possible bad page ratings.
    I'll look into custom error pages.
    I appreciate the persistent efforts to help here, many thanks.
  10. Harry

    Harry Senior Member

    That's why you should point to / and not /index.html etc. Omit file names and extension, link to directories' roots instead.
  11. Jazajay

    Jazajay Active Member

    If you don't care about equity wack in the meta noindex and then wack in the meta refresh the Search Engines wont index it then and you wont get into any trouble as the page is telling them to ignore it anyway so that's what they will do.

    Also go to webmaster tools and submit a page removal request for the index file thats currently indexed if you have a Google webmaster account. :)

Share This Page