Welcome to Netrider ... Connecting Riders!

Interested in talking motorbikes with a terrific community of riders?
Signup (it's quick and free) to join the discussions and access the full suite of tools and information that Netrider has to offer.

IT Query - is it possible?

Discussion in 'The Pub' at netrider.net.au started by krh998, Mar 14, 2007.

  1. Hi all

    This is a work question. From posts I've read here it looks as though Netrider has several IT gurus here. I was wondering if I could pick your brains on something. :grin:

    I've been asked to set up an html email newsletter that links to a web server (in-house, IIS6) for the full article, images, etc. This part is all fine.

    The problem is that my managers want to make sure that the web site is not accessible from search engines or direct from the link. In other words, the only way to get to the web site is via the email newsletter. (Without going down the login / personal credentials path).



    One thing I was originally thinking of is to set a referrer and only allow access from that particular referrer. But is it possible to set up something in the HTML email to achieve this?

    Are there any other options available?

    I've explained to my managers that anything of this nature is really security by obfuscation and they're happy with that :roll:

    I'd appreciate any pointers ...

    Thanks,
    K
     
     Top
  2. hmmmm i'm sure its possible, but i have no idea how. Search engines (in general) work by sending "Spiders" to crawl out over the web and send back phrases that have been coded under the HTML of the site. The less repetitions you have of phrases, the less likely a search engine will find it.

    Do you guys host your own webserver onsite?.... why not make it an intranet site, so you have to log into the system to view it, where google can't find it...
     
     Top
  3. a robots.txt file will stop search engines from caching your pages ...

    From .. http://www.outfront.net/tutorials_02/adv_tech/robots.htm

    What on Earth is a robots.txt File?

    A robots.txt is a file placed on your server to tell the various search engine spiders not to crawl or index certain sections or pages of your site. You can use it to prevent indexing totally, prevent certain areas of your site from being indexes or to issue individual indexing instructions to specific search engines.
     
     Top
  4. I work in education (read looowwww budget) and not all schools are on our wan yet. Means we have to rely on the internet when an intranet would be much easier for a lot of things...
     
     Top
  5. hahahaha don't worry. I work in education, i'm used to the "IT Expenditure"

    tim07 - i honestly havn't heard of those... guess i should start getting back up to speed on web development!
     
     Top
  6. +1 here, though after five years being sole IT support for my department at [name removed] uni, I'm ready to move on to other things.

    The number of nights I've had stressing about not being able to get the equipment because of our budget. Ultimately it's not worth it.
     
     Top
  7. The only real workable solution to your issue is to make the website password protected through challenge/response user authentication, within IIS6's configuration settings.

    You can them 'embed' the username and password into the URL link for the images and full article etc. Eg. http://username:password@www.mysite.com.au/image.jpg

    The use of robots.txt is not reliable and even ignored by some spiders, especially if correct syntax or format is not given.
     
     Top
  8. For internal work access, limit the access to the companies Gateway Address, for external access, certificate based?

    Trying to 'secure' something when people are unwilling to 'enter a password' means something passive is needed.
    Embedding the username/password in the link isn't really a great idea. As it really isn't secure. Then again, how secure do you need to be?
    Being EDU, if you have student photos on the intranet, you really need to watch who accesses it.

    This is all moot if you only want the Newsletter accessable internally.

    I had any access to the intranet requiring u/p and logged. If the teachers were on their laptops, it'd auto auth, anywhere else they needed the u/p.
    Comes down to accountabilty. I have had Police requesting information due to an investigation. Students logged on with their login information, Parents were offered access to Staff/Teacher liasing areas which was a u/p provided to the Parent in person. Over the top? maybe :D
     
     Top
  9.  Top