It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

Hilarious customer service thread's complaints & answers

page: 1
2

log in

join
share:

posted on Jan, 21 2010 @ 02:23 PM
link   
Time ago, i stumbled upon this thread that about a Chat that i was asked to embed in a website.
I'll replace the Product name with a "x":

Thread title:
X Chat is killing my robots.txt file
Customer (OP):

Everytime I ftp my robots.txt file (which contains information that prevents crawling robots crawling certain folders) this file get's loaded into the live chat as a new visitor that wants to chat.
When I quit this robots.txt chat the X chat also removes the file from the server.
Sounds like another bug to me.
I seem to start stumbling over a lot of these bug's and mistakes.
I don't want to place the live chat files into a sub folder so that it can't reach the robots.txt which obviously resides in the root.
Any other solution out there?


First reply:

X chat shouldn't do this.
Are you sure Robots.txt is configured properly, last time I had heard more than 80% of these files were incorrect.
It's also a good idea to isolate all scripts in seperate directories for troubleshooting, and debugging


There's to clarify (for those who don't know what is "robots.txt") wthat is this file, what it does and how it's made:
it is a txt file that impeeds some (at choice) spiders to crawl directiories: in the following example, we are disallowing the crawling of the folders called test1, test2 and test3, but it's just an example): basically, instead of setting up aline like this one < META NAME = "ROBOTS" CONTENT= "NOINDEX, NOFOLLOW" > on each document that shouldn't be crawled, you setup a txt file in which you declare what you don't want to be crawled.
User-agent: *
Disallow: /test1/
Disallow: /test2/
Disallow: /test3/

User-agent: * (meaning ALL spiders)
Disallow: /directory name/
You spare much time this way.

Customer:

Hello,
I know and agree that it shouldn't do this:-) but it keeps defying me on this point:-D.
My robots.txt is set up fine no mistakes there. This is what it looks like:
User-agent: *
This first line dictates that the rules are set up for all spiders
Disallow: /test1/
Disallow: /test2/
Disallow: /test3/
These three lines are folders not allowed for spiders to crawl.
I don't want this chat to run anywhere else but the root so I am not going to stick it in a folder. I just want it a bit more bug free than it currently is. Not to much to ask for a program that's being sold.


Here, it switches from funny to hilarious:

Customer:


Hmm I just realized that the chat is killing every txt file I put on the server on the same location as the chat is (which is in the root).

I just tried to add a search engine which used one txt file and as soon as I uploaded the text file to the server it logged in into the chat. I quit that chat (because it wasn't a real chat just a txt file logging in and voila. Gone was my txt file from the server.
What's up with that?


Lol, never seen any cannibal app: basically it makes vanish every .txt file in its immediate closeness



 
2

log in

join