#apache - Fri 23 Mar 2007 between 03:19 and 03:58

noodlkanonFive[dot]bl: what's this about you needing to be banned?
megaspazjpeg: try to join a-h now
noodl: it's a frickin' bot
megaspaztoo narrow
actionfajita stop cheering, megaspaz
fajitastop cheering, megaspaz
megaspazpushes fajita down the stairs
fajitaHey! *thud* son of a ... *crash* what the *thump* ouch *crunch* ow
actionscx cheers fajita
scxcheers fajita
mmm fajitas...
I'm hungry now...
well, don't see how in wget
wget --save-headers -r http://www.example.com
but is downloads the indexes
I just want the list of files and folders
noodlscx: you're trying to get a list of files and directories linked from a web page without saving them?
I want to know if some file is in the site
it'll do it
noodlum, i don't know, sorry
i expect it wouldn't be hard to write a script to do it though
scxI guess, but what command could I use to access an http and ls it?
jpegyou have to rely on the remote site having Indexes turned on.
if you're looking for a specific file, just request it.
scxwell it is a site that claims that 'cause I am using Linux I can not access a part of the site
scxbut it is nonsense 'cause the service does not even exist
megaspazchange your UA
scxand I want to show 'em in their ugly faces
the link is basically dead
but I want to see if the docs are in the site somewhere
dunno the name of the exact name of files anyway
megaspaz: UA ?
megaspazuser agent... but if the link is dead, then it's dead... i thought they just had some ua checking that said you no accessy this site
scxmegaspaz: oh,... 1) The link is broken but as excuse they say it is my system. 2) It is just a pretext 'cause I am almost sure the service doesn't even exist

Page: 2 9 16 23 30 37 44 51