It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
originally posted by: VivreLibre
Having same issues. CIA is extracting user data from the last month so it takes up a bit of bandwidth. I'm sure they'll be finished soon.
originally posted by: StargateSG7
originally posted by: VivreLibre
Having same issues. CIA is extracting user data from the last month so it takes up a bit of bandwidth. I'm sure they'll be finished soon.
---
Actually, I think it's PARTIALLY because of this posting:
www.abovetopsecret.com...
The NSA, DIA, CIA, CSIS, CSE, MI5, MI6, GCHQ,
The Illuminati, the Bilderbergers, the Freemasons,
The Trilateral Commission, The Federal Reserve,
the Nephilim, the Annunaki, the Reptoids,
the Greys, the Blues and the Flying Elvii
are ALL in an uproar right now and ATS
has a heads up on all the intelligence booty!
The THREE GIGABYTE Wikileaks Insurance File
is ABOUT TO BE BROKEN and its password
posted here... if its all true of course!
The Breaking-AES-256-bit-encryption-algorithm that is!
originally posted by: Martin75
a reply to: Night Star
Please save ATS...You created a bunch of addicts! Lol
originally posted by: ratfintc
a reply to: Night Star
I have written the code for bots, spiders, and other indexing scripts and automatic crawlers of all sorts. They can take up a lot of site resources on large websites if no one bothers to write rules for their behavior. The crawlers and spiders that ignore these rules have to be IP banned if they do not behave. I run my spiders and bots through rotating proxies to index sites that deploy basic counter measures. Some are able to defeat my spiders and bot traffic. Most do not spend their resources effectively in order to block me. My clients pay me to get info for them from these connected sources. I do what I must to fulfill the contract. I mean no offense to the sites I target but I will fulfill my contract. Lets make a deal is my motto.
How much video RAM do you figure a site like ATS would need to do that?
originally posted by: StargateSG7
Also you COULD if there is the
technical inclination to run all index queries on a GPU platform with pre-indexed pages
stored in VIDEO RAM of the GPU which can be IMMEDIATELY presented to the bots
and crawlers.
originally posted by: Arbitrageur
How much video RAM do you figure a site like ATS would need to do that?
originally posted by: StargateSG7
Also you COULD if there is the
technical inclination to run all index queries on a GPU platform with pre-indexed pages
stored in VIDEO RAM of the GPU which can be IMMEDIATELY presented to the bots
and crawlers.
originally posted by: pl3bscheese
a reply to: StargateSG7
Why on earth would you imagine ATS would have a need for 10gbps+ uplink? This isn't a streaming video site.
originally posted by: StargateSG7
originally posted by: ratfintc
a reply to: Night Star
I have written the code for bots, spiders, and other indexing scripts and automatic crawlers of all sorts. They can take up a lot of site resources on large websites if no one bothers to write rules for their behavior. The crawlers and spiders that ignore these rules have to be IP banned if they do not behave. I run my spiders and bots through rotating proxies to index sites that deploy basic counter measures. Some are able to defeat my spiders and bot traffic. Most do not spend their resources effectively in order to block me. My clients pay me to get info for them from these connected sources. I do what I must to fulfill the contract. I mean no offense to the sites I target but I will fulfill my contract. Lets make a deal is my motto.
----
What we do on our sites is run/redirect all bot/crawler inquiries onto a separate
thread on other CPU cores that have access to a SECOND daily/weekly imaged copy
of the web pages database and NOT the main server copies. The bots get a version
that is at most a few hours old of the master web page indexes BUT they don't clog
up our main incoming user ports because those bot/crawler connections are forwarded
to the second server on secondary IP ports. This frees up bandwidth for more incoming
users. If you can live with your web pages NOT being indexed every few minutes then
create a second copy of your website on a 2nd server. Also you COULD if there is the
technical inclination to run all index queries on a GPU platform with pre-indexed pages
stored in VIDEO RAM of the GPU which can be IMMEDIATELY presented to the bots
and crawlers. This will completely reduce your main CPU usage by 50x because of
the REALLY FAST GPU being able to use it's stream processors to index to
individual characters in multiple strings of text SIMULTANEOUSLY!
originally posted by: pl3bscheese
I honestly don't see how 15 reqs/s of database queries is significantly holding up the system, or why it would require exotic tech like SG7 is mentioning. Admin made it clear it was about ad-blockers and ddos. The ad-blocking is likely a reaction to the consequence of being black-listed from Google, which lead to the intrusiv ads to generate the equivalent revenue lost when dropped.