It looks like you're using an Ad Blocker.
Please white-list or disable AboveTopSecret.com in your ad-blocking tool.
Thank you.
Some features of ATS will be disabled while you continue to use an ad-blocker.
Originally posted by nuclearnuttery
Thank you for the welcome.
I'll try to be less of an A-hole here.
Here's a pic if you want!
img846.imageshack.us...
it's a sketch with some very bad math.
someone needs to go back to geometry class before tackling time travel.. heh!
edit: hi there above! didn't see your post! yes I seem to agree with you almost completely, something is rotten in the state of PP, lol.edit on 31-3-2011 by nuclearnuttery because: (no reason given)
by CBS News investigative correspondent Sharyl Attkisson. For all those who've declared the autism-vaccine debate over - a new scientific review begs to differ. It considers a host of peer-reviewed, published theories that show possible connections between vaccines and autism. The article in the Journal of Immunotoxicology is entitled "Theoretical aspects of autism: Causes--A review." The author is Helen Ratajczak, surprisingly herself a former senior scientist at a pharmaceutical firm. Ratajczak did what nobody else apparently has bothered to do: she reviewed the body of published science since autism was first described in 1943. Not just one theory suggested by research such as the role of MMR shots, or the mercury preservative thimerosal; but all of them. Ratajczak's article states, in part, that "Documented causes of autism include genetic mutations and/or deletions, viral infections, and encephalitis [brain damage] following vaccination [emphasis added]. Therefore, autism is the result of genetic defects and/or inflammation of the brain." The article goes on to discuss many potential vaccine-related culprits, including the increasing number of vaccines given in a short period of time. "What I have published is highly concentrated on hypersensitivity, Ratajczak told us in an interview, "the body's immune system being thrown out of balance."
Originally posted by the2ofusr1I think info wars might just be a starting point for a few ....at least he gets a person thinking about subject matter
Probability theory is the branch of mathematics concerned with analysis of random phenomena.[1] The central objects of probability theory are random variables, stochastic processes, and events: mathematical abstractions of non-deterministic events or measured quantities that may either be single occurrences or evolve over time in an apparently random fashion. If an individual coin toss or the roll of a die is considered to be a random event, then if repeated many times the sequence of random events will exhibit certain patterns, which can be studied and predicted. Two representative mathematical results describing such patterns are the law of large numbers and the central limit theorem.
As a mathematical foundation for statistics, probability theory is essential to many human activities that involve quantitative analysis of large sets of data. Methods of probability theory also apply to descriptions of complex systems given only partial knowledge of their state, as in statistical mechanics. A great discovery of twentieth century physics was the probabilistic nature of physical phenomena at atomic scales, described in quantum mechanics...
...The mathematical theory of probability has its roots in attempts to analyze games of chance by Gerolamo Cardano in the sixteenth century, and by Pierre de Fermat and Blaise Pascal in the seventeenth century (for example the "problem of points"). Christiaan Huygens published a book on the subject in 1657.[2]
Initially, probability theory mainly considered discrete events, and its methods were mainly combinatorial. Eventually, analytical considerations compelled the incorporation of continuous variables into the theory.
This culminated in modern probability theory, on foundations laid by Andrey Nikolaevich Kolmogorov. Kolmogorov combined the notion of sample space, introduced by Richard von Mises, and measure theory and presented his axiom system for probability theory in 1933. Fairly quickly this became the mostly undisputed axiomatic basis for modern probability theory but alternatives exist, in particular the adoption of finite rather than countable additivity by Bruno de Finetti.[3]
The Bertrand paradox is a problem within the classical interpretation of probability theory. Joseph Bertrand introduced it in his work Calcul des probabilités (1888) as an example to show that probabilities may not be well defined if the mechanism or method that produces the random variable is not clearly defined.
In his 1973 paper The Well-Posed Problem[1], Edwin Jaynes proposed a solution to Bertrand's paradox, based on the principle of "maximum ignorance"—that we should not use any information that is not given in the statement of the problem. Jaynes pointed out that Bertrand's problem does not specify the position or size of the circle, and argued that therefore any definite and objective solution must be "indifferent" to size and position. In other words: the solution must be both scale invariant and translation invariant.
To illustrate: assume that chords are laid at random onto a circle with a diameter of 2, for example by throwing straws onto it from far away. Now another circle with a smaller diameter (e.g., 1.1) is laid into the larger circle. Then the distribution of the chords on that smaller circle needs to be the same as on the larger circle. If the smaller circle is moved around within the larger circle, the probability must not change either. It can be seen very easily that there would be a change for method 3: the chord distribution on the small red circle looks qualitatively different from the distribution on the large circle:
The concept of the probability distribution and the random variables which they describe underlies the mathematical discipline of probability theory, and the science of statistics. There is spread or variability in almost any value that can be measured in a population (e.g. height of people, durability of a metal, sales growth, traffic flow, etc.); almost all measurements are made with some intrinsic error; in physics many processes are described probabilistically, from the kinetic properties of gases to the quantum mechanical description of fundamental particles. For these and many other reasons, simple numbers are often inadequate for describing a quantity, while probability distributions are often more appropriate.
In probability theory, to postselect is to condition a probability space upon the occurrence of a given event. In symbols, once we postselect for an event E, the probability of some other event F changes from Pr[F] to the conditional probability Pr[F|E].
For a discrete probability space, Pr[F|E] = Pr[F and E]/Pr[E], and thus we require that Pr[E] be strictly positive in order for the postselection to be well-defined.
A quantum Turing machine (QTM), also a universal quantum computer, is an abstract machine used to model the effect of a quantum computer. It provides a very simple model which captures all of the power of quantum computation. Any quantum algorithm can be expressed formally as a particular quantum Turing machine. Such Turing machines were first proposed in a 1985 paper written by Oxford University physicist David Deutsch suggesting quantum gates could function in a similar fashion to traditional digital computing binary logic gates.[1]
Quantum Turing machines are not always used for analyzing quantum computation; the quantum circuit is a more common model; these models are computationally equivalent.[2]
Quantum Turing machines can be related to classical and probabilistic Turing machines in a framework based on transition matrices, shown by Lance Fortnow.[3]
Iriyama, Ohya, and Volovich have developed a model of a Linear Quantum Turing Machine (LQTM). This is a generalization of a classical QTM that has mixed states and that allows irreversible transition functions. These allow the representation of quantum measurements without classical outcomes.[4]
A quantum Turing machine with postselection was defined by Scott Aaronson, who showed that the class of polynomial time on such a machine (PostBQP) is equal to the classical complexity class PP[5].
If large-scale quantum computers can be built, they will be able to solve certain problems much faster than any classical computer using the best currently known algorithms (for example integer factorization using Shor's algorithm or the simulation of quantum many-body systems). However, there is up until now no mathematical proof that classical algorithms that are as good as quantum algorithms cannot be found (see Quantum complexity theory). If one disregards the question of efficiency (i.e., given enough time and resources) all problems solvable with a quantum computer can also be solved using a traditional computer.[3]
posted 14 October 2000
at www.anomalies.net...
TimeTravel_0 : As it turns out...
TimeTravel_0 : If you encounter a black hole that is spinning and has an electrified field, you will not be killed passing through its massive gravitational fiuelds.
TimeTravel_0 : regreting asking yet>
Yareisa : no
TimeTravel_0 : Ok
wyrmkin_37 : no
Yareisa : I'm hooked
TimeTravel_0 : In about a year...
G° : is this the omeg point theory?
G° : omega?
TimeTravel_0 : CERN will discover some very odd things as a result of their high energy experiments.
TimeTravel_0 : in about a year.
TimeTravel_0 : from your point of view.
wyrmkin_37 : cern?
TimeTravel_0 : in Geneva.
Yareisa : particle accelerator
wyrmkin_37 : oh
TimeTravel_0 : They will accidently create microsingularities.
G° : makes things go round and hit each other...
TimeTravel_0 : Which will evaporate very quickly.
wyrmkin_37 : one in texas?
TimeTravel_0 : and create a massive ammount of X-ray and Gamma rays.
TimeTravel_0 : It will puzzle them for a while.
TimeTravel_0 : Until they figure out how to add and elcrtical charge and capture these strange odd and massive particles in a magnetic field.
wyrmkin_37 : they shoot electrons at the speed of light.......see what they bust up into
TimeTravel_0 : Yes.
G° : still with you...
wyrmkin_37 : quarks
TimeTravel_0 : If you bombard a singularity with electrons...
TimeTravel_0 : you can alter the size of its event horizon.
TimeTravel_0 : and thus its gravitational field.
TimeTravel_0 : By overlapping these fileds from two singularities...
TimeTravel_0 : you can travel forward and backward through time.
TimeTravel_0 : Its actuallyu quyite simple.
wyrmkin_37 : i follow now
TimeTravel_0 : Thats noit the hard part.
G° : didn't tipler say there was no event horizon?
TimeTravel_0 : No..he said it was possible to approach a massive gravitationl field from certain angles and not get squished.
G° : oh, sorry
wyrmkin_37 : lol
Captain Nemo, also known as Prince Dakkar, is a fictional character featured in Jules Verne's novels Twenty Thousand Leagues Under the Sea (1870) and The Mysterious Island (1874).
Nemo, one of the most famous antiheroes in fiction, is a mysterious figure. The son of an Indian Raja, he is a scientific genius who roams the depths of the sea in his submarine, the Nautilus, which was built on a deserted island. Nemo tries to project a stern, controlled confidence, but he is driven by a thirst for vengeance and a hatred of imperialism (particularly the British Empire) and wracked by remorse over the deaths of his crew members and even by the deaths of enemy sailors.