It looks like you're using an Ad Blocker.

Please white-list or disable AboveTopSecret.com in your ad-blocking tool.

Thank you.

 

Some features of ATS will be disabled while you continue to use an ad-blocker.

 

NIST Admits Total Collapse Of Twin Towers Unexplainable

page: 9
34
<< 6  7  8    10  11  12 >>

log in

join
share:

posted on Oct, 19 2007 @ 06:24 AM
link   

Originally posted by bsbray11

Originally posted by seanm
Funny how 9/11 Truthers love to engage in logical fallacies. Did you never take a course in logical and critical thinking?


This is pretty ridiculous coming from someone who keeps asserting that we don't need physical models of the collapses because they're proof of themselves.


Only in your wild imagination.


Why aren't you talking the physics of the collapses anymore? Why no more assuming two, already-disconnected, free-falling bodies, or letting on as if the actual floors held the loads of the floors above them? Any other insights you have for us?


You have a wilder imagination than I thought. You need to read more carefully.



posted on Oct, 19 2007 @ 06:44 AM
link   

Originally posted by billybob

Originally posted by seanm

Originally posted by Griff

Originally posted by sp00n1
Yea, a supercomputer used to model planetary collisions doesn't have the power to model a collapse... what?!?!


My thoughts exactly. We can put a man on the moon but we can't figure out how WTC 7 fell?


Funny how 9/11 Truthers love to engage in logical fallacies. Did you never take a course in logical and critical thinking?




hmmm.

planetary collsion modelling is pure newtonian mechanics on a grand scale, with an added factor of complex gravity interactions. ditto that for landing on the moon, except minus the weaknesses of pure theory(if your a "we landed" believer).

a building falling on earth subject to the exact same mechanics, minus the dynamic, convoluted gravity effects is (SOMEHOW!?) much more complex?


The claim is that modeling the actual building collapse is somehow very easy, that modeling all the complex interactions it took in the 12 to 14 sections it took for the buildings to fall is magically a piece of cake, as simple as putting a man on the moon:

Designed by M.I.T. in 1964
World's first microchip computer
Prototype computer for Apollo moon landing
Memory: 12K fixed (ROM), 1K eraseable (RAM)
Clock: 1.024 MHz
Computing: 11 instructions, 16 bit word
Logic: ~5000 ICs (3-input NOR gates, RTL logic)

Now, I'm sure you would want to claim that our computers of today, 43 years later, must be able to model the collapses, right?

Prove it.



posted on Oct, 19 2007 @ 06:46 AM
link   

Originally posted by bsbray11
And that's pretty much all their report was, since they never really tried to explain the actual collapses themselves.


And you want to continue to claim that it is necessary to model the collapses to ascertain why they collapsed?



[edit on 19-10-2007 by seanm]



posted on Oct, 19 2007 @ 07:35 AM
link   

Originally posted by seanm

And where did I say "free fall"?

The same way physicists do. The same way countless people since 9/11 have. Here's some help, for starters:

911myths.com...
911myths.com...


I was answering the bolded question with my question.

The kinetic energy can only get it's energy from the potential energy. mgh. Do you know what that h stands for? That would be the height of the fall. Hence when you tell me to calculate the kinetic energy, you are implying that there was a freefall.



posted on Oct, 19 2007 @ 07:37 AM
link   

Originally posted by seanm
It would help you to think before you write something silly.


Well, what is it? Rotational kinetic energy or kinetic energy from falling 6 feet? You might do well to head your own advice.



posted on Oct, 19 2007 @ 07:39 AM
link   

Originally posted by seanm

Originally posted by Griff

Originally posted by seanm
Yawn.... The actual towers collapsed. Get over it.


Brick walls never seem to amaze me. That's almost as bad as using the Bible to prove the Bible.


The towers weren't made of bricks. And they still fell regardless if models were made that didn't collapse.

DO catch up with what's written before replying next time.


I ment talking to brick walls. Jeez. Again, you might want to head your own advice.



posted on Oct, 19 2007 @ 07:41 AM
link   

Originally posted by seanm
Funny how 9/11 Truthers love to engage in logical fallacies. Did you never take a course in logical and critical thinking?


I've taken many courses. Most involving physics. Half structural. What degree do you posses again? I can deduct it's nothing in the engineering or scientific fields.



posted on Oct, 19 2007 @ 07:44 AM
link   

Originally posted by seanm
Well, isn't that interesting? To think that we allowed all that time, effort, manpower, and MONEY to be wasted on the most massive investigations ever when one could have simply looked up the answer on Wikipedia!


Wikipedia is still good for most things.

And where do you get the idea that it was the "most massive investigation ever"? There was more money spent on Clinton's blow job. There was more money, time and investigation into the space shuttles.

What a fantasy world we must live in.



posted on Oct, 19 2007 @ 07:47 AM
link   

Originally posted by snoopy
I think the problem is your understanding of what facts are.


I think your problem is the brick wall syndrom that sean seems to have. See bellow.


And again, since it is so easy to use a computer to do the calculations, then what's the hold up? Why aren't the geniuses over at ae911.org doing it?


How many times do I need to tell you we need the construction documents. Find me those and I'll get you your answer. Until then, give this argument up please. It's not working.


So many scholars and experts as well as the 70% of the population whom the truthers seem to claim all believe in these conspiracies, and no one wants to lift a finger...


I'll lift a finger to you.



posted on Oct, 19 2007 @ 07:56 AM
link   

Originally posted by seanm
You have a wilder imagination than I thought. You need to read more carefully.


Hmm...what seems more logical?

A. We all have a reading comprehension problem.

B. You are not making sense and have no clue about physics other than what 911myths tells you.



posted on Oct, 19 2007 @ 07:59 AM
link   

Originally posted by seanm
And you want to continue to claim that it is necessary to model the collapses to ascertain why they collapsed?


How else would you go about it? Guessing? Seems like that would be sufficient for you. Not me though since I like to know how things work. Not just that they work.



posted on Oct, 19 2007 @ 09:18 AM
link   

Originally posted by snoopy
reply to post by Leo Strauss
 


Once again, his disagreements are a completely different case. What is happening here is someone has some legitimate issues which are a complete matter of opinion and people here are trying to use it as evidence of a cover up. This is completely dishonest of people to try and distort the doctors views. He has a good point, but it has nothing to do with conspiracy theories. It's simply impossible to know the extent of the fire proofing that was removed. There is absolutely no way to measure this. However through the testing done by NIST it became clear that this is what happened because this is the scenario that was able to cause a collapse. Not bombs or explosives or any of that nonsense.

And again, if the actual process of the collapse itself (of which the cause was already proven) is so crucial, why aren't ae911.org doing the testing and computer simulations? The truth movement claims that these guys are so called experts. But then why is it these so called experts do nothing but poke holes which are going to be found in any research done by anyone ever instead of actually doing their own legitimate research?

Why is it they just complain about it not being done and at the same time doing absolutely nothing on their own other than claiming they can?



Snoop
I am sorry but again he said:

“In my opinion, the WTC investigation by NIST falls short of expectations by not definitively finding cause, by not sufficiently linking recommendations of specificity to cause, by not fully invoking all of their authority to seek facts in the investigation, and by the guidance of government lawyers to deter rather than develop fact finding."

Dr. Quintere is not quibbling over some details. He is indicting the entire investigation from top to bottom. I think you are twisting his words. Please read the link.



posted on Oct, 19 2007 @ 09:24 AM
link   

Originally posted by seanm
You have a wilder imagination than I thought. You need to read more carefully.


I'm reading more carefully than you understand. You don't know what you're talking about. Griff knows the same thing and there are other people here with technical backgrounds that you're not fooling, either.

I didn't think Griff would have to point out for a second time that KE is determined from PE, and PE is equal to mass, times the pure acceleration of gravity (9.8 m/s^2, at which ONLY objects in a vacuum will fall), times height. There is NO variable in that equation to express the electromagnetic resistance preventing two solid objects from falling through each other. In other words, there is nothing equivalent to a "drag coefficient", which has to be taken into account even for the gases in the air, as they produce friction and drag when you move against them. By referencing the KE you implicitly assume a free-fall somewhere in there. I'm telling you, the subject is deformations of a single rigid body with many components, not collisions between two big, simple objects after one falls through the air.



posted on Oct, 19 2007 @ 09:29 AM
link   
reply to post by bsbray11
 


Hi Bsbray,

Not trying to derail, but I was hoping you would have some input for my thread from the NASA scientist.
Griff has asked some great questions and hopefully has got some informative reponses.

Thanks


[edit on 19-10-2007 by CaptainObvious]



posted on Oct, 19 2007 @ 09:31 AM
link   
reply to post by Leo Strauss
 


Leo,

HE also states that he does not beleive it was a controlled demolition.



posted on Oct, 19 2007 @ 09:47 AM
link   
reply to post by CaptainObvious
 


CO
I know that and I mentioned that in my original post.

Here are some more questions Dr. Quintere raised in his objection to the NIST report.

"2. Why were not alternative collapse hypotheses investigated and discussed as NIST had stated repeatedly that they would do? ...

3. Spoliation of a fire scene is a basis for destroying a legal case in an investigation. Most of the steel was discarded, although the key elements of the core steel were demographically labeled. A careful reading of the NIST report shows that they have no evidence that the temperatures they predict as necessary for failure are corroborated by findings of the little steel debris they have. Why hasn't NIST declared that this spoliation of the steel was a gross error?

4. NIST used computer models that they said have never been used in such an application before and are the state of the art. For this they should be commended for their skill. But the validation of these modeling results is in question. Others have computed aspects with different conclusions on the cause mechanism of the collapse. Moreover, it is common in fire investigation to compute a time-line and compare it to known events. NIST has not done that."

5. Testing by NIST has been inconclusive. Although they have done fire tests of the scale of several work stations, a replicate test of at least & [sic] of a WTC floor would have been of considerable value. Why was this not done? ...

6. The critical collapse of WTC 7 is relegated to a secondary role, as its findings will not be complete for yet another year. It was clear at the last NIST Advisory Panel meeting in September [2005] that this date may not be realistic, as NIST has not demonstrated progress here. Why has NIST dragged on this important investigation?"



posted on Oct, 19 2007 @ 01:20 PM
link   

Originally posted by seanm

Now, I'm sure you would want to claim that our computers of today, 43 years later, must be able to model the collapses, right?




Sure its possible, but i think Purdue University were being rather lazy with their models and analysis. They didn't even bother to entertain the explosives scenario. I am still waiting to see their collapse models, where are they?

Here is an unofficial 3d draft simulation of the 'collapse' (annihilation). It uses simulated explosives as the primary 'collapse' initiation mechanism. As you can see it looks rather reminiscent of the towers destruction.




posted on Oct, 19 2007 @ 01:41 PM
link   
I think the only way to tell for sure is to build full scale models. Buy a couple of remote control airplanes and recreate the collapse.

But really there are alot of unknow variables to creating the computer models. The structural engineer who designed the building should be commended IMHO that the buildings stayed up as long as it did with columns being severed and others with unbraced lengths far beyound their capacities for support . Oh not to mentioned the fire that weakened the steel beyound measure. I guess most people do not know that if you want to bend massive amounts of steel, you pre-heat it.



posted on Oct, 19 2007 @ 03:00 PM
link   

Originally posted by Truthforall
The structural engineer who designed the building should be commended IMHO that the buildings stayed up as long as it did with columns being severed and others with unbraced lengths far beyound their capacities for support .


That's actually required by law, especially for skyscrapers. It's called a "safety factor" and it basically amounts to how many more columns they erected than they actually needed to carry all the loads. This is supposed to be a large figure for skyscrapers, and less important for more mundane structures like houses, for which less is likely to go wrong, and it's less catastrophic when something does. So skyscrapers are "over-engineered", built and then some, in case of disasters like this. It's legal code.


Oh not to mentioned the fire that weakened the steel beyound measure.


Beyound measure, huh?


I guess most people do not know that if you want to bend massive amounts of steel, you pre-heat it.


I guess you don't know that it takes a lot of energy to uniformly heat that much steel to 400 C, let alone 500 or 600 C, which is about the absolute maximum you can achieve in a fire like that from what experimental data shows (and for much smaller pieces of steel!). The fire may reach higher temperatures, but there's not enough heat energy to heat so much steel so fast.

It's important to understand the difference between temperature and heat, too. An extreme example of the difference is a massive iceberg versus a candle flame. The iceberg will have more total heat energy, but the candle with have a much higher temperature.

Steel requires both a very high temperature, and a lot of heat energy to uniformly heat. It's dense and it's a good conductor. It acts as a heat sink, so where you apply a constant flame to a column, the heat will tend to spread out through the column and lessen the temperatures at any given point. You'd have to have the flame right up against the column for much transfer to occur in the first place, and it would have to stay there for a good length of time, too (depending on how powerful the flame is, ie the wattage it puts out). Throw any steel object into a fire and see for yourself how long it takes to heat up even enough to glow a dull red, around 600 C. Steel only loses half of its yield strength at 600 C. The yield strength is the point at which permanent deformations start to occur in the steel, and this strength is higher than the design loads by the safety factor. And this would be going on only on a column-by-column basis, IF the right conditions were met, which is impossible from all data I've compared to. The amount of power required is equivalent to scores of wood stoves per cubicle, to uniformly heat all the steel in the given amount of time.



posted on Oct, 19 2007 @ 03:01 PM
link   
A Question I presented to NASA Scientist Ryan Mackey


I thought this question would be appropriate to share on this thread.


I have one question for you if you don't mind.? I often here this
argument:


"The computer models stopped at collaspe initiation because of the
ridiculous variables that they had to put into the computer to start it.
And because the collapse looked absolutely nothing like what we saw on
9/11, in that it was not symmetric and they could not get it to
progress!!! "


Mr. Mackey's Response:

That argument is nonsense. The computer models stopped at collapse
initiation (and sometimes before!) because of what's called a "convergence
problem." It has nothing to do with a need for unrealistic initial
conditions or because it would give a politically incorrect answer.
NIST ran two different major structural models. These models do different
things. The one for the structure, contained in NCSTAR1-6D but also
baselined in NCSTAR1-2A is run by a program called SAP2000. The other one,
considering the dynamics of the aircraft impacts, was run in LS-DYNA.

SAP2000 is a structural model that is essentially solves the static load
problem. It cannot represent moving objects (although there may be some
workarounds, but nothing accurate on this scale). Basically the way it
works is by solving the stress-strain relationship for each element, which
is a simple equation, using a look-up table for the material properties as
a function of strain.
The reason it's difficult is because that simple stress-strain
relationship becomes an ENORMOUS matrix problem. Each piece of the
structure has several variables, representing position and strain (think
"stretch") in several dimensions. The load is a mixture of fixed boundary
conditions, like area loads on floors, and "self-weights" of the components
themselves. The elements are coupled to other elements to varying degrees.
A simple way to think about it is as follows: Start with the structure as
originally built. Then apply the load. The load creates stresses in each
component, and all of these have to balance. Once you have this, the
stresses lead to strains, and the structure sags a little bit as a result.
That changes the stress distribution, so you solve again. That changes the
strain. And so on.
At every step, you perform a calculation that is essentially a matrix
inversion. The matrices here, by the way, are incredibly large -- for the
WTC cases, they are literally bigger than a million times a million.
Matrices cannot always be inverted. If, for instance, there is a row of
all zeroes, a matrix is said to be "degenerate," and it cannot be inverted.
Inverting such a matrix is logically equivalent to dividing by zero. This
is, for instance, what would happen if you tried to include a completely
detached piece in the SAP2000 model.
In NIST's calculations, the matrices don't ever actually reach degeneracy,
they get awfully close -- an element on the diagonal that is very, very
small (i.e. close to zero) results in an "ill-posed" matrix. Inverting
this matrix is like dividing by a very small number, i.e. multiplying by a
very large number, and thus the outcome is not very stable. A small error
in this number -- even a roundoff error -- can lead to large changes in the
final result.
The more stable a structure is, and the less it deflects under load, the
easier it is to solve. As the WTC models got closer and closer to
instability, the harder it was to solve. Eventually the simulation simply
cannot proceed, due to the "convergence problem" I mentioned above. Either
the matrix inversion step gives unrealistic answers, or it results in such
a large change compared to the last step that it overshoots each time we
try to refine our result, and thus we get no single-valued answer.

This happens in real life, too. Think of a single column, near to its
buckling load, supporting a structure above. Which way will it bend?
Either way is equally energetic. As it gets closer to failure, the error
in our calculation becomes more and more significant.

Now in terms of actually modeling the collapse itself, this is much, much
worse. The situation above is still static, i.e. not moving, at least not
very fast, and we are still going to hit convergence limits. But now we
want to go even beyond that and consider a dynamic situation.
SAP2000 cannot do this. Instead, we could use a tool like LS-DYNA, which
doesn't just handle the stress-strain relationships, but also considers
kinetics -- motion, impulse, and much more focus on timestepping. Very,
very small timesteps.
We could, in theory, model the collapse in LS-DYNA. But the modeling
problem is vastly more complicated than it was before. First, we have to
decide what the actual state of the components is at the instant of
collapse, and even small uncertainties here will result in large
uncertainties in the final results. Second, we have to go through the same
process above, but now we have to do it at every timestep, so perhaps a
million times as many calculations as before. Third, we have far more
variables than before -- instead of just XYZ and the strain values for each
member, we now also have speed, adding six more degrees of freedom (three
translational and three rotational). Fourth, every time two objects
contact each other, exactly how force is transmitted is extremely sensitive
to the exact geometry. Think of all the various ways a bowling pin can
fall, and that's contact between loose, rounded objects. The variety of
objects in the WTC collapse -- shape, strength, etc. -- will be vastly
greater.
The complexity of the aircraft impact models was limited by these
performance constraints. That's why the aircraft was simplified and why
the results are open to some interpretation. Modeling the structure
collapse in similar fashion would be hundreds of times worse, or
(alternately) hundreds of times more coarse.

There's no point to doing this. What we really need is a gross-order
understanding of behavior. This is provided by models such as the one in
Bazant, Le, Benson, and Greening. Similarly, the NIST impact models don't
really care about the exact disposition of every fragment of aircraft, but
only things like the total momentum transmitted to the structure, the rough
order distribution of fuel, and the expected loads on the large core
columns. This is about the limit of detail that we can reasonably
calculate.

I've made the argument many times that dynamic models are just not that
precise. If we could accurately model the entire WTC collapses, then I
should be able to take that same model to Las Vegas, go to the craps
tables, and make a billion dollars before the bar closes. Obviously, I
can't. Even the most sophisticated models cannot accurately predict what
side of a six-sided die will come up when thrown. There is no reason to
expect billions of times higher precision from NIST.

Hope that answers your question.

Thanks,
Ryan Mackey



new topics

top topics



 
34
<< 6  7  8    10  11  12 >>

log in

join