Use of Scale Model or Full Sized models for investigating 9/11 collapses

And that's the problem. No-one has been able to experimentally demonstrate -- even with a simple model -- what NIST failed to explain. Everything you say is "self evident" beyond the initiation of collapse is totally without experimental support, although there have been failed attempts to reproduce what you claim is obvious.

Not if all the variables as well as the model itself were made available for independent peer review, no.

That's what makes the NIST WTC7 model so unacceptable (besides its general inaccuracy, of course) and that's why I said at the start of the thread that femr2's approach was far more scientifically valid.
Sorry, Cube...I'm not buying it. They're all in on their story. They have taken--and spent--tons of cash, based on that story!
As far as admitting that they're wrong, simply because the evidence is right there? That boat sailed long ago.
It's a lovely thought...that they'd respect the science and the evidence...
but I think one would have to be pretty naive to believe it, given their history.
 
You switching between quoting a number for "each tower", then referring to a number which includes both towers.
However, I also noted that calculating for a 4 inch slab over the entire floor for 110 floors gives a result much less than that total.
77000 tons of concrete is vastly different than 300,000 tons, which leads me to suggest that the greater proportion of concrete was in the foundation and consisted of heavier grade concrete.
OTOH I note that at 700 tons of concrete per floor that fits well with NIST saying a typical floor assembly being 750 tons.

I did say more then 300,000 tons per building and that number is based on the lightweight type. My point is we have so much claimed certainty based on inadequate data. It is so curious that no one has built a self-supporting model that can completely collapse in 13 years and no engineering school has even said it would try.

The 425,000 cubic yards is for both towers. I can't help that. But each collapse was a separate phenomenon so the amount of concrete in each tower is important to analysing the supposed "collapses".

But the NIST never specified either the total or the amount in each building. So it is ridiculous to believe anything on the basis of incomplete and contradictory data. The floors don't come anywhere near accounting for 425,000 cubic yards, not even if the basements were solid blocks of concrete which we know they weren't.

But the NIST said in three places that they needed to know the distribution of weight to analyse the motion of the towers due to the aircraft impacts. That can even be demonstrated experimentally.



psik
 
Well, which claim would you prefer to support? The claim that NIST modelled the collapse initiation and didn't bother to model the event further because progressive and global collapse all the way to the ground was now "obviously" going to happen, or the claim that NIST didn't bother to model the rest of the collapse after initiation because it was "too complex" to do so?
You have changed horses in midstream. Those were not the two options which you claimed were paradoxical. And the discussion is about the use of models not NIST actions. However - addressing these new options.
1) The first of your new options - 'NIST didn't bother to model because progressive collapse...was now "obviously" going to happen' is true. Even if the scientific method was strictly applicable to this forensic engineering investigation why is modelling of obvious true aspects necessary? The purpose of reproducibility in to demonstrate that an hypothesised explanation is valid - not to prove the bleeding obvious.
2) The second of your new options - 'NIST didn't bother to model the rest of the collapse after initiation because it was "too complex"' is a strawman of your own fabrication. The claim of "too complex" taken from my explanation where I referred to 'initiation stage' - not 'progression' It was neither NIST's claim nor mine in the context of 'progression' stage.

We've seen both claims on this thread, but neither provide evidence of anything except the deep reluctance of supporters of the official collapse theory to have their ideas examined through experimental research, or address the methodology by which that might be achieved.
Whether 'we' have or have not seen them in the thread I did not make them and you were purporting to respond to what I said.

I suggest you do not understand what reproducibility means in terms of the scientific method. Simply: if a phenomenon cannot be reproduced under experimental conditions -- repeatedly -- then it is not scientifically explained.
That is hogwash. Specifically is is a claim which is false because it presses a false global assertion. The implied assertion that the scientific method applied to a one off forensic analysis requires that every detail which is bleedingly obvious true has to be modelled including being modelled in situations which cannot be modelled and where more than adequate real event data is available.

As I've already mentioned, an exact reproduction of the collapse of the Towers is not what would be expected:
Then explain what your "near enough approximation" could be because I have now three times said that no model will be as good as what we know from the real event.
however a demonstration that a similar high-rise tower structure can indeed crush itself to the ground under the dynamic momentum of a falling upper 20% of itself on any day other than 9/11 has not yet been achieved.
That is not the issue. Let me pre-empt any return to conflating ambiguity - by 'dynamic - falling on itself' you are clearly referring to the 'progression stage'. That stage has been extensively explained at three levels of detail:
a) By the NIST assertion 'global collapse was inevitable';
b) By the Bazant 'available energy' assertions (B&Z 2001-2) and multiple reassessments in the same vain. All of them being abstract and approximation models; AND
c) By explanations of the actual collapse mechanism of the real event - take the ROOSD version or my earlier and more comprehensive "Three Mechanisms' explanation.

The purpose of modelling is to provide details of aspects which are not known or to 'prove' aspects which are in doubt. Neither criterion applies.

...Does NIST? If so, why didn't they produce a model, however fundamentally simplified, do you think? Because it was "obvious" what was going to happen after they modelled initiation, or because it was "too complex"?
You revert to the false conflation. My answer already given - progression was NOT 'too complex' I made that comment about initiation. I'll leave it for K Beachy to make his own response since that comment was directed at him.
 
Yes I mean Lon. But the sizing of the core bracing is pretty trivial because the floor loads were pretty uniform inside the core. This is hardly significant except to you who has not explained why? The floors inside the core were I believe designed for 75# live load... I don't recall but it's mentioned in the appendix of the NIST reports in the docs from LERA.
There were no horizontal beams *in the perimeter*... the OOS floor was supported by trusses except for the mech floors which had WF beams and yes... thicker slabs. The facade had 52" high spandels... probably 1/2" thick in the upper section, and maybe 3/4" in the middle and 1" in the lower.... not more.

Empty claims of things being trivial based on no data.

47 columns of 12 foot sections is 564 feet of steel.

The core was 85 by 135 feet. So the length of horizontal steel would be:

6×135+8×85 = 1,490 1,490÷564 = 2.64

So the length of horizontal steel was more than 2.5 times the amount of vertical steel but it is trivial because you say so even though no one computes how much there had to be. Did it get thicker down the building since it had to brace thicker columns? Oops, another triviality that does not get mantioned.

psik
 
It's a lovely thought...that they'd respect the science and the evidence...
but I think one would have to be pretty naive to believe it, given their history.
As I said to Mick: I'm not sure what AE911 has got to with the question at hand. Nevertheless the science and evidence must be of reasonable quality and NIST failed to meet a reasonable standard of methodology and evidence in many areas of its investigations. Its refusal to allow independent peer review of all the data used to create its animation of WTC7 is just one example.

Leaving that to one side and returning to the topic of the Towers: I would be interested in gathering a consensus of what data is available and necessary -- and how it could be acceptably simplified -- from all corners of the debate. From that consensus a model of the global collapse of the Towers might be worth undertaking; without it there will inevitably be complaints about input data or methodology, depending on the results.
 
It is so curious that no one has built a self-supporting model that can completely collapse in 13 years and no engineering school has even said it would try.

Well, I just built one out of Jenga blocks. But I suspect you have more specific criteria? What's the simplest model you would accept?
 
As I said to Mick: I'm not sure what AE911 has got to with the question at hand.

The type of model that needs to be focussed on depends on if you agree with AE911's assertions or not.

If you are asking for a fully detailed model, then that implies that you disagree with AE911 about the arresting, and simply think the building fell a bit fast. So a detailed model would clear that up - but there's still no good math that justifies the need for such a model.

But if you agree with AE911, then there is no need to spend million on a full model, and instead you can demonstrate the general principle with a low resolution ballpark model.
 
This could be true, but is not a proof because it assumes the collapse was natural with the first confirming the second and second confirming the first. There are many reasons to reject CD but this is not one of them.
Did you find evidence for CD? I was not talking about CD, I was talking collapse progression was confirmed, twice, and math can be used to check it.
Natural, fire is not natural on 911, it was on purpose, a crime. It does not matter what started the collapse, "natural" or CD limited to the impact areas, the collapse will not arrest, it will do what was seen by two full scale models. The collapse progress is no surprise based on the structure of the WTC, something Robertson knew immediately; for most would take some study of the WTC structure. I had no problem on 911 understanding it was fire based on observation of events (including the planes being commercial), and gained a better understanding when I studied how the WTC was built. I have no doubts fire can start a collapse, steel fails quickly in fire, and CD proves only a tiny amount of energy is required to start a global collapse, and gravity does the work of demotion.

How do we model the fires or explosives. Fires are a most inefficient way to destroy buildings, it took office fires with the heat energy of 2,700 tons of TNT to start the collapse in the towers. The gravity collapse which destroyed the WTC complex was only 137 tons of TNT for each tower, potential energy released. CD is not done with fire, it is inefficient. The WTC towers could be destroyed with small amounts of explosives, from similar locations for a top down failure, using the mass/gravity to do the rest.

Combined with the fact there were no explosives, termite or other devices used on 911, and complete lack of evidence for the same, it is proof collapse continues, as seen. What we saw is proof of the collapse not stopping after starting.

Full scale models are better than scale models to demonstrate collapse progression.

For 911 truth, no scale model would be enough to keep them from promoting the "official narrative" is like a fairy tale, not a factual account, or spreading the "official" investigation is effectively a disgraceful cover up, and saying little was revealed and much was concealed. For these tag lines, which might sound great to 911 truth follower, models or investigation may not be enough to break their illusion of an inside job or CD. I suspect most 911 truth claim believers quickly figure out claims from 911 truth are baseless, and they join reality.

It is tough to model the collapse progression at a level for doubters. Unable to grasp a floors fails at a certain mass, and no knowledge the WTC is a system depending on floor to core to shell integrity, how do you do it?
Scaling gravity, connection strength, mass, etc. It would be hard to make a model to model the collapse, much more easy to make a model to model the structure appearance for a wind tunnel test...

I like all the model I have seen, the really bad wire models, or my favorite the in box plastic tray models, cold you imagine 10 foot thick plastic ceilings, and walls? Now that is a model.

I like the washer model; when it models a floor can hold a set mass, all the way down; the collapse would progress, accelerating, and the speed would reflect the momentum transfer of the collisions. This can be done mathematically for the WTC, and it matches the collapse speed profile.

How can a collapse with more mass than a floor can hold, be arrested by the floors which can't hold the mass? Even if we magically stop the collapse at each floor, then hit the "play" button, the floor fails.
Who can't reject Judy Woods pool ball model at face value.

NIST did full scale models of the WTC floor sections to test. And full scale sections of offices. These test confirmed WTC was up to specifications.
 
Well, I just built one out of Jenga blocks. But I suspect you have more specific criteria? What's the simplest model you would accept?

Was that "model" leaning on a slanted surface or not?

Supposedly you were modelling the "floors" on opposite sides of the core, but in the real building the floor outside the core was a single continuous piece but you had two separate pieces that were not connected in any way.

Did your components sustain any "damage" in the collapse?

What is a model supposed to demonstrate in relation to the real thing?

=============================================

Obviously my "model" is not to scale and I never said it was. It is not a tube-in-tube structure. I think your claiming that you model had a "core" is complete nonsense. Your Jenga block core had nothing comparable to beams but you make a big deal of trusses.

"Scale" models cannot be made without complete data on the buildings so why everyone is not demanding that data is a mystery to me.

psik
 
test
 

Attachments

  • column 704.JPG
    column 704.JPG
    43.6 KB · Views: 494
This one's for Pskey.... does this make sense to you or is it wildly off the mark fantasy?
The weights align with the published total column weights.
The size of the bracing is not easy to find out. I suspect it was not large sections.... Wide flanges supported core floor loads., elevator rails... and held the columns in alignment... I don't think the brace/floor beams low down were substantially larger than up top. hardly would prevent ROOSD or similar inside the core... and would add more rugged pieces of steel to bash up the slabs in the core in the collapse.

Go for it!
 

Attachments

  • FOS Study 2013.pdf
    206.3 KB · Views: 612
  • Flooor 80.pdf
    130.2 KB · Views: 618
"Scale" models cannot be made without complete data on the buildings so why everyone is not demanding that data is a mystery to me.

Because a need for an exact scale model (which, if physical, would have to be very large, at least 1/10th scale, and etither way very expensive) has not been demonstrated.
 
I don't think the brace/floor beams low down were substantially larger than up top.
They were the same, as each only carried the weight of that one floor. Two floors carrying heavy machinery were a different construction. Th columns got thicker towards the bottom, to keep things modular, they were same external measurement all the way,but with thick steel at the bottom and getting thinner towards the top.

http://911research.wtc7.net/wtc/arch/floors.html
 
They were the same, as each only carried the weight of that one floor. Two floors carrying heavy machinery were a different construction. Th columns got thicker towards the bottom, to keep things modular, they were same external measurement all the way,but with thick steel at the bottom and getting thinner towards the top.

http://911research.wtc7.net/wtc/arch/floors.html

The core columns were not the same external dimension... the facade were except at the mech floors where they were actually wider then those above and below.

The floor braces inside the core indeed were designed for the same floor loads and would be uniform on similar floors... except where there were no elevator shaft penetrations additional beams were used to support inside the core flooring. The mech floors were obviously stronger as the superimposed dead loads on them were much greater. Stronger as they were... they were no match for thousands of tons of materials crashing down on them.
 
This can be done mathematically for the WTC, and it matches the collapse speed profile.
Then really a lot of the hard work with respect to creating a model of the Towers collapsing has been done, would you agree? All that would be required is for these mathematical models to be applied to a 3D environment and animated...?
 
Because a need for an exact scale model (which, if physical, would have to be very large, at least 1/10th scale, and etither way very expensive) has not been demonstrated.
In what way would it need to be demonstrated to you that there may be a need for the most catastrophic and politically significant structural failures in history to be modelled? As we have agreed, Moore's Law suggests it will not always be so expensive to gain access to computing power.
 
What you call a salient point was such a transparent appeal to authority I didn't bother to repeat it. Even if everyone in the world with a structural engineering degree thought the mechanism of progressive collapse leading to global failure has been understood, it wouldn't bring a model that illustrates this understanding any closer to becoming a reality.

.
Yes its an appeal to authority, which, if that is a true authority, is perfectly alright.

The CTBUH , ASCE and other relevant organizations have endorsed the idea that collapse was inevitable. Who are you or I to contradict the experts?
 
Yes its an appeal to authority, which, if that is a true authority, is perfectly alright.

The CTBUH , ASCE and other relevant organizations have endorsed the idea that collapse was inevitable. Who are you or I to contradict the experts?
No, it's not "alright". It's easy to think of other experts that disagree with your authorities but that's beside the point. Unless the CTBUH, ASCE etc have built models of the collapse progression or commented on their possibility, your appeal to them is an irrelevance.
 
In what way would it need to be demonstrated to you that there may be a need for the most catastrophic and politically significant structural failures in history to be modelled? As we have agreed, Moore's Law suggests it will not always be so expensive to gain access to computing power.

But only the Truthers see the need. For the majority of people who don't see any indication of bombs, there's no evidence to justify wasting money on such a thing.

Realy the onus is on the Truther community to make the case. They could do that with a simplified open-source computer model that represents a plausible collapse scenario. You think that's too much to do over the next five years? I could do it in a month, part time.

I might even give it a go some day, I'm just looking for a nice Javascript physics engine with destructible connections.
 
But only the Truthers see the need. For the majority of people who don't see any indication of bombs, there's no evidence to justify wasting money on such a thing.

Realy the onus is on the Truther community to make the case. They could do that with a simplified open-source computer model that represents a plausible collapse scenario. You think that's too much to do over the next five years? I could do it in a month, part time.

I might even give it a go some day, I'm just looking for a nice Javascript physics engine with destructible connections.
This is precisely the point... the engineering and scientific community having reasonable knowledge about the structural design and the impact and fires see no reason to go to the expense to explain want makes sense to them.

[...]
 
Last edited by a moderator:
No, it's not "alright". It's easy to think of other experts that disagree with your authorities but that's beside the point. Unless the CTBUH, ASCE etc have built models of the collapse progression or commented on their possibility, your appeal to them is an irrelevance.
No! Both institutions have endorsed the NIST reports as far as the general collapse propagation mechanism goes. It is not required that they redo the modeling in order to agree with it.

http://www.princeton.edu/~achaney/tmve/wiki100k/docs/Appeal_to_authority.html

...arguments from authority are an important part of informal logic. Since we cannot have expert knowledge of many subjects, we often rely on the judgments of those who do. There is no fallacy involved in simply arguing that the assertion made by an authority is true. The fallacy only arises when it is claimed or implied that the authority is infallible in principle and can hence be exempted from criticism.

Here we have an authoritative organization, NIST and at least two more which agree, for the most part , with the report by NIST. How many more authoritative endorsements is required in order to overcome simple appeal to authority?
Then there is Bazant, who's approximation showed 30 times the force on the floor pan that was required to fail it. For those questioning column failure, Bazant also showed that had all the force been directed at the columns, it was sufficient to fail them.

However, yes, some factions of society still want a completely open data, computed model of the towers. . One such group touts a cadre of 2000 engineers and architects. Presumably this would supply enough experts to do what NIST did and generate the data inputs necessary to run a computer model of the towers. THAT is precisely what psikeyhacker is asking for.
Their entire raison d'etre reads as an appeal to their authority yet they have done little more than advertise that supposed authoirity.

Should it be encumbant upon non-experts and those who do in fact accept the NIST conclusion that progression to global collapse was inevitable, to come up eith data inputs that satisfy those who do not accept it?
 
Last edited:
No - if "Engineers for 9/11 truth" or anyton else thinks their claims to expertise are sufficient for them to be taken seriously then they must also think they have sufficient expertise to [...] well do it themselves!! :rolleyes: (I redacted myself :))
 
Realy the onus is on the Truther community to make the case. They could do that with a simplified open-source computer model that represents a plausible collapse scenario. You think that's too much to do over the next five years? I could do it in a month, part time.

I might even give it a go some day, I'm just looking for a nice Javascript physics engine with destructible connections.

Exactly. Of course if you do, and it too demonstrates global collapse , it will immediately be attacked for your choice of inputs and your lack of structural engineering authority. It MAY induce AE911T to finally do it themselves, but I give that at best a 50-50 chance.
 
Now I'm thinking I'll look into using Blender to build a model. While it's not engineering grade physics, it should suffice to demonstrate the principle of collapse, and how it varies with scale. It has scriptable constraints. But most of the existing models seem to be solid block type things.

 
When I was with AE911T I suggested that they produce a building performance study... even engineer the towers since so many of the design criteria were known. Use their members to do it. They had no interest in this. Imagine if their guys when actually looking at the towers concluded that they could collapse without devices!!!!!!!!!!! Yikes they would be out of business shooting themselves in the foot. They simply can't risk being wrong and so they prattle on about the investigation that no one else things is relevant. Do we know everything? No! I don't event think NIST got it right... but that's just a detail... not that the collapse would progress until nothing was left.
 
But only the Truthers see the need. For the majority of people who don't see any indication of bombs, there's no evidence to justify wasting money on such a thing.

Realy the onus is on the Truther community to make the case.
Fully agree. The big issue is the governance policy issue - how far should a community go to satisfy the demands of a fringe minority.

And the first step for any group - fringe, minority or even majority - is to make a prima facie case that the demanded activity is justified. The truth movement has not produced the goods at either of those levels.

It's the same deal if we go to the technical level of "CD" or MHI claims. Until a prima facie case is presented there is nothing to respond to other than a little bit of noise in a corner of the Internet, marginal publications and AE911 banners in NYC etc. No significant substance to address.

They could do that with a simplified open-source computer model that represents a plausible collapse scenario. You think that's too much to do over the next five years? I could do it in a month, part time...
Here I partly disagree and I am clearly the lone voice in the wilderness on this forum as on others. So be it.

I think the aspect that has been overlooked so far is "who is the target audience" - so it is a marketing issue. I identify two distinct targets for now - there will be more but two will suffice. They are:
1) the professional academic and industry practitioners who need quantified sound physics for their purposes. AND
2) The audience of lay people who could benefit from a "working model" which they can see to assist understanding in qualified terms - not quantified - how the collapses progressed.

I think that we've been implicitly focussed on "1)" - the needs of "2)" are different.

As a means of influencing professional level understanding and opinion I stay with my posted claims viz:
a) A model of the collapse progressions stages for WTC1 and WTC2 cannot prove anything at the level of overall macro mechanism that cannot be proved at least as well by the available evidence of the real event. By that I mean the manner in which "Three Mechanisms" explains that stage.
(i) Open Office Space floor strip down by falling weight,
(ii) peel off and fall way by perimeter AND
(iii) destruction of the core dominated by strip down of beams analogous to "(i)"

And for each of those three sub-mechanisms the numeric values of relevance are all in the "overwhelming" range. Orders of magnitude more force/energy available than is needed.

So neither the understanding of mechanism nor the quantification necessary at that level for professional use could benefit from modelling.

It becomes a different question if the aim of the model is to quantify some lower level detail e.g. shear failure of joist to column connectors. But no one seems to mean modelling of details.

b) As for modelling the cascade failure of the initiation stages - I still hold that it couldn't be done to any useful degree of accuracy for reasons I've posted. So I won't bore members by repeating the arguments unless anyone wants them.

So - yes it is possible to make some physical and some computer models. But I don't see any clear reason for doing it to satisfy legitimate needs of the professional community.

The situation for the lay community could well be different.

There could well be a valid need for building models which demonstrate the collpase mechansims for lay persons who cannot visualise what happend. (And for the large proportion of engineers who have limited visualising skils - I won't derail into career experience war story examples unles it becomes necessary.)

I might even give it a go some day, I'm just looking for a nice Javascript physics engine with destructible connections.
Our colleague OneWhiteEye delights in doing the basic physics modelling. We routinely agree to differ when my focus is on the pragmatics of explaining the real events. The combination of basic physics research and pragmatic analyses of the real events can often be complementary. Go for it if it is your interest area.
 
It seems that NIST models were not useful and didn't demonstrate that the truss sag was the culprit. The 7wtc GIF doesn't resemble very closely at all the global collapse either. I think the expectations for a model are unachievable realistically... never look exactly like what happened.
 
My primary interest in modeling is in explaining things. I don't think I'd ever get anything that's a sufficiently high fidelity model of the WTC that would satisfy anyone, let alone look like the actual collapse. But I can demonstrate two things:

1) The mode of collapse - how a structure can progressively collapse without crushing the supports
2) How things vary with scale
I think the aspect that has been overlooked so far is "who is the target audience" - so it is a marketing issue. I identify two distinct targets for now - there will be more but two will suffice. They are:
1) the professional academic and industry practitioners who need quantified sound physics for their purposes. AND
2) The audience of lay people who could benefit from a "working model" which they can see to assist understanding in qualified terms - not quantified - how the collapses progressed.

I think that we've been implicitly focussed on "1)" - the needs of "2)" are different.

So I'm really only interested in 2) there.
 
No! Both institutions have endorsed the NIST reports... [etc] should it be encumbant upon non-experts and those who do in fact accept the NIST conclusion that progression to global collapse was inevitable, to come up eith data inputs that satisfy those who do not accept it?
None of this has got anything to do with building models that haven't been built yet. Why wouldn't you welcome the effort to model Bazant's theories in a virtual (or for that matter physical) environment?
 
How much would you charge?

I wouldn't charge, because that's not the sort of thing I'd do for hire. And once you start paying for things, the expectations are different. I'd do it out of personal interest, and as a teaching tool.
 
My primary interest in modeling is in explaining things. I don't think I'd ever get anything that's a sufficiently high fidelity model of the WTC that would satisfy anyone, let alone look like the actual collapse. But I can demonstrate two things:

1) The mode of collapse - how a structure can progressively collapse without crushing the supports
2) How things vary with scale


So I'm really only interested in 2) there.
Understood and respected.

Your blocks model is a good example of a physical model of that genre. And it could be rebuilt with realistic shaped blocks on say a 5 or 10 storey "sample" model.

If the perimeter columns were connected in various groupings it could show realistic variances in the size of the "peel off Sheets" - in that setting "cheating" - by fixing a few joints - would be legitimate to meet the the objective of visual demonstration of mechanism.

And for a doubling or more of complexity it could also be extended to include the core strip down - thereby completing the triumvirate of the "Three Mechanisms" of the actual progression stage collapse mechanism which occurred 9/11. And no column crushing confusion involved.

And best of luck to anyone who thinks that the cascade of initiation stage can be modelled visually for lay people. I'll stick with cascades of dominoes and hope that the said lay person can translate across a couple of orders of analogy. I've got away with that one many times.

After explaining to the lay person - or otherwise disinterested engineer - that there were two stages - one easy to explain the other not so. Then explained the progression in about 40 seconds using the "Three Mechanisms". They usually accept that the cascade initiation stage was "like dominoes toppling one after the other - only a bit more complicated". The engineers are usually comfortable once you say "it as a complicated cascade".
 
Last edited:
None of this has got anything to do with building models that haven't been built yet. Why wouldn't you welcome the effort to model Bazant's theories in a virtual (or for that matter physical) environment?
Bazant's theories have been partly modelled by psikeyhackr and those engaged in discussion with him. psikey's paper loops model is a Bazantian model - begging all the additional constraints that several members have identified. On the computer side - whilst it is not a model per se - the recent paper by Szuladzinski, Szamboti and Johns is a mathematical critique of Bazant's model. It suggests that Bazant got the numbers wrong and collapse à la Bazant should have halted.

The big issue remains that Bazant's theories as published are not valid for WTC 9/11 collapse..but that is a different issue. And that comment would be considered lèse-majesté by many. :rolleyes:

Bazant's theories are IMO inherently easier to model than the real 9/11 events.
 
None of this has got anything to do with building models that haven't been built yet. Why wouldn't you welcome the effort to model Bazant's theories in a virtual (or for that matter physical) environment?
I would welcome the new models. Not that I think its necessary but it would be interesting.
. I question why it hasn't been done by those most involved in demanding it , and pointed out that those who already accept the so called ROOSD driven collapses have the least impetus to redo the modeling.
 
.....

The big issue remains that Bazant's theories as published are not valid for WTC 9/11 collapse..but that is a different issue. And that comment would be considered lèse-majesté by many. :rolleyes:

Bazant's theories are IMO inherently easier to model than the real 9/11 events.
The reason they are easier to model though, is precisely what makes them so much less than what-really-happened.
 
The reason they are easier to model though, is precisely what makes them so much less than what-really-happened.
And ain’t that the truth. Probably the biggest causer of confusion in these sorts of discussions AND in the formal cloisters of academic/professional publishing.

PS Hey - I've been promoted. "Member".

Does the pay increase?
 
Back
Top