NIST's Rationale For Not Releasing Simulation Data

Hitstirrer

Active Member
What is the reason given for not making critical information public, and who could possibly find selective release of this data acceptable?

If I am under investigation can I be "quite transparent" with some facts and "not transparent" with other facts? I don't think so.

The closer one looks at any part of this the more questions arise. And the more it stinks of coverup and deception.
(I don't mean to be "off topic"-how else to ask the question?)

The reason constantly given by NIST's lawyers when appeals for FOIA info are heard is that release of the data would 'jeapardise public safety'. When pushed, they go on to suggest that the data could be used by potential terrorists to programme their own computer and use that to discover how to bring down other steelframe highrise buildings.

Many people, including judges hearing the appeals, take that at face value, and with the background of general fear of being accused of aiding terrorists they nod and agree with that contention.

However, if critical thinking skills are applied for a few moments it becomes clear that data on one building that doesnt now exist would be useless to model a totally different building. And if a future terrorist really planned to destroy a 47 story building elsewhere, would they seriously use that data to discover how to place incendiaries to cause fires that would eventually cause global collapse.

Isn't it more likely that they would just arrange for a pretty large explosion either inside or close to that building which wouldn't require them to study data unrelated to their planned attack site before that. And then have a much more certain result than hoping that for the second time in history a fire could be persuaded to cause global collapse.

The reason given for the non release of the data is patent nonsense when examined for a few moments.
 
What is missing are the connection models. Now while the structural geometry of the building is unique, connections are reasonably generic. So it's conceivably they could be used - for example to decide which column to take out with a single bomb. Seems very unlikely to me though.
upload_2013-10-10_14-23-3.png
 
Last edited:
The reason constantly given by NIST's lawyers when appeals for FOIA info are heard is that release of the data would 'jeapardise public safety'. When pushed, they go on to suggest that the data could be used by potential terrorists to programme their own computer and use that to discover how to bring down other steelframe highrise buildings.

Many people, including judges hearing the appeals, take that at face value, and with the background of general fear of being accused of aiding terrorists they nod and agree with that contention.

However, if critical thinking skills are applied for a few moments it becomes clear that data on one building that doesnt now exist would be useless to model a totally different building. And if a future terrorist really planned to destroy a 47 story building elsewhere, would they seriously use that data to discover how to place incendiaries to cause fires that would eventually cause global collapse.

Isn't it more likely that they would just arrange for a pretty large explosion either inside or close to that building which wouldn't require them to study data unrelated to their planned attack site before that.
Yes.
What is missing are the connection models. Now while the structural geometry of the building is unique, connections are reasonably generic. So it's conceivably they could be used - for example to decide which column to take out with a single bomb. Seems very unlikely to me though.

I'm reminded of Luke Sky Walker learning the weak point and key to the destruction of the Death Star.

Yes, their reasoning does sound a little questionable, there. If the official interpretation of events is correct then the terrorists presumably had no intention whatsoever to bring down any building at all, let alone exploit a profound weakness within one.

The towers were considered indestructible, and what was sought was a barbaric but painfully simple means of achieving a high impact terror attack at the heart of the US, with mass death, and not to reduce the buildings to dust. It then follows that they were inordinately lucky to pull one building down, let alone all three, and must've been totally shocked with the magnitude of what subsequently occurred after impact, as they watched the buildings collapse on satellite television from afar.

So withholding data on the grounds that it may aid terrorists in some aim to pull buildings seems a little insincere. Never heard of a terrorist that would actually 'need' to do that.
 
The towers were considered indestructible, and what was sought was a barbaric but painfully simple means of achieving a high impact terror attack at the heart of the US, with mass death, and not to reduce the buildings to dust. It then follows that they were inordinately lucky to pull one building down, let alone all three, and must've been totally shocked with the magnitude of what subsequently occurred after impact, as they watched the buildings collapse on satellite television from afar.

So withholding data on the grounds that it may aid terrorists in some aim to pull buildings seems a little insincere. Never heard of a terrorist that would actually 'need' to do that.

I'm sure you've never heard of a terrorist who would need to work out all the technicalities of building a nuclear bomb either, that does not mean we should provide things that might be helpful.

I don't think it's incredibly likely that terrorists will run a simulation of the Sears Tower to see if they could bring it down with a truck bomb. But I can certainly understand the "why take a chance" mentality - as well as the "I'm not taking responsibility for that" mentality.
 
The reason constantly given by NIST's lawyers when appeals for FOIA info are heard is that release of the data would 'jeapardise public safety'. When pushed, they go on to suggest that the data could be used by potential terrorists to programme their own computer and use that to discover how to bring down other steelframe highrise buildings.

To the people responsible for determining whether something is a security issue, this type of information could be extrapolated to other situations and that makes it a be a potential risk. It may not make sense when viewed from the outside but from their point of view classifying the information is a precaution they can take to prevent misuse of the information. They are being overly cautious but they would be castigated if they declassified the information and someone did find a way to do something nefarious with the data.

Another way to look at it is they are covering their butts.
 
The fact is that there is enough data already released to produce a duplicate simulation as accurate as the NIST simulation probably was. There is NO real shortfall of data.
I'm neutral here, just wondering. @Mick West just uploaded a scan indicating 74,777 or '80 percent of all responsive' records are withheld. So the fundamental data is within this 20% that is within the public domain? 20% just seems a little short, to me, to confidently assert there is "no real shortfall in data", or that one could produce "a duplicate simulation as accurate as the NIST simulation probably was", that's all.
 
Last edited:
To the people responsible for determining whether something is a security issue, this type of information could be extrapolated to other situations and that makes it a be a potential risk. It may not make sense when viewed from the outside but from their point of view classifying the information is a precaution they can take to prevent misuse of the information. They are being overly cautious but they would be castigated if they declassified the information and someone did find a way to do something nefarious with the data.
This is an example of the argumentum ad metum fallacy. You may as well classify the information that parcetamol is good for headaches in small doses.
 
I'm neutral here, just wondering. @Mick West just uploaded a scan indicating 74,777 or '80 percent of all responsive records' are withheld. So the fundamental data is within this 20 percent that is within the public domain? 20% just seems a little short, to me, to confidently assert there is "no real shortfall in data", or that one could produce "a duplicate simulation as accurate as the NIST simulation probably was", that's all.

Since that was released, the blueprints have been released. In theory you could create all the connections from that. Just lots more work.

But then you'd want to verify them against the blueprints anyway.
 
This is an example of the argumentum ad metum fallacy. You may as well classify the information that parcetamol is good for headaches in small doses.

It's a balance. The possible benefit vs. the possible harm. There's very little (if any) possible benefit to officials in releasing the data. There's plenty of benefit in selling Paracetamol (Tylenol/Acetaminophen).
 
I'm neutral here, just wondering. @Mick West just uploaded a scan indicating 74,777 or '80 percent of all responsive' records are withheld. So the fundamental data is within this 20% that is within the public domain? 20% just seems a little short, to me, to confidently assert there is "no real shortfall in data", or that one could produce "a duplicate simulation as accurate as the NIST simulation probably was", that's all.
If 1% of it was missing, you could spend millions redoing it and the criticism would be... 'That's no good, you don't have all the data so it won't be accurate and will obviously not be the same as NITS simulation'.
 
Whose data is it and who was NIST acting on behalf of when it was generated?

By that argument we should have open access to the CIA's databases of covert operations.

It's legal for them to withhold it. "But the public paid for it via taxes" is not an argument.
 
If 1% of it was missing, you could spend millions redoing it and the criticism would be... 'That's no good, you don't have all the data so it won't be accurate and will obviously not be the same as NITS simulation'.

Sure, it would be best if they released it, from a verification point of view. The point is that it's possible.

AE911 should be able to do it with far less money using volunteers from their membership.
 
Sure, it would be best if they released it, from a verification point of view.
Surely if the NIST report into the collapse of WTC7 was a full and fair analysis of how the building came to be destroyed, one could argue that it is in essence a set of instructions detailing exactly how to bring down a high rise building.
AE911 should be able to do it with far less money using volunteers from their membership
The question surely is what did NIST use as input data. The specific details that they claim they were unable to modlel (connection details at columns 79 and 81) are there in the drawings for anybody to see. I agree that it would be relatively easy and cheap for a small team of engineers to model these details correctly. The question is therefor, why were NIST, with a huge team, and access to substantial funds unwilling or unable to do so.
 
By that argument we should have open access to the CIA's databases of covert operations.

It's legal for them to withhold it. "But the public paid for it via taxes" is not an argument.
False comparison. It's a model of a building, not a list of secret missions and "enhanced" interrogation techniques.
 
Last edited:
Surely if the NIST report into the collapse of WTC7 was a full and fair analysis of how the building came to be destroyed, one could argue that it is in essence a set of instructions detailing exactly how to bring down a high rise building.

Somewhat, which is why I said "from a verification point of view". From a "don't help building bombers" point of view, it's a bad idea to release the data. Trade off.
 
Somewhat, which is why I said "from a verification point of view". From a "don't help building bombers" point of view, it's a bad idea to release the data. Trade off.
By that standard, NIST should never have released the structural drawings for the building.
As an interesting aside, wouldn't it just be easier for a terrorist organisation to recruit a structural engineer?
 
This is an example of the argumentum ad metum fallacy. You may as well classify the information that parcetamol is good for headaches in small doses.
It is an example of the way things really work in the arena of classified information. I've seen quite a bit of technical information that from different programs that from an engineers viewpoint there was no need to classify. The people responsible for security had a different opinion based on the guidelines they were instructed to use. The paranoia surrounding the transfer of information went into overdrive after 9/11 and engineers that were able to deliver lectures overseas in 2000 found that same information being classified in 2002 and their lectures being canceled. Given the high profile security leaks of the last few years its not surprising that the level of concern has increased among security professionals.

As far as obtaining classified information goes an officer at the documents office put it to me this way. There's a difference between "need to know" and "want to know" and unless I can present an acceptable argument demonstrating my need to know classified data what I want is irrelevant. I was given the option of hanging around for the next fifty years until they information was declassified.
 
Again, this is just a model of a building. The question is why it would be classified: what is the rationale? It seems to be an argument from fear that has no purchase on reason.
 
Again, this is just a model of a building. The question is why it would be classified: what is the rationale? It seems to be an argument from fear that has no purchase on reason.
We don't actually know what's in the files so we can't judge if their concerns are valid. I can't see how they would be and you seem to have made the assumption they are not but since we haven't seen the data we can't know for sure. What I do know is if there is something exploitable about the data once it's released it's out there and can't be retrieved.

The argument from fear fallacy could be applied to any classified data. Data is classified because of the fear or concern that someone may obtain it and use it against your interest.
 
We don't actually know what's in the files so we can't judge if their concerns are valid. I can't see how they would be and you seem to have made the assumption they are not but since we haven't seen the data we can't know for sure. What I do know is if there is something exploitable about the data once it's released it's out there and can't be retrieved.

The argument from fear fallacy could be applied to any classified data. Data is classified because of the fear or concern that someone may obtain it and use it against your interest.
The thing is, we know EXACTLY what SHOULD be in the files because we have the structural drawings and we have NISTs temperature estimates for the building. So if they have stated the facts in writing, why would they not release the data that would confirm that the inputs matched?
The fact of the matter is that NIST did not model connection details correctly in their models of WTC7.
 
The point that is being skated around is that we want their input data to see what parameters they set in there to enable their sim to globally collapse.

We already know that they set the outer columns at infinite stiffness. Infinite.

And that they told their computer that the five beams on floor 13 went from ambient temperature straight to 100C in 1.5 seconds.

Do you know of any physical way that burning office furniture, within a fire moving along in an organic way, can raise the temperature of five beams, covering a very large area, from ambient to 100C in 1.5 seconds.

They then told their computer that the concrete remained at ambient temperature during those 1.5 seconds.

Believe it or not, but their computer computed, according to its preset programme, and decided that the shear studs would break free from the concrete due to 'differential thermal expansion'. NSS.
.
A 'New Phenomenon' was thus born, and announced to the world, from such unrealistic input data.

Its minor details like this that cause engineers to request to be told what other unrealistic parameters were set in NIST's cartoon simulation in order to achieve their desired result.

NIST have thus established a track record of crazy input data to break the studs and allow the beams to expand. And as that was at the very place that they said was the initiation of global collapse, is it really unreasonable to ask to look at the rest of their input parameters to see what others were introduced in order to persuade their simulation programme to comply with what was seen in the videos of the real event.

So, you see, asking others like AE911 to produce their own model from drawings and fire evidence isn't the point at issue.

If/when NIST's data is released, what if the two known unscientific input parameters mentioned above are just the tip of a huge false input iceberg. And could that really be the reason that they hide behind that patently rediculous excuse that it would jeapardise public safety ?
 
The thing is, we know EXACTLY what SHOULD be in the files because we have the structural drawings and we have NISTs temperature estimates for the building. So if they have stated the facts in writing, why would they not release the data that would confirm that the inputs matched?
The fact of the matter is that NIST did not model connection details correctly in their models of WTC7.

I'd frankly be very surprised if they were 100% accurate in transferring dimensions and configurations from the plans (some of which seemed to be in conflict) to the full model. It's very hard to have a massive data entry scheme run perfectly.

That would not change the degree of criticalness of any given error, of course. Just that inaccuracies are bound to occur. And simplifications are unavoidable.
 
And that they told their computer that the five beams on floor 13 went from ambient temperature straight to 100C in 1.5 seconds.

You know why they did that - to allow them to run an accurate simulation given the computing resources and time available. It's a straw man.

But if you want to debunk aspects of their methodology, then start a new focussed thread. This is about their stated rationale for not releasing the connection data (keeping terrorists from using it).
 
You know why they did that - to allow them to run an accurate simulation given the computing resources and time available. It's a straw man.

But if you want to debunk aspects of their methodology, then start a new focussed thread. This is about their stated rationale for not releasing the connection data (keeping terrorists from using it).

But my entry was specifically giving a reason for NIST not releasing data. It is exactly on topic.

Surely you have to accept that if somehow five huge steel beams can be made to rise in temperature from ambient to 100C in 1.5 seconds, whilst the concrete and shear studs are left at ambient, then shearing will inevitably occur. Of course they would shear. But only if that rediculous 1.5 seconds rise from 20C to 100C could be made happen. Otherwise composite floors worldwide would have been failing at such a rate that codes would have been amended 50 years ago.

As the basis of that is false, then ergo the result is false.

That is data within the simulation that they are not releasing. Its just that this particular snippet happens to have been found inside the voluminous report and not in the data field itself.

No strawman. A key parameter that was used to obtain a result that in the real world woudn't happen.

And , on topic, that could be one reason why NIST are not releasing simulation data. It could prove embarrassing if many such seemingly illogical parameters were set in order to achieve global collapse.
 
Shouldn't The government also do everything in it's power to make all available videos of the 911 attacks classified? You know, if terrorists were to see videos of planes crashing into buildings, they could get ideas. Much more so than from getting some very specific data on the actual buildings hit..
It seems like a cop out to me.
 
But my entry was specifically giving a reason for NIST not releasing data. It is exactly on topic.

Surely you have to accept that if somehow five huge steel beams can be made to rise in temperature from ambient to 100C in 1.5 seconds, whilst the concrete and shear studs are left at ambient, then shearing will inevitably occur. Of course they would shear. But only if that rediculous 1.5 seconds rise from 20C to 100C could be made happen. Otherwise composite floors worldwide would have been failing at such a rate that codes would have been amended 50 years ago.

As the basis of that is false, then ergo the result is false.

That is data within the simulation that they are not releasing. Its just that this particular snippet happens to have been found inside the voluminous report and not in the data field itself.

No strawman. A key parameter that was used to obtain a result that in the real world woudn't happen.

And , on topic, that could be one reason why NIST are not releasing simulation data. It could prove embarrassing if many such seemingly illogical parameters were set in order to achieve global collapse.

No. The fact of 1.5 seconds, and not heating the concrete in one of the simulations, is already public knowledge. So why bring it up?
 
Shouldn't The government also do everything in it's power to make all available videos of the 911 attacks classified? You know, if terrorists were to see videos of planes crashing into buildings, they could get ideas. Much more so than from getting some very specific data on the actual buildings hit..
It seems like a cop out to me.

It's general data, that arguably could be used on other buildings to determine explosives placements. It's not about getting ideas.
 
No. The fact of 1.5 seconds, and not heating the concrete in one of the simulations, is already public knowledge. So why bring it up?

Because its relevent. And pertinent to the debate. And given as evidence that they use such input to obtain false results. And that is then evidence that they MAY have done it elsewhere also. Its a logic chain to suggest a reaaon why they wont release all simulation data.
 
It's a straw man. But if you want to debunk aspects of their methodology, then start a new focussed thread.

if somehow five huge steel beams can be made to rise in temperature from ambient to 100C in 1.5 seconds, whilst the concrete and shear studs are left at ambient, then shearing will inevitably occur. Of course they would shear. But only if that ridiculous 1.5 seconds rise from 20C to 100C could be made happen. Otherwise composite floors worldwide would have been failing at such a rate that codes would have been amended 50 years ago. As the basis of that is false, then ergo the result is false.
All simulations are broken up into individual moments in time, which are then represented by figures. There is no 'real' time in any simulation. All simulation time is, in a 'real' sense false. You are calling a false thing false. I can't really disagree with you about that. Apart from that, your argument is wrong. To speed up time in a simulation will not increase thermal shock loading because that loading is a time-dependant variable, and you're speeding up time, remember?

[...]
 
Last edited by a moderator:
All simulations are broken up into individual moments in time, which are then represented by figures. There is no 'real' time in any simulation. All simulation time is, in a 'real' sense false. You are calling a false thing false. I can't really disagree with you about that. Apart from that, your argument is wrong. To speed up time in a simulation will not increase thermal shock loading because that loading is a time-dependant variable, and you're speeding up time, remember?


No, an understanding of science is what they aren't releasing.
It's general data, that arguably could be used on other buildings to determine explosives placements. It's not about getting ideas.

Arguable is very clearly the key word here. Maybe this point should be the topic of another thread: if NIST releases their data, could it aid terrorists in destroying more steel buildings through the use of air planes and fire induced collapse?
We've gone over the uniqueness of these buildings...if they're so unique then I don't see what the problem is. It's not like they could apply that same data to other buildings. If they're NOT unique, and this data could be used for the purpose of destroying other buildings, then it seems these terrorists would be able to have this type of information anyway, correct?
It all seems like a big game to me. A game of cover up the mistakes.
 
if they're so unique then I don't see what the problem is. It's not like they could apply that same data to other buildings. If they're NOT unique, and this data could be used for the purpose of destroying other buildings, then it seems these terrorists would be able to have this type of information anyway, correct?
This has been covered previously.

It all seems like a big game to me. A game of cover up the mistakes.
Twas ever thus.

Expanding the deleted topic in my last post, one could argue that NIST are promoting science in their report, in fact, by setting all the obtainable evidence out in an orderly manner, and carrying out a timeline with meaningful tests of sub-elements, tieing in video and stills, constructing reasonably-matching simulations and making references wherever possible. That's good science, and a good example, though of course it should be compared with others, because it's a human endeavor, and therefore fallible.
 
Last edited:
All simulations are broken up into individual moments in time, which are then represented by figures. There is no 'real' time in any simulation. All simulation time is, in a 'real' sense false. You are calling a false thing false. I can't really disagree with you about that. Apart from that, your argument is wrong. To speed up time in a simulation will not increase thermal shock loading because that loading is a time-dependant variable, and you're speeding up time, remember?

[...]

To a point I agree. But when one element is allowed to move from 20c to 100C and the other composite part isn't, then the simulated timescale is a red herring.

As you correctly say, the aim was to speed up a 'real' time of say 20 minutes into a 1.5 second event. And then the programme was scaled accordingly from 20C to 100C. That is understandable by incremental expansion rates. I 'get' that.

However, to make that credible you would have to also scale up the concrete expansion expected over that same 20 minutes ( speeded up to 1.5 seconds ) rather than leave it at 20C throughout that entire 20 minutes ( speeded up to 1.5 seconds )

We know that concrete's co-efficient of expansion is similar to steel and that its thermal transfer rate is much different. But such factors must be fed into a sim in order to get a half accurate result. By leaving the concrete at 20C throughout that entire 20 minutes ( speeded up to 1.5 seconds ) it would certainly cause a large thermal differential stress on the studs.

The point is - would that same stress force have been seen if the concrete had been allowed to incrementally expand within that same 20 minutes ( speeded up to 1.5 seconds ) if its own thermal transfer and co-efficient of expansion parameters had been set rather than left at ambient throughout. And would that have shown the level of shear reported by not doing that ?

NIST, as experts in this field, appear to have failed to programme this pretty important part of the initiation theory correctly, and then relied on the result seen to announce a new phenomenon.
 
To a point I agree. But when one element is allowed to move from 20c to 100C and the other composite part isn't, then the simulated timescale is a red herring.

As you correctly say, the aim was to speed up a 'real' time of say 20 minutes into a 1.5 second event. And then the programme was scaled accordingly from 20C to 100C. That is understandable by incremental expansion rates. I 'get' that.

However, to make that credible you would have to also scale up the concrete expansion expected over that same 20 minutes ( speeded up to 1.5 seconds ) rather than leave it at 20C throughout that entire 20 minutes ( speeded up to 1.5 seconds )

We know that concrete's co-efficient of expansion is similar to steel and that its thermal transfer rate is much different. But such factors must be fed into a sim in order to get a half accurate result. By leaving the concrete at 20C throughout that entire 20 minutes ( speeded up to 1.5 seconds ) it would certainly cause a large thermal differential stress on the studs.

The point is - would that same stress force have been seen if the concrete had been allowed to incrementally expand within that same 20 minutes ( speeded up to 1.5 seconds ) if its own thermal transfer and co-efficient of expansion parameters had been set rather than left at ambient throughout. And would that have shown the level of shear reported by not doing that ?

NIST, as experts in this field, appear to have failed to programme this pretty important part of the initiation theory correctly, and then relied on the result seen to announce a new phenomenon.

You are mixing your models. The 1.5 seconds and unheated slabs were in the initial exploratory LSDYNA model. The actual 16-story ANSYS model used a full time scale, and heated everything, and the damage determined from that was taken forward to the full sized LSDYNA model.


The concrete slab had temperatures applied to the shell element nodes at fire locations evenly spaced
through the slab thickness and included temperature gradients through the slab. Each beam and column
was assigned a single temperature (i.e., no thermal gradients across the section). Component studies had
shown that columns were not heated sufficiently to warrant inclusion of a gradient, and that steel beams in
the tenant floors heated at a nearly uniform rate, with the exception of the top flange which was adjacent
to the floor slab. A uniform temperature gradient was determined to be a reasonable approximation for
the temperature profile in the beam and girder sections (Section 10.3.2).
Content from External Source
 
We've gone over the uniqueness of these buildings...if they're so unique then I don't see what the problem is.

Of course you don't. That is, in fact, the entire point of classification of information.

Someone has deemed that information to be sensitive in nature and, agree or not, that's the way it is. The very nature of information like that means that you cannot logicaly argue against the information's need to be secret.
 
This has been covered previously.


Twas ever thus.

Expanding the deleted topic in my last post, one could argue that NIST are promoting science in their report, in fact, by setting all the obtainable evidence out in an orderly manner, and carrying out a timeline with meaningful tests of sub-elements, tieing in video and stills, constructing reasonably-matching simulations and making references wherever possible. That's good science, and a good example, though of course it should be compared with others, because it's a human endeavor, and therefore fallible.

Sounds strange. That's like suggesting we should teach physics by dangling a physics book in front of someone and saying 'here's a small portion of what's in it, the rest is in our hands. Work it out yourself.'

It seems more counter productive than just revealing the information flat out.
 
This entry was posted on a different thread relating to shear studs in WTC7, as begun by mynym.

The contents are relevent to this thread also as they suggest a rationale for NIST not releasing simulation data.

The post went as follows :-


Thank you mynym.

As you say, NIST needed shear studs to be absent on that girder to even come close to making their theory work. That could be another reason why they now refuse to release the full set of drawings. Fortunately, Salverinas, who worked for Frankel, had a set of drawings that he used in presentations used to tour the Country.

The powerpoint slides of those drawings, which were found on-line, show studs on all elements on all floors - which is what all building professionals tell us would have to be the case.

That guy would not have risked ridicule from his audience of professionals on those tours if he had left studs off a transitional girder where different orientated floor pans met. There would have been questions from the floor at first sight of that slide

Good spot.
 
I'd frankly be very surprised if they were 100% accurate in transferring dimensions and configurations from the plans (some of which seemed to be in conflict) to the full model. It's very hard to have a massive data entry scheme run perfectly.

That would not change the degree of criticalness of any given error, of course. Just that inaccuracies are bound to occur. And simplifications are unavoidable.
TOTAL COP OUT. It is easier to get it RIGHT than it is to get it WRONG. You are struggling here Mick, ridiculous.
 
Back
Top