Szydagis' point 3: Interstellar travel is too hard

Ann K

Senior Member.
Another point from Szydagis' lengthy discussion supporting the possibility of alien visitors:

Consider a craft with 1g of acceleration (9.8 m/s²) as well achievable with our current tech, just not sustainable long term due to fuel requirements. Putting the fuel issue aside, we could do this today without killing the occupant. This 1g acceleration is sufficient to achieve near-light speed in 1 year
Content from External Source
https://uapx-media.medium.com/addre...-criticisms-against-studying-uap-5663335fe8c8

The phrase "Putting the fuel issue aside" struck me as hand-waving away a problem that is generally considered insurmountable. But I admit my expertise in physics is insufficient to articulate it clearly, so I'd like to open the question to more knowledgeable members. Is it possible to ignore fuel as a consideration for his postulated near-light-speed travel? He blithely assumes engineering can do it.

The next naysayer argument is about how hard this would be and the fuel it would take. I am sorry, but that is a question of very clever engineering, not of new physics, and engineers are renowned for finding clever loopholes within the “known laws” of physics.
Content from External Source
I question his sunny optimism.
 
Distance is a problem for communication though too, isn't it?

Would it be fair to say that if we have been visited by aliens who live millions of light years away then the aliens back on their home planet don't know about it yet? (staying within known physics like author seems to want to).
 
The obvious answer is you can't ignore vehicle fuel and its consumption as a consideration for postulated near-light-speed travel, or travel at any speed to anyplace for that matter. The point of this part of the paper is human physiological limitations/tolerance, however. He's saying if we can figure out how to deal with the fuel issue, the concept he espoused would be safe for humans. And yes, that's a huge IF.

During my career, we faced similar scenarios on finding design solutions to engineering problems. You look at proposed solutions as a function of what technology is required, and whether that technology currently exists (and can be modified if necessary) to produce the proposed designs. If technology doesn't exist, you then look at what it would take to develop it. Many ideas are dropped at this point. This is all part of the systems engineering process

In the real world, the biggest impediments to designing/testing/producing/fielding new/modified technology are schedule and funding. So what you are trying to find is a low risk technical solution that isn't prohibitively expensive, and can be successfully integrated and sustained within an overall system and concept of operations.

So could the concept Dr. Szydagis postulated be an insurmountable engineering challenge at any single point in time due to technology/cost limitations? Sure. But it doesn't change his point the concept is safe physiologically if/when the challenges are overcome.
 
Last edited:
The main issue is in order to sustain 1g till you reach near lightspeed with any human scale sized object the energy requirements are. . significant.

Think on the order of the current entire energy output of Earth for a year for many years.

This when produced with any method we can think of as practical creates the issue of the fuel for that method increasing the weight and then increasing the energy to accelerate etc, etc.

Not to mention propellent if required.

You can use antimatter matter reactions as it's the lightest way we can thing of for storing energy but we're an incredibly long way away from being able to make and store antimatter in anywhere near enough quantities and we will still need to use as much extra energy to create it as we make globally for decades, maybe even more depending on how efficient the process is.

You can use a solar powered laser to push it or but you need a space based solar array that's planet sized which is basically a currently impossible engineering task.
 
So could the concept Dr. Szydagis postulated be an insurmountable engineering challenge at any single point in time due to technology/cost limitations? Sure. But it doesn't change his point the concept is safe physiologically if/when the challenges are overcome.
Sure, but that statement is just as true if you change "engineering challenge" to "magical wishing challenge." If the challenge being overcome is not a reasonable possibility, it doesn't mean much to say "but if it could be, the ride would be quite comfy."
 
Sure, but that statement is just as true if you change "engineering challenge" to "magical wishing challenge." If the challenge being overcome is not a reasonable possibility, it doesn't mean much to say "but if it could be, the ride would be quite comfy."
Big difference between safe and comfortable.

I guess I have more confidence in man's ability to achieve than you do. In 1900, how many would have considered putting men on the moon nothing more than a "magical wishing challenge," only to see it accomplished in less than 70 years?
 
This is well debunked.

When travelling at anywhere near relativistic speeds, collision with a single atom can be catastrophic.

Here is a geeky article discussing the dangers of travelling at 20%c.

Just how dangerous is it to travel at 20% the speed of light?

The goal of Breakthrough Starshot is to accelerate its craft to about 20 percent the speed of light. At that speed, even individual atoms can damage the vehicle, and a collision with a bit of dust could be catastrophic. So the team set out to quantify just how risky these collisions could be.

This is 20%c!

I have seen debunkers do the calculations at 99%c. It is not pretty, am searching for links.

https://www.youtube.com/shorts/lf9OMavhQ0A


Source: https://www.youtube.com/watch?v=OxHWImyWBKM
 
Last edited:
He blithely assumes engineering can do it.
There's a German word called "Technologiegläubigkeit", which denotes an almost religious faith in the potential of technology, and Szydagis exhibits it here. (As did everyone who refused to think about long-term storage of nuclear waste.)

He also demonstrates his unfamiliarity with relativity and physics in general.

Kinetic energy is described by K=m×v², with m=mass and v=speed.

If you have a constant energy output, the fact that v is squared means that the acceleration must diminish. (Newton)
If you have constant energy output, the fact that m increases as v approaches c means that the acceleration must diminish. (Einstein)
The fact that c is the universal speed limit means a constant acceleration can't be upheld indefinitely. (Einstein for dummies)

I don't know any technology or physical principle that could be employed to steadily increase the output of an engine for a year.
 
There's a German word called "Technologiegläubigkeit", which denotes an almost religious faith in the potential of technology, and Szydagis exhibits it here. (As did everyone who refused to think about long-term storage of nuclear waste.)
Any belief in "potential" is in deep trouble when it runs up against the stone wall of "actual". As an example, when I was in school the mantra was that the world was not overpopulated because there was enough protein in the seas to feed many more people. Half a century later, with global warming, a number of major oil spills, and sea life population collapses due to overfishing (plus the realization that supply chain issues are going to limit the whole process) that "potential" is looking smaller and smaller.
 
I guess I have more confidence in man's ability to achieve than you do.
"The difficult we accomplish at once, the impossible takes a little longer." ^_^

And with a nod to Clarke's Law...

I suspect we have a similar optimistic view of human (or, possibly, alien) ability to achieve things that are possible. But we may not see eye to eye in what we think is possible. Which is fine.
 
I suspect we have a similar optimistic view of human (or, possibly, alien) ability to achieve things that are possible. But we may not see eye to eye in what we think is possible.
When Jules Verne wrote his journey to the moon, the French thought, Il n'est pas impossible que le voyage soit possible, mais il est possible que il soit impossible. It's not impossible that it's possible, but it's possible that it's impossible.

That's the kind of guarded optimism I believe we need.
 
Szydagis is sloppy and vague but so are we on this thread.

He fails to demonstrate how an acceleration of a craft at 1g for a year is within the realm of physical possibility given sufficient resources, superb intelligence and time at the disposal of an alien civilization. He merely claims it is.

However, the contributions to this thread thus far fail to demonstrate such a feat to be physics-defying even in theory. They merely claim it is.

Thus far it's a draw. Everyone's indulging in pure speculation.
 
When it comes to hypothetical alien civilizations, both the Drake Equation, and the Fermi Paradox that critiques it, are highly speculative. A 'believer' shouldn't be overly enamoured by the former inasmuch as the 'skeptic' shouldn't read into the latter.

Given the vast number of known variables as well as an unknown number of unknown variables to consider for such a calculation, there is no scientifically viable, let alone rigorous, mathematical model to produce a reliable probability value for alien communication or visitation.

If all sides to the argument wish to be pedantically scientific and unbiased, and as much as I am personally emotionally averse to admitting it, I must also accept on the basis of the foregoing paragraph in italics, that there is currently no sound mathematical model that can reliably demonstrate that undetected alien visitations are unlikely.

However, the available consistently sketchy UAP evidence which fails to prove anything extraordinary casts serious doubt on any hypothesis whereby such visitations, if they have occurred, have ever been detected.
 
However, the contributions to this thread thus far fail to demonstrate such a feat to be physics-defying even in theory. They merely claim it is.
Are you not concerned about collisions with stuff as we approach near c inside the galaxy?

[... or should I say, do you need us to do the calculations? ...]
 
Thus far it's a draw. Everyone's indulging in pure speculation.
Except me. ;)

I haven't said what Dr Szydagis proposed is possible or impossible because I don't know. I'm not a physicist or, God forbid, a propulsion engineer.

My original response to @Ann K was not intended to defend Dr Szydagis or his "postulated" concept. Rather I explained the real world systems engineering process and how it sometimes proceeds even given technology availability/maturation issues for some aspect of the design. This is colloquially referred to as "betting on the come," an expression we borrowed from the gambling community. Sometimes it pays off, sometimes it doesn't.

It's usually more of a question of whether someone is willing to put up funding to take the chance the needed technology develops/matures. I don't think Dr Szydagis will have to worry about finding a funding source anytime soon.
 
Last edited:
What prevents me to accelerate an object to near light speed in space?

You can accelerate near to light speed, but the nearer you get the more difficult it will be.
clearly this expression approaches infinity as the speed approaches light speed and you can not supply an infinite amount of energy.


However, the contributions to this thread thus far fail to demonstrate such a feat to be physics-defying even in theory. They merely claim it is.

Thus far it's a draw. Everyone's indulging in pure speculation.
[... @LilWabbit In fact, Mendel already effectively made this argument ...]
Kinetic energy is described by K=m×v², with m=mass and v=speed.

If you have a constant energy output, the fact that v is squared means that the acceleration must diminish. (Newton)
If you have constant energy output, the fact that m increases as v approaches c means that the acceleration must diminish. (Einstein)
The fact that c is the universal speed limit means a constant acceleration can't be upheld indefinitely. (Einstein for dummies)
[... and it makes me so sad, as does the collision problem at even lower %c ...]
 
Last edited:
Relativistic collisions with small objects is theoretically solved. The navigational deflectors in Star Trek are one of their less fantastical aspects - an electron beam directed ahead of a craft can charge and repel particles and small objects out of the way. When I say "theoretically" solved, the energy requirements are way beyond what we can reasonably put in space, but the needed technology is a lot more feasible than continuous acceleration up to relativistic speeds.

Alcubierre drive always comes up in this kind of discussion. It gets around all the relativistic issues by bringing the local frame of reference with the vessel in transit and being a causality horizon - nothing outside can interact with anything inside and vice versa. It also requires tachyonic exotic matter and an impossibly powerful electric field. Exotic matter is consistent with the standard model but has never been observed and if it can exist can't be created by any interaction with mundane particles. So the old saying about dragon soup - we've got everything but the dragon, but that doesn't mean we're 90% of the way there. As usual, when breaking the universe with a hypothetical, you've always got to skip an impossible step somewhere along the line.
 
Last edited:
Relativistic collisions with small objects is theoretically solved. The navigational deflectors in Star Trek are one of their less fantastical aspects - an electron beam directed ahead of a craft can charge and repel particles and small objects out of the way. When I say "theoretically" solved, the energy requirements are way beyond what we can reasonably put in space, but the needed technology is a lot more feasible than continuous acceleration up to relativistic speeds.
I think you are talking sci-fi geek bunk.

How far ahead does this "electron beam" get ahead of the ship travelling at at 99.9%c?

[... and where does the e=mc^2 go as the electrons defeat the incoming stuff ? ...]
Alcubierre drive always comes up in this kind of discussion. It gets around all the relativistic issues by bringing the local frame of reference with the vessel in transit and being a causality horizon - nothing outside can interact with anything inside and vice versa. It also requires tachyonic exotic matter and an impossibly powerful electric field. Exotic matter is consistent with the standard model but has never been observed and if it can exist can't be created by any interaction with mundane particles. So the old saying about dragon soup - we've got everything but the dragon, but that doesn't mean we're 90% of the way there. As usual, when breaking the universe with a hypothetical, you've always got to skip an impossible step somewhere along the line.
This is outside the scope of discussion of OP, since this specifically concerns accelerating and object to near c.
 
Last edited:
Szydagis is sloppy and vague but so are we on this thread.

What is everyone supposed to talk about? It's a response to Szydagis' claim that Interstellar Travel is not that hard, but then he says (bold by me):

Consider a craft with 1g of acceleration (9.8 m/s²) as well achievable with our current tech, just not sustainable long term due to fuel requirements. Putting the fuel issue aside, we could do this today without killing the occupant.

The next naysayer argument is about how hard this would be and the fuel it would take. I am sorry, but that is a question of very clever engineering, not of new physics, and engineers are renowned for finding clever loopholes within the “known laws” of physics.
Content from External Source
So, we can do this, but we can't take enough fuel so we can't do this, but never mind that, let's just pretend we can.

If he's going to pretend that the fuel issue is just a solvable engineering challenge, then why not pretend a Star Trek warp drive is just a solvable engineering challenge.

And if Mendel is right above about the whole idea of accelerating for a year up to .99c isn't possible, then it's just more pretend.

It's a poorly written article full of several Straw Men, wild speculation and an assumption that lots of old, often explained, UFO cases add up to something much bigger than they do.
 
I think you are talking sci-fi geek bunk.

How far ahead does this "electron beam" get ahead of the ship travelling at at 99.9%c?
It's not geek bunk, a system based on the principle was tested on the Space Shuttle near the end of its life and achieved measurable deflection of cosmic rays and charged particles passing through the South Atlantic Anomaly, it was a forward looking project on both the shuttle and the ISS that tested possible Mars tech, the system was hoped to reduce the need for both micrometeoroid and radiation shielding on long duration deep space missions, but it's unlikely it could meet the weight requirements vs. traditional shielding.

As for distance, at relativistic speeds it has to go very far (and the math on how far gets weirder the faster you're going because linear distance is a complex question when you need to measure across multiple frames of reference), however in a vacuum that isn't particularly difficult. Electrons and protons don't lose energy over distance unless they interact with something (which in this case is the point).
 
Space travel is prohibitive even without the light speed limit.

Say we just want to send a probe to Proxima Centauri (the nearest star, let's say 4.5 light years away) at 0.1c (one tenth of the light speed, where relativistic effects are weak, a correction of 1% in respect to Newtonian physics). The one-way trip will take about 45 years, but let's say we are willing to wait (the probe signals will take 4.5 more years to get back to Earth).

We have also developed the ability to create and store antimatter in bulk, and we have built a perfect anti-matter drive which converts all the matter - anti-matter annihilation energy into propulsive power.

This is the energy we need:

1666128245660.png
http://gregsspacecalculations.blogspot.com/p/blog-page.html

To send one kilogram to a one way trip to Proxima Centauri, even taking 45 years, even using the most energy-dense fuel which can be theoretically conceived (just 200g of fuel for each kilogram of payload! compare that to the current state of our technology...), even using the fuel with 100% efficiency, we need 4.77 megatons of energy!

One megaton is 1.162 TWh (terawatt-hour, or billions of kWh):
1666129681966.png

So we need 1.16222 * 4.7735 = about 5.55 TWh for each kilogram of our probe. An 800kg probe would need 4440 TWh, or 4440 billions of kWh, slightly more than the annual electricity generation of the US in 2021:

1666129491687.png
https://www.eia.gov/tools/faqs/faq.php?id=427&t=3


And after all this, having already made humongously giant leaps beyond what our current technology can do, all we will get will just be 800kg of hardware orbiting Proxima Centauri, and able to send us back images and other data (at quite a slow speed). Compare this with what 'UAP's are supposed to be able to do...


Note: parameters used for the calculations
1666130933750.png
1666130998004.png

Edit: do you want to reach a speed of 0.9c and dramatically shorten the one-way trip to a little more than 5 years? Be prepared to use

1666131505373.png
!
 

Attachments

  • 1666130885424.png
    1666130885424.png
    12 KB · Views: 59
  • 1666130895038.png
    1666130895038.png
    12 KB · Views: 75
Last edited:
It's not geek bunk, a system based on the principle was tested on the Space Shuttle near the end of its life and achieved measurable deflection of cosmic rays and charged particles passing through the South Atlantic Anomaly, it was a forward looking project on both the shuttle and the ISS that tested possible Mars tech, the system was hoped to reduce the need for both micrometeoroid and radiation shielding on long duration deep space missions, but it's unlikely it could meet the weight requirements vs. traditional shielding.
At near c? Reference needed.
As for distance, at relativistic speeds it has to go very far (and the math on how far gets weirder the faster you're going because linear distance is a complex question when you need to measure across multiple frames of reference), however in a vacuum that isn't particularly difficult. Electrons and protons don't lose energy over distance unless they interact with something (which in this case is the point).
That makes no sense to me.

Can we accelerate to near c is debunked. If you disagree, why not make a StackOverflow account and post your answer there :rolleyes: ?
 
Last edited:
It's often more of a question of whether someone is willing to put up funding to take the chance the needed technology develops/matures. I don't think Dr Szydagis will have to worry about finding a funding source anytime soon.

Sorry folks, but can I make a side note on here?

Very good point often overlooked when it comes to fund highly risky scientific projects like the search for probes/drones/whatever from other civilizations in outer space. Yet this brings to mind the surprisingly never criticised billions of dollars invested in the search for Dark Matter over the past several decades, the most recent attempt was using the Large Hadron Collider that cost 10 billion dollars to find the lightest supersymmetric particle as a candidate for the dark matter and people said ..."you know, it will be found...it's around the corner". Then the LHC, at the cost of 10 billion dollars, didn't find it...! Okay, so there is no supersymmetry at the natural set of parameters and there was no Dark Matter particle discovered by the scientific community so far. What is worse, even if they for once find the dark matter is super symmetric particles it will have ZERO impact on our daily lives.

At least this unending quest for the now deemed as "elephant in the universe" (dark matter/energy) has in fact made the scientists realise (or at least strongly suspect) there might be something definitely wrong with the current theories about gravity or with the current paradigms of cosmology.

Therefore, this fuss here about all the current scientific hypothesis that thus far only "scratch the surface" of the actual possibilities for "super fast" interstellar travel should be taken with a much more humble approach by everyone, IMO.
 
Are you not concerned about collisions with stuff as we approach near c inside the galaxy?

[... or should I say, do you need us to do the calculations? ...]

Demonstrating (1) a thing being technologically very difficult to achieve given our understanding of physics (which some have attempted on this thread) is not equivalent to a demonstration of (2) a thing being physically impossible. In our discussion these two lines of analysis are being epistemologically confused where some debunkers discuss physical possibility and others physical difficulty/technological feasibility. They're related but different types of analysis and both are part and parcel of science proper. Neither is bunk per se.

For something to be physically impossible it must violate/circumvent known laws of physics and all sorts of other challenges that are even theoretically insurmountable. Such an impossibility has not been proven, not by you nor by others, not with regard to collisions nor with the energy requirements for the accelaration of a craft at 1g for a year. It's merely been pointed out that such a feat would be extraordinarily difficult to achieve in practice. Most sensible interlocutors have no qualms with the latter statement.

The main theoretical flaw with many alien hypotheses is not the difficulty of a particular speculative technological feat, but rather its physical impossibility -- such as crafts being so advanced that they can manipulate space-time, travel through 'dimensions' and veritably negate gravity. This has to do with the fact that while our understanding of the universe remains imperfect and even prone to error, we nonetheless already know a great deal about spacetime, gravity and certain basic laws and constants with a high degree of confidence owing to amazingly accurate and repeatable predictions and measurement outcomes.

The alien hypotheses often selectively ignore highly predictive and validated physical laws for convenience, while accepting others, as there is no choice but to acknowledge at least the minimal physical properties featured in the footage they claim as evidence (radar returns, heat signatures, light properties, physical motion, etc). Such argumentation is not scientific, nor is it intellectually honest.

A fair-minded person quickly sees such argumentation seeks to modify reality to fit a fancy. On one hand, Dr. Szydagis is sidestepping this landmine by restricting his argument to stating that interstellar travel is physically possible. However, he seems over-confident of it being technologically feasible and plays down the difficulties of making such travel a reality. Shrugging off the fuel dilemma is a case in point.
 
The main theoretical flaw with many alien hypotheses is not the difficulty of a particular speculative technological feat, but rather its physical impossibility -- such as crafts being so advanced that they can manipulate space-time, travel through 'dimensions' and veritably negate gravity. This has to do with the fact that while our understanding of the universe remains imperfect and even prone to error, we nonetheless already know a great deal about spacetime, gravity and certain basic laws and constants with a high degree of confidence owing to amazingly accurate and repeatable predictions and measurement outcomes.
Even with "certain basic laws and constants with a high degree of confidence owing to amazingly accurate and repeatable predictions and measurement outcomes" it is hard to predict what can and cannot exist in the future.

Newton's laws of gravity were amazingly accurate yet unable to predict things like black holes and other quircky behavior of spacetime we now routinely use for our satnav systems.
Maxwell's laws were highly accurate as well, but they were unable to predict the possibility of quantum computers.

It's virtually impossible to look thousands of years ahead and try to predict what our understanding of for instance dark energy would bring. Who knows, we might yet discover something similar to negative mass which enables us to build something akin to the Alcubierre Drive. Maybe we could push those colliding partices away with a repulsive field based on the same stuff that lies at the root of dark energy.

I don't think we can put meaningful limits to speculation about what lies thousands of years ahead, not even with our current understanding of physics no matter how accurate our present day predictions are.
 
Even with "certain basic laws and constants with a high degree of confidence owing to amazingly accurate and repeatable predictions and measurement outcomes" it is hard to predict what can and cannot exist in the future.

We see eye to eye on your overall philosophical point of intellectual humility towards the unknown being essential in all scientific pursuit.

Newton's laws of gravity were amazingly accurate yet unable to predict things like black holes and other quircky behavior of spacetime we now routinely use for our satnav systems.

And yet the Einsteinian paradigm and redefinition of mass has never rendered Newtonian predictions invalid at lower velocities.

It's virtually impossible to look thousands of years ahead and try to predict what our understanding of for instance dark energy would bring. Who knows, we might yet discover something similar to negative mass which enables us to build something akin to the Alcubierre Drive. Maybe we could push those colliding partices away with a repulsive field based on the same stuff that lies at the root of dark energy.

Indeed, science is always open to testable speculations and allowing such speculation a lot of imaginative latitude. Pseudo-science, however, is often flippantly ready to compromise established science in order to promote a particular and often untestable speculation. And trying to appear scientific while doing so.

Some of the current dark energy speculations tweak already observed facts (about the expansion of the universe) and adjust highly successful theories (in their ability to predict the behaviour, including the expansion, of the known universe), such as Einstein's relativity, to fit the idea of alternate gravity. What makes many a UFO enthusiast drawn to these speculations is the opportunity they see in them to prove anti-gravity propulsion systems.

Farnes' (the author of a study often cited by 'anti-gravity propulsionists') theory has been criticized by peers not because it's imaginative and fascinating (which are both more than welcome characteristics of scientific hypothesization), but because it is essentially postulating something extremely speculative -- a negative mass dark fluid that self-creates in a universe that expands at different rates in different directions. The property of self-creation already enters into the arena of philosophical metaphysics. It's unfalsifiable (untestable) but, in the same token, also unable to predict any testable measurement outcomes, since every possible measurement outcome in every possible universe can be claimed to fit such broad philosophical strokes.

It is not science to selectively tweak established science and engage in philosophical speculation in the name of science. It is not science to alter highly successful theories to the convenient extent they do not contradict one's preferred science fiction theory whilst rendering these successful theories, in the process, less successful in their predictive power.

Scientific exploration should of course be totally open to exploring the idea of anti-gravity, to questioning Einstein's relativity, and to tweaking established calculations on the expansion of the universe as long as the challenger of existing successful paradigms undertakes the rigorous scientific task of producing a testable rival theory with greater predictive power whilst able to account for all the measurement outcomes of earlier successful theories (i.e. the principle of empirical adequacy). That's not what's happening with dark energy speculations as yet. Could it happen in the future? Sure, why not. We don't really know.

I don't think we can put meaningful limits to speculation about what lies thousands of years ahead, not even with our current understanding of physics no matter how accurate our present day predictions are.

If we wish to stay within the domain of science, we can put a meaningful limit to speculation which is that of testability and provision of better predictions of measurement outcomes than the existing powerfully predictive hypotheses. Just like what Einstein did to Newton. But when we begin to tweak observed facts to fit our theory, we're transgressing the reasonable bounds of scientific speculation into the realm of unreasonable.
 
I don't think we can put meaningful limits to speculation about what lies thousands of years ahead, not even with our current understanding of physics no matter how accurate our present day predictions are.

But we have already learned from history that the more science progresses the more limits to what can be done are found, not the contrary. Science expands the technological possibilities, but in doing so it also puts fundamental constraints on what can possibly be done, which were previously unknown:

- In the 1600s-1700s Newton's laws showed that an object cannot simply be moved magically, but one needs to apply a force. He also showed that if applying a force is impossible an object can still move, provided it can expel a part of its mass. Newton's laws brought on great advancements in mechanics (and the jet engine, much later), but also gave us fundamental limits (which are already very much pertinent to the impossibility of space travel).
- In the 1700s-1800s thermodynamics first showed 'free energy' is impossible, and then it showed 'unlimited energy' is also impossible. We gained a big technolocial progress, 'engines' and the industrial revolution, but it also gave us two very, very nasty fundamental limits.
- In 1903 Tsiolkovsky found the rocket equation. This allowed us, in the end, to develop rockets and go to the Moon, but unfortunately it also showed the energy requirements for achieving a speed high enough to go to even the nearest star are monstrously high, even with Newtonian physics (see post #22). [notice: this gets rid of one of the two methods which Newton's laws allow, reaction propulsion. The other method, applying a force, is what solar sails or spacecraft 'pushed' by lasers would use. There are fundamental limits here too, unfortunately. For completeness, fusion ramjets are an interesting twist, but no more viable than the other alternatives].
- At the beginning of the 1900s quantum physics showed us how to get orders of magnitude more energy than we thought possible before, great! The results, after 100 years, are admittedly mixed: we gained the capability of destroying ourselves through nuclear war, nuclear power plants, and who knows, maybe, in the future, even fusion power plants. We also learned that, whatever we do, we shall never be able to extract more than m*c2 energy from anything. We even discovered how to do that, with anti-matter (the fuel used in post #22), and that even if we had the capability to produce and store anti-matter with high efficiency (which is not easy, to say the least), and adding in some other order of magnitude of technological improvement in different fields, and be willing to invest some years of the energy output of the whole planet we could reach the nearest star in about 9 years with a spacecraft resembling something like this (yeah, it's about 600 kilometers long... hard to hide in the LIZ...):

1666171616158.png

https://web.archive.org/web/2015050...gov/dspace/bitstream/2014/38278/1/03-1942.pdf


What lies thousand of years ahead? Alcubierre drive? Oh my, the light speed limit is there to stay... don't worry about that. Trespassing it, in whichever way, is the same as going back in time, or seeing an effect before its cause. As the old joke goes: "Splash! A tachyon enters a coffee". Good luck.

New physics? Yeah, but, any new exciting physics will be (as happened with Einstein vs. Galileo and Newton, for instance) a correction in some extreme conditions. It might open up great technological avenues, and it will be quite satisfying to understand dark matter and dark energy, but this will not remove the limits we are already aware of (it will add more if anything, as history shows).
 
Often these types of future speculation rub up against causality where the nature of cause and effect breaks down, and that has implications that seem unreconcilable with reality.

It is listed as one of the issues with the speculative Alcubierre drive

https://en.wikipedia.org/wiki/Alcubierre_drive#Causality_violation_and_semiclassical_instability

Calculations by physicist Allen Everett show that warp bubbles could be used to create closed timelike curves in general relativity, meaning that the theory predicts that they could be used for backwards time travel.[43] While it is possible that the fundamental laws of physics might allow closed timelike curves, the chronology protection conjecture hypothesizes that in all cases where the classical theory of general relativity allows them, quantum effects would intervene to eliminate the possibility, making these spacetimes impossible to realize. A possible type of effect that would accomplish this is a buildup of vacuum fluctuations on the border of the region of spacetime where time travel would first become possible, causing the energy density to become high enough to destroy the system that would otherwise become a time machine. Some results in semiclassical gravity appear to support the conjecture, including a calculation dealing specifically with quantum effects in warp-drive spacetimes that suggested that warp bubbles would be semiclassically unstable,[4][44] but ultimately the conjecture can only be decided by a full theory of quantum gravity.[45]

Alcubierre briefly discusses some of these issues in a series of lecture slides posted online,[46] where he writes: "beware: in relativity, any method to travel faster than light can in principle be used to travel back in time (a time machine)". In the next slide, he brings up the chronology protection conjecture and writes: "The conjecture has not been proven (it wouldn't be a conjecture if it had), but there are good arguments in its favor based on quantum field theory. The conjecture does not prohibit faster-than-light travel. It just states that if a method to travel faster than light exists, and one tries to use it to build a time machine, something will go wrong: the energy accumulated will explode, or it will create a black hole."
 
The property of self-creation already enters into the arena of philosophical metaphysics.

I didn't know what hypothesis you're addressing neither what's precisely the property of the self-creation you referred to nor how it's supposed to function, but talking strictly in scientific terms there has already been found evidence for the Schwinger Effect in graphene (in a nutshell, create matter out of nothing with strong magnetic fields.)
Other scientific evidence of something out of nothing? Quantum fluctuations in the vaccum -- in a nutshell, according to quantum mechanics, a vacuum isn't empty at all. It's actually filled with quantum energy and particles that blink in and out of existence for a fleeting moment - strange signals that are known as quantum fluctuations.
 
I didn't know what hypothesis you're addressing neither what's precisely the property of the self-creation you referred to and how it's supposed to function, but talking strictly in scientific terms there has already been found evidence for the Schwinger Effect in graphene (in a nutshell, create matter out of nothing with strong magnetic fields.) Other scientific evidence of something out of nothing? Quantum fluctuations in the vaccum -- in a nutshell, according to quantum mechanics, a vacuum isn't empty at all. It's actually filled with quantum energy and particles that blink in and out of existence for a fleeting moment - strange signals that are known as quantum fluctuations.

Vacuum is not nothing. It's not the same as non-existence in philosophical ontology. This is a common misconception.
 
the fact that m increases as v approaches c
Pretending that mass varies is the concept of "relativistic mass". Scientists who deal with relativistic speeds generally shun the concept, as it adds more confusion than it does explanatory aid, and prefer to just keep the Lorentz's gamma=(1-(v/c)^2)^(-1/2) factor in all the equations.

? gamma=subst((1-x^2)^(-1/2),x, (v/c))
...
? gamma*m*c^2
m*c^2 + 1/2*m*v^2 + 3*m/(8*c^2)*v^4 + 5*m/(16*c^4)*v^6 + O(v^8)

gamma*m*c^2 = mass-energy
m*c^2 = the famous and equally misunderstood term for mass-energy equivalence
1/2 m v^2 = the newtonian kinetic energy term
other terms in v = the error in that newtonian KE term

This seems to have been put behind a paywall, but I'm pretty sure I found it a fun read: https://aapt.scitation.org/doi/10.1119/1.3204111
Einstein Never Approved of Relativistic Mass

The Physics Teacher 47, 336 (2009); https://doi.org/10.1119/1.3204111
Eugene Hecht

ABSTRACT
During much of the 20th century it was widely believed that one of the significant insights of special relativity was “relativistic mass.” Today there are two schools on that issue: the traditional view that embraces speed-dependent “relativistic mass,” and the more modern position that rejects it, maintaining that there is only one mass and it's speed-independent. This paper explores the history of “relativistic mass,” emphasizing Einstein's public role and private thoughts. We show how the concept of speed-dependent mass mistakenly evolved out of a tangle of ideas despite Einstein's prescient reluctance. Along the way there will be previously unrevealed surprises (e.g., Einstein never derived the expression for “relativistic mass,” and privately disapproved of it).
Content from External Source
Also check his annotation of the references, he's clearly showboating.

But the easier take without all the history is, as so often, Don Lincoln:
Source: https://www.youtube.com/watch?v=LTJauaefTZM


Further reading: https://profmattstrassler.com/artic...o-definitions-of-mass-and-why-i-use-only-one/
 
Big difference between safe and comfortable.

I guess I have more confidence in man's ability to achieve than you do. In 1900, how many would have considered putting men on the moon nothing more than a "magical wishing challenge," only to see it accomplished in less than 70 years?

There's a difference between an exponential curve and a logistic curve even though at small values they are indistinguishable.

A logistic model of scientific advance is scientifically supportable, even if somewhat pessimistic, but an exponential model isn't - the need for new inputs will eventually be greater than what the universe can provide (c.f. Malthus).

What is the thing that we've done in ~2020 that would be as astounding to someone in 1970 that the moon landings would have been to someone in 1920? It's not a trick question, there's enough pretending to get into other people's heads that the set of answers is inherently going to be fuzzy edged. I don't think many would object to "AIs beating humans at go" as an example (I'm pretty sure I saw that predicted as happening "never" more than once - oh, how smug we humans can be!), but most of the technological advances there have been simply in lithography, and the 70s-2000s were the sweet spot of that technology's advance, there's no reason to think it can continue at the same rate for much longer at all - we're hitting several very real limits of quantum mechanics (both in manufacture and operation).
 
New physics? Yeah, but, any new exciting physics will be (as happened with Einstein vs. Galileo and Newton, for instance) a correction in some extreme conditions. It might open up great technological avenues, and it will be quite satisfying to understand dark matter and dark energy, but this will not remove the limits we are already aware of (it will add more if anything, as history shows).
Not "new" physics, but adapted theory, or in better words, corrected theory.
 
There's a difference between an exponential curve and a logistic curve even though at small values they are indistinguishable.

A logistic model of scientific advance is scientifically supportable, even if somewhat pessimistic, but an exponential model isn't - the need for new inputs will eventually be greater than what the universe can provide (c.f. Malthus).

What is the thing that we've done in ~2020 that would be as astounding to someone in 1970 that the moon landings would have been to someone in 1920? It's not a trick question, there's enough pretending to get into other people's heads that the set of answers is inherently going to be fuzzy edged. I don't think many would object to "AIs beating humans at go" as an example (I'm pretty sure I saw that predicted as happening "never" more than once - oh, how smug we humans can be!), but most of the technological advances there have been simply in lithography, and the 70s-2000s were the sweet spot of that technology's advance, there's no reason to think it can continue at the same rate for much longer at all - we're hitting several very real limits of quantum mechanics (both in manufacture and operation).
I don't know....the internet and the various means of accessing it? As a teenage "someone" in 1970, I would have found it "astounding" to know there would be technology in my life time that allowed me to do what I'm doing right now.

Any projections you or I care to make about the state of technology in 70 years are pretty meaningless. I know I won't be around to see if you're right or wrong.
 
Last edited:
an electron beam directed ahead of a craft can charge and repel particles and small objects out of the way

If one has a source of infinite spare electrons. Soving that's a larger problem to solve than the original one. I would hope you'd be squirting out protons or something positively charged in the opposite direction too, otherwise, you'll build up a terrible case of static.

It may sound mundane and not-very-sci-fi, but all you really need is enough repairable energy-dissipating material on the outside of your craft. Something as simple as honking great tanks of a liquid seems to be a workable solution, given what we currently know about the mass/speed/quantity distribution of stuff in inter-stellar space. They'll take damage, but the energy will dissipate through the liquid, which itself takes no damge, so the damage that will need to be repaired will be localised to the tank wall itself, which could even possibly be repaired in-situ for most hits. (Remember the suspicious drill-hole in the ISS - you could almost have patched that up with duct tape or a hot-glue gun - really not so sci-fi at all - and this is just a scaled-up version of that.)
 
I don't know....the internet and the various means of accessing it? As a teenage "someone" in 1970, I would have found it "astounding" to know there would be technology in my life time that allowed me to do what I'm doing right now.

If I'm not mistaken, @FatPhil's point is the seeming plateauing of basic research in physics (and mathematics?) in the last 50+ years. In other words, what astounds the end-user today (internet, smartphones, increasingly sophisticated AI in the form of automations, data-analytics and algorithms) is actually based on decades-old fundamental theories and the mere refinement of their early elementary technological applications. It's not fundamentally new science. It's applied research that's thriving today and what the philosopher of science Thomas Kuhn would have labelled as "normal science" -- operating within a particular paradigm that emerged through a "scientific revolution" quite a while back. And which will continue to thrive as applied research until the next paradigm shift takes place through some revolutionary basic research.

However, I also like what you wrote in response:

Any projections you or I care to make about the state of technology in 70 years are pretty meaningless. I know I won't be around to see if you're right or wrong.

Physicists such as Nima Arkhani-Hamed (Quantum Field Theory) and philosophers of physics such as David Albert have been talking about a new buzz in theoretical physics that's been picking up in the last few decades, especially at the intersection of QFT and mathematics (geometry). Also the many-worlds interpretation of quantum mechanics has enjoyed some recent traction and inspired arts and entertainment including the Marvel Cinematic Universe and other films.

Nima and his cohorts have been particularly impressed by the applicability to physics of the geometric structures of polyhedra and the more advanced structures of algebraic geometry and number theory as a promising novelty. It has produced applications into twistor string theory that are not merely speculative anymore. Such as the amplituhedron (below) which is an abstract geometrical model of calculating certain particle interactions (namely N = 4 supersymmetric Yang–Mills theory) in a simpler way than using thousands of Feynman diagrams.

329F4DF6-7A8E-4510-8BE0-3AA51AACB594.jpeg

He sees that these avenues of explanation at the said intersection, if pursued further and seriously, may unravel a more fundamental understanding of the universe from which both spacetime and quantum mechanics could emerge. We obviously don't know if they will or won't. But the buzz seems real enough.

He says in one of his interviews:

"But there's completely clearly now these fantastic new stuctures which are not speculative, just talking about standard physics, but in a very different way. I'm hopeful that it will go somewhere sort of more generally beyond the very special and simple theories that we're starting to understand this way, to reveal something deeper about the way the nature actually works."

"... would be thrilled if we would find some way of talking about all of standard physics, not just the toy models that we're looking at, but all the standard physics in a way that doesn't put spacetime in and doesn't put quantum mechanics in, and gets the answers out. And ultimately then, if we really understand it, then we'll have the beginning of an understanding, a starting point, from which we can see the emergence of spacetime and quantum mechanics."


Be as it may, one thing is clear. Rigorous science, whether basic research or applied, is plenty fascinating and mind-boggling as it is. Some of it puts even the most fantastic science fiction to shame in terms of sheer awe and wonder. But it does take more effort and intellectual gymnastics to grasp.
 
Last edited:
If I'm not mistaken, @FatPhil's point is the seeming plateauing of basic research in physics and mathematics. In other words, what astounds the end-user today (internet, smartphones, increasingly sophisticated AI in the form of automations, data-analytics and algorithms) is actually based on decades-old fundamental theories and the mere refinement of their early elementary technological applications. It's not fundamentally new science. It's applied research that's thriving and what the philosopher of science Thomas Kuhn would label as "normal science" within a new paradigm that emerged through a "scientific revolution" quite a while back.

However, I also like what you wrote in response:



Physicists such as Nima Arkhani-Hamed (Quantum Field Theory) and philosophers of physics such as David Albert have been talking about a new buzz in theoretical physics that's been picking up in the last few decades, especially at the intersection of QFT and mathematics (geometry). Also the many-worlds interpretation of quantum mechanics has enjoyed some recent traction and inspired arts and entertainment including the Marvel Cinematic Universe and other films.

Nima and his cohorts have been particularly impressed by the applicability to physics of the geometric structures of polyhedra and the more advanced structures of algebraic geometry and number theory as a promising novelty. It has produced applications into twistor string theory that are not merely speculative anymore. Such as the amplituhedron (below) which is an abstract geometrical model of calculating certain particle interactions (namely N = 4 supersymmetric Yang–Mills theory) in a simpler way than using thousands of Feynman diagrams.

329F4DF6-7A8E-4510-8BE0-3AA51AACB594.jpeg

He sees that these developments, if pursued further and seriously, may unravel a more fundamental understanding of the universe from which both spacetime and quantum mechanics could emerge. We obviously don't know if they will or won't. But the buzz seems real enough.

He says in one of his interviews:

"But there's completely clearly now these fantastic new stuctures which are not speculative, just talking about standard physics, but in a very different way. I'm hopeful that it will go somewhere sort of more generally beyond the very special and simple theories that we're starting to understand this way, to reveal something deeper about the way the nature actually works."

"... would be thrilled if we would find some way of talking about all of standard physics, not just the toy models that we're looking at, but all the standard physics in a way that doesn't put spacetime in and doesn't put quantum mechanics in, and gets the answers out. And ultimately then, if we really understand it, then we'll have the beginning of an understanding, a starting point, from which we can see the emergence of spacetime and quantum mechanics."


Be as it may, one thing is clear. Rigorous science, whether basic research or applied, is plenty fascinating and mind-boggling as it is. Some of it puts even the most fantastic science fiction to shame in terms of sheer awe and wonder. But it does take more effort and intellectual gymnastics to grasp.
Yes, I understood the point @FatPhil was making. I was answering his direct, non-trick question about the perceptions of "someone" relative to technological advances of the two time spans he specified. I figured I was as good a someone as the next guy. ;)
 
Last edited:
Can we accelerate to near c is debunked. If you disagree, why not make a StackOverflow account and post your answer there :rolleyes: ?

Nothing prevents us from building a particle accelerator in space, and accellerating massive objects (such atomic nuclei), very close to the speed of light.
I'm not sure I understand your question, your stack-overflow link which violates the no-click policy shows a mistaken point of view being corrected, but what you appear to have asked is countered by my response. ANy CRT monitor in any space mission was a primitive particle accelerator that would almost certainly have beamed particles at relativistic speed (it depends on the voltage, obviously, but 0.01c is trivial, and 0.1c is possible).

Whether a body can accelerate *itself* (in contrast to "an object" above) to relativistic speeds is probably the question you were looking for an answer to, and that's been answered here and on SE.
 
Newton's laws of gravity were amazingly accurate yet unable to predict things like black holes

Au contraire:
If the semi-diameter of a sphere of the same
density as the Sun were to exceed that of the Sun in
the proportion of 500 to 1, a body falling from an
infinite height towards it would have acquired at
its surface greater velocity than that of light and
consequently supposing light to be attracted by the
same force in proportion to its vis inertia, with
other bodies, all light emitted from such a body
would be made to return towards it by its own
proper gravity. This hypothesis assumes that
gravity influences light in the same way as massive
objects.
Content from External Source
-- J. Michell, Philos. Trans. R. Soc. 74, 35 (1784).
https://royalsocietypublishing.org/doi/10.1098/rstl.1784.0008

Yes, you read that date right.

I'd say that were you to propose to Reverand Michell that these things which let no light escape, were they to exist, should be called "black holes", I think he'd agree, so I think we've got something that could be called a black hole and is (almost) indistinguishable from a GR black hole. It's "like a black hole", surely?

I say "almost", as SR gave us the property that time would dilate as we view an object approach it, but I'm pretty sure that you can derive SR just from what was known in Newton's time, and therefore the external viewer seeing a slow imprint of a falling object onto the surface would also predicted. Spaghettification also follows just from Newton's Laws too, so I reckon Newton's Laws and what else was known at the time predicts black holes that are remarkably similar to the real ones.

OK, GR gave us the singularities we know to be inside, and the timeiness/spaceiness reversal behind event horizons, but is not necessary for most of the properties we can see (it's also needed for frame dragging, which also happens outside).
 
Back
Top