“Do I believe, for example, that by using magic I could fly? No. How would you get around gravity? Impossible. Do I believe that I might be able to project my consciousness into a very, very vivid simulation of flying? Yeah. Yes, I’ve done that. Yes, that works.” -Alan Moore

If you possessed a computer with enough power, you could conceivably simulate the entire Universe. From the inception of the Big Bang, you could compute the positions and momenta of every particle and every interaction over time, across all 13.8 billion years. If your simulation was good enough, you could even account for quantum processes and uncertainty, and you’d wind up with planets, life, and even human brains at the end.

But if this were representative of our reality, would there be any way to tell? Maybe computational short-cuts would show up as some sort of fundamental blurriness at small enough scale. And what would that tell us about our quest to understand the fundamental constants, particles and interactions that define our Universe? Would it all be futile? Perhaps there would still be something important to learn about our existence by asking the right fundamental questions through experiments.

I am interested in the justification for this statement about the fundamental quantum limit of 10^-35 meters. This distance is known as the Planck length, and is obtained from the gravitational constant, Planck’s constant, and the speed of light. But the idea that one cannot even in principle explore distances below this length is speculation that has no empirical justification. Even theoretical justification would require a quantum theory of gravity, which we do not have. Saying this is a fundamental distance scale below which we cannot probe is not even remotely as justifiable as saying, for example, that the speed of light is a speed beyond which nothing can go. The latter is on very firm experimental and theoretical ground. The former is pure speculation, as far as I can tell. Ethan, feel free to correct me by sharing your evidence if I am wrong.

If our universe is a simulation, then the computer that runs the simulation is outside of our universe, and the coders who run the computer are as well.

That is the definition of supernatural: above or outside of nature.

“Simulationism” is nothing more or less than the belief in a new type of deity: Coder Gods. This is so transparent and so obvious, that it’s shocking to see the idea taken seriously by people with otherwise-impeccable scientific credentials.

The 1960s and 1970s saw the rise of a number of religious cults that attracted prominent supporters: the Unification Church (Moonies) and Scientology being the most well-known. To this day, numerous high-profile figures in Hollywood are committed members of Scientology. But rational people laugh at the idea that Sun Myung Moon was the reincarnation of Jesus, or that their souls are infested with Thetans that have to be purged through Auditing and similar rituals.

The 1990s through the present era have seen the rise of a new crop of cults, all of them based on technology. These include the Extropians, the Transhumanists, adherents of The Singularity, and now adherents of The Simulation. As with Scientology, they include prominent figures in high places, giving them a kind of respectability that is denied to the hoi-polloi of technology cultists, such as those who pray for ETs to swoop down and rescue humanity from its foibles (that, specifically, is a type of “cargo cult”).

Today’s technology cults have their deities and their hereafters: supernatural coder-gods, and AI gods in silicon boxes; Alcor freezers and nanotech immortality, “upload” of the soul to eternal life in the computer, and so on.

But these are half-assed religions in that they utterly lack moral frameworks to limit the excesses of their true believers. If anything, there is a close resemblance to the “prosperity theology” or “prosperity gospel” of certain parts of the Religious Right, with an attendant social Darwinism that self-justifies anything the true believers do.

The technology cults are engaged in the fundamental dishonesty of offering religion in the guise of science. When Deepak Chopra plays this game we laugh; but when Ray Kurzweil and Nick Bostrom do likewise, what then? Take them seriously, because they sell salvation in a computer rather than in a “healing-energy crystal”?

If Kurzweil, Bostrom, Drexler, et.al., were honest, they would call their endeavors religion and open up new churches. We should have no problem with that, as it would be an honest exercise in freedom of religion. But we should object strenuously when they try to sell it as science, and we should not help them do that by treating it as such.

Yes, the universe could be a simulation. That said, this speculation offers no further insight whatsoever into figuring anything out how nature works, and is really just a long winded way of throwing in the towel to the concept of science altogether.

Newton believed the universe was created, and yet he still wished to understand how it worked through the relationships of what could be observed. Follow his lead and try not to wig out. If you wish to read a book about what happens when you take the whole ‘universe as a simulation’ speculation seriously, check out “Permutation City”. It’s an exercise in infinite regression into the ludicrous.

I think thatThe reason that our universe’sfundimetal laws are what we reached by our todays tecknology for example the speed of light which is less than 300000kilometers per second and all other phisical laws, is because of the present fundimental laws,in other word,if we had a different fundimenta l laws with respect to what we have know,in that case we had another form of universe,so that for example the speed of light or the speed of inflasion might be more or less than what our universe at present time has.another example:-) if we want to have twocity with only one entrance wayand1000 departur ways

“I am interested in the justification for this statement about the fundamental quantum limit of 10^-35 meters.”

You can’t go faster than light, and you can’t tell time to an arbitrarily close limit, and from those two basic axioms from Heisenberg you have the planck length.

We have empirical evidence of the irreducible accuracy of time, mass, energy or location, and the consequence of those evidenced facts is the planck length.

It really is as simple as that.

“But these are half-assed religions in that they utterly lack moral frameworks to limit the excesses of their true believers.”

Be fair: none of the “real” religions have moral frameworks to limit the excesses of their true believers either.

They may CLAIM to have them, but

a) they’ve never limited the excesses of “true believers”

b) they’ve been arbitrarily changed, obeyed, re-interpreted or removed by the society as it changes either time or location

ketchup,

Just to expand a bit on Wow’s response to you. Quantum mechanics tells us that there are pairs of observables that cannot be measured in any system to infinitely good precision simultaneously. These are called complementary observables. The most well known example is probably position and momentum. You cannot, as an inherent property of the universe, measure the position of a particle and its momentum to arbitrary precision. There is always an uncertainty involved with each measurement, and the product of the two uncertainties must be greater than the reduced Planck constant. This is a VERY small value, so we don’t notice this uncertainty in everyday life, but it becomes important at small scales.

Now, suppose we want to determine the shortest possible time we can measure. We need a very accurate clock. Obviously, we want the uncertainty of our time measurement to not exceed the time we wish to measure. Quantum mechanics tells us that time and energy are another pair of complementary observables. This places a limit on how accurately we can measure times. We can reduce this limit, though, by increasing the uncertainty in the energy of our clock, and therefore the total energy of it. This, however, is where general relativity comes in. To read our clock, our clock must not be so high in energy that it becomes a black hole. This places an upper limit on the energy of our clock, thereby placing a limit on the uncertainty of our time measurement. We therefore have a minimum time that we can measure. This is because it would make no sense to say that the duration we measure is 0.01 +/- 1 Planck times. It would only make sense to say that 1 Planck time is the smallest time we can measure.

Now, the speed of light, as you accepted above, is the fastest speed we can measure. If we want to measure a distance, the way to do so is to measure the distance travelled by a light beam in a certain time. Remember, though, that there is a shortest time we can measure. This implies that there is also a shortest distance we can measure. We can only measure how far our light beam travels in 1 Planck time. We cannot measure how far it travels in less time than this since we cannot measure times less than the Planck time. Therefore, we have c * T(Planck) = d (Planck) or the Planck distance, which is the shortest measureable distance.

Experimental justification is simple. These conclusions result directly from general relativity and quantum mechanics. Insofar as those theories are justified experimentally, so is the conclusion that there is a shortest measureable distance.

Sean T,

Thank you for the explanation. But I do not know enough about General Relativity to fully interpret the phrase “To read our clock, our clock must not be so high in energy that it becomes a black hole.” I know that the energy at which an object becomes a black hole depends on the volume of space into which that matter is squeezed. But what about a photon? Say I want to use a photon as my clock. Are you saying that general relativity places an upper limit on the energy of a photon? What is that upper limit? I have never heard of an upper limit to the energy of a photon before, other than speculation that the wavelength of the photon cannot be smaller than the planck length, but would be begging the question.

@8: how are you using a photon as a clock without measuring time or distance? Because if you clock something by (for example) ‘how far the photon traveled during the time they did it’, you’re back at a minimum increment.

ketchup

You are rapidly getting above my pay grade with that question. However, if you are using that photon somehow as a clock (ignoring eric’s objection to that idea for now), then the photon must interact with something else. If not, how do you read your clock? In so doing, a photon with energy greater than twice the mass of any elementary particle will cause a pair production event producing a particle and its antiparticle. Localized to a small region of space this pair production would, given a sufficiently energetic photon, produce enough mass to cause your clock to collapse into a black hole, thereby rendering it useless as a clock since there would be no way to read your clock.

I had tried to avoid math before since I don’t know your level of mathematical education, but it’s actually (at least by QM and GR standards) pretty simple math. We require a few things from our clock, namely that its time uncertainty must be no greater than the time it measures, that it be distinguishable from a “clock-anticlock pair”, and that it be readable. The first of these I discussed in my last post. The second is equivalent to the requirement that the uncertainty in the energy of the clock not be greater than the total energy of it. The last encompasses the fact that the clock cannot be a black hole and that the size of the clock cannot be larger than the speed of light times the length of the clock.

Now for the math: dE * dt > h, where dt and dE are the uncertainties in the energy and time, h is Planck’s constant. This is the uncertainty principal from QM and is well established experimentally. Now t>=dt so t>h/dE. E is not smaller than dE as discussed above. Substituting E for dE in the last equation therefore gives a value for the right side that is not larger than the original so we have t>h/E. From relativity this can be expressed as t>h/mc^2. Now GR tells us that mhG/rc^4. Finally we had the requirement that rhG/tc^5. Algebraically this is equivalent to t^2>hG/c^5 or t>sqrt(hG/c^5).

This is the expression for Planck tome. You notice it depends only on constants of nature, not on any physical characteristcs of our clock (even though the energy and size factored into tje derivation). This indicates that this minimum time measurement is indeed a fundamental property of all clocks, no matter how you choose to build them. The basis for this derivation was results from quantum mechanics and general relativity, namely the uncertainty principle, the equivalence of mass and energy, and the formula for the Schwarschild radius (describing the mass/radius relationship for black holes). Both of these theories ate on sound experimental footing, so too is this derivation of minimum time.

@10: technically the high energy photon clock could interact with another photon without inducing pair production, and then the user reads the signal from the lower energy photon. However I’m still stuck on how ketchup means to use a photon as a clock in a way that doesn’t involve measuring one of the quantities you already mentioned.

A photon might not be a good example for the talk about clock and energy. But just on a side note… while in a theoretical sense you can ponder a single photon of arbitrary high energy in an empty universe, reality is that you can’t have an arbitrary high energy photon without it becoming particle/antiparticle pair. We can’t talk about photon and black hole because photon doesn’t have a radius. But something which made that photon (emitter of some kind) probably does, and it has an upper limit before it colapses to a black hole.

On a side note.. we don’t know if a single photon has an upper limit theoreticaly, because we don’t know if it’s wavelenght can go below planck length.

Sean T, I am not taking issue with your math. I am taking issue with your assumptions. The math depends on the behavior of quantum mechanics and relativity at extremely high energies and extremely small distance scales. That is exactly the regime in which we know for sure we need a quantum theory of gravity in order to get the right answers. . We do know that whatever the correct theory of quantum gravity is at those energies, it will simplify to ‘regular’ quantum mechanics and general relativity at energies we have already probed. But you cannot assume that theories that give the right answer at presently-accessible energies will also give right answers at energies several orders of magnitude higher. Newton’s laws of motion, which make excellent predictions at energies much less than the speed of light, are utterly incorrect at speeds close to c. Even if the correct theory of quantum gravity gives qualitatively similar answers to our current theories at the Planck distance and energy scale, that regime is so many orders of magnitude beyond what we have probed experimentally that we do not even know if the ‘constants’ of nature such as G, c, and h even have the same values at those energies. So how anyone can claim to know that the smallest possible distance scale we can probe is 10^-35 meters, as if this is established fact, is beyond me. It seems to me that the best one could say is ‘if we extrapolate current theory several orders of magnitude beyond where it has been tested, our best guess is that the smallest possible distance scale one could probe is about 10^-35 meters’.

“Sean T, I am not taking issue with your math. I am taking issue with your assumptions.”

But earlier you claimed we had no evidence for the plank length. Sean T and myself showed you were wrong.

If you’re now claiming a DIFFERENT “problem” with this, then please prove you’re genuine rather than just JAQing off on the internet.

Sean, this is why I didn’t bother putting more in, I didn’t think ketchup would be willing to listen if the short message didn’t start them thinking.

“reality is that you can’t have an arbitrary high energy photon without it becoming particle/antiparticle pair”

Reality means that unless you’re going to break conservation of either momentum or energy, you need a mass nearby to change a photon to a particle/anti-particle pair.

Which is going to be the default case in any otherwise empty universe.

“We can’t talk about photon and black hole because photon doesn’t have a radius.”

We CAN talk about the Schwartzchild Radius of an energy tensor of a photon, however, and to create a photon able to discern 10^-35m, you need one with such a large amount of energy that it has an event horizon of, surprise, surprise, 10^-35m, making it rather difficult to get any smaller, since any photon able to pick that size out would pick out something smaller than the black hole surrounding that point.

Wow,

I really don’t understand of what wrong you are accusing me. Perhaps I did not do a good job of explaining my question in my original post. Yes, I know what the Planck time is, as a combination of the constants of nature that can be derived using general relativity and quantum mechanics. I do not dispute that. Nor do I dispute that IF these constants are truly constant even at energies that have never been probed, and IF quantum mechanics and general relativity as currently formulated turn out to give correct answers at energy scales that have never been probed, THEN 10^-35 m is approximately the smallest length scale that could ever be meaningful. Maybe I didn’t make that clear in my original question, but I do not view my two posts above as inconsistent with one another. What I am trying to understand, and still do not have a clear answer to, is how do we know 10^-35 m is really the correct ultimate limit to what we can explore? No one has done experiments any where near this small of a distance scale. How do you know that some new behavior doesn’t emerge at, say 10^-30 m that modifies (not eliminates) the uncertainty principle, in a way analogous to how special relativity modifies Newtonian dynamics? How do we know the values of c, h, and G, on which the Planck length depends, are truly constant at distance scales orders of magnitude different from which they have been measured? Did you or Sean T answer either of these questions? If so then I am overlooking it.

It is of course impossible to prove whether one is ‘genuine’ over the internet, but I would hope that the questions would stand on their own. Insisting that one justify assumptions is an important part of science, and I usually become skeptical when someone gets angry when asked to do so.

@ ketchup #17

” how do we know 10^-35 m is really the correct ultimate limit to what we can explore? ”

the honest true answer is that we don’t know. And in no serious discussion will you ever hear someone claiming that we absolutely positively know that planck length is the smallest reality in our Universe. However, we do assume it to be so (at the moment), because our theories (which succesfully predict a whole host of other things), show that energies needed to probe smaller scales result in formation of black holes.

Same as we don’t know if BHs are truly point objects at the center or if matter does have an upper limit to being compressed. We haven’t probed those energies same as with plank scale. So we don’t know those things experimentaly, and yes, there might be some new behaviour, but for now there is no evidence of it. Likewise, there is no real competing theory which predicts things below planck scale, and yet offers some things which we can probe with current tech.

Sinisa,

Thank you for the response. This was my impression, too. But in Ethan’s piece, he listed 10^-35 m as the ultimate length limit as if it were established fact. I was wondering if he meant it as fact, or the way you describe.

@ketchup

well, within the established theories, it is a fact. The math leads you to that limit.

“Yes, I know what the Planck time is, as a combination of the constants of nature that can be derived using general relativity and quantum mechanics. I do not dispute that. ”

You did when you claimed we had no evidence the planck length was a minimum size. Because that fact is the RESULT of those features. Dispute the result and you dispute the features.

SL, it’s that physics will break down because there’s no difference between a singularity and something able to resolve 10^-35m, both will result in extrapolations of physics to realms we have no axiomatic reason to. So it’s a lot more certain than you make out, though special pleading could give a loophole.

Is it impossible for human to reach to the tecknology to make artificial partikles smaller than photons and electrons when reaches the science of how quatom partikles act in order to reach to a speed more than light in future.

All conclusions in science are provisional, tentative, and subject to revision should new evidence arise. All of them. However, most of the time scientists don’t include that caveat when they are talking about high confidence conclusions. So you might often hear scientists talking about gravity as if it’s an “established fact.” Or the laws of thermodynamics as if they are “established facts”. Ethan talking about planck length as ‘established’ fact is his way of saying he’s as confident in QM as he is any other theory you care to name. He’s as confident in it as you are that if you walk out a second story window, you’ll fall.

This quibble comes up all the time with philosophers and creationists, or basically anyone with some pet peeve about some specific bit of science they don’t understand; they want scientists to include all the caveats

when talking about their pet issue. Its a form of subtle, often unconscious exceptionalism. But really, there is no need to treat any of these very hgh confidence conclusions as if they are low confidence. If you are comfortable talking about stepping out a window and falling without the “of course, that’s my theory and it’s provisional and subject to revision should new evidence arise” caveat, then you should feel comfortable talking about evolution without that caveat, talking about QM without that caveat, and so on.“But in Ethan’s piece, he listed 10^-35 m as the ultimate length limit as if it were established fact”

It is, in the same way as 1/2 is 0.5 as established fact. When converting rational numbers into decimals, the number is the result of a calculation, and that calculation results is an established fact.

If you wish to contest this limit, please provide the physics that leads to this new figure.

TIA, the internet.

Thank you eric for your respose

If we look at the sky at night we can see naturally only a part of milky way by our eyes but by an artificial instrument,telescope, which all components of it is from our nature we can see much more distance,so I think that when we get to higher enough tecknology to make artificial particles smaller than,the natural photons and electrons,we will be able to reach for example to a higher speed than nature made partikles do.

Well, that made no sense. Why did you bother?

I like cosmological phisics so I usually read the coments of phisicists about universe to increase my information about it and I would be glad to read scietific answers about my thoughts and there is no bother, thank you for your response,Wow