60

Inspired by this question regarding reality as simulation and this question about a continuous time line, it made me wonder: if our time were indeed like a high frame-rate simulation, how could we detect it, if at all?

So, assumptions are, of course - yes, time is discrete. And the "frame-rate" is high enough to not contradict what we already know in physics/science. What evidence could we find that time is discrete?

jdunlop
  • 32,004
  • 5
  • 76
  • 119
Alma Do
  • 1,157
  • 1
  • 7
  • 15
  • 32
    None. If true, the frame rate of reality is too high for elements in that reality to detect. Nyquist-Shannon theorem – nzaman Oct 15 '18 at 15:08
  • 27
    Obligatory xkcd: https://xkcd.com/505/ – Oleg Lobachev Oct 15 '18 at 17:58
  • 7
  • 1
    A related question is whether time is quantized. https://www.scientificamerican.com/article/is-time-quantized-in-othe/ – Walter Mitty Oct 16 '18 at 01:41
  • I highly recommend reading Permutation City by Greg Egan (or his short story Dust) if you haven't -- it touches on this question and is mind-expanding in itself. – ekl Oct 16 '18 at 18:54
  • whose reality? ;) – Will Ness Oct 17 '18 at 08:11
  • 4
    Don't we live in a world that is frame rate based? The fundamental unit being planck time and the solution to the Ultraviolet Catastrophe basically proving discreet time? – Chuu Oct 17 '18 at 17:32
  • Problem is, maybe even the universe simulating ours is frame-rate based. Maybe reality is not continuous, whether simulated or not. You might also be interested in Is the world C∞? – Zommuter Oct 18 '18 at 08:21
  • @Zommuter I have to admit, I've failed to comprehend what's going on in that question after 5 words in the paragraph :) But - thanks! – Alma Do Oct 18 '18 at 08:26
  • Sorry, I may have Sheldoned out a bit there It boils down to asking whether the universe in general is described by continuous Physics or not, i.e. not only time but space and everything else as well – Zommuter Oct 18 '18 at 08:31
  • @Chuu is correct, we already live in this universe. And it's not just frame rate/planck time - there is also the planck length giving the "resolution" of our physical universe. – Grimm The Opiner Oct 18 '18 at 11:20
  • 2
    @Chuu Time is not made out of individual planck time intervals. It's more about what's measurably meaningful than about the underlying structure of the universe - if the universe has a "frame rate", the frames could be both shorter and longer than the planck time. The solution to ultraviolet catastrophe only requires a kind of quantization, but that doesn't necessarily mean "fixed time step" (consider that the quantization levels aren't an integer multiple of the lower possible time/energy). – Luaan Oct 18 '18 at 12:01
  • I just wanted to add, this is an amazing question. More such, please! – Sentinel Oct 18 '18 at 23:01
  • @Luaan so naturally the plank time/length aren't the lower bounds to the granularity? (but are just the levels starting from which it is impossible to distinguish things) – Alma Do Oct 19 '18 at 09:58
  • @AlmaDo They're really just the result of dimensional analysis - Planck took a few universal constants, and combined them in a way to produce a value in seconds, meters etc. That's all the significance there is - the only goal was to avoid anthropocentric units. Many physicists do consider Planck time to be the shortest possible time interval, but there really isn't any reason for that to be true - the shortest possible time interval could be both shorter and longer than the Planck time. We'll probably need a working theory of quantum gravity to know for sure. – Luaan Oct 19 '18 at 14:35

10 Answers10

59

Collision penetration by velocity. As a starting note, we cannot talk about a graphics fps, and only a physics fps. Graphics fps only exists to the outside observer, we can only experience our universe through physics.
This is a classic problem in video-games. If physics is checked by frames and if objects are overlapping, then if something travels fast enough it can be before an object one frame, and past the object next frame. Collision never triggers and it goes flying by. Too bad they knew so we got that pesky speed of light to deal with. Instead we just need to make our objects small enough.
Cool part is that physics almost supports this. There is only a probability that two objects will collide. Now by using the width of objects and how often they collide we can calculate the frame rate of the universe. At least the physics loop frame-rate.

Andrey
  • 5,042
  • 1
  • 18
  • 33
  • 3
    Hmmmm... so hypervelocity particles acting like weird radiation then? – Joe Bloggs Oct 15 '18 at 15:36
  • 3
    This has some interesting implications on LHC experiments :D – quetzalcoatl Oct 15 '18 at 16:15
  • 81
    Unless it uses swept collision detection. That can be calculated exactly and still be frame based. – ratchet freak Oct 15 '18 at 16:42
  • 14
    Interesting perspective. I wonder if this could be the mechanism behind quantum tunneling? – Skek Tek Oct 15 '18 at 18:03
  • @quetzalcoatl unfortunately for OP our reality is not frame based, and time is a dimension infinity sub-dividable. Kind of like 4d vector graphics where archs can always be calculated further if you zoom in. So some fudging will have to be done – Andrey Oct 15 '18 at 18:26
  • 6
    I don't think this works. The framestep pretty much has to be Planck time or something else fails, and c is just too low to set this up to clip through in one Planck time. – Joshua Oct 16 '18 at 03:29
  • 37
    What @ratchetfreak said. This answer is a silly implementation flaw common in games due to propagation of bad engines and/or programming idioms. It's easily avoided by doing the calculations the right way. – R.. GitHub STOP HELPING ICE Oct 16 '18 at 04:52
  • 14
    Quantum tunnelling might be interpreted as dodging a collision check :) https://en.wikipedia.org/wiki/Quantum_tunnelling – Guillermo Mestre Oct 16 '18 at 07:50
  • 29
    You can avoid that bug by disallowing any object from moving more than one pixel per frame. This would imply a maximum speed which no object could surpass. If you make it actually a barrier for all sorts of information flow, and make the universe a finite age, this also naturally limits the size of the universe you need to simulate, as everything too far away will be of no observable consequence anyway. In order to prevent unlimited growth of your simulation, you might also want to add an accelerating expansion to your universe model, so things outside the horizon stay outside the horizon. – celtschk Oct 16 '18 at 08:41
  • 3
    (1) But you can't always conclusively (dis)prove the frames. You can only conclusively prove the frames if no proper collision detection handling was implemented. Any proof that frames don't exist could just as well be proof that they implemented proper collision handling, or that there is more FPS than our test was accounting for (2) Instead we just need to make our objects small enough. Neutrinos already pass through normal matter. So are you saying we've just proven that real life is frame-based? – Flater Oct 16 '18 at 11:38
  • Hmm, what is electron tunneling... :) – mathreadler Oct 16 '18 at 22:49
  • 7
    @R.. Have you ever built an engine that can run a AAA game? No? Of course you haven't; even if you tried, you'd just assume that any optimization was "bad code" and not do it, and you'd fail. Avoiding the much more expensive collision checks isn't just "haha game designers r dum", it's an active decision to save on a few CPU cycles that can be better spent doing other stuff. –  Oct 16 '18 at 23:28
  • @Flater Hence the rotating version below. In order for all 'inter-frame' interactions to be handled correctly, the 'universal game engine' would have to be generating interactions between everything all the time, and the universe would have no frame rate as such. – Sentinel Oct 17 '18 at 10:11
  • 1
    @Sentinel: The ability to predict what is expected to happen between this frame and the next does not require the predictor to be frameless. – Flater Oct 17 '18 at 10:41
  • @Flater Correct, but the predictor would in effect be the universe, as all elements would carry out a continuous existence in the simulated universe. Then it would become irrelevant if in the meta universe the frames were in relative milliseconds of millenia. Subjective in the moment consciousness would be above the simulated substrate in the predictor. – Sentinel Oct 17 '18 at 11:22
  • @Sentinel: Relevance isn't particularly part of the question. If everything was bound to the same frame progression, with no possibility of going beyond these frames, then it's irrelevant to know/prove that reality is frame-based to begin with. But OP isn't asking if it's relevant, he's asking how to prove/detect it. – Flater Oct 17 '18 at 11:23
  • @Flater I just don't see the meaning of 'frame' in that context. If the simulator is a thing that is "outputting" at a certain rate, where and to whom is it outputting if not the things it is actually simulating? – Sentinel Oct 17 '18 at 11:29
  • 1
    @Sentinel It is not up to the application to consider who booted the computer and why. Nonetheless, applications can still benefit from acknowledging their frame-based nature (e.g. counting CPU ticks - which is effectively the same principle at play). Time is meaningless to a computer, and yet we are perfectly capable of having a computer track the time even though it doesn't understand the purpose of doing so. – Flater Oct 17 '18 at 11:32
  • @Flater. That is not what I am asking. I am asking what metaphysically is the difference between the simulated entities (in the predictor) and the output of the simulation. – Sentinel Oct 17 '18 at 11:39
  • 3
    @ratchetfreak Swept collision detection only works for analytically solvable motion. In the general case, the movement of composite soft bodies can only be numerically approximated, so something like penetration collision detection will happen unless the time delta is made infinitesimal, which would require infinite energy to calculate (or hit the resolution limit of the universe). Reductio ad absurdum: it will always be possible to construct motion which breaks the solver. – spraff Oct 17 '18 at 14:19
  • @spraff thank you, I have been meaning to respond to this whole discussion, but I think you said it best – Andrey Oct 17 '18 at 14:24
  • @spraff Very good point. That or literature on physics would never refer to a 'solver' but to some other explanation, such as quantum tunneling – Sentinel Oct 17 '18 at 17:46
  • @Andrey - congratulations, you discovered why the speed of light is an absolute limit. The framerate happens to be such that if anything were faster, we could detect penetration. But at the speed of light or below, the frame rate is sufficiently high for penetration to be less than quantum effects allow. – Tom Oct 17 '18 at 17:49
  • However, pixelization of space is not consistent with Lorentz invariance, which has been verified to very high accuracy. – The_Sympathizer Oct 19 '18 at 05:35
  • @celtschk Limiting to one pixel per frame is exactly what the universe does. The only difference is that "pixel" is planck length, "frame" is planck time, and "speed of one pixel per frame" is "light speed". – forest Oct 19 '18 at 09:41
  • @forest unfortunately the answer where I explained in comments how the plank length is not a pixel was deleted. Look it up, common misconception. Has to do with energy – Andrey Oct 19 '18 at 13:36
  • @Andrey I understand that it's not a discrete, cubic box. However it can be thought of as a pixel in that it is the smallest distance that a boson can move in a single planck time, even if smaller measurements can be made. – forest Oct 20 '18 at 03:10
  • Collision penetration of hypervelocity is in case non-classical. The effect is nothing to do with quantum tunneling, but is simply because the cross-sectional area of the wave-form gets smaller with increasing velocity. You can see the effect in particular accelerators and in thermal (slow) nuclear reactors. – Aron Oct 21 '18 at 12:44
37

Actually, the world as described by the standard model of particle physics cannot account for time intervals lower than the Planck time, which is approximately $5.4\times 10^{-44}s$. But the current smallest time interval uncertainty in direct measurements is approximatevely $1\times 10^{-20}s$. Litteraly any experiment (such as those described in other answers) would need to be more precise by more than $20$ orders of magnitude in time measurement than the most precise currently known experiment.

I have no idea what a world where physical constants are different from ours by more than $20$ orders of magnitude would realistically look like.

Source : https://en.wikipedia.org/wiki/Planck_time

G. Fougeron
  • 486
  • 3
  • 4
  • 9
    +1. This is the correct answer, and I'm not sure why so many other (wrong) suggestions have gotten upvoted so highly while this one has been ignored. – Mason Wheeler Oct 17 '18 at 14:20
  • @MasonWheeler Two reasons that I know: 1) Answers posted first always get more attention initially, get to the top where they are easy to see, thus getting even more attention. 2) Worldbuilding is more about quippy, funny and common-sense-logical than scientifically accurate. – M i ech Oct 18 '18 at 10:13
  • 2
    I think this is in fact the correct answer, because the question includes "does not contradict what we already know" . But stepping into the world of fantasy a bit, you could devise experiments to test that you aren't just the figment of someone else's acid trip. – Sentinel Oct 18 '18 at 10:27
  • 4
    +1. Yet another question where the answer is "err, it's already like that here!" :-) – Grimm The Opiner Oct 18 '18 at 11:23
  • @GrimmTheOpiner Well, not quite. It's not the quantum physics implies that the universe is framerate based, it just puts a hard limit on what is physically possible to observe (for any meaningful value of "observe"). – Cubic Oct 18 '18 at 14:51
  • 1
    Still though, let us see what happens when we put a microphone next to a spinning , collapsing black hole. – Sentinel Oct 18 '18 at 23:05
  • There are also many attempts to form a framework with quantised time (server tick size) of Planck's time. – Aron Oct 21 '18 at 12:46
  • @cubic I would go so far as to say that's the whole of QM in a nut shell. It's a bunch of things we observed...but then again that is Physics in a nutshell. – Aron Oct 21 '18 at 12:49
  • 2
    @ Mason Wheeler From the source linked in the answer "Because the Planck time comes from dimensional analysis, which ignores constant factors, there is no reason to believe that exactly one unit of Planck time has any special physical significance" – Andrey Oct 22 '18 at 15:37
  • "Actually, the world as described by the standard model of particle physics cannot account for time intervals lower than the Planck time." I was under the impression that that's not true. – HDE 226868 Aug 02 '19 at 00:59
  • I might have been a victim of the myth myself. Disclaimer : I am not a physicist, and not really capable of having a critical opinion on these subjects. – G. Fougeron Aug 02 '19 at 06:12
19

High spin rates would prevent certain orientations. For example, an object spinning at 100th the universal frame rate would never achieve an orientation in between 3.6degree steps. Spinning at half, it would always be at two ends of a line.

This would invoke the collision detection problem mentioned here. The resulting interactions such as audio or electromagnetic field could be Fourier analysed and the spectrum would show the universal framerate.https://math.stackexchange.com/questions/1002/fourier-transform-for-dummies

The object could be a rod a thousand kilometers long rotating in space at one millionth the frequency of the frame rate. Noise artifacts in the interaction with a magnetic field would still be observed.

Sentinel
  • 690
  • 3
  • 9
  • 8
    Assuming time step is planck time, the object would have to be immeasurably small to not violate speed of light. – Zizy Archer Oct 16 '18 at 06:04
  • 1
    @ZizyArcher Are you sure? The object could be ten thousand kilometres long in space and rotate at one millionth the speed of the frame rate, and still produce noise artifacts in measured signals. – Sentinel Oct 16 '18 at 11:20
  • an object spinning at 100th the universal frame rate would never achieve an orientation in between 3.6degree steps Who says the framerate is fixed? Secondly, as long as the "lightframe" (= distance that light travels between two frames) is smaller than your smallest measurable unit, the frames are still undetectable. – Flater Oct 17 '18 at 11:25
  • @Flater - That doesn't matter, you would still get noise in the signal. – Sentinel Oct 17 '18 at 11:37
  • 1
    This would invoke the collision detection problem above. I am assuming you are referring to the currently high voted answer? I would suggest you instead link to that answer instead of calling it out by position which is dependent on votes and or how an enduser chooses to sort answers. – Matt Oct 17 '18 at 18:10
  • Of course, we know that this happens - that's what quantization is all about. You cannot have arbitrary values of anything. Your rotating object (which would need to be tiny) is perfectly well described by our existing physics. You could argue that quantization is exactly what you're looking for, but keep in mind that it doesn't tell us about the quantization of time specifically (if time is even a meaningful separate concept). E.g. you can't tell the difference between a quantization of time, energy, momentum, relative position... – Luaan Oct 18 '18 at 12:09
  • @Luaan I am still confused why it would need to be tiny – Sentinel Oct 18 '18 at 13:48
13

Yes, possibly.

Relativistic time dilation effect may help us to detect time quantization. If the universe is a simulation running at a uniform speed, then time dilation effects must be simulated ones.

In the world of video production, there is a longstanding problem of converting the frame rate when a video is converted from one media to another. In classic film, frame rate is 24 fps. In PAL video, it's 25 fps. In NTSC, it's 30 fps. Individual frames are too short for humans to take notice, but when we have to convert frame by frame, the resulting artifacts are becoming visible to an untrained eye.

Similarly, if we have two very precise clocks moving with respect to each other, or one in a strong field of gravity, and one away from it, the time will be running at different speeds for them. If time is continuous, the "slow" clock will measure time exactly as Einstein's theory had predicted. But if time is discrete, and the "slow" clock has to actually run in a "fast" timescale world, we would be able to see some weird effects, like some seconds will be shorter, and some longer than others.

The "slow" clock and its attendants would not be able to notice that without referring to the "fast" clock, and vice versa.

Alexander
  • 42,224
  • 6
  • 59
  • 153
  • 3
    You don't need large differences in local gravity. I actually asked about this over on [space.se] as Have we attempted to experimentally confirm gravitational time dilation? You might be surprised at the top-voted answer. – user Oct 15 '18 at 17:46
  • @Michael Kjörling yes... but in this case we need to detect "framing" artifacts, which supposed to be harder. An estimate of how precise the clock has to be to prove or disprove "Planck time quantum" would be interesting to see, but I can not make this calculation myself. – Alexander Oct 15 '18 at 17:51
  • How do you propose to measure these differences in the length of seconds? If one second lasts million ticks and the other lasts million plus 1, do you think we could ever detect this tiny difference? We would have to get to the point where one second lasts say 50 ticks and the other lasts 51...

    I would rather use special relativity here. Easier to make thing's time very slow (relative to ours) by accelerating them near the speed of light. But even LHC doesn't show even hints of this weirdness.

    – Zizy Archer Oct 16 '18 at 06:11
  • @Zizy Archer I mentioned seconds just as a high level example. Real experiment, I suppose, will use some particle/subatomic processes. – Alexander Oct 16 '18 at 07:01
  • @Alexander yes I know, I meant "seconds" the way you did - in terms of ticks of something observable. But I can't think of anything that would not just show the 1m vs 1m+1 problem and I am asking how do you resolve this and what process you suggest. – Zizy Archer Oct 16 '18 at 08:18
  • One theoretically possible way is high-resolution analysis of redshifted spectra. We might be able to see 1m vs 1m+1 splits in wavelengths. Practically, though, the visible or ultraviolet light's wavelength must be too long for us to see anything spooky. – Alexander Oct 16 '18 at 16:41
  • @Alexander Of course, that's going to be complicated by the fact that spectra are blurry. Can you tell the difference between the expected blurriness and anti-aliasing? – Luaan Oct 18 '18 at 12:10
9

I do simulations by trade, so forgive me if this is more technical than you intended. I will have to massage some of the details to get closer to what actually happens in simulations to cause artifacts like the ones you seek.

First off, we have to start with an assumption: the universe is supposed to be modeled as a set of Ordinary Differential Equations (ODE). The current models of the universe are all based on ODEs. We have to assume that velocity is a measure of the rate of change in position. So we're detecting differences between what the universe actually is and what could be represented as ODEs. If the universe is supposed to be something other than ODEs, then the actual answer of what we detect is completely dependent on what the laws of the universe actually are. If my universe consists of "Use the known laws of physics until Jan 1, 2020, then break all the laws and summon a dragon into the middle of Washington DC," then that's probably the artifact we'd notice!

Next, we have to point out that "frame-rate based" is only part of the problem. One of the nice things about ODEs is that you can solve them perfectly given one frame: the initial state. This means that our frame rate could be as low as 1/113 billion years ($10^{-19} Hz$) without generating any artifacts. That number, of course, is the current predicted lifespan of the universe, from big bang to heat death. If you want a different estimated lifespan number, you get a different minimum frame rate, but the numbers are equally broken. It's not just the frame rate that matters.

To see the artifacts we need to start taking shortcuts. The first shortcut we take is to say that we don't process everything. We observe convenient symmetries, and we take it. If I'm simulating a pingpong ball flying through the air, I typically don't model every atom and the inter-molecular forces that hold its plastic shell together. I model it as a "rigid body," with a position, velocity, and a rigid shape.

It's when we take these shortcuts that we run into frame-rate issues. If my rigid body assumption on the pingpong ball is not reasonable, then I generate a poor model. For example, during the impact between a pingpong ball and a paddle, the ball deforms quite a bit. I need to remember to model this period differently, with more expensive physics.

The most common frame-rate issues are the ones mentioned in others: interactions between two solid bodies that don't follow the laws of physics. This happens because the simulation starts at a frame where one of these simplifications is valid, but during the integrated path of the universe, those assumptions broke down. A pingpong ball can be modeled as a rigid body, until you have to model the eletrostatic interactions between it and the paddle which deform it during a hit. If you use these assumptions when you shouldn't, you get artifacts. You get fast moving objects that pass through eachother. You get solid objects that stick together. Stuff like that.

This issue doesn't happen in conservative simulations. Conservative simulations will do some sort of look-ahead process to see whether it can prove the simplifications will still be valid. If so, it does it the easy way. If it can't prove it, it does it the hard way, even if it turns out that those simplifications would have worked for the actual path (it's hard to find 100% whether the simplifications hold, but finding the 99% case and being conservative 1% of the time is easy). For example, your simulation might virtually "expand" every object in all directions by its velocity + max-acceleration * frame-period, and then look for collisions. If there's a collision in that expanded world, then its possible that one will happen in the real world. If there's no collision, then it's easy to prove that no collision would occur in the real world as well. This approach is very fast, computationally, and will let you avoid these frame-rate artifacts. We wouldn't observe anything as being wrong.

Now we typically don't solve these ODEs directly. We do what is called "numeric integration," which is an approximation tool. Some numeric integration approaches have artifacts that we can detect. For example, if you have a bias in your equations, you may develop what is called energy-drift. When this happens, your equations predict that energy will be conserved, but the approximations actually don't perfectly conserve it. This can accumulate over time and be detected.

Of course, we have solutions for that as well. This energy drift is easily accounted for using Hammiltonian mechanics, which admit a particular class of numeric integrators which are sympletic. These integrators do not have energy drift, because they only integrate along paths that conserve energy. If the universe used one of them, we'd never see it.

Now with all of this, we leave open the question: why? Why does the universe exist with a frame rate. If it exists because an intelligent species created a simulation, we have to ask whether they are designing their simulation to fool us within the simulation. If so, it's much harder to notice artifacts because someone is actively trying to prevent you from noticing. If I'm simulation a machine vision system for a robot, there's a whole host of artifacts that I don't actually care about because I know the algorithms the development team are putting together happen to not see those artifacts. If they add an algorithm that does see it, I'll have to change my simulation to model the physics more accurately.

For example, if there is some fixed number of frames per second, you'll see harmonics form in the small number of multiples of frames per second. You'll see things like aliasing if the algorithm is simple. However, modern simulations have something called adaptive frame rates available. If the situation gets complicated, you just decrease your frame period to improve the resolution of your answer. You can always do this unless the simulation has to run in real-time (in some true time sense). In that case, you'd look out for the out-of-simulation entities which don't seem to obey the laws of physics.

So in the end, there's lots of open questions, but the real answer is that you might be able to see the frame-rate in some trivial way, or it may be intentionally obscured from you so it can never be seen. The story is yours, design it as you see fit.

Cort Ammon
  • 132,031
  • 20
  • 258
  • 452
  • 2
    Very informative. The TL;DR; version of this I got is "We can detect bugs the creators forgot to account for; otherwise we can't". Although this answer seems to assume the frame rate is imposed on the model (simulation) and not the machine (real-time render). So out of curiosity, if the simulation was forced to produce x frames per real-world-second, is there a good way to force the simulation to use more hard calculations to miss frames and produce artifacts? – Tezra Oct 17 '18 at 21:11
  • @Tezra I need to add a section on adaptive framerates, but the answer is "yes, there's a good way to force it, and there are methods that can be used to prevent us from forcing it." However, doing it from inside the sim is particularly difficult because, by definition, the sim contains the answer you were looking for (it's the thing that's in your mind) – Cort Ammon Oct 18 '18 at 00:24
  • 1
    As for simulation vs. machine, the only real difference between them is what option are available, and determining what options are availiable for some hyperintelegent creator's machine is an exercise in folly unless they are defined in the question. To me, the more important thing is that, simulation or machine, in both cases there is a creator, and they are trying to make it do something. That really sets the stage for the rest of the discussion. – Cort Ammon Oct 18 '18 at 00:26
  • The other problem is in this context we are not observers. We are the simulated. – Sentinel Oct 19 '18 at 20:48
  • @Sentinel Yep. An excellent piece of prior art on the effect you mention is Permutation City, by Greg Egan. That book explores what might happen when you try to show a Garden of Eden pattern, which would prove you are in a simulation, to the simulated. – Cort Ammon Oct 19 '18 at 22:48
8

The Arrow of Time, or macro system temporal asymmetry, is the observable irreversibility of chemical and mechanical reactions, it forms the basis of entropy. It is also a demonstration of time's passage in that to reverse reactions effected by it one would have to turn back time.

If we establish that there is a minimum time frame over which such asymmetry can be observed to occur this would demonstrate a minimum duration for the definite forward movement of time, proof that time only moves in one direction and in discrete parcels as well. Actually getting experimental proofs concerning events that occur so quickly is however impossible, by definition, if time is discrete since we couldn't measure the time it takes for time to happen in increments smaller than itself.

Ash
  • 44,753
  • 7
  • 97
  • 214
3

Rounding errors in hight speed cameras

In "frame-based" universe, there is fixed smaler amount of time, it is like looking on kino/tv one monent we see static reality, then we do not see anything (and everything is recomputed to new positition) then we see static reality ... and so on. As nothing can be done faster than the "reality-frame" allows, for normal people would suffice like 120fps to have illusion of smooth move. (the reality frame must be much faster, as people are terribly slow anyway).

But no camera in our universe could run as fast as reality frame-rate. So regardless how far we push, we would not be able "photo" either movement during visible reality-frame, nor dark during space between reality-frames. But it is not needed to detect those frames. If we can run camera "near" reality-rame speed (like 1000 times slower or so), and THEN rewind it much slower, if those two are not direct multiply of each other, we would find, that there is small noise between frames of our camera - if we film something really fast moving, we may found, that we constatntly get more (or less) better frames/worse frames/the item position varry slightly over the frame in regular pattern.

We see (say 50 fps) TV as smoothly moving, but if we film it on camera with different rate (say 60 fps), there are moving stripes in the film we could see even in bare eyes, as those are running at the difference speed (10 fps) which is simply visible as flickering. But if we would film a fast TV (say 5001 fps) with slow camera (say 50fps), we would see nothing, but one of 100 frames would be duplicate or missing or distorted. If we could get near the reality-fps, (even on many orders slower), we would not notice it by bare eyes, but statistically comparing changes in the frames, we would notise, that with regurarity one of many of them is distorted somehow, even if in avarage the movement would be smooth. and the systematic error on magnitude of 10, 100, 10.000.000 frames would signal us, that there is "frame-based" reality and we could compute its frame rate, even if it may be impossible for us to go near such speed.

The same way, as you can compute, how fast is your camera, if you film fast running car and at some moment the wheels seems to rotate slower, then stop, then rotate reverse, while car is still moving ahead.


That said it is still possible, that we live in "frame-based" reality with so insanely fast frame rate, that we could not make for this effect even with best possible equipement. Then we would not know.


And for computational speed of that simulation - we cannot say anything, as in "frame-based-reality" time is already stopped and there may be "millions years" between next sub-femptosecond frame is computed - for all what we know.

gilhad
  • 2,192
  • 1
  • 10
  • 12
3

If the frame rate is constant (rather than variable), then for complex waves we might be able to detect intermodulation and aliasing of its harmonics (which in theory should go on to infinity).

dtldarek
  • 1,344
  • 8
  • 15
2

High speed cameras would not only allow us to detect that reality is frame-based, but it would also allow us to record what the framerate is. Between each frame any number of things can happen(i.e., you can move 5ft or 1000ft), but the results of what happened in one frame is only visible on the next frame.

Answers to this Physics SE question suggest that there is no upper-limit to the maximum framerate of a camera, and some cameras available today are already capable of 200 million FPS. That's about one picture every 5ns!

So, if reality was frame-based then as high-speed cameras achieved higher and higher framerates, eventually we'd begin to see the discreteness of the universe as the pictures look more and more like a slowed-down stop-motion film. Eventually, we'd be unable to take unique pictures in between two very small moments of time, as there is nothing to view between one frame and the next.

Giter
  • 17,277
  • 6
  • 44
  • 48
  • 21
    The cameras themselves are governed by the theoretical frame rate of the universe. Does this mechanism allow you to detect whether you have reached a limit on the camera's time resolution versus the universe's time resolution? – pojo-guy Oct 15 '18 at 15:33
  • @pojo-guy: Maybe you notice that things moved between frames and when you double the framerate they move only half as much? – AlexP Oct 15 '18 at 15:56
  • 1
    @pojo-guy: Since this theoretical universe is based on concrete frames rather than anything more abstract like distance or time, any number of things can happen in between the frames since the 'CPU' of the universe is must allow multiple things to happen per frame. The camera would be able to take as many pictures as it wants per frame, and every picture taken during one frame would show up on the next frame. – Giter Oct 15 '18 at 16:01
  • 2
    ...and now I'm wondering about the exponential increase in storage and processing power required for that – nzaman Oct 15 '18 at 16:34
  • @nzaman: There are about 10^86 particles in the visible universe, so just storing a single 8-bit pixel in some database for each one would require 100 trillion-trillion-trillion-trillion-trillion-trillion 1TB harddrives! – Giter Oct 15 '18 at 17:01
  • @Giter ...and how many particles would be needed to construct said harddrives? Even, how many particles would be needed to record the magnetic flux changes to encode the information in question on an arbitrary, non-existent, hypothetical harddrive? – user Oct 15 '18 at 17:41
  • 5
    If the universe is frame based, how did you develop a camera that records faster than the framerate of the universe? Obviously the camera would be limited by the same framerate of.. well, everything, so if the universe was let's say 15000fps, no camera would be able to record faster than that, and thus this effect would not be observable. – Mermaker Oct 15 '18 at 18:51
  • @MichaelKjörling: ...I've never actually thought about that part of things. I guess if our universe was a simulation in the 'real universe', then the real universe would need to be, at minimum, the size of our universe multiplied by the number of particles needed to fully simulate a single particle. – Giter Oct 15 '18 at 18:54
  • 2
    @T.Sar: The camera itself is not limited by the framerate, just the framerate of the pictures it takes is limited. Just like you can take 10, 100, or 1000 steps in a single frame of a movie, the camera can take 10, 100, or 1000 pictures in a single frame of time. All of those pictures will simply look the same in a frame-based universe, but will be at least slightly different in a non-frame-based universe. – Giter Oct 15 '18 at 18:56
  • Physical size 3.5 hdd is 8.27 cubic inches. SSD 7mmx2.5x4=2.76 cubic inches. Also 0.4 cubic inches in M.2 nvme storage. The SSD storage can be had in 1,2,4tb size, m.2 currently 2tb. So M.2 chips the size of a 3.5" inch hdd (8.27/0.4)=20 *2 tb= 40tb more storage than any hard drive so it is the density winner. @MichaelKjörling – cybernard Oct 15 '18 at 19:04
  • Keep on mind that there are also trillion fps cameras. They can only do one vertical pixel slice at a time, but they can record at super high frame rates by pulling film through the aperture quickly (racing photo finish operates on the same principle). So if you have a reliably repeatable event, you can capture it at speeds slow enough to put light to shame. – Draco18s no longer trusts SE Oct 15 '18 at 19:21
  • 1
    The Physics SE question pushes some false information. The super fast cameras are not high framerate—exactly the opposite. They suspend a light-wave and then expose the camera. They then move the suspended wave forward and expose again. – Nathan Goings Oct 15 '18 at 19:24
  • @NathanGoings Yes, in electronics that technique is called "equivalent time sampling" and has been used for many decades. – user71659 Oct 16 '18 at 00:48
  • 2
    @Giter I'm not sure if you are being deliberately obtuse, or if I'm missing something. If the universe has a framerate nothing can happen "between" frames. Certainly not a complicated macroscopic process like taking a picture. The framerate of the universe creates an upper limit on the frame rate of the camera, but for any real camera it's unlikely to get anywhere near that level. – John K Oct 16 '18 at 18:25
  • @JohnK: Think of it like this: in a movie, from one frame to the next somebody could turn their head, or turn their body, or walk, or run, or get in a car, or eat a meal, or anything else. A frame is simply a snapshot of a moment of time: if the universe is playing at 30 frames per second, that just means there are 30 discrete states of the universe per second, but like a movie any number of things can happen between those states. – Giter Oct 16 '18 at 18:48
  • The builders closed that loophole - the storage needed for this method of detection is so big that will collapse to a black hole and hide the framerate of the universe because the detector broke down as a black hole. – Geronimo Oct 18 '18 at 00:08
  • I think the linked-to question fails to account for uncertainty: Δt*ΔE≥ℏ. The lower the framerate, the higher the uncertainty of the measured photon energy. At some point the uncertainty will exceed the whole universe's energy so you won't know what you actually observe... – Zommuter Oct 18 '18 at 08:25
1

Numerical limits

If we are living in a high frame-rate simulation it means that whatever may be running this simulation is subject to limits. If it was not the case, why would it choose discrited time ?

So, if this thing is not perfect, there must be numerical limits inherent to the "language" it use to simulate us. As instance, C++ code max double value is 1.79769e+308. So let's just wait for one of the billions and billions of variable needed to run this incredible simulation to hit this limit. May it be the "univers volume" as it expends in every direction at the speed of light ? I don't know, but if these superior beings have opted for discreted time one may think that this simulation cannot fully handle the concept of infinite either.

Then, see what happen... A big crash ? Some totally unexpected bugs ? At least I guess we will see that something is wrong ; if we are not just totally erased...

Freedomjail
  • 1,608
  • 6
  • 12
  • If it was not the case, why would it choose discrited time Well.. because for example, there is no other way to do it? (as a possible reason in this setting) – Alma Do Oct 18 '18 at 15:08
  • @AlmaDo Even if we can't conceptualize it, the perfect simulation should not be based on timestep. A timestep imply that any phenomena faster that the timestep can not be simulated correctly. If what is simulating us chose that solution, it means that, as you said, there was no other way. "No other way" means that this thing is not omnipotent. If it's not omnipotent then it has limits... – Freedomjail Oct 18 '18 at 20:19