is there any scientific proof that a computer can not simulate a more powerful computer than itself?
Is there any scientific proof that a computer can not simulate a more powerful computer than itself?
Other urls found in this thread:
en.wikipedia.org
twitter.com
2 is greater than 1
Yes, it's called "not being retarded".
a chain is only as strong as its weakest link
there are some extremely unintuitive things out there user
en.wikipedia.org
"In terms ofcomputational complexity, a multi-tape universal Turing machine need only be slower bylogarithmicfactor compared to the machines it simulates.[2]"
Yes, but in the case of something that was made by man, nothing unintuitive should be present.
This is a trivial mathematical proof, the machine doing the simulation would be slow as fuck and this defeats the purpose when you could just build the faster machine
You can simulate a faster machine, but not in real-time. There is an arm emulator for atmega8 which runs linux.
So say I have computer with X speed, I can simulate 2X speed but at X speed and see what kind of results I would get if I half the simulation time?
Depends what you mean by "more powerful". Computational classes are a thing and can be mathematically defined, as can the corresponding properties of the machines needed to solve corresponding problems. If you're just talking about professor speed then actually there's nothing stopping you provided you have the necessary memory and time - with enough paper and ink you could theoretically do by hand anything any modern supercomputer can do, or you could run it on a raspberry pi with enough memory connected somehow, etc.
If your computer can do 50 calculations per second and you want to simulate a computer that does 100 calculations per second, from the point of view of the simulation every real time second will be like 0.5 seconds (this is completely ignoring the overhead of the simulation), and every 2 seconds the simulation will see 1 second pass. So basically yes.
Except that the scalability isn't linearly. 2 gpu in sli/cf aren't twice faster than one card.
But it can, only not in real time
GPUs are designed this way.
speed of a computer has little to do with its capabilities. that's mostly bottlenecked by ram and disk space. those are the issues with supercomputing which cannot be overcome.
The real question is, if a slower machine is simulating a more complex machine at a slower cycle rate, does the complex simulation know it's going slower?
of course not if its a turing machine
Only if it has a reference to real time (or, more generally, some source of timekeeping independent from its own clock rate).
>This is a trivial mathematical proof
Well...?
There is one case:
>a computer sims a computer weaker than itself in one random area
>with each computer not knowing what each's difference is, they fight it out somehow in some game, over and over until they get a "distribution"
the researchers observing this distribution now has power over both, and has some kind of RNG
Pic related was generally what I was getting at
if by "more powerful" you mean simply "can do X work in less real time", that just doesn't make sense
even with perfect efficiency, the simulated computer can only be up to exactly as fast as the simulating machine
it's like arguments regarding free energy, you can only get out what you put in
A finite automaton cannot simulate a push down automaton, a push down automaton cannot simulate a Turing machine.
Computational power isn't equal to muh GHz.
- T. Vague memories of computer science
Maybe if we figure out a universal theory of all physics and we end up with a formula that is able to calculate a universe based on an input state + time offset, then we could
it's hard to tell what you mean by 'know it's going slower'
computer programs don't inherently do things based on real time, so unless the software is written to reference an external real time clock source, it cannot know if it's 'going slow' or not
and of course, in a simulated environment, one could just as well simulate a slow external clock
ever messed with video game hacks that slow down or speed up the gameplay? similar thing, the game is programmed to do things with real time taken into account, which can be intercepted and modified
Go ahead and try it. It's not the computer elves living the CPU cores that are holding you back, you fucking idiot.
what the fuck do you even mean
see
No. Actually, our reality is being simulated by a Playstation 2.
>if a slower machine is simulating a more complex machine at a slower cycle rate
this doesn't make sense, you seem to be implying that slower = less complex, which is not true
a machine need only be turing complete to simulate any other machine, making "turing complete" the gold standard of how complex a machine can get
outside of that, the only other limitations are memory and time
given enough memory and time, you can complete any operation (you must have enough memory for the operation in question, how much time it takes depends on how fast the machine is)
>is there any scientific proof that a computer can not simulate a more powerful computer than itself?
If you define speed as "the time it takes to execute some amount of Instructions", then Probably no, but your statement isn't a mathematical statement and so there can exist no proof.
I fucking knew it. Sony, you cunts!
>scientific
>more powerful
There are 2 concepts of powers in computing:
Speed and Size.
You can simulate a faster computer in a slower one... It will just be... slower.
You can simulate a bigger size, sometimes, but not always.
That's about it.
You can, just not in real-time.
goo gl QCDsTS
just b urself
>the chad simulation vs the virgin emulator