What would it take to be able to convert any O(n!) algorithm to O(1)

What would it take to be able to convert any O(n!) algorithm to O(1)

Non-brainlet responses only, plz

You can change physics or whatever but only ever by 1 variable

Attached: 1_HwLR-DKk0lYNEMpkH475kg.png (1682x1106, 90K)

Other urls found in this thread:

pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html
twitter.com/AnonBabble

Did you know..

... that doing your own homework is a O(n) task?

You want non-brainlet responses to your ultra turbo brainlet question?
Not gonna work

Brainlet responses

>What would it take to be able to convert any O(n!) algorithm to O(1)
Pre-calculate and store the results.

So you get n! data, gl

>So you get n! data
Not necessarily, that would depend on what the algorithm does.

>You can change physics or whatever but only ever by 1 variable
okay, introduce the rabbit and ask him to do it. You also get to become a magical girl while doing so, so that's a win-win in my book.

Attached: 1550974233814.png (500x500, 77K)

A rabbit is like 300 trillion variables

You failed

Infinite memory is needed.

For better approximation have a look at dynamic programming. It tend towards O(1) when left running for a while.

Infinite CPU power would do it too but that's the easy way out

>You can change physics or whatever but only ever by 1 variable
P=NP
There, now your storage is O(1)

Zenos not being a paradox, infinite steps in finite time.

increasing cpu to infinity definitely does not make an O(n!) algorithm run in O(1), filtered brainlet.

Yes it does you absolute dumbass

Infinite CPU speed = instant result on any input

Attached: X7fQkrX.jpg (600x315, 38K)

Oh my god imagine being this retarded

Big O measures the upper bound of computational complexity. It has nothing to do with actual execution. Now back to your cagie dumb wagie.

Attached: 8EF94067-5455-44E8-AA10-87C90FD48B58.png (1457x1080, 813K)

Nice retort faggot

Having an infinitely fast computer is equivalent to saying every algorithm you can run on it is O(1)

Now go take your trolling and shove it in your ass

Attached: 1554517804011.jpg (1000x945, 195K)

the absolute state of this board
kill yourself

Attached: 1549216591988.jpg (480x451, 26K)

>Big O measures the upper bound of computational complexity. It has nothing to do with actual execution
Imagine being this fucking retarded.

If you have a theoretical computer that processes everything at infinite speed, then guess what happens shit for brains. You process at infinite speed, therefore whatever you process is instantaneously. Your algorithm which is O(n!) on a real computer runs at O(1) on the theoretical infinitely fast computer.

Do you guys even fucking CS

What a bunch of fucking dumbass.

Some algorithms can't be improved past a minimum complexity limit.
It's non sensical to belive that optimal O(n!) algorithms can somehow be tweaked into O(1)

The medium you're running your algorithm on doesn't change its complexity you turbo brainlet.

gb2 complexity 101, because reading your uneducated posts really hurt.

Nice retort again. What a faggy way to argue. Get off your boring ad hominem and provide an actual argument, moron

To clarify so I can fill the void in your brain, if you have infinite CPU speed that means you can complete 1 clock cycle in near zero time, but that doesn't reduce the number of clock cycles you need to complete a computation, any it doesn't mean your algorithm can complete 1 step in 0 time just that it tends towards 0. If you have an input size that tends towards infinity as your cpu speed does you're still fucked because your computation still requires O(n!) steps.

That wasn't even me, other people know you're retarded as well lmao.

this is me

>The medium you're running your algorithm on doesn't change its complexity you turbo brainlet.
IF I HAVE A COMPUTER THAT IS CAPABLE OF PROCESSING AT INFINITE SPEED, THEN O(N!) AND O(1) ARE FUCKING EQUIVALENT YOU TURBO FUCKING IDIOT

>infinite
>near zero time
You are a fucking moron

just stop posting, for real, this is embarrassing

computational complexity in big o merely states the relationship between the size of input and the number of "steps" taken to compute the result. a step can be defined as some elementary artihmetic operation, or memory access or just general single function evaluation, depending on the context

THATS NOT THE POINT YOU ABSOLUTE CRETIN

That's not how it works retard.
Even if you could process your algorithm instantly that doesn't change the fact that your algorithm is O(n!) not O(1).
Saying they are equivalent is wrong because in the O(n!) case you have to complete n! steps whereas the O(1) one has to complete a constant amount of steps.

You need to go back to complexity 101 you massive empty nut

Meant to respond to

elaborate lmao. Are you confusing the fact that infinity is a concept and not a number?

The goal of computational complexity is to classify algorithms according to their performance. If you have an idealistic computer then all of that becomes garbage and none of it is true anymore. What exactly is hard to understand about that?

When you're changing fundamental physics to answer a question, you can dismiss an entire field. If I turned off the strong force right now, guess what happens to particle physics.

Also we're talking runtime complexity here.

It changes the fact that O(n!) and O(1) can now be considered to be equivalent and in the same class. I don't know what you guys are smoking but this is how this shit works.

No it does not, as I just said imagine you have an infinite input size too. Now your argument is invalid.

In reality you would store every result for lookup later on. For example for a password cracker, to crack every alphanumeric password of length 1-N it would calculate all the results and store them in a fast lookup table.

The size would be huge though, terabytes. And the lookup time wouldn't be O(1) more like O(log N).

But that is still absolutely possible, hard drives are cheap.

ok brainlet, let me condense it into just one sentence:
computational complexity doesnt describe time but number of steps, therefore it is completely independent of how fast your machine is and how much time (even if none) a step takes.
got it?

even OP's picture says "number of operations", so there is no misunderstanding about what kind of complexity we are talking about

>It changes the fact that O(n!) and O(1) can now be considered to be equivalent
but it literally doesnt, the number of steps is the same
you still dont get it? wtf

Honestly processors are getting fast enough now that autistically obsessing over big o is becoming something you don't even need to do anymore.

Please look up the definition of runtime complexity, pro-tip it is not defined as wall clock time like as your posts seem to hint thats what you think it means. Even if it were your argument would still be wrong.

Thats not how it works

Since ideal, infinitely fast processors don't exist, you are right. But if they would exist by some chance, then you would be absolutely wrong. Because the number of steps in an infinitely fast processor is always 1.

what if OP meant
>what it takes for O(n!) algorithm to run at O(1) speed?
I know, it's retarded question, but this is OP we're talking about.

The OP is clearly retarded so there's no point responding to it.

>But if they would exist by some chance, then you would be absolutely wrong.
but thats not true, thats what ive been telling you the last 5 posts - even if CPUs with infinite processing power existed, all we wrote to you would still be right and you'd still be wrong
jesus fucking christ, just read the posts!

>Because the number of steps in an infinitely fast processor is always 1.
no, the number of steps would be the same, they would just take (near) 0 time.
if it did only one step, only one elementary operation (say addition) would be performed and nothing more

Currently processors are the fastest they are going be in terms of CPU clock rate because we have reached physical limits for what is possible at room temperature. Right now companies can make processors faster than what is in consumer devices, but we are at the limit of physics where current processors cannot be made faster because they produce so much heat that you cannot cool them at room temperature where consumer devices function. If you have specialized cooling you can have a faster processor, but we are at the limits currently barring some huge breakthrough in heat dissipation technology which is highly unlikely. The problem is not that its hard to make a faster processor, its that we cannot cool faster processors at room temperature so no you're absolutely wrong and you probably haven't every dealt with any problems that aren't toy homework problems if you think computational complexity doesnt matter. Theres a reason any programming interview you go to will test you on algorithms and data structures, its literally the most important thing.

>infinite processor takes near 0 and not 0 time
LOL the mental gymnastics you must go through

I'm right, you're wrong. I don't care how many faggots here jump onto your bandwagon. You're all fucking wrong. An infinite processor processes anything at O(1), takes exactly 1 step to accomplish that, and runs in 0 (not near 0) time. Feel free to argue but you're only further demonstrating your absolute ignorance.

>doesn't know that there are multiple types of infinity
brainlet confirmed

Lol notice how theres like 4 people arguing against you and no one arguing with you?

>>infinite processor takes near 0 and not 0 time
>LOL the mental gymnastics you must go through

not only do you not understand complexity, you also don't understand the concept of infinity it must be truly suffering to be you :^)

precalculate results and do table lookup

I completely understand why the FANGs do algorithm interviews now. Holy fuck the actual state of computer science today.

>You're all fucking wrong.
Sure buddy. Get back to computer science 1, maybe you'll learn the basics.
>An infinite processor processes anything at O(1)
That sentence makes no sense.
> takes exactly 1 step to accomplish that
It would then perform only 1 step. Algorithms consist usually of more than 1 step, even those of O(1).
>Feel free to argue but you're only further demonstrating your absolute ignorance.
Cope. Your lack of fundamental knowledge got called out by pretty much everyone in this thread and instead of learning, you just start kicking around. Sad.

Attached: 1552236653762.jpg (312x359, 25K)

I honestly don't give a shit about O(n!) problems
I am however deadly scared of anything that has o(n^2) because holy shit, your best case scenarios are still scraping exponential

lmao
please record and upload any interview you make for a job in tech
That would really lighten up our days

Negro isn't this the p vs np problem

Low IQ fags

Assume that algorithm takes time total time t(n) seconds. Without the loss of generality also assume that we can calculate t(n) in O(1) time.
>You can change physics
Accelerate the time by the ratio of t(n). Now literally every algorithm takes 1 second (plus the time of computing t(n), which is also constant).

to be fair, there is atleast one retard in every thread on Jow Forums

Attached: 1550466692663.png (710x577, 30K)

pick up a book and stop embarrassing yourself lmao

Attached: 1552389812318.jpg (493x449, 43K)

The only reason big o exists, was because programmers were dealing with processors operating in the megahertz range and only around 1 Mb of ram, so they needed to figure out ways to squeeze out as much performance from their code as possible. Nowadays we have processors that operate in the gigahertz range, and several gigabytes of ram. For your every day ordinary program, you do not need to be manically obsessing over how long a particular part of the program is running, when to the end user it's near instantaneous anyway. Big o is a relic of the early microprocessors age the same way punch cards were a relic in that same era.

Yeah, but that's hardly a challenge to prove
Prove that there's more than one retard in every thread on Jow Forums

infinite processor:
>O(1) and O(n!) algorithms finish in the same time (zero time)
>the number of steps taken in each of the algorithms as a function of input are still bounded by 1 and n!
is there really an argument beyond this

n=1

>For your every day ordinary program, you do not need to be manically obsessing over how long a particular part of the program is running
That‘s because someone actually smart already obsessed over it when designing compilers and libs, so a monkey like you can make 100MB helloworlds in Electron.

so what you are saying is you would not mind if I replaced all your sorting algorithms in the libraries of the programs you use with insertion sort?

Attached: boocat2.jpg (947x768, 144K)

You're delusional.
The O complexity notation has nothing to do with how fast your computer is.
And it still relevant nowadays in high performance software engineering.

imagine being this retarded

the dumbo thinks that infinitely fast processor will execute only 1 step for all algorithms therefore it transforms everything into O(1). i think he knows how retarded that is and is just grasping at straws

I could say the same to you, since you seem to be claiming that my algorithm will take a finite amount of time on a machine that processes anything instantaneously.

I realize you're all faggots affected by Dunning Kruger but whatever. You don't even seem to be able to think for yourself.

>hurr O(n!) is set in stone
O(n!) is just a complexity class. It's fundamentally no different than any other complexity class. It's a fucking curve on a graph, it represents how your algorithm scales with input size. When I say that O(n!) can be ran at O(1) on an ideal infinitely fast computer, what I'm saying is that with such a computer, O(n!) becomes meaningless because I've just figured a better way to do processing because I can process every O(n!) algorithm exactly as effectively as I would an O(1) algorithm. They become logically equivalent. So the distinction is no longer relevant. Now obviously since we're living in the real world, this doesn't hold.

You guys are horrible at thought experiments.

CPU speeds are improving pretty much just few % per year at best, its completely irrelevant - while O(?) makes the difference between your algorithm taking 10 miliseconds and 2 minutes.

Attached: .png (586x557, 66K)

Is that an O(1) problem?

It is still important. Sure, your local Java Pajeet making CRUDs and frappuccino-drinking front-end dev don't need it, but when you have to deal with either real time constraints for high-performance computing or to work with big amount of data in high-load domains they are actually real. No one's obssessing over them, but you need to understand at least on the intuitive level how your solution scales. All modern database and filesystem implementations and OS schedulers for example, use complex data structures and algorithms to get better results. And that's just what you use daily.

this thread has proven that:
1) you dont know that O() describes number of steps needed for computation in relation to input size
2) that infinitely fast CPu would still compute the exact same amount of steps as normal CPU, just infinitely faster

as i said - pick up a book and stop embarrassing yourself

In the real world people who didn‘t fail high school calc understand that approaching infinity does not mean le ideal comic book magic supahcomputah. Point your input size at infinity and realize how infinitely retarded you are.

>Prove that there's more than one retard in every thread on Jow Forums
I can't. The opposite holds, actually. There exists a thread with just 1 retard.
1. There exists a thread without replies (see: at 14:44 PST).
2. By principle of W.T. Snacks, OP is a fag. It's trivial to see that he's also a retard.
3. It is a thread with only 1 retard.
4. Ergo, there exists a thread with only 1 retard.

>What would it take to be able to convert any O(n!) algorithm to O(1)
The algorithm to do it is probably O(n!)

turning ! into 1 is easy
just don't press the shift key

Number of steps on an ideal infinitely fast computer is always 1. It can't be more than 1 because that would imply it's not infinitely fast. You give it a task, it takes exactly 1 step to solve it and you get the output.

This effectively flattens all upper-bound complexity classes to O(1).

If you can't understand that, may I suggest you are a brainlet?

Do you understand that we're talking about a hypothetical and ideal computer here? It can't exist. Infinitely fast CPU doesn't approach shit, it's infinitely fast.

Nice.
>Number of steps on an ideal infinitely fast computer is always 1
Thats where you are wrong, nigger. the number of steps is the same, or you wouldnt get the result of the algorithm. How could you compute sorting algorithm with only single "if" instruction being executed?

Attached: 1552826864237.gif (334x251, 976K)

second part after "nice" meant for

>i think he knows how retarded that is and is just grasping at straws
I don't think he realize how retarded he is.

Define a step on an ideal computer that runs at an infinite speed.

Let me see you wiggle your way out of this one.

the same as a step on any other computer except it happens in no time?

>Do you understand that we're talking about a hypothetical and ideal computer here?
Yes. Do you understand the difference between mathematically correct abstract models untranslatable into real world and freefloating fantasy completely decoupled from logic? Because you keep arguing about characteristics of the former, while presuming features of the latter.

Attached: B19D8D85-801E-4B49-9E84-99A8A4DE5230.jpg (400x323, 51K)

Congratulation.You're on the first peak of the Dunning Kruger graph.
Enjoy your stay!

>no time?

kek

OH WOW
L-fucking-MAO

And what do you call a step that runs in 0 time, you absolute comedy material in the shape of a person?

As I've been saying, you completely do not even understand what infinity is. eat shit.

the "step" is defined by you when you compute the O(), as i have already explained here you have just confirmed that you have never in your entire life ever made a computational analysis of an algorithm LMAO

you would select it the same way on finite and infinite speed CPU because it is NOT dependent on CPU speed. Define it as single elementary logical or arithmetical operation (+, -, if, and, ...) or memory access (read, write) for all i care, it doesnt matter - the result will be the same in all cases - O() will be the same on finite and infinite CPU with same algorithm and same step "type".

The infinitely fast CPU still needs to do all the operations/steps, it just does them infinitely faster. God, you are stupid,

>the "step" is defined by you when you compute the O(),
you literally cannot make up a person more retarded than this. You can't just make definitions however you want when they are already established in a field.

Oh dear lord

It's effectively 1 step, ok? It can be an infinite amount of actual steps, it doesn't matter because in an infinitely fast computer this effectively means 1 step. It doesn't fucking matter what your model of a step is when regarding classical computers, we're dealing with a hypothetical model here. So things like steps become a useless abstraction. Is that hard to understand for you or what? I don't fucking get it.

what is the definition of the step in computational analysis, you retarded clown? it depends on the problem and what you are interested in. jesus fucking christ, is everyone here a child?

read this to get some basics, son:
pages.cs.wisc.edu/~vernon/cs367/notes/3.COMPLEXITY.html

>It's effectively 1 step, ok?
but it isnt :-) thats the point

the number of operations performed by your infinitely fast CPU is still the same, the O() is still the same

"Effective something" and "something" are two different measures. What you're doing is amortizing cost, which is valid, but different from what people here are talking about.

Sorry, you are wrong
>You can't just make definitions however you want when they are already established in a field.
It's not a definition, it's a parameter of the analysis. Just like the algorithm itself.

Computational analysis is basically a process where the input is a tuple of "algorithm, elementary operation (what user calls step)" and output is a "O(something)"

Divide by n!

What your problem is, is that you think that theory is set in stone. Like I said in an earlier post, if I suddenly erased strong force from existence, particle physics falls apart immediately. It doesn't make the slightest bit of sense because everything depends on strong force. Similarly, if I take complexity theory, and introduce a computer that can do something that is clearly physically impossible - namely process at infinite speed - then the entire complexity theory becomes useless. You give me a hard problem that would take millenia on a classical computer, I put it into my magical device - puff - behaves like O(1), gives me instant result. You give me more problems like this, again, I get instantaneous results. This means that with regards to my magic machine, your complexity theory has completely fallen apart. Because if all I give you is my magic machine and a bunch of black box algorithms, you won't be able to discern between O(log n), O(n!) and O(1) algorithms. They will act the same way. So all of your theory is suddenly invalidated.

Point is that we'll never have such a machine, but if we did, it would take exactly 1 step to solve anything, because the step is then defined by the physical act of inputting information and collecting it on the other end. It's no longer bound by any property of the algorithm.

t. Angular dev

No, I'm basically describing how an infinite computer invalidates the model of a classical step. You can't even argue against this because it's meaningless semantics at that point.

Please don't procreate.

>ad hominem instead of arguing your points, whatever they may be
Typical noob behavior