Linked list

Can someone tell me what the point of linked lists are?

I've been writing software professionally for almost seven years now and I've never written a single program that needed anything close to that data structure.

So is it just pointless bullshit to sell to comp sci students? Also, same question for binary trees.

Attached: hqdefault.jpg (480x360, 24K)

B L O C K C H A I N

It was taught back in the C days before standarized dynamic containers, it's still taught more as a "here's how that shit works under the hood" thin

you only use the exact amount of space that you need
easy to change size
it's just convenient if you don't have a "list" type like in Python doing all the dynamic array magic under the hood.

do research; they're basically the heap version of a mutable array

>same question for binary trees.

you're a webdev aren't you?

Okay so as a person who mostly sticks to Python and Javascript that holds my hand for everything / has garbage collection I can ignore this?

I have an interview soon and I'll be pissed if they ask me about linked lists. The last time I read about them was probably in a C++ book

linked lists do not take up a contiguous part of memory meaning that restructuring a linked list is a considerably less expensive operation where as with an array thats a really expensive operation since you gotta reallocate all that shit and copy it and ghghughhghhhhhhhhh

it might have mattered more back in the day but the differences between the two are rather miniscule unless you are working with just huge bad boys of memory and linked lists are taught more to ease students into thinking about pointers and big O and stuff like that

if u dont know what the point of binary trees are then u r lost...

there is slightly more overhead to using a linked list since u gotta store a pointer along with the data as opposed to how in an array its just the data

Attached: Screen-Shot-2018-01-16-at-11.17.57-AM.png (401x398, 235K)

So you're saying that I would use something resembling a linked list if I wanted to be able to support delete / new / insert because a regular array is contiguously allocated?

Man its been too long since I thought about this shit

just retire grandpa, you're not cut for this anymore

I guess ill have to read up on algorithms and data structures more. I hope I'm not interviewed by someone who mumbles everything like last time...

see my post here if you dont know what this means this field isnt meant for you

b-but im only 27.

>tfw grads could code circles around me

Literally any data structure is just an extension of a linked list. Trees are linked lists. Heaps are linked lists. Queues and stacks can be made with static arrays but can also be built on top of linked lists. It is the precursor data structure.

I get what you mean. It makes sense if you have to manage your own memory but I don't see what the point of it is as an abstract data structure in something like JS with mutable arrays.

It's useful to know how it works behind the scenes, i.e. static allocation on the static, dynamic allocation, how pointers work etc. So thanks Jow Forums I was confused

Edit: stack ;__;

They may ask you just to check your overall knowledge in programming, since it's one of the basics, but I doubt it, since most interviwers assume everyone and their grandma knows how to implement one

How can you write software professionally and still not understand a fundamental data structure? plz tell me what you do in your job and how much you earn.

Are B-trees useful?

They are used in Unreal Tournament to represent a player's inventory. But this is because the UT devs are lazy fucks who reuse most of the code from the previous iterations instead of migrating to modern data structures.

What a shitty way of thought.

>Literally any data structure is just an extension of a linked list.
I wanted to object but then I realized LISP is linked lists and every language is a LISP in disguise.

Linked lists have a lot of use in operating systems because of O(1) insert/delete guarantee (unlike array-based counterparts which have O(1) amortized time for insert/delete on ends).

Yes.

worst case search time is O(log(n)), faster than hash table's worst case of O(n)

Arrays is the basic structure of which the universe is built.

>you only use the exact amount of space that you need
lolwut? do you even know how linked lists work?

Lisp also has vectors, not everything is a cons.

linked lists and their accessor functions don't use arrays though.
heaps can be implemented in an array without any linked list

>dev for 7 years
>Asks what's the purpose of binary trees

Lmao dude writing HTML doesn't make you a dev

No, you only need it for implementing Dijkstra and A* search but who cares about that shit

They are used heavily by database software

cons creates a pair and "lists" are just binary trees.

Sure, but why wouldn't you memorize it? I mean, it's one of the most basic data structures there is: each node has a value and a pointer to the next node. I don't understand how you can not memorize this and I have a shitty memory myself.

>t. beta incel

Memorizing the concept isn't the same as memorizing the implementation. It's like saying you know Calculus because you understand that it deals with curves, infinities, and infinitesimals.

Linked lists are useless, there is no real world situation where you would want to use them over an auto-resizing vector.

Basic containers have very different properties.
How long does it take to insert an element at the middle of an n long array? About n/2 since you have to shift the rest. How long does it take for a list? 1. Insertion anywhere takes the same, small time.

For binary trees, finding by key in a key-value system is the most obvious thing.
Consider an old paper dictionary. Which is faster, reading every word from the beginning, or opening at a random page and deciding if its before or after?

Are you a webdev or something? They are the ones usually asking stupid shit like this, since shitty languages like js and php use hashmaps for arrays and variables

>O(1) insert/delete guarantee
This is the thing linked lists are useful for.
You can delete or insert immediately at any place in the list provided you have a reference to the appropriate node.

The confusing thing is that you need to find the node first, which can take time. So java's linked list for example is completely useless, since it hides the nodes from you. But if you can keep node pointers around in a different data structure then a linked list can be very useful.

wrong.
consider a queue of tasks where a task can be canceled at any time (and should be removed)
can't do it properly with a vector

Simply wrong. You can amortize the deletion and compaction to get the same run speed (which most implementations already do under the hood).

>consider a queue of tasks where a task can be canceled at any time (and should be removed)
>can't do it properly with a vector
I don't follow. Of course you can. Can you show C++ code of what you mean?

Consider one of the most well known usage, render lists.
You don't know how many things you will render, you need to insert elements really fast, and you don't ever need to not iterate them, and the elements usually take big space.
Vector would be inefficient because of the high cost of copying all previous entries that may or may not happen when adding a new node.

>never written a single program that needed a linked list
Probably because you've only programmed in brainlet languages like Python that implement all the linked list functionality for you. But things like arrays and whatnot in high level languages are implemented as linked lists under the hood, so you actually are using them even if you're not aware of it.

Jesus Christ, and you're a "professional software developer" too. If you're the caliber of programmer that can be found in professional software development, then I can definitely get a job like that, because I am way smarter than you.

You can amortize growing or shrinking, yes. But you can't cancel (eg. remove) an arbitrary element in O(1).

>Task addLast();
>cancel(Task task);
>Task pollFirst();

You make a queue just fine with an array but the cancelling causes a problem. At best you can add a "cancelled" member to the task but then it still has to flow through the queue until it reaches the front. With a linked list it can be removed immediately

Linked lists don't need to be mutated to prepend to them.

>1 MB list
>make a copy, add A to one copy, B to another
>linked list, still 1 MB, just make two new pairs with the same tail.
>array, need to make a whole new array and copy it, takes 2 MB and extra milliseconds

linked lists are purely functional data structures so they're common in functional languages like Haskell.
>Real World
Nevermind

maybe youre retarded or you are a webdev that doesnt know what optimizing is? look at doom they use linked lists everywhere which is one reason it ran so well on a 386. for example each sector has a linked list of all enemies inside it for fast calculations and since they move around alot it takes no time to move them in and out of other sector linked lists. retarded developers such as yourself are why we have a 5ghz 8 core gpu with 32gb of ram and the mouse cursor still lags.

> Anyone with actual technical skills must be an incel.
How's it feel knowing that you have no talents other than doing that which even a dumb animal is capable of?

You do realize that implementations of a lot of languages (php and js for example) use hashmaps with linked lists for the elements in same bucket for variable names and function names, the very fundamentals of the language, right?

Under the hood, list implementations in high level languages use a large array. If this array gets full, the data gets copied to an exponentially larger array so it can store more. This removes the performance loss of inserting a new entry, while keeping all other benefits of an array. Because of this, linked lists will be worse in pretty much any use case.

Your example doesn't even make sense. You would need another data structure just to keep track of all the Task objects, or else why the fuck would you suddenly want to remove an arbitrary one? In such a case you can do away without the linked list completely.

No, the lists are arrays. You can look up the source code.

this

Attached: 1509708404439.png (502x841, 327K)

LL's let you insert and delete arbitrary nodes at any point in the list in constant time. This in turn means you can make things like LRU caches (with the aid of a hash table) with constant time operations (assuming no collisions, which can be guaranteed with eg universal hash functions). Granted, operations on arrays are generally faster for all but large lists due to caches exploting locality, but if you're in eg a hard real-time environment you need those constant operations guarantees.

BSTs let you store elements in-order with all operations taking O(log n) time. This makes them convenient for tracking things like order statistics for lots of elements.

You're applications are probably plebby enough that it doesn't matter though.

I'm and what the fuck? I know for a fact that the original php implementation did use a hashmap to store the function names.
Also, storing a list in an array doesn't make the benefits go away, like O(1) removal and insertion. Actually, memory itself ia kinda like a fucking huge array.

The other overhead of linked lists is pointer chasing, that’s expensive on its own and even more so if the destination node isn’t in cache.

It doesn't matter how the task pointers are kept. They could be held in the fields of other objects. Surely you can understand that an application might have more than one data structure in it.

He's also correct that it is common to combine linked lists and hash tables, especially when you need a hash table which maintains order (which is basically every scripting language)
An array won't cut it because you can't remove arbitrary keys in O(1) for the same damn reason.

I don't really see why it's so hard. Admittedly most people don't care about this shit because you can just brute force everything unless you're doing something performance critical

>tge you can do x[-1] to acco the last element in Python
thank you based Guido

That usually indicates a wrong usage case for them.
Random access is shit performance in them by design.
Lists are most useful in places where you know you will only ever need to iterate elements, but can benefit from O(1) insertion and removal

Linked lists SUCK anyway because they are not cache friendly. realloc() is cheaper than constant pointer dereferencing.

The same runtime guarantees can be done with a self compacting and expanding vector. Linked lists are just simply not used in the real world.

It can be more cache friendly if you store the list in an array (with offsets instead of pointers) and you can benefit from its pointing property if you for example have variadic length data
Say, for example you have the following task:
Have a collection of variable length strings.
Implement a way to add a new string
Implement a way to when receiving a string remove all strings that contain that string.

The fastest and most memory efficient solution would be a linked list in an array

i never actually understood O(log(n))
i know what log does
enlighten me

If you know what log does what do you not understand? Do you need an example?

Memory allocators usually work with linked lists fyi

it would help

Are you sure you understand big O notation?

What I usually do is visualize the function, in this case log(n), as a line on a graph. After a certain size, log(n) is going to always be lower than the function n. Hence, at that size, an algorithm that is O(log(n)) is going to be faster than an algorithm with O(n) runtime.

Attached: file.png (783x454, 35K)

n is a representation of the size of the input, the result being complexity (e.g. time or memory usage). O(n) grows linearly i.e. the complexity to process an input of a given size is proportional to the size of the input O(log(n)) grows in a logarithmic manner, so the size of the input causes the complexity to grow more slowly than a linear function as log(n) < n for all positive n. log(n!) means the function grows ridiculously complex as the size n! increases even faster than e^n. calculating the O of a function basically boils down to determining how fast the number of branches, recursions, and iterations increases as n increases.

i get that len(n) is less complex than n, but what kind of program would cause an O(log(n)) complexity? i get how looping over all n once is O(n) etc, but what would cause O(log(n))?

shit, not len
*log

finding someone in a phone book by looking through the entire phone book from start to finish is O(n). starting half way, determining if their last name is before or after that, picking the middle point of that, determining if their name is before or after that, and then repeating until you find their name is O(log(n))

not to be Cnile but legit user learn some basic C to start off with, it'll give you a wayyyy better understanding of things and there's no harm in learning it. not saying you have to use it daily or anything, but it's kinda the lingua franca of programming and if you can actually understand how the tools work at a lower level that's pretty valuable knowledge imo. of course if you're just in it for the money your employer probably won't give a shit but it'll make your life easier in the long run

linked lists aren't taught for practical use. although they may be used in real software, the implementation of data structures like linked lists have generally already been written. you just learn it to understand how it works, similar to how you're taught to do math without a calculator before actually doing it with a calculator

if you understand the concept and know a programming language you don't need to memorize the implementation

Why would you avoid learning about it? Sure it's not super useful but it's still something to keep in mind. And if you're a beginner programmer, it's still a easy data structure to comprehend which makes a decent introduction to data structures in general.

also holy fucking shit fuck this captcha why do you keep giving me the ones that load slowly

Xor linked lists are useful in some cases. You can reverse and move parts of the list in O(1) time. I wrote small implementation of xor linked lists because no one actually implemented the above mentioned features despite being possible.

>The other overhead of linked lists is pointer chasing
Allocate a contiguous block of memory and manage a free list (which incidentally is itself a linked list).

cose your langiage have buildin all this shit

Attached: 1c.jpg (1240x580, 33K)

Understanding linked lists is fundamental to computer science. Fully understanding and implementing linked lists is absolutely required to proceed to higher level courses. It's like learning a+bx+cx^2=y manipulation. You'll probably never use it in the real world, but if you can't figure out what it means or how it works then you're not ready for Calculus and linear algebra.

they're ideal when you're going to hammer the shit out of random inserts/removals and don't care as much about sequential access

only time they're really valuable is in cases where a large-ish array/list would be getting reallocated/reordered all the time - you can easily insert things at the correct point instead of having to re-sort an array constantly and removing them is just as cheap

not as big a deal as it used to be in the slower single-core days (when caches were smaller too) but it was a handy thing for 3d graphics and such where you might have a large list of objects to render that was constantly being modified between render passes

trees and >graphs grew out of it too by adding more pointers than the traditional "next item"

first time i remember seeing them in 3d code was in the fucking quake map editor or something

sorting and reallocating vectors isn't as expensive as it used to be, though a linked list will still shit on them when performance matters (almost never, considering the absolute state of "development" these days)

Probably you think that linked lists are useless because you have never felt the need of having a list structure that could grow dynamically. Binary trees if balanced are useful if you want to look for a number in log(n).

This is not true, only if they are balanced like red/black trees

Basically you assume that the log is in base 2 (as everything with fucking computers) hence you are dividing the problems recursively into two subproblems

It helps you to know what's really going on under the hood. It's like driving, you can be a driver who only knows about sticking the key in and putting it into D, but a great driver know whats happening in the engine and the rest of the drivetrain.

Actually no, even with python, if you have an array of 8gb and you have 14gb of system memory appending an element to that array would create another array of 8gb whether creating a linked list would allow you to increase its size without having your process terminated by the kernel. That makes a big difference in imho.

Can be implemented in creative ways to save space on embedded systems. I.e. if you need a list that will cap at 256 elements or less then you can implement a linked list using a array with bytes storing indexes to the previous/next node. This means each node takes 2bytes + sizeof(data stored) instead of 2*sizeof(ptr) + sizeof(data). Same idea for other int types that are smaller than the size of a pointer on the machine.

unroll your linked list so each node is an array the size of a cache line

I need to import a 3d mesh into my program.
The 3d mesh is a combination of vertex (3 floats) data, texel co-ordinate (2 floats) data and indices, where indices repeat the ID of each non_unique vertex/texel pair.
I need to store ONLY the vertices/texel pairs that are unique and reference them with the indices buffer (e.g. if during importing, 7 is identical to 2 then the indices buffer records position 2 and the vertices buffer skips recording the vertex since it knows an identical match exists in position 2.
I need to generate a hash for each vertex/texel pair so I can uniquely identify them BUT I can't create an array of size that matches every single hash since that could potentially be 500,000*500,000< despite only needing to a maximum of

dive and conquer algorithms, e.g. binary search
it's log base 2 btw

*divide

heap allocation is basically just linked lists

user, logarithms are like exponentiation but with division. It is repeated division just like exponents are repeated multiplication. So to cause a logarithmic runtime you incorporate a means of repeatedly breaking up the problem.

are you retarded or just pretending?
if you need to store 5000 elements, you only use 5000 elements' worth of space
if you then need to add one more element, you only add one more element's worth of space
kill yourself

Linked lists work best when you don't know how many items you'll need to store, or if the items need to be reordered quickly.

'Arrays' in high level languages are implemented using linked lists or other pointer based structures, so you're probably using them already and don't know it.

You don't need to know how to implement them day-to-day but you do occasionally and you'll be limiting your programming capabilities if you don't understand them.

Resizing real arrays with something like realloc will either mean more memory is used (if you make the array too big) or that *very* expensive copying of the array contents will need to be carried out.

Also, linked lists don't necessarily cause a problem for cache coherance, as either way its the algorithm used that is either cache coherant or not. Plus the copying of data for resizing will cancel out the advantages of static arrays.

It's handy for huge datasets that can't be in RAM (embedded device or normal PC) so you load the "meta" data in your overview and the changes (move, delete, add etc.) are easily written to disk.

Is varchar linked list?

Attached: 1536135299999.jpg (548x661, 68K)

You have no idea how linked lists work.
They need POINTERS as well. You know, to keep track of the previous and next element.
Go fuck yourself you stupid kid.

Java hashmap internally uses linked lists and trees.

He has a point. If you don't improve your skillz you'll be discarted eventually.

It's to weed out brainlet compsci majors

> (binary) trees are pointless bullshit
Webshit detected. Yes, they will be pointless for you. Just remember that relational db are faster if you use a magical thing called indexes where you need them.

As for a linked list, they are useless for you as well on any modern architecture, as pointers will lead to cache misses, and cache misses will lead to the dark side. Just remember that linux systems (among others) scale well thanks to their magical thing called kernel.

Historically, yes, but since modern CPUs have predictive fetching they're incredibly fast for doing linear traversals of arrays, whereas the access pattern of linked lists are typically unpredictable.

Frequently, it's actually faster to just move stuff to a larger array on modern CPUs than it is to bother with a linked list. It's a bit of an anachronism.

You can improve traversal speed significantly by unrolling your lists and prefetching where possible.