He starts his array counter at 0

>he starts his array counter at 0

Attached: 1507521089749.jpg (515x355, 173K)

Yeah, it makes sense

C was a fucking mistake

Attached: Screenshot_2018-08-02 C's Biggest Mistake.png (1294x3510, 866K)

kys Lua brainlet

It makes sense in languages like C where you directly manipulate memory. In scripting languages it makes a lot more sense for arrays to start at 1. The FIRST element in an array is array[1], and the last element is array[array.len], which is a hell of a lot more intuitive.

t. retard who doesn't understand data validation

You are over your head here user, go shitpost somewhere else.

Zero indexing makes sense, think of it as math, f(0)

>tfw Matlab user

Attached: 1531988533485.jpg (386x401, 27K)

>2018
>still uses array
oh god what shit tier programming language is that?

>Adalets start their arrays at 1 since they think it's more readable

this. fucking brainlet normies are infesting Jow Forums.

>he's a programmer
Virgin faggot

Attached: wtf.png (317x392, 134K)

this

>think of it as math, f(0)
All matrices are indexed starting with 1, this is a non argument, I would start with f(1).

So you are the kind of person who must spend 3 days to took 3 pills (one per day), rather than 2.

The first pill on the first (1) day the second on the second day (2) ...

t. retard who doesnt understand what C was made for

im sure interfacing with hardware would be really great with all these mandatory artificial wrappers around arrays and strings

>The first pill on the first (1) day
I am pretty you are fucking with me now, but for safety reasons:
How about taking first now?

>How about taking first now?
Then this is the FIRST day, so 1.

It made/makes sense for a simple language with performance concerns like C to have its primitive structure to be as simple as possible.
They really don't have much reason to specify a fat pointer (as he puts it) back then. Because the specific application needs determine the size of the size field/location.

They just opted for a simpler model which allows you to specify your array operations yourself.

Now you'd almost certainly want a size somewhere. But the number of bits for the size or even if it should be there was not an obvious choice. So it certainly wasn't a mistake.

What was a mistake was C++ deciding that it was still appropriate when the language was standardized. Which is just another example of ridiculous mistakes. But note how Walter blames C primarily and not C++. Because C++ inheriting this is C's fault somehow. I guess he can't blame his friends.

He doesn't even understand (or most likely he intentionally misrepresented) the attitude to standard libraries back then.

C wasn't made for what you think it was

C wasn't made for what you think it was

#define ] +1]

>#define ] +1]
#define ] -1]
motard

>Using Matlab
I am so sorry.