Why is there no easy way to get a string input from a user in C?

Why is there no easy way to get a string input from a user in C?
Is C not as based as Jow Forums told me? Am I missing something obvious?

Attached: c.png (1200x1276, 74K)

getchar()

fgets and scanf are easy i thioguht

Reading user input is bloat. Just let the user change some variable and compile the program anew :^)

#include
#include

int main(void)
{
char *line = NULL;
size_t len = 0;
ssize_t nread;

while ((nread = getline(&line, &len, stdin)) > 0) {
// Do thing
}

free(line);
}
What's so difficult about that?

C is a low level language. Just read chars into a dynamically allocated array and resize accordingly.

This... Apple is successful because they made choices for the user. Gives a smooth experience.

>scanf
Guaranteed buffer overflow.

>fgets
Count 15 lines if you want to do it right.

C is obsolete. Rust welcomes 3-digit-IQ people with open arms.

>Guaranteed buffer overflow.

It seems you're the one with less than 3-digit-IQ.

scanf being a buffer overflow magnet is an extremely well-known fact. Just because you only test your programs with one input of a length-2 string doesn't mean it's safe.

>what are length specifiers
retard

>leaving the buffer nonempty
Have fun fucking up subsequent user inputs.

What kind of mental gymnastics did you just perform? Do you even know C?

>scanf
>Guaranteed buffer overflow.

>what is %123s

inb4 "but it has to be hardcoded", just make the format string using sprintf with %%%d

I don't know shit about C but aren't there libraries that implement safe alternatives to things like scanf? Why not use that? They should even take programming errors out of the equation by taking care of shit like

He's wrong about scanf being inherently unsafe, but it really is a pretty fucking shit function most of the time.
Using (which unfortunately is a POSIX extension) + sscanf or some better string parser is a much better idea.

It's %.123s. Or (again with the POSIX extension) you'd use the much more sane
char *s = NULL;
scanf("%ms", &s);
Which malloc's the buffer for you, and has no hardcoded length limit.

>Do you even know C?
More than you it seems.
#include
#include

int main(void) {
char name[4];
int age;
printf("Name: ");
scanf("%3s", name);
printf("Age: ");
scanf("%d", &age);
printf("\n%s, %d\n", name, age);
return EXIT_SUCCESS;
}

$ ./nigger
Name: Timmy
Age:
Tim, 2082543632

That's your fault for not checking errors, you moron.

It's one of the minor issues of the language.

I would like to switch to D for that reason, but D is not available for microcontrollers, at least not in a non-hacky way.

scanf isn't gonna return any error. Now I know who knows C better here.

what are you smoking OP
just fgets() that shit into a buffer

fread or fgets

Yes it does, fuckface. It returns how many format specifiers that it successfully read.
If it returns less than you were expecting, it was an error or only a partial match.

It's entirely your fault for not checking that it actually returns 1.

you are 3 digits 70.1IQ

>If it returns less than you were expecting
Good luck measuring the size of a user input before storing it.

>it actually returns 1
It doesn't.

That's 70.1 more than you sweetie.

scanf("%10s%*s", buf);

Yes, you are.
#include

int main(void) {
char buf[8];
if (fgets(buf, sizeof buf, stdin)) {
puts("Successfully read");
return 0;
}
perror("fgets() failed");
return 1;
}

It's the format specifiers, not the number of characters read or length of string, you fucking moron.
#include

int main(void)
{
int a, b;
if (scanf("%d %d", &a, &b) != 2)
printf("Invalid input, fuckface\n");
else
printf("You entered %d, %d\n", a, b);
}
$ ./a.out
1 2
You entered 1, 2
$ ./a.out
a 1
Invalid input, fuckface
$ ./a.out
1
Invalid input, fuckface

>re-prompting just to discard input if the user types something smaller than 10 characters

So you're admitting there wasn't any error?

#include
#include

void clear_stdin() {
int c;
do {
c = getchar();
} while (c != '\n' && c != EOF);
}

int main() {
char name[4];
unsigned age;

printf("Name: ");
scanf("%3s", name);
clear_stdin();

printf("Age: ");
scanf("%u", &age);
clear_stdin();

printf("%s, %u\n", name, age);
}

Good question.
Lets say I make an array of chars
char string[5] = {'H', 'e', 'l', 'l', 'o', '!'};

If i want to print this to std out i am supposed to use printf
//printf(const char *format, ..)
printf(string);

Notice that it takes a pointer to chars. In other words, all information about the size of the string is lost. How does printf know when to stop outputting chars? Supposedly C advocates zero terminated strings. In other words, if I modify string to
char string[5] = {'H', 'e', 'l', 'l', '\0', '!'};

it should only print "Hell". Indeed, on my computet it does.

But also on my computer it doesnt cause a segmentation fault when I print the original string. It just prints "Hello!". Does that mean all char arrays are laid out in memory with a padded zero byte at the end?

Tbh I've found myself using a shit ton of ways to get a string in C
But the most not-retarded way is to use read from stdin or getline which does a realloc inside itself if you exceed the allocated memory
I heard it's unsafe tho,but can't confirm
I've never found any memory leak so far

>Does that mean all char arrays are laid out in memory with a padded zero byte at the end?
The compiler will, because of the architecture you're running on, align all data segments to a multiple of 4 or 8 byte. So the remaining two byte happen to be initialized to 0 by your compiler and you got lucky that it didn't do something different.
I'm pretty sure what you're doing is undefined bahaviour.

>implying this shouldn't be a standard function
Cniles will never cease to amaze me.

#include
#include

#define LINE_INC_SIZE 16

int
cgetline(char **restrict lineptr, size_t *restrict linecap, FILE *restrict stream)
{
size_t bytes_read = 0;
int next_char;

if (*lineptr == NULL) {
*lineptr = malloc(LINE_INC_SIZE);
*linecap = LINE_INC_SIZE;
}

while ((next_char = fgetc(stream)) != -1 && next_char != '\n') {
if (bytes_read + 1 >= *linecap) {
*linecap += LINE_INC_SIZE;
if ((*lineptr = realloc(*lineptr, *linecap)) == NULL)
return -1;
}

(*lineptr)[bytes_read++] = next_char;
}

(*lineptr)[bytes_read] = 0;
return bytes_read == 0 && next_char == -1 ? -1 : 0;
}

int
main(int argc, char **argv)
{
char *line = NULL;
size_t size = 0;

while (cgetline(&line, &size, stdin) != -1)
printf("%s\n", line);
}


perfectly easy.

>I heard it's unsafe tho,but can't confirm
Nope, just not portable. It's safe.

There is no sane and safe way to handle strings in C. If you really must use them, just use C++ for string operations, or better just pick any modern language.

>Supposedly C advocates zero terminated strings.
It MANDATES them. A string only ends when \0 is encountered, omitting the null terminator is a quick trip to UB.

getc()

or fscanf/scanf for a string

safe alternative, could be safer

Features make your language slow. If they added sane string handling then the language would become slow like C++.

>Features make your language slow
I don't think that's always true, consider `restrict`. That's a feature.

Having better unicode support (than what C11 provides) and facilities for common string operations in a standard library don't make a language slow. In fact, the more operations with standardized semantics, the more optimizations compilers can make.

I don’t think you understand. C is perfect and any changes would be a regression in performance and make the language more like those trannies who worry about immaterial things and make baseless claims driven by political motives instead of pure reason (like myself.)

dude scanf lmao

It's part of any basic C library.

true, but if you know the string length in advance, you can use a char array without null byte, and use the c99 n function variants (strncpy, etc.)

>slow like C++
c++ is faster than c you tard

>char string[5] = {'H', 'e', 'l', 'l', 'o', '!'};
The initialization list provides more values than there are objects, so this is illegal, even if your compiler tolerates it.

All C strings are ended in a null byte, so calling like this results in undefined behavior.

>Does that mean all char arrays are laid out in memory with a padded zero byte at the end?
No. All string literals are transformed into an array with static storage duration and a null byte is added. In the case the string literal is the initializer of a char array, the null byte is added if there is space in the corresponding char array. But that only covers literals, the compiler doesn't need to create extra space for you.

True, but then you're not working with a string any longer. In C, every string ends with a null character.

Interesting, do you have a benchmark demonstrating this?

Anything that uses on static data.

Algorithms are bloat, only slow garbage like electron and cpp needs algorithms to make up for their slowness.

lol