suckless st shills btfo
You DO use a GPU accelerated terminal, right?
>using a display server
Disgusting
>when your program is slowed down by conhost.exe ramming an entire cpu core
Is the visual performance of your terminal so important that it needs to be GPU accelerated? It is a fucking terminal.
>the application you use requires GPU to display white text on a black screen
you are truly a retard
>.exe
Don’t embarrass yourself, Windows user
alacritty > kitty
you cant chamge my mind
>i don't do anything important with my terminal so i don't notice how big text outputs can make the cpu usage skyrocket
kys
The windows console is more likely to be faster(it doesn't have transparency afaik), but it's impossible to tell for sure without benchmarking every single linux terminal emulators and every version of windows.
>using pajeetware
If you're outputting large enough chunks of text to your terminal to the point where you CPU usage for rendering all of it becomes a problem then the issue is that you're using the terminal in a stupid way. You don't solve that by moving the rendering to the GPU, as there's no real world reason to make the terminal render megabytes of text per second.
Why in fuck would I need this? Is the CPU no longer powerful enough to render a few thousand characters of text with all the font rendering frameworks that people like to stack up these days?
Welcome to 2019. The next step is to rewrite the shell in JavaScript or some stupid shit like that.
OP just won the most retarded thread of the year award!
congratulate him!
>muh terminal speed nigga
Go back to your video games
based
Dude, you can talk to me when you're getting 120 FPS in netstat.
>not writing unity in vim
never gonna make it
> only 120fps on a 144hz screen
hows the screen tearing treating ya?
my threadripper renders 144fps on the console so i have no screen tearing without wasting a single cycle on vsync. go back to 60hz, cpulet
Why not utilize the hardware that's designed for rendering?
Meanwhile I'm here using urxvt. It's tiny, opens instantly even on an old machine, and is very fast.
Because it isn't a fucking 3D scene.
>the only graphics are 3D
user...
Kek, love the flex
Simple text has traditionally been rendered by the CPU.
Because the added complexity to use that hardware in a terminal isn't a good trade-off. These things don't come for free. There's a cost associated with creating the software for rendering things through the GPU and then a recurring cost for maintaining that.
1) it's not normally done that way
and
2) it's more complicated
are non-answers.
How about "it adds complexity, is more bug-prone, and has zero benefits"?
The answer is that the added complexity isn't a good trade-off off when a terminal has good enough performance with a CPU. Added complexity is always measured by what advantage it brings, and in the case of GPU-accelerated terminals that advantage is almost null.
How is gpu acceleration going to help make terminal drawing faster when terminal drawing is done over a pseudo terminal/stdout one control character at a time? Along with whatever other information is being shot out.
I don't even think you can apply basic vectorization to it because it's so serially limited to the primitive io of what is basically a 38000baud serial port simulation.
None of that is innate and depends heavily on things like the language and libraries used to write it.
If you have to use the words "good enough" you have your answer to "why bother".
I unironically use one because I like how it looks.
Why would you want to code in an ugly dialogue window?
This. Also
>python in a "performant" piece of software
>alacritty < st
The human eye can't see more than 24 megabytes per second so a gpu accelerated terminal is not necessary
>st
too lazy to configure
>everything else
almost everything is based on vte or other shit, and for some reasone, the borders become transparent when using compton
alacritty is a comfy middle ground imo
i use termite on wayland (︶^︶)
what the fuck even is a gpu accelerated terminal?
You pretty much need a meme card to make this even viable, right?
I fucking hate the dev behind kritty, literally indian poettering
Its just litty is the ONLY wayland compatible terminal thta supports image previews, if anyone can find something that can do this ill switch adap
Did you just put random words into a sentence?
underrated
Powershell is the best terminal in existance.
not to be confused with KiTTY, the fork of PuTTY
No, that's dumb. Go rewrite a unix command in node.js like the rest of your kind.
tried that out in another shill thread, and i got;
- 3x ram usage over lxterminal (with multiple lxterminal (and that's multiple lxterminal windows vs. one kitty window)
- more cpu usage than lxterminal (idling with ~1.2% cpu usage, as opposed to 0.0%)
yes, it renders faster, much faster, doing "ls -R /" in 700ms instead of close to 5s
but i'm not on a 286, and i can't read a million words a second, regular terminals are more than fast enough, and you're not even saving any resources by offloading to the gpu with this thing
literally pointless
only if it come with some crazzy visual effects
No, the OS I use has had a working GUI for decades.
>being a game developer
Enjoy your crunch time.
>powershell
>terminal
what did he mean by this?
He's asking if you need a non-generic graphics card for it to be worthwhile.
You're one of those people who only reads one word out of five and derives the rest from your own assumptions, aren't you?
It's like the next step up from illiterates who guess the meaning of writing from context, except lazier.
Tiny
guis are not a replacement for clis
What does Jow Forums think of Terminology?
>GPU accelerated terminal
what are you smoking
Generally curious. What is a use case, where you output such large amounts of text to stdout/stderr that it makes your CPU usage spike?
>yes, it renders faster, much faster, doing "ls -R /" in 700ms instead of close to 5s
That's actually a pretty good result for Kitty. I get at best twice as fast execution.
kitty: 4.5s
st: 8.9s
urxvt: 9.0s
mlterm: 9.4s
xterm: 9.4s
gnome-terminal: 10.9s
And that's for ~1.7 Million lines.
>i don't do anything important with my terminal
printing megabytes of text to your terminal at full speed is not using your terminal properly, there's no use-case for it
>so i don't notice how big text outputs can make the cpu usage skyrocket
printing shit at full speed on my terminal uses 10% cpu tops, there is no use where the speed of the terminal is in any way affecting what you're doing
hmm, guess lxterminal-gtk2 is just really slow
~380,000 lines
lxterminal-gtk2: 13.2s
xterm: 3.4s
kitty: 2.0s
looks bad, but that's still >29,000 lines a second, which is more than enough for normal usage, it absolutely doesn't feel slow, and this isn't something i'm going to switch terminal over
>using terminal for visualisations instead of readable output
-- also, if i do it in tmux, which i do anything important in anyway, i get;
lxterminal+tmux: 1.7s
xterm+tmux: 1.7s
kitty+tmux: 1.9s
(ran them a few times each, not sure why kitty is slightly slower)
so uh.. yea.. completely pointless
And yet, they did - totally, to within 0.1%.
for the layman who uses computers like appliances, sure
there's more computers setup without guis than there are ones with
Only for when making a roguelike look like its being played in a terminal
thanks for showing me this, OP. I like the performance in cava (visualizer for alsa/pulse).
>normies about to see me using my pc
>i cant let them know im beating off to trap hentai
>quickly open new gpu accelerated terminal and run tcpdump
>"wow user that looks impressive"
This is never the case on gnu/linux. What are you smoking?
>wild assumptions based on nothing
The issue centres around the use of "meme card", that's literally meaningless in this context. It's as valid as saying
>you pretty much need a card of my own arbitrary but non-stated qualifier to make this fit another personalized qualifier/quantifier
It's chaff.
>VMS bad
>POSIX good
Retad
i thought the enlightenment terminal did too. i don't use it tho so im not sure