Modern webpage size

Why are webpages so big, an article I looked at was 200kb in total, but when I turned off uMatrix and uBlock it came up to 17 megabytes.
What the fuck is going on?

Attached: 1525747735270.jpg (770x770, 63K)

Other urls found in this thread:

techtimes.com/articles/60815/20150616/google-search-now-tags-slow-to-load-websites.htm
twitter.com/SFWRedditGifs

>hyperbole
you'll be taken more seriously if you actually link to the article user

techtimes.com/articles/60815/20150616/google-search-now-tags-slow-to-load-websites.htm

>1.8MB Transferred
the actual fuck are you talking about
but even for that, I'll bite. let's see how it changes when you visit a second page.
>280kb Transferred
so essentially, as with most websites you'll see these days, they assume you'll stick around for more than one page, loading up their 'application' onto you on the initial load
as for the 17MB you're describing, the only thing I could find on that is that you're a fucking idiot and don't have autoplay disabled in your browser
sure, that's cancer, but easy enough cancer to avoid

it was the videos and ads that played that made it that big

>one website sucked
>why do all of them suck?!?!

t. brainlet turboretard

so my point stands - why the fuck are you surprised? disable autoplay, get a pihole or the like, move on. people have to pay the bills for hosting and paying writers, and the vast majority of users don't follow those two steps, allowing them to make money and provide content. this isn't fucking rocket science

Dude, are you retarded? Just because there's ads doesn't mean websites aren't way bigger than they should be currently. Then again, you're an ad-apologist from which we we can deduce you indeed must be retarded.
T. otherfag.

another idiot that cannot read
don't like it, don't deal with it
that's it. it makes them money. it works. that's how capitalism works. don't like that, go suck Putin's cock or something

Most of that size comes from loading as many trackers and bullshit as possible to squeeze as much money as they can out of you.
Web designers make a nice page, then the client requests the ad bullshit and all the work is ruined because now your pretty 400kb news website has 7-9MB of trackers loading.
>t. webdev
Go to any website that serves articles and the same will happen, look at any pajeet page.
>If you don't like it shut up and avoid it
I'm glad to see you want this to keep happening.

>I'm glad to see you want this to keep happening.
how the fuck else do you expect to change it aside from avoiding it yourself
whining on an anonymous board for teenagers pretending to be adults isn't particularly effective

Ok, I went to nytimes.com and it was 2.7mb for the entire page, now what?

I was just working on The Memory Hole (not that one), and so far I've gotten it down to just under 1 MB without any compression. Working on adding gzip to my build process, should cut it down a lot further.
These days, a lot of things happen on the client, especially with frameworks like React and Vue.js. Webpack and Babel have really changed the web dev scene, whether it's for the better or not is still up for debate, but in my opinion it's fantastic.
As the world's average network transfer speeds increases, the need for pure HTML sites is slowly being replaced with rich clients with low-level access (see browsers soon being able to execute motherfucking compiled code. *nix kernels in running the browser when?)

>reload page
>219kb transferred

Wow it's almost like they are updating their page constantly through the day and use an API call to get the latest data when you refresh the page, crazy how a NEWS WEBSITE would update when you refresh the page haha lol

>API call to get the latest data when you refresh the page
lmao

You know we don't just talk about shit here, I actually am a web dev so I can make changes with the way I work and give feedback to clients rather than just ignore it.
Still too big.
You can still make quite a rich webpage that looks great and is tiny. My resume website is only 200kb as a whole, and not every page is loaded at once.
It already cached the data from the first time you loaded it.

Attached: 1526277139486.png (475x417, 513K)

>You can still make quite a rich webpage that looks great and is tiny
I don't disagree. It mostly depends on the functionality your service needs. For a resume, at most you'll have a way to show off some skill and maybe a contact form that'll get bombarded with spam. But for example, I'm working on a webapp for a financial firm out of state, and so far, uncompressed, the Javascript file is roughly 8 MB. That size includes the stylesheets too, but it's still fucking disgusting.

>the Javascript file is roughly 8 MB

aren't you splitting it over lots of small async-loadable files? That's how you solve that issue. I got worried when my project hit 3MB in JS and split it up so it only needs to download ~100kb at a time

well apart from vendor.bundle.js which is 3MB but that's cached for like ever

Depends on how big the place is, this is a smaller place in Aus so workers have a lot more influence over this shit.

It's the dev configuration, haven't setup splitting or anything for it so it's a monolith. In prod it's configured correctly and isn't bad.

ah good - it is disgusting, isn't it? Seeing 100.js in my dist directory makes me cry.

bandwidth is cheap, devices have lots of spare resources
laziness from poo devs

We use Microsoft dynamics crm at work and its 30 odd mb before the page is done. Its all random js frameworks and logic that you'll never use. The vast majority of it is cacheable but they are also mostly fetched via ajax, so it doesn't all load at once and the page isn't functional until everything is loaded. The sales people use this POS everyday.

Yeah, feels bad man. Waiting for WebAssembly to become top shit so I can run a browser in a browser semi-natively. Then I won't feel bad about making giant fucking applications to run in the browser, because it'd be as if I was making giant fucking applications to run on the desktop.

How do you think the data magically gets to your screen dumbass?

It's not hard coded you fucking mongo

why did you post that gremlin-looking faggot, though?

Wasm

Just post more Rose, turbo nigger.

blame zandoo.cz

>only 17MB
Holy fuck! Why webdevs are not improving in tech???
You think I bought 16GB RAM for nothing!?
Fucking hell... I wish websites size in gigabytes and entering one would be like stepping my foot on alien planet

Attached: Lennart_poettering.jpg (1024x678, 123K)