Listen up

Listen up,

I got a plan to fix internet.

Make a browser that

1. Ignores JavaScript or CSS sent (or do not render anything altogether if any js or CSS sent)
2. Can only make HTTP GET request
3. Only accepts Content-Type: text/markdown; charset=utf-8 (see tools.ietf.org/html/rfc7763).

The browser would be trivial to make.

> But it makes no money though, no corporation would use it

And that's the point, it would fundamentally separate web apps from readable content.

Attached: fix_internet.png (500x500, 7K)

Other urls found in this thread:

bank.com/login.php?username=user&password=ihavetinypeepee
twitter.com/SFWRedditGifs

But then nobody would use it, and you would fix nothing.

I'd rather one where you can freeze code execution and edit/inject JavaScript. Something with more privileges than add-ons like *monkey.

The idea is correct. Here are some recommendations to improve this further:
- very few, or no, formatting information in the data. The browser reads the data and does what it wants with it (such as displaying a navigation pane with all links in a sitemap.txt file provided by the site).
- well-defined top-level text files (unformatted? json? not sure, the most human-readable, the better, but mind transfer costs) should describe the site for the browser to display things nicely.
- as you mention, absolutely NO site-provided logic of any kind.
- possibly using something very different from GET (and for legacy sites, DO support at least also POST). GET requests url-encode inputs. Imagine the security risk with having a url-encoded password right in your address bar.

Generally, all display and logic should depend on the browser. For example, if you have a form in your site, you should tell the browser what you expect, but not how to display the form or how to collect it from the user. The browser (and thus the user) should be in charge of that.

Personally, I believe in full decentralization, but one step at a time.

I don't fully agree. At the beginning, Jow Forums had no users. Look at it now - a miserable pile of rubble dying underneath normalfag foot. Regardless, the point is that step by step, a community, and a better web, can be created.

Like most thing, it would need some shilling to get it going. That's true of anything new.

do it, i'll get started on the logo

>no post requests

Attached: what.jpg (600x512, 26K)

hey are you retarded by chance?

>markdown

Attached: d90.png (644x800, 15K)

> add-ons like *monkey.

I wish i even knew enough to use that

>a plan to fix internet
>Make a browser

Attached: o.jpg (960x813, 94K)

The reason is good. All current browsers are either horribly slow and bloated to the point where some sites will lag an i7, or generally unusable like Midori.

>bank.com/login.php?username=user&password=ihavetinypeepee

you're talking to an incel 25yo boomer. give up.

why not just use gopher?

Agree with most thing except handling anything legacy. It should be different enough to not import any existing problems.

I am also not sure about supporting POST request for forms. I am thinking of this as mostly to share readable site.

Right now I look up some tutorial, a developer's site comes up with 5 mb JavaScript and css and displays nothing if css or js are disabled. Many times these people are using static site generator with markdown. I want these people to have a new and hip alternative to send it directly as markdown.

I want internet articles to be readable without CSS or JS and that's about it. Having a complete and better alternative to most of what modern browser currently provides would be a much larger project.

A very trivial browser.

i would use it. i already have noscript on one of my browers. if a site wont load because of it i just skip it, wasn't worth visiting anyway.

Forms are for web apps

for encryption use ROT13

Do you have a better markup format to offer that has everything a document needs and not html ?

I like the idea, I don't know much about this topic in general but I have a simple suggestion, Firefox f.e, has that reader mode option that turns any bloated page into a nice readable page with white background and just text. Why not make that default? Like always show up pages like that?

heh, I have your password now. going to hack ur account

With regard to legacy, I think it's just a useful way to get the ball rolling. All it needs to do is fetch html, and render the body as unformatted plaintext without the tags, using the browser's usual rules on that plaintext.
If just sharing static sites is enough, there are already solutions that work for that so the project is a lot less interesting. Interacting with forms and being able to 'write' to the site without invasive obfuscated code distribution, however, would be a game changer in my opinion.

Wasted of resources, JavaScript shenanigans always evolving, that techniques may or may not work in the future, no motivation to make things better since they are sending html/js/css anyway

honestly I'd just settle with a browser that allowed me to natively edit the HTML/CSS of pages I visit, much like what we can do with the Inspect Element feature in Firecucks, except it'd change the edited page permanently

>25yo boomer

Are you mentally retarded or just a fucking idiot?

So it's the website that provides that view? I got the impression it was on the client side since all the websites look the same on that view

lurk moar

I see what you meant by legacy. That should be easy enough, bunch of library already exists that does this well enough.

Forms gets one closer to the mess javascript/css rendering etc is. What sort of inputs should be supported, what sort of validation, hover text, UI etc to support, who decides, how often these changes and all these shenanigans that makes it difficult to make normal html/css/js browser. Maybe with some strict one type of POST input sort of thing can make this easy enough to not be a big headache.

what about guestbooks tho

Why is this not a Firefox addon already?

Form fields are already widely standardized, I don't think the problem lies here. With a proper standard, it would not move toward the hell of css/javascript, because the browser is still entirely responsible for the tasks of logic and rendering. The request type and interchange format is what needs care. Hover text, input format support and validation on the client side would all be the browser's task, which will explicitly enable lightweight "testing" browsers to help site development by being able to send arbitrary data in the form and see how the server-side handles validation. More importantly, this keeps all actions in the user's perfect control.
The problem with html/css/js is that css + js are turing-complete and html is convoluted as fuck due to handling fine-grained rendering (poorly, too).

You download css/js/html, then browser cleans it up.

/thread

Elinks,w3m.