Why aren't there any solutions for serving files using a rest api? How the heck...

Hunter Sullivan
Hunter Sullivan

Why aren't there any solutions for serving files using a rest api? How the heck do people serve so many files on websites like Jow Forums but there are no open sourced solutions to serving files? I just want to be able to store files somewhere and then be able to retrieve them through simple commands. Can I do this on linux? Do I really have to write up my own shit? Are there any apache projects that handle this?

I just want to upload files and get them back. Do I really have to use scp to send files around?

Attached: 1564359638016.jpg (763 KB, 960x720)

Other urls found in this thread:

softwareengineering.stackexchange.com/questions/150669/is-it-a-bad-practice-to-store-large-files-10-mb-in-a-database
temp.sh
transfer.sh

Jace Long
Jace Long

serving files using a rest api
user what are you smoking

Anthony Sanders
Anthony Sanders

how else are you supposed to serve them?

Cooper Reyes
Cooper Reyes

please answer me, how the heck am I supposed to serve files?

I can use Ubuntu and scp them to a folder and use something like express to make it public, but that's it. Is that what everyone does? Do people just make public folders?

Noah Parker
Noah Parker

Its called HTTP. look it up.

Connor Smith
Connor Smith

god please don't leave me!!! I NEED TO KNOW please please please please please please please please please please please please how do I serve files?

I want a scaleable solution since I plan on having multiple web servers serving content, but everyone on the internet says to keep all your files on the file system and manage the urls in a database. So I plan on having my "image" server separate than my webservers. But how do I get my files onto this server?

I found a program called Thumbor, and you can upload pngs, jpegs, and gifs to it using a post request. But that's it. I want to be able to upload mp4 files as well! So I can use Thumbor for it's supported file types, but how do I store my mp4s?

I can use Ubuntu, but there isn't any post request for me to upload files to it. I would have to scp and copy the files to a folder, and then manage it's url. Are there no solutions? please please please please please please please please please help me you have the answer please share it

Henry Jackson
Henry Jackson

Are you retarded? apt install nginx

Dylan Evans
Dylan Evans

okay, then what? I understand you can serve all content in a couple of folders, but how do I get that content up there if it's dynamically created? For example, if you post an image here, it goes to a serve and then can be picked up via the url. But what if I have two servers? One server will create the post, and the other would serve the files. How do I get the image from the first server to the second without scp?

I would have to scp that file into the public folder in nginx, right? Are there any other ways so I can't create, read, update, and delete a specific file? WHY CAN'T YOU GUYS TYPE MORE THAN ONE LINE?

Thomas Roberts
Thomas Roberts

not him, but are you asking how to get one server to send an image to another as part of a user's http request?

if so, one common way is to have the user agent link directly to the image server. The server parses the user request, figures out the image name the user wants, and finds and returns it from its database

Liam Torres
Liam Torres

That's exactly what I want to do, but I'm not going to store files in the database. Instead, I want the files to sit on a separate server like how 4cdn does and have urls that that refer to the files. The reason why I'm not storing the files as blobs is because of the reasons here softwareengineering.stackexchange.com/questions/150669/is-it-a-bad-practice-to-store-large-files-10-mb-in-a-database

For example, right now when I post, I post to the Jow Forums.org website, but then Jow Forums will communicate to 4cdn and push my original file there after some checks and maybe compression, and then I can get my original file by talking to 4cdn

Do I use a linux distro to hold the files? Because there doesn't seem to be any solutions. And then the original question is how do I talk to this linux distro since there isn't a rest api I can use to do that

Juan Davis
Juan Davis

Unironically use web.py, it's great.

Thomas Young
Thomas Young

do you really want a REST api? i think not

if you want to serve dynamically generated content i would recommend using a http server library for a language of your choice not nginx, and then loadbalancing between nodes with haproxy

if you want to have a proxy at the back end of your web application, you should probably just have the data on the same node as the http server. if you don't, you'll introduce a bunch of latency issues you'll need to optimize around.
i recommend you make a head node and a bunch of tail nodes, replicate the data to all of the nodes with rsync or something, run a copy of your application on each node, and balance between the nodes with a haproxy instance on the head node
if traffic is too heavy on head node, have it just run haproxy

Nathaniel Barnes
Nathaniel Barnes

Corps develop their own in-house shit to do this, or from when they pay a CDN. If you're making your own cdn you'll have to make your own API.
I recommended web.py earlier but mainly because you can do anything with it reasonably easily, and it's not as heavy as something like spring or whatever.

James Cox
James Cox

I just want to upload files and get them back.
Back in my day we used something called the File Transfer Protocol. You literally used commands to transfer and retrieve files. Shit was fucking cash.

Attached: 1557959640645.png (292 KB, 496x736)

Matthew Scott
Matthew Scott

You might be onto something. Maybe I can use ftps to send my file from server 1 to server 2 and then use nginx to serve all my files

Is there a downside with this approach?

Anthony Perez
Anthony Perez

An FTPS transaction is not low latency, your clients will be waiting like over a second to get the content

Jacob Kelly
Jacob Kelly

dumb nibber, just do a multi-part http request and response containing the file

Lucas Richardson
Lucas Richardson

Yeah, that's what the client does. Then server 1 has to store it somewhere safe because I want this thing to be scalable and have multiple "server 1"s and only 1 server 2.

My issue is, what software do I run on server 2, and how do I communicate from server 1 to server 2.. Sadly ftp is out due to this:

Henry Johnson
Henry Johnson

what is a CDN

Thomas Flores
Thomas Flores

Why should I upload to a cdn when I can just upload to server 2 and then my cdn can cache my files?

Christian White
Christian White

yeah why not?

Nathan Rogers
Nathan Rogers

guys I'm lost it seems like every application in the world can serve dynamically created images but I can't... Why are there no resources for something so trivial?

Attached: 1564116611066s.jpg (6 KB, 250x230)

Josiah Jackson
Josiah Jackson

guys why is it so hard to build a scalable web service without going directly to some third party like amazon or imgix?

I'm trying to find resources, but it seems like "image server" isn't even a term used by people... I can't find resources... and my head is going to explode... Websites like myspace and Jow Forums solved this decades ago, why is this so esoteric when literally every social media site does this>

Attached: 1564547015233.png (136 KB, 363x296)

Parker Rodriguez
Parker Rodriguez

and now I'm alone, lost, and confused... I feel so empty now

I just want to upload files and get them back... I just want to do this through a rest api... Do I really have to write this stuff myself?

Attached: images.png (7 KB, 227x222)

Lucas Hill
Lucas Hill

Install Gentoo then use startpage
//boards.Jow Forums.org/trash/

Brandon Hernandez
Brandon Hernandez

._. I feel so desperate now

John Brooks
John Brooks

Shell will set you free. Forget apache projects whatnot. Wget and curl are your friends who will never let you down, however if you wish to get filenames with the original names, not just settle for the unixtimestamps, you have to get busy with sed, awk and grep. And while you are at it, why not set up your own gitea-server, or set up project in gitlab.

Sure false prophets offer you deformed constructs like jq, steer far away from those.

Ian Allen
Ian Allen

Here's my site: temp.sh
I haven't released the source but there's another similar site that has ( transfer.sh )
You won't be able to achieve this with PHP btw cos PHP doesn't support reading the filename header from a PUT request

Easton Nguyen
Easton Nguyen

Be a chad and release the source.

Robert White
Robert White

Do you keep your database and your backend on the same server? If not, then you upload the user's content to the database and store it as blobs, right?

And if you're not storing it as blobs, then you're sending to the second server with wget and curl? That's amazing.

Brayden Barnes
Brayden Barnes

nigger, smb/ftp + openvpn or any secure vpn(not pptp, etc.)

vpn is to account for security vulnerabilities so don't skimp on that

Caleb Sanders
Caleb Sanders

No u

I'm not storing the files in the database just the filepaths and filenames.
It's all on the one server, there's limited space but if it fills up it fixes itself cos it deletes files once they are 3 days old with a Cron job that runs every hour.
The files go to a folder in /tmp with a randomised filename. So link would be like temp.sh/ABCDE/kek.txt , check db to see if the file ABCDE has the filename kek.txt , return /tmp/tempsh/ABCDE else return 404

Aaron Hill
Aaron Hill

Using smb and ftp in 1998+21
Just use scp or rsync with an sshkey and quit ur whining.

Jack Wright
Jack Wright

different tools for different jobs, literal mong

Oliver Martin
Oliver Martin

It solved what he wants to do, how are these the wrong tools?

Kayden Diaz
Kayden Diaz

ah correct i thought i read he wante dto browse too
inb4 rsync autocomplete

Jonathan Torres
Jonathan Torres

To add to this if I wanted to use multiple servers for storage here's what I'd do.

User uploads file to temp.sh server
File is served from there for now
Every 5 mins a cronjob copies the new files to the other servers, updates the mysql and removes the first copy
The updated mysql record makes temp.sh/ABC/file.txt redirect to server2/ABC/file.txt so all the links remain intact throughout the process.

Evan Turner
Evan Turner

It's all on the one server
It seems like a really nice hobby project, but if it were to scale I bet you would need more servers, and if you need more servers then you would need multiple web servers and a really beefy database or sharding, which is what I kind of want to do. I want to prepare for that by implementing a second server managing just the files and communicating with it through the webserver and allowing the client to make get requests to it.

It's a nice project user, I really like it! :)

Yeah, it seems that scp is the best way to talk to server 2... I was hoping for something more abstract but if it works it works. I'll probably just use nginx to serve a folder and plop files via scp to that folder. I'll manage file data entries via a table like does,

Nathaniel Hughes
Nathaniel Hughes

I just thought of something that could be nicer.
The upload endpoint redirects to an upload script on one of the storage servers (and changes between them depending on which one has more space, or just randomly). That way you don't have to transfer the file between the servers.

Ryan Walker
Ryan Walker

Thank you for the recommendation, but my intent is having all of my data on one server and having multiple "slave" servers serving my web page and handling other api requests.

Thank you everyone~
I am happy now ^_^

Attached: tuks6.jpg (32 KB, 640x360)

Disable AdBlock to view this page

Disable AdBlock to view this page

Confirm your age

This website may contain content of an adult nature. If you are under the age of 18, if such content offends you or if it is illegal to view such content in your community, please EXIT.

Enter Exit

About Privacy

We use cookies to personalize content and ads, to provide social media features and to analyze our traffic. We also share information about your use of our site with our advertising and analytics partners.

Accept Exit