Is there a specific reason you can't use a Python script to start your bash scripts? I guess it's for a company?
Jace Roberts
> I guess it's for a company? No its personal. >Is there a specific reason
Be me. Use youtube-dl to download videos online. Some video services decide to limit video download to some thin ass limit. I simply have a pree-made file full of youtube-dl -ci -o "%(title)s-%(id)s.%(ext)s" --no-check-certificate youtube-dl -ci -o "%(title)s-%(id)s.%(ext)s" --no-check-certificate youtube-dl -ci -o "%(title)s-%(id)s.%(ext)s" --no-check-certificate
Copy URL to the last line and create a list of bash commands super easy. After finishing simply use CRTL+END+SHIFT after the last command to delete the useless and empty commands.
Use another script file to execute every command at the same time. parallel --no-notice -a videos.sh
Turns out its limited by the number of cores. Internet connection is not even at 30%. All cores are used. All cores are maxing out. I-i-i-i-i-i-t werks in a way...
I don't see why the execution script needs to be bash if all it's doing is launching another script. Maybe you could try backgrounding the scripts in a loop without waiting if they don't need to start at exactly the same time. Simplify the command script so it only runs one instance and provide it the URL as an argument, or something like that. Can't help further since I don't know bash that well.
Evan Richardson
> Python script to start your bash scripts Can you make it super easy like I'm doing in (I don't want to create 1 file for 1 video) Can you tell me how exactly do do this?
Also I don't like piling shit loads of programing languages on top of one another. Can bash seriously not executer all scripts at the same time? I don't even give a shit about collision because they will never collide and its irrelevant (1 script works only on 1 separate file).
Henry Richardson
something like this? for i in $(seq NUM_PROCESSES); do ./your-script-here & disown; done
Nicholas Anderson
>I don't see why the The most important part is my convenience in copy pasting the videos. I like copy pasting it into the videos.sh script.
Thomas Miller
Something like that. You can pass it the title and so on from the command line so that it can be reusable for any video.
Its more me not wanting to turn this into a FPS game where I need to precisely click in a specific place to CTRL+V the URL.
hitting the end of a script is not that hard.
James Ortiz
>Each call gets executed in its own subshell and executed in the background (that's what the & does) The console emulator is displaying stuff youtube-dl is saying however (This is a good thing and i like this) >Optionally you could redirect output into a log file or everything will interleaved but its not requir d. >At the end you wait until all processes finished. Na its cool I like it working in one window even if its multiple process talking over one another. This shows me its working.
Tyler Perry
How can I make this for i in $(seq NUM_PROCESSES); do ./your-script-here & disown; done
Work?
Jason Evans
I just meant it as a template for you to fill out based on your needs
"NUM_PROCESSES" is a positive integer for however many instances of the script you want to run, and "./your-script-here" is the path to the script you want to run
Isaac Hall
so for example if you want to run 9 instances of "/opt/bin/turds.sh", you could say for i in $(seq 9); do /opt/bin/turds.sh & disown; done
Kayden Clark
>NUM_PROCESSES" is a positive integer for however many instances of the script you want to run Ow shit. This is a problem. because I have no idea how many scripts will be in the file. Sometime I take 20 videos and other times 40~50.
Its not like i want to count how many URLs I copy pasted into the videos.sh. script.
Logan Jones
So that would work like this: > #! /bin/bash > for command in "$@" > do > ${command} & > done > wait
You can test this with: > time yourscript.sh 'sleep 5' 'sleep 8' 'sleep 1' time taken will be (very close to) 8 sec.
Keep in mind that you almost always should quote the params.
Samuel Mitchell
$(seq $(wc -l file-with-one-url-per-line))
Thomas Baker
Thanks for this. However this is failing to execute for me for i in $(seq $(wc -l x.sh)); do ./x.sh & disown; done
looks like wc will also print the file name if you dont pipe into stdin
Hunter Torres
Sure, because ./x.sh it's not a valid file, you have to get the i line
Elijah Rivera
hmm dude? ./x.sh is one of the text files I created and I'm working with as we speak. This is actual production code.
Jacob Davis
see
Robert Howard
? In the script you are trying to execute the entire x.sh for each x.sh line, that's not what you want right? You want to execute each line of x.sh separately with the &
Isaac Brooks
>whoops sorry i meant this: No problem.
Also this code for i in $(seq $(cat x.sh | wc -l)); do ./x.sh & disown; done
1)All my cores are maxing out. Also if I close the GUI console window. 2)Internet connection is maxed out
3)looks painfully linear if executed. Meaning that I don't see the files created in the folder while starts shitting multiple files in the folder after start meaning its downloading multiple files at the same time.
Are you certain there are not other problems with the code? In like its getting something ridicules like for i in $(seq 2354567567
Luis Adams
OP, your problem is I/O bound. Bash is not the right tool for the job. Use something else.
in the op you said you wanted to launch multiple bash scripts from a single bash script, and that's what the line i gave you does
are you looking to run a different script on each iteration, or the same one? if it's the former, get rid of the "seq" and loop over each line of the file with the links, and pass "$i" as an argument to your youtube-dl command.
Jacob Kelly
THE FUCK! Why is Jow Forums adding in my code! pastebin.com/yV3fhQm8 Here is 100% of x.sh
Colton Anderson
Exactly, try to figure this out, for each line of this file execute this file If you have n lines you execute n^2 commands
Brody Phillips
Me again, I fiddled with termux (phoneposting) and found a better solution Here is a working single line:
> awk '{system($0 "&")}' x.sh
^this use awk, the most underrated terminal program
Brody Gutierrez
>in the op you said you wanted to launch multiple bash scripts from a single bash script Yes I still want this. >are you looking
Only creates 1 file in the folder while downloading contrast this with running this script pastebin.com/piWtS8ra
Who will create multiple files after seconds. you can see this in the file browser.
Like run the scripts yourself.
Ian Long
>executing these 2 files in 1 folder I'm only doing ./a.sh BTW.
Carter Smith
WOW it looks like its working. Thanks. I see the results my self after the test finishes.
>^this use awk, the most underrated terminal program Thanks if only i knew what it is doing.
Benjamin Watson
Ah I see. x.sh+a.sh do a different thing from the third pastebin you linked.
I assumed you wanted to launch the same script multiple times, and what is happening is that all the commands in x.sh are getting run over and over again, multiple times each.
The other pastebin runs each command in x.sh once.
But yeah, if you wanted to do that in the first place, then is the best approach
for url in `cat a.txt`; do youtube-dl -ci -o "%(title)s-%(id)s.%(ext)s" --no-check-certificate $url & done
Matthew King
I wanted to download multiple videos using youtube-dl with the minimal level of difficulty for me copy pasting the URLs.
Austin Murphy
No problem, I'm happy that it works. From what I've read, you are competent enough to use Linux and copy paste snippet from the web, but not to create your own. If you have time I suggest you to learn some basic programming, maybe in python, it's fun and useful.
Leo Watson
Sweat thanks! A lot. >I suggest you to learn some basic programming I'm no master programmer. However I know C and other languages.
However C is not something that is for system automation, you know the nice stuff like using youtube-dl. And bash is mostly chaining together other programs.
I wanted to know how to start something at the same time goggled it and the results where all for GNU parallel then look at OP picture to see my frustration.
Is the best solution I think After learning that & exists I was thinking about a similar solution however I wanted to ask before I start trying to write my own abomination.
For what is GNU parallel actually used? I'm interested.
I would be surprised if youtube didn't throttle you to oblivion no matter how parallel you download videos. Don't expect to get any gain from this, I would imagine if youtube recognizes a bot it throws captchas at it, that would block youtube-dl.
Colton Bailey
Fun fact. >youtube Gives you max download speed all the time. > Don't expect to get any gain from this Other services who throttle are tricked by this and this is a fact! >recognizes a bot it throws captchas at it Nope they are not detecting this.
Feel free to test it yourself and enjoy the best solution given by You are all welcome.
Also,m if you're still here, here's an extract from the parallel man page:
--jobs N -j N --max-procs N -P N Number of jobslots. Run up to N jobs in parallel. 0 means as many as possible. Default is 100% which will run one job per CPU core.
If --semaphore is set default is 1 thus making a mutex.
Camden Evans
Ye thanks however the problem was that my CPU cores where maxed out all the time.
Thanks for this. Also for what exactly is GNU parallel used?
Sebastian Sanders
Same as what you were using it for. Running tasks in parallel.
James Phillips
Why did my cores max out if I run it? >Same as what you were using it for. Running tasks in parallel. bullshit youtube-dl ? Or some protein folding crap?
Isaiah Taylor
>The number of scripts I can start is limited by the number of PROCESOR threads I have on my computer. Meanwhile AmigaOS was fully multitasking on a single core in 80s.
Michael Nguyen
I doubt biology scientists would use bash for their computing needs. My bet is its target audience is people like you, automating tasks to make their workflow easier.
>Why did my cores max out if I run it? I don't know. I didn't use parallel a lot (I generally use perl not bash) but when I did, I didn't have any issues. Are your cores maxing out with bash's &?
Joshua Thomas
I know. Every OS was multitasking I was only curious how to make it work in bash. Because this is what I got from GNU parallel.
Do you think the picture is acuter?
Ryder Collins
>Are your cores maxing out with bash's &? There is a spike and hill in all cores if I start & however after this its normal.
See picture. I'm runing this in the terminal emulator and the PC is on some load now. However even if they spike I did not see system lags.
GNU parallel was maxing out all the time. >automating tasks to make their workflow easier. Then why the fuck is it limited to the number of my processor threads? By default? Who's brilliant idea was this?
I understand if it gives complete priority to every script etc. However this...
i think you've misunderstood what parallel is for if you want to just open a bunch of shit, do this thing & otherthing & anotherthing & etc &
also, you can tell parallel how many jobs to run at once, obviously, since job management is it's entire purpose
Connor Nelson
What am I looking at? Elaborate this to the given examples:
Dylan Allen
>Then why the fuck is it limited to the number of my processor threads? Cause 8 cars usually dont fit down a 4 lane highway
Leo Butler
>i think you've misunderstood what parallel is for I was thinking the same is it for some protein folding crap?
Ryder Nguyen
echo -e {1..100}'\n' | xargs -n1 -P5 -I{} echo {}
will run 5 instances of whatever command you want in parallel, sequenitally iterating over the input range killing the xargs will kill the subprocesses so it's a bit better than & imo
Grayson Cruz
Also thanks a lot for the good examples however other have given the same examples before you (not that I expect everyone to read every post) I'm only curious for hat exactly GNU parallel is for.
Jack Allen
it's purpose is to take a large list of commands, and run them in parallel, typically one per core, since that makes sense in most situations, which is why it defaults to that it can also handle jobs on remote machines and other fancy shit as an example, lets say you want to resize a folder of 1000 pictures, it may not be possible to do them all at once, due to memory constraints, and if you have only 4 cores (and not 1000), it won't be any faster than doing 4 at a time, which is where parallel comes in, it will go through all 1000, but only 4 at any one time, when one finishes, it moves to the next
Lucas Ortiz
-- and a working example (transcoding a bunch of audio files); $ time for i in *.flac; do ffmpeg -i "$i" -c:a libopus -b:a 112k "${i%.flac}".opus; done ... real 0m41.579s user 0m49.216s sys 0m0.600s $ time parallel ffmpeg -i "{}" -c:a libopus -b:a 112k "{.}".opus ::: *.flac ... real 0m12.806s user 0m51.855s sys 0m0.848s
Andrew Hughes
Hmm. I don't quite think I understand. Why not make these things sequentially?
For example GIMP has a batch function to make massive photo editing in one go on multiple files.
The stuff I wanted it for is bullshit to circumvent the limitations on some streaming services. I can use the bash & to string together other bullshit like. Updating youtube-dl other system clean ups etc.
>only 4 cores Why is it limited per core? And not some custom number you give it?
I can understand if I have thousands commands however the per core and core maxing out suggested some dedication of one core.
Zachary Nguyen
>Why is it limited per core? And not some custom number you give it? it isn't, it /defaults/ to one job per core, for the reason i explained you can specify an arbitrary number of jobs with "-j" in most cases, this is used to run jobs which max out one core, though you could for example, tell it to run half as many 2-threaded jobs, or run more jobs than cpus if the jobs are limited in another way, such as by a network
James Evans
>this is used to run jobs which max out one cor I'm interested in examples of these jobs. For example you did talk about audio.
Ryan Myers
most things are serial in nature, most of the time, jobs that use multiple cpus are just a set of serial jobs running in parallel, at least to a degree the reason is simple, splitting a job into multiple pieces requires that the resulting pieces don't depend on each other for example, in video encoding, you can't encoding multiple consecutive frames in parallel (except in the case of I-frame-only video), as most frames depend on the output of the previous frame(s), think of it like a production line, each step depends on the last step to be completed, so you can't skip anything
Brandon Hughes
speaking of audio, i *could* split one audio track into multiple jobs, to encode a single audio file in parallel most (if not all) audio codecs work like video formats, where the state of the decoder resets periodically, to make seeking easier (if it didn't, you'd have to decode an hour of audio to seek to the 1 hour mark, for example) each time this happens, you remove the need to know what any previous state was, making it possible to process each of these chunks in parallel
Juan Morgan
>I'm interested in examples of these jobs. im lazy as fuck so i just write single threaded things and parallel them.
Jacob Kelly
Op here I was trying to rewrite this script so that it executes everything line by line a.sh #!/bin/sh
3) parallel -a links.txt youtube-dl -ci -o "%(title)s-%(id)s.%(ext)s" --no-check-certificate {}
Lincoln Stewart
(yes, i realise 2 isn't "all at once", but it's really not a good idea to truly have no limit, imagine spinning up 2000 instances of youtube-dl at once, not only will it choke your machine, it'll use a tons of ram, kill network reponse times, and generally just be completely useless)
Henry Lopez
This is why I'm using 2 files. One with the commands(x.sh) and another to initialize the commands(a.sh).
Sometime its a good idea to run it sequentially because some video is fucked up(unavailable in your country, removed for offensive content) and you want to see the output.
Carter Ramirez
Thanks a lot the best solution! You all rock.
let no one say that Jow Forums is useless.
Aiden Hill
Basically in my setup I can change the "Parallelization engine" (I made this word up) to whatever I want.
And I'm not locked into one solution,
Software modularity is a good thing.
Tyler Sanchez
>Internet connection is not even at 30%. >All cores are used. >All cores are maxing out.
>implying you could max out your internet connection nigga you getting jewed by the servers, even if you didn't have a fisher price CPU, you'd still be far from filling that pipe
Thomas Davis
bumb
Aaron Howard
parallel is easy to use. parallel youtube-dl ::: url1 url2 url3 url4 etc # Or parallel youtube-dl < file_with_urls_on_each_line