Re: a command "spool", ie queueing commands?

[Date Prev][Date Next][Thread Prev][Thread Next][Date Index][Thread Index]

 



See the batch command.

On Tue, 2 Aug 2005, urgrue wrote:

> > Could you please give us a real-life example?
> 
> Well, for example imagine you're doing some maintenance on a server.  
> You want to move some big files, run updatedb by hand, run a script  
> that parses a bunch of files, and finally merge some video files.
> 
> Because these are very disk-heavy operations, it's much better and  
> quicker if they are not run at once, but are run sequentially instead.
> 
> So normally you'd probably use "at" to run one now, another in an hour,  
> etc. But this is dumb because you have to simply try and guess if the  
> job is done or not. You could also run them in sequence by putting them  
> all in the same "at" job, but its not possible to add or remove  
> commands from an "at" job once its created. Also, "at" is "user-aware",  
> ie if two users set up two at jobs they will happily run at the same  
> time. What I propose would add all commands to the same queue, even if  
> a different user is the one launching the command (but of course the  
> commands should be run AS the user who started it).
> 
> So what I'm thinking is like this (lets assume the command is called  
> "cs" for "command spooler"):
> me:~> cs
> Usage: cs [add|remove|show] <command>
> me:~> cs show
> no entires
> me:~> cs add mv /path/hugefiles /other/path/hugefiles
> command added to queue
> me:~> cs show
> 1 [running] : mv /path/hugefiles /other/path/hugefiles
> me:~> cs add updatedb
> command added to queue
> me:~> cs add /path/my_file_parsing_script /path/lots/of/files
> command added to queue
> me:~> cs add avimerge -i clip1.avi clip2.avi clip3.avi -o allclips.avi
> command added to queue
> me:~> cs show
> 1 [running] : mv /path/hugefiles /other/path/hugefiles
> 2 [queued]  : updatedb
> 3 [queued]  : path/my_file_parsing_script /path/lots/of/files
> 4 [queued]  : avimerge -i clip1.avi clip2.avi clip3.avi -o allclips.avi
> 
> 
> As I do lots of disk-heavy operations, I would find this INCREDIBLY  
> useful.
> 
> urgrue
> 
> 
> > --Adrian.
> > 
> > On 8/2/05, urgrue <urgrue@xxxxxxxxx> wrote:
> > > i realized it would be useful to be able to add commands into a
> > command
> > > queue, from where they would get executed in sequence. for example
> > > numerous large hard disk-intensive operations would be better
> > executed
> > > in sequence rather than at once.
> > > in other words, exactly like a printer spool, but for commands. you
> > > could add commands in, list the queue, and remove from the queue.
> > >
> > > does anyone know of something like this?
> > >
> > >
> > > -
> > > : send the line "unsubscribe
> > linux-admin" in
> > > the body of a message to majordomo@xxxxxxxxxxxxxxx
> > > More majordomo info at  http://vger.kernel.org/majordomo-info.html
> > >
> > 
> > 
> 
> 
> -
> : send the line "unsubscribe linux-admin" in
> the body of a message to majordomo@xxxxxxxxxxxxxxx
> More majordomo info at  http://vger.kernel.org/majordomo-info.html
> 

-
: send the line "unsubscribe linux-admin" in
the body of a message to majordomo@xxxxxxxxxxxxxxx
More majordomo info at  http://vger.kernel.org/majordomo-info.html

[Index of Archives]     [Linux Newbie]     [Audio]     [Hams]     [Kernel Newbies]     [Util Linux NG]     [Security]     [Netfilter]     [Bugtraq]     [Yosemite News]     [MIPS Linux]     [ARM Linux]     [Linux RAID]     [Linux Device Drivers]     [Samba]     [Video 4 Linux]     [Git]     [Fedora Users]

  Powered by Linux