Hi, can please anyone who has received their AWAIR element/ is in the mid-october batch share their order numbers just to have the estimate of how much orders they have to process and whats the queue? Thanks!
πŸ‘︎ 8
πŸ’¬︎
πŸ“…︎ Oct 01 2021
🚨︎ report
Hi! How to export only the first 5 seconds of all videos in Adobe Media Encoder queue? I need to do this in batch.
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/ofmarconi
πŸ“…︎ Dec 15 2021
🚨︎ report
Please bring back the old Batch Queue.

Also it’s pretty annoying that although a recipe is checked, you still have to click on it. Why would that matter?

But for real I miss the queue. Helps me see what’s left, let’s me stop, pause, drag important images to the front, etc.

πŸ‘︎ 16
πŸ’¬︎
πŸ‘€︎ u/crutonic
πŸ“…︎ Sep 20 2021
🚨︎ report
Did the ability to start and stop the batch queue go away in 21?

Or did it get put somewhere I can't find it?

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/evSftw
πŸ“…︎ Jul 27 2021
🚨︎ report
ANC bigwigs jump the Queue: President Cyril Ramaphosa to personally receive first batch of Covid-19 vaccine doses as they are delivered news24.com/news24/southaf…
πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/Euro_African
πŸ“…︎ Feb 01 2021
🚨︎ report
Next batch? Queue system? Is your index next in line to be shipped? Let's figure it out!

Hi everyone,

Having clicked within the first minute I haven't made it to the first batch, and with the release of HL ALYX in just 6 days, I doubt that I have any chance to get the headset in time. For me it now says

Ships in 3 to 5 weeks.

Now I know that valve is not giving us the queue numbers but I think we can figure it out by ourselves or at least get a better estimate by comparing the order status with the exact time of the confirmation email. If you are using Gmail you get the exact time (to the second) by checking the email headers. You can do that by opening the confirmation email then clicking the 3 dots on the upper right corner and choosing "Show original" More information on how to do that on google support.

Received time:

There in the headers, you can find the exact received time and in my case, it's Mon, 9 March 2020 10:00:27 -0700 (PDT).

Solely by this info, we can infer that everyone receiving their confirmation before the second 27 will have their index shipped sooner than 3 to 5 weeks. Now the more information we have the more we can puzzle together what the expected shipping times will be.

I for one I wonder at what second is the confirmation email of those that already received their headset. That will give us a better clue into what second the second batch begins.

I will keep updating this post as we gather more info on the matter. If you want to participate in this survey all you have to do is write your order status together with the exact time you received your confirmation email. I think we can figure this out together cos this waiting game and uncertainty are hard.

Rough estimate for EU / US / ASIA queue times:

  1. **First batch - Shipping.** (9 data points)
    From: Mon, 9 March 2020 09:59:50 -0700 ( PDT )
    To: Mon, 9 March 2020 10:00:27 -0700 ( PDT )
  2. **Second batch - Ships in 3 - 5 weeks.** (33 data points)
    From: Mon, 9 March 2020 10:00:03 -0700 ( PDT )
    To: Mon, 9 March 2020 10:32:37 -0700 ( PDT )
  3. **Third batch - Ships in 4 to 6 weeks.** (0 data points)
    Not enough info
  4. **Fourth batch - Ships in 5 to 8 weeks.** ( 0 data points)
    Not enough info
  5. ****Fifth ba
... keep reading on reddit ➑

πŸ‘︎ 95
πŸ’¬︎
πŸ‘€︎ u/moredhell
πŸ“…︎ Mar 16 2020
🚨︎ report
Bash - simplest dirtiest queue besides atd/batch

Looking for the simplest dirtiest queue I can use to run a lot of shell scripts one at a time.

Edit / clarification:

I have a bunch of cron jobs running. Need to serialize a portion of them to avoid race conditions. Right now I am doing: Crontab (abbrev syntax, which won't work. it's actually nastier but I'm on mobile now)

32 * * * * cat <<< 'export x=foo ; script1.sh' | batch

27 * * * * cat <<< 'export y=bar ; script2.sh' | batch

. ... Lots more, plus can trigger outside of cron but still need to throttle to oy one running at a time. Ultimately I'm calling a Java app with a task I can't guarantee will be safe with concurrent runs.

End edit.

Needs to take in a combo of script name + some shell environment variables to set and run each script with - name + vars should be passable as args.

Currently using batch (from at / atd). Works reliably eventually, but the limit of 1 job a minute slows things down, and batch also pauses when average cpu load gets too high - which is undesirable here.

Environment is pretty locked down with restrictions on software - or I would just spin up a jenkins. For this environment that would mean months of compliance checkboxes amd involving another team so I'm trying to skip around it.

Standard RHEL repo available software works. Or python stdlib. Suggestions for binaries or pip software outside that still welcome but major bonus points for extra standard stuff.

Thanks!

πŸ‘︎ 35
πŸ’¬︎
πŸ‘€︎ u/happyfridays_
πŸ“…︎ Oct 03 2020
🚨︎ report
Service bus queue message batch sending as single message through rest api

Hey everyone,

I'm trying to send a batch of messages to a service bus queue via the azure rest api. I'm able to send single messages just fine but having some trouble getting batches to show up correctly in the queue. Each batch is appearing in the queue as a single message, whereas I believe they should be being separated out into individual messages.

Following this doc:https://docs.microsoft.com/en-us/rest/api/servicebus/send-message-batch#request

Can't quite figure out what differentiates a single message from a batch aside from just passing an array rather than an object to queue.

Anyone familiar with service bus queues have any pointers?

EDIT:
Okay so it seems like the rest api may not be able to send batches to service bus queues. Seeing some conflicting information tbh from azure docs explaining how to send batches from rest api to azure .net docs explicitly stating that sending batches from rest api is not supported.
Honestly not sure anymore but I decided to just use the Node sdk to get around this and am now able to send batches of messages to service bus queues.

πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/Notaters
πŸ“…︎ Apr 13 2021
🚨︎ report
When you are gutted because person in fron of you in the Chippy queue gets fish and chips x 10. As you are paying for your single portion of chips they announce "Sorry, it will just be a minute for a fresh batch of chips.". You are fuming becaouse you got the scrap ends of the last batch.

Torn between being grateful they had enough left for you - and wishing you had walked in 1 minute later

πŸ‘︎ 80
πŸ’¬︎
πŸ‘€︎ u/Diplodocus114
πŸ“…︎ Jul 13 2020
🚨︎ report
Queue for art requests but its a batch script
πŸ‘︎ 9
πŸ’¬︎
πŸ‘€︎ u/Im_an_OwO
πŸ“…︎ Aug 21 2020
🚨︎ report
One tab says in queue the other tab is showing out of stock.... must be in the queue for the next batch of PS5s I won’t get. reddit.com/gallery/jtq61m
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/1309Log
πŸ“…︎ Nov 13 2020
🚨︎ report
Is it possible to Chain a Batch of Jobs within Laravel Queues?

I have a case where I need to run a few tasks in sequence, one of these tasks requires performing the same number crunching on a large set of data, and would be suited to running in parallel as running sequentially would be prohibitively slow. Finally there's some other cleanup jobs that needs to run after all the parallel jobs have completed.

The Laravel docs mention about running Chains inside Batches, but not Batches inside Chains. Is such a thing possible?

Here's an example of what I'm trying to achieve:

Bus::chain([ 
  new SerialJobSetup, 
  new SerialJob2,
  Bus::batch([
    new ParallelJob(1), // Runs in Parallel on different queue workers
    new ParallelJob(2), // Runs in Parallel on different queue workers
    new ParallelJob(3), // Runs in Parallel on different queue workers
    new ParallelJob(4), // Runs in Parallel on different queue workers
    ...
  ]),
  new SerialJob3, // Continues to run the chain once all parallel jobs above have completed
  new SerialJobCleanup,
])-&gt;dispatch();

I'm pretty sure I've seen a https://twitter.com/themsaid presentation (perhaps it was for Laracon Online?) where he does something similar, but I can't for the life of me find it. I'm wondering if I dreamt it.

EDIT: The solution is not to use Bus::chain() at all!

SerialJobSetup::dispatchSync();
SerialJob2::dispatchSync();

Bus::batch([
    new ParallelJob(1), // Runs in Parallel on different queue workers
    new ParallelJob(2), // Runs in Parallel on different queue workers
    new ParallelJob(3), // Runs in Parallel on different queue workers
    new ParallelJob(4), // Runs in Parallel on different queue workers
    ...
])
-&gt;finally(function (Batch $batch) {
    SerialJob3::dispatchSync();
    SerialJobCleanup::dispatchSync();
})
-&gt;dispatch();
πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Muffins86
πŸ“…︎ Dec 10 2020
🚨︎ report
This batch has been in my queue for an unprecedented 2 minutes. No one’s picked it up because of the 4 cases of water. FFS.
πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/jimmerzbuck
πŸ“…︎ Aug 01 2020
🚨︎ report
digiKam 7.1.0 is released. Includes better Canon CR3 metadata support, new Batch Queue Manager plugins to fix Hot Pixels automatically and to apply texture over images, and better metadata management (improved IPTC compatibility with UTF-8 characters encoding). digikam.org/news/2020-09-…
πŸ‘︎ 59
πŸ’¬︎
πŸ‘€︎ u/anaxarchos
πŸ“…︎ Sep 08 2020
🚨︎ report
AMD Queues Its First Batch Of AMDGPU Changes For Linux 5.9: Sienna Cichlid + More phoronix.com/scan.php?pag…
πŸ‘︎ 42
πŸ’¬︎
πŸ‘€︎ u/allenout
πŸ“…︎ Jun 27 2020
🚨︎ report
When the fish and chip shop puts salt and vinegar on your chips without asking, and you have to tell them they got it wrong in front of a big queue of people, which requires them to give you a whole fresh batch of chips and sets your order back at least another minute, to the clear ire of the queue

Bonus problem: when I got home, it turns out they gave me a pot of gravy instead of curry sauce - it was an absolute disaster

πŸ‘︎ 53
πŸ’¬︎
πŸ‘€︎ u/JackHadders
πŸ“…︎ Aug 16 2019
🚨︎ report
In next digiKam 7.1.0 planed at end of August, Apply Texture will be available as plugin in Batch Queue Manager.
πŸ‘︎ 9
πŸ’¬︎
πŸ‘€︎ u/human_dynamo
πŸ“…︎ Aug 13 2020
🚨︎ report
In next digiKam 7.1.0 planed at end of August, Hot Pixels Fixer will be available as plugin in Batch Queue Manager.

https://i.imgur.com/XH9IS53.png

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/human_dynamo
πŸ“…︎ Aug 08 2020
🚨︎ report
So frustrated! WHILE watching the queue of available batches doing on demand, a notification will pop up of a high-paying batch. But that batch never pops up in my queue so I can get an opportunity to select it!! Does this happen to anyone else?

Low paying batches will pop up in the queue fine after the notification flashes at the top of my screen. This "ghosting " only seems to happen on the higher paying batches.. $25+ and this happens while I am looking at my available batches screen.. Anyone else experiencing this?

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/K125031
πŸ“…︎ Dec 14 2019
🚨︎ report
AWS Batch - why Job Queue can have multiple Compute Environments?

I am looking for concrete examples why Job Queue could have more than one Compute Environment specified.

At first I thought "maybe I have multiple compute environments with different instance types and I let the queue know which order is preferred". However, one Compute Environment itself can have multiple types of instances specified and with proper allocationStrategy on a MANAGED cluster will automatically provision a suitable instance(in theory at least). That fact ruled out my initial idea.

So far I thought about following use cases:

  • run tasks on SPOT instances, if none are available then fallback to compute environment with on demand ones
  • use my own ECS Cluster but if no compute resources are available fallback to Batch managed cluster
  • use environment with compute optimised instances but if none are available fallback to some generic, cheaper ones defined in other compute environment

Why else could that feature be useful?

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/realfeeder
πŸ“…︎ Jul 22 2020
🚨︎ report
Queue / Batch scheduler

I have a computer that interacts with an external device over serial basically. I have 10-15 people who log into this computer and run python scripts that control this external device. Each script takes about 2 minutes to run, but some are much longer. Right now everyone just runs their script and only one script can access the device at a time, however I would prefer a more efficient scheduling workflow. I envision something similar to how a super computer's resources are shared, except this server is limited not by computational power, but by access to this external device. Is there any program that can handle this? The platform would be Windows.

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/awkwardWoodshop
πŸ“…︎ Mar 03 2020
🚨︎ report
Single Piece Flow vs Batch and Queue to decrease the leadtime of your sanitation process

Dude. I dont know jack about 3d printing but i do know jill about industrial engineering. In the context of your sanitation "assembly" line. You can increase your throughput if you follow single piece flow. This works. 5 minute video explains it well. https://www.youtube.com/watch?v=ciJckWCMvpA

Leadtime = wip/exit rate or lead time = counf of pieces in between garbage bags on the dock and the hands of the hospital staff / count of pieces exiting per day

single piece flow lowers your wip which will decrease your lead time

πŸ‘︎ 8
πŸ’¬︎
πŸ‘€︎ u/TruthisaPerson
πŸ“…︎ Apr 07 2020
🚨︎ report
How to do a queue of batch files in Youtube-DL?

I have 100 txt files with lots of links in each. I want to start one txt file after another automatically. I need the archive file and output directory named dynamically based on the txt file name (I'm already doing this).

Already asked in /r/youtubedl: https://old.reddit.com/r/youtubedl/comments/eqwzxd/how_to_do_a_queue_of_batch_files/

πŸ‘︎ 5
πŸ’¬︎
πŸ‘€︎ u/plays_reddit
πŸ“…︎ Jan 19 2020
🚨︎ report
1/4 Batch | β€œIn queue for DNA extraction”

Who’s in the same batch as me?

When are you getting your results?

πŸ‘︎ 6
πŸ’¬︎
πŸ‘€︎ u/GenotypedBeta
πŸ“…︎ Jan 06 2019
🚨︎ report
Day 27. Pink Kush from some BC Flowr seeds that I found in a batch a short while ago. First time grow. Thanks for everyone’s help so far. Love this hobby! Already have my next 3 strains lined up in the queue
πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/SeniorFartAnalyst
πŸ“…︎ May 15 2020
🚨︎ report
Is there a batch downloader or way I can download like 100 seeds either at once or in a queue?

Hey! I do have multiworld setup but it's not on the latest version of the alttpr official site.

I won't have net for a possibly up to a week or two and it will just be me and my PC for that time. Although I could generate seeds offline with the Multiworld gui.py, I'm wondering if there is a way to batch generate/download a large number of seeds at once.

If not, all good. I will end up downloading 50 or so manually if I have to! :D

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/MindkontrolTV
πŸ“…︎ Jan 09 2020
🚨︎ report
Is it possible to queue up different clips for AE to track in a batch?

I have hundreds of clips I need tracked and they're taking a while. is it possible to create a batch queue?

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/sjull
πŸ“…︎ Apr 08 2020
🚨︎ report
Batch 17 Free Closed Beta! Sign up before 3rd of Jan to get in. (you may have signed up before for their alpha. If so, you will be in queue already) maxlouiscreative.us15.lis…
πŸ‘︎ 21
πŸ’¬︎
πŸ‘€︎ u/MaxLouisCreative
πŸ“…︎ Dec 30 2017
🚨︎ report
[Dev] Shuttle/Shuttle+ Music Player - Major update is live. Batch add to playlists, queue and delete, more eye candy, better tablet layout, gmail style placeholders and the light theme for the free version. Also, Shuttle+ is on sale!

Shuttle/Shuttle+ is a lightweight, intuitive music player for Android. It's been around for a while now, and things are really starting to pick up. Shuttle free recently passed the 50k downloads mark, a proud moment for me!

I've been working extremely hard over the past month to bring multi-select functionality to Shuttle (think batch add to playlists), as well as major performance improvements and bug fixes. There's a lot more eye candy now, thanks to the help of /u/jessecoleman and his design input.

I've also had some help from Andrew Neal of Apollo Music Player, to iron out some bugs and improve the performance of the player. He's been gently encouraging me to make more of an effort giving Apollo credit, as recently Shuttle has adopted more code based off of Apollo. That is to say, yes, there are some parts of Shuttle which are based on the source code of Apollo Music Player. Thanks to Andrew for his hard work on Apollo, and for giving up his time to lend a hand.

Unfortunately, folder browsing did not make the cut for this update. The groundwork is there, but it was too unstable, so it's stuck in the alpha testing phase until I kill some more bugs.

Shuttle+ is currently on sale for ~99c. The paid version has tag editing, and folder browsing is just around the corner, which will be a paid feature as well. Shuttle+ also sports an extra theme, and I hope to include more of those in the near future. In the huge overhaul that was this update, I've also removed the ability to navigate directly back to an artist/album from the 'now playing' page. This is only temporary, and I hope to include that functionality back in for the next update.

The free version is completely ad-free, and always will be, so if you can't afford or don't have access to the Play Store, or you simply don't buy apps.. Grab a copy of that over here

I'm currently awaiting musiXmatch approval for the use of their plugin, to bring lyrics support to Shuttle.

Shuttle does not support playback of music stored in the cloud (Google all access and the like), due to the lack of an official API.

Also, there are still some known issues on some Galaxy S3's, causing issues with the sound output and equalizer. I'm

... keep reading on reddit ➑

πŸ‘︎ 89
πŸ’¬︎
πŸ‘€︎ u/timusus
πŸ“…︎ Jul 18 2013
🚨︎ report
Batch 0.1 - A distributed task queue library

A couple days ago, I released v0.1 of batch, a distributed task queue library (think celery or resque).

This library allows you to send tasks to a RabbitMQ broker, so that a worker will be able to pull it and execute the associated handler. It leverages the futures and tokio-core crates to provide asynchronous I/O operations. It tries be as safe as possible while still feeling productive and expressive.

#[derive(Serialize, Deserialize, Task)]
#[task_routing_key = "hello-world"]
struct SayHello {
    to: String,
}

fn main() {
    let mut core = Core::new().unwrap();
    let client = ClientBuilder::new()
        .connection_url("amqp://localhost/%2f")
        .handle(core.handle())
        .build();
    let send = client.and_then(|client| {
        job(SayHello { to: "Ferris".into() }).send(&amp;client)
    });
    core.run(send).unwrap();
}

See the examples or the user guide for more information.

πŸ‘︎ 14
πŸ’¬︎
πŸ‘€︎ u/kureuil
πŸ“…︎ Feb 24 2018
🚨︎ report
Right on queue, I got the good half of that sprouts batch from last night promptly at 8:00... 😏

I’m glad nobody took it last night, got it back with a nice peak.. :)

πŸ‘︎ 4
πŸ’¬︎
πŸ‘€︎ u/Im_Just_Rob
πŸ“…︎ Jun 03 2019
🚨︎ report
Designing a Queue for Async Spring Batch Job

First I give some overview of the system.

In the system user creates project. And in that project user can perform various task. Those task are very long running and do many thing. So we use the Spring Batch to run those task. When an user submit a task we insert that into DB. And there is a Scheduler that retrieve those task and fire. A pseudocode has given.

    @Scheduled(fixedRate =SCHEDULER_REPEAT_TIME, initialDelay = SCHEDULER_INITIAL_DELAY_TIME)
        public void executeLongRunningTask() throws InterruptedException {
    
        // Retrieve every Job from every project from DB which are not run yet.
        List&lt;JobExecution&gt; taskList = SpringJPADao.getTaskExecutionList(TaskExecutionStatusType.QUEUED);
        
        // Loop through and run 
        for (JobExecution job : taskList) {
            jobLauncher.run(job, jobParameters);
        }
    }

Now the There are some job, if they run concurrently then problems occur. For example suppose there are four jobs. j1, j2, j3, j4. The requirement is j1 and j2 should not be run concurrently. Even two j1 sob should not be run concurrently. But all this are specific to a particular project. That means j1 from Project1 and another j1 from project2 can be run concurrently.

Currently I have solve this one by keeping a flag for a particular project in **DB**. Before running a job I check if the flag is on. Like

        // Loop through and run 
        for (JobExecution job : taskList) {
            // check by project id if there is any job running
            if (taskSemaphore.isRunning(job.getProjectId())) continue;
            
            // acquire lock on the project
            taskSemaphore.acquireLock(job.getProjectId());
            jobLauncher.run(job, jobParameters);
        }

And release the lock in the `afterJob()` callback provided by the Spring Job by `JobExecutionListener` interface.

And also I have implemented a another scheduler that monitor if there is any job that gets locked for 30 minutes. So if there is any release the lock.

We use DB for job queuing and also the above solution that I currently implemented.

This post is intended for discussion. I am actually looking for how others would approach this kinda problem. Any suggestion or opinion would be appreciated. Thanks

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/muztaba
πŸ“…︎ Jun 14 2019
🚨︎ report
Batch Rendering or Rendering Queue in the Sequencer?

Hi,

I'm using the sequencer for several visualizations and would like to export the animation by shot rather than one whole sequencer timeline.

Currently, I'm doing it manually. Open a shot. Render it. Open a shot. Render it.

Is there a way to batch render? Select all shots and just queue them altogether.

There is wiki about this https://wiki.unrealengine.com/Sequencer_Batch_Rendering
but the command line is beyond my skill set.

πŸ‘︎ 2
πŸ’¬︎
πŸ‘€︎ u/bentraje
πŸ“…︎ Dec 03 2018
🚨︎ report
OpenCL batch computing: task-device pool vs load balancing vs multiple queues (pool is winner) youtube.com/watch?v=Ep-36…
πŸ‘︎ 3
πŸ’¬︎
πŸ‘€︎ u/tugrul_ddr
πŸ“…︎ Jun 17 2017
🚨︎ report

Please note that this site uses cookies to personalise content and adverts, to provide social media features, and to analyse web traffic. Click here for more information.