Handling of long operations

Any wishes
Locked
gentux
Posts: 11
Joined: Thu Feb 04, 2021 6:31 pm

Handling of long operations

Post by gentux » Fri Apr 23, 2021 1:29 pm

Hi Mike

When I leave a long running operation running I often find myself returning to my computer hours later and notice there was an error on one file which paused the whole operation (as per murphy's law it was exactly the file that was copied the moment you went out of room). I would rather have it continue with everything else so after I click away the error message it is finished.

It would be quite easy, just make a task queue where all the selected files to be operated on are put in, when one fails, take it out of the execution queue and put it at the end.

Also another benefit of the queue, sometimes I want to copy a file whose destination has not enough space left until the other operation before it has completed and the queue is the best way to do that.

I hope you understand that the current situation is quite annoying.

Best regards
gentux

User avatar
mike
Posts: 1060
Joined: Thu Jul 16, 2015 5:35 am
Location: Exeter, UK

Re: Handling of long operations

Post by mike » Tue Apr 27, 2021 8:37 pm

Hi,
Could you provide an example of such situations - what exactly happens that requires a human interaction?

gentux
Posts: 11
Joined: Thu Feb 04, 2021 6:31 pm

Re: Handling of long operations

Post by gentux » Tue Apr 27, 2021 9:18 pm

So here's what I experienced as an example of my recent situation. I wanted to free up space on my USB drive quickly but I couldn't just copy and format because I had some portable programs installed there and it would have been a hassle. So I chose to move a bunch of folders with video files with some already on the NAS and some not.

Granted there was a bit more than a terrabyte and it was just an error saying that the file already exists. But when you move files and you skip moving a file that already exists, then the subsequent deletion of the folder will throw an error as well that the folder was not empty (which is good behaviour) but it won't delete the other stuff that has actually been moved until you click it away. So you would wait once more. Also to me it seemed like it didn't continue to move other stuff until I chose to skip in the first place but I can't really tell if it did.

That's all fine by me and expected behaviour, my problem was just that I needed to free up some space on my external drive "quite" quickly so I chose to move all files in one go, which in the end took two days. If I could have had the possibility to split up the move (like when I select multiple files in other commanders and do a F6 followed by F2) it would have only halted the ones that I need to skip/overwrite and those which moved successfully could have been deleted earlier, making space sooner.

Hope you see what I mean. Surely it's no dealbreaker but I like that sort of functionality in a commander as option.

User avatar
mike
Posts: 1060
Joined: Thu Jul 16, 2015 5:35 am
Location: Exeter, UK

Re: Handling of long operations

Post by mike » Wed Apr 28, 2021 8:33 pm

Ok, got it, thank you.
As a more realistic approach, what if the operation would allow to set defaults for such behaviour upfront?
i.e. "on 'file exist' - do this", "on 'file locked - do that" etc.
Essentially what happens under the hood when a dialog is answered with a "Apply to all" checkbox turned on.
In this particular scenario, if the operation knew in advance that it should skip any items that already existed in the target directory - would it change the picture?

gentux
Posts: 11
Joined: Thu Feb 04, 2021 6:31 pm

Re: Handling of long operations

Post by gentux » Sun May 02, 2021 9:17 am

Yeah that would make it.
I think it would help further if the move command remembered what was skipped and didn't try to delete any of its parents. Together with a "Skip all" option it should do away with all the errors I saw.

Locked