I have a client/server application where the client submits long-running jobs. The way I'm designing this, the client would upload a blob to storage and then queue a message with the blob identifier. Workers on the server side would process the job and report on progress via a queue. What I'm trying to figure out is if all jobs should report on progress to the same queue, or if each job should get its own "response queue". Is it expensive in any way to create/delete queues? What concerns me is that I don't think clients should be listening to (even getting and filtering) responses from all jobs. So, I'm leaning toward a slightly tweaked process: the client uploads a blob, creates a queue with the same name, starts listening to that queue, and then queues a message to the server. The worker processes the blob and sends responses on the new queue. When the job is complete, the client deletes the queue. Is that a reasonable approach?
Any ideas or thoughts appreciated.
Thanks!
Brad.