Multi-User S3 Job Queue Service
 
An internal Python-based service that manages and executes S3 sync, copy, and other AWS S3 commands for multiple users in a queued, fault-tolerant, and portable way.
Tech Stack
- Python
- Boto3
- Cron
- multi-threading
- JSONL
What I Did
I designed and implemented a Python service that manages and executes AWS S3 jobs (e.g., sync, copy) submitted by multiple users. Jobs are submitted via a lightweight CLI and placed into a job queue, then processed sequentially by a background worker. Users can also track the status of their jobs through the same CLI. The service supports both Linux and Windows environments, with built-in logging, file locking, and per-job status tracking. I used Python's threading and queue modules for task handling and implemented file locking to safely handle concurrent access to shared job files across mixed OS network shares. Since this is an internal tool, the code is not available on my public GitHub.
What I Learned
I learned how to safely write and replace files using atomic operations, especially across NFS and heterogeneous environments. I gained hands-on experience building portable CLI tools using Python’s standard library. I addressed concurrency challenges in file access by using locking to prevent race conditions. I also strengthened my understanding of background job processing using queues and multi-threading.