0

i've made a telegram bot and basically when you send a video to it, it downloads it edits and then sends it back (edited). i have added a queue system which saves who is currently editing a video and who is waiting to edit their video by appending them to the list. once the video has finished edited it uses pop() on the first index of the list inside my json file and yh. it works however, the json file does not stay up to date. so lets say someone is editing a video. the first index would say "userid-editing" and once the video has finished editing and has been sent back to the user the first index would pop however, if the video is just got popped and basically at the same time and somebody queues then it messes the whole thing up and overrides certain things and its just a total mess.

here is my function that queues and sends the video out for editing if anyone can give me suggestions or any tips on the code would be highly appreicated!

async def edit_video(user_id, profile, message):
    queue_path = f"{database_folder}/queue.json"
    
    # Create the queue file if it doesn't exist
    if not os.path.isfile(queue_path):
        with open(queue_path, 'w') as queue_file:
            json.dump({"queue": []}, queue_file)
    
    with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
    queue = queue_data["queue"]

    if str(user_id) in queue:
        # Count how many times it appears in the queue
        count = 1
        for i in queue:
            if i[:10] == str(user_id):
                count += 1
        modified_user_id = f"{user_id}-{count}"
        with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
        queue = queue_data["queue"]
        queue.append(modified_user_id)
        with open(queue_path, 'w') as queue_file:
            json.dump(queue_data, queue_file)

    else:
        # Append the user_id to the queue
        with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
        queue = queue_data["queue"]
        queue.append(str(user_id))
        with open(queue_path, 'w') as queue_file:
            json.dump(queue_data, queue_file)

    await message.reply(f"Putting you in queue! ➡️ {queue.index(str(user_id))+1}/{len(queue)} Approx time: ({math.trunc(len(queue)*.7)} mins)")
    while True:
        # Read the queue from the JSON file
        
        # Check if the user_id exists in the queue
        if queue[0][:7] != "editing":
            if queue[0] == str(user_id):
                # You can now proceed with the code
                break
            else:
                if queue[0][:10] == str(user_id):
                    with open(queue_path, 'r') as queue_file:
                        queue_data = json.load(queue_file)
    
                    queue = queue_data["queue"]
                    queue[0] = str(user_id)
                    with open(queue_path, 'w') as queue_file:
                        json.dump(queue_data, queue_file)
        await asyncio.sleep(5)  # Wait for 5 seconds before checking again
    try:
        with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
        queue = queue_data["queue"]
        first_element = queue[0]
        modified_element = first_element + "-editing"
        with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
        queue = queue_data["queue"]
        queue[0] = modified_element
        with open(queue_path, 'w') as queue_file:
            json.dump(queue_data, queue_file)
        random_id = id_generator()
        random_id = str(random_id)
        video_file = await message.reply_to_message.video.get_file()
        checkfolders(message.chat.id, profile)
        temp_download_path = f"{download_folder}/downloaded_temp/{random_id}-{str(user_id)}.mp4"
        
        await video_file.download(temp_download_path)
        with open(f"{download_folder}/downloaded_temp/{random_id}-{str(user_id)}.txt", "w") as f:
            f.write(message.text)
        
        await bot.send_message(chat_id=user_id, text=f"Creating video for {profile}...")
        output = await run_external_script(user_id, profile, random_id, temp_download_path)
        if output:
            video_out = open(output[-1], "rb")
            await bot.send_video(chat_id=user_id, video=video_out)
        else:
            print("Output is empty")
        
        with open(queue_path, 'r') as queue_file:
            queue_data = json.load(queue_file)
    
        queue = queue_data["queue"]
        queue.pop(0)
        with open(queue_path, 'w') as queue_file:
            json.dump({"queue": queue}, queue_file)
    except Exception as e:
        print(f"Error: {str(e)}")

I've tried reading the json file right before it dumps and it still doesn't work i've been pulling my hair at this for the whole day haha

Pady
  • 9
  • 2
  • If you want to make updating a file an atomic operation, create an empty file under a temporary name, write that from scratch, and then rename it over the original. If multiple copies of your script can be running at once, use file locking. When you open the destination file with the `'w'` flag and rewrite, that means that it temporarily is empty and then partially-written; you never want to allow that. – Charles Duffy Aug 29 '23 at 17:27
  • See the `flock` call for locking. Note that when you're using file locking in conjunction with the create-and-rename trick, your lockfile should be a separate file from the data file itself, something like `foo.json` for the data and `foo.json.lck` for the lock. – Charles Duffy Aug 29 '23 at 17:29
  • See [How to do atomic file replacement?](https://stackoverflow.com/questions/7645338/how-to-do-atomic-file-replacement) for the part of the problem about making sure your files go straight from one version to another without ever being empty/corrupt/half-written in between, and [Locking a file in Python](https://stackoverflow.com/questions/489861/locking-a-file-in-python) for ensuring that there's only one process writing a file at a time (and that writes can't happen while a read is in progress). – Charles Duffy Aug 29 '23 at 17:30
  • ..._in general_, though, you'd be better off if you need to do this at-volume using a dedicated storage backend built for the purpose like redis instead of building something yourself direct on top of the filesystem layer. – Charles Duffy Aug 29 '23 at 17:31

0 Answers0