1

Given the following data set, loaded into a Pandas DataFrame

BARCODE ALTERNATE_BARCODE
123
456 789

Imagine I have the following Pandas python Statement:

users.loc[users["BARCODE"] == "", "BARCODE"] = users["ALTERNATE_BARCODE"]

Is there any way - without rewriting this terse statement too much - that would allow me to access the number of rows in the DataFrame that got affected?

Edit: I am mainly on the lookout for the existence of a library or something build into Pandas that has knowledge of the last operation and could provide me with some metadata about it. Deltas is a good workaround, but not what I am after, since it would clutter the code.

Fontanka16
  • 1,161
  • 1
  • 9
  • 37
  • 1
    `users["BARCODE"].eq("").sum()`? (you obviously need to run it before replacing the values) – mozway Mar 30 '22 at 07:12
  • 1
    `I am mainly on the lookout for the existence of a library or something build into Pandas that has knowledge of the last operation and could provide me with some metadata about it` - no, it not exist. – jezrael Mar 30 '22 at 07:23
  • How can one lift the Similar question ban of the one? The related question has very little to do with what I am after. – Fontanka16 Mar 30 '22 at 07:23

1 Answers1

1

Prior to replacing the values, get the length output of the .loc command.

len(users.loc[users["BARCODE"] == "", "BARCODE"].index)
Somebody Out There
  • 347
  • 1
  • 5
  • 15
  • 2
    Thank you, but I am on the lookout for more of something that could provide me with metadata about the operation. – Fontanka16 Mar 30 '22 at 07:25
  • 1
    @Fontanka16 The best solution would be to use a [walrus](https://docs.python.org/3/whatsnew/3.8.html#assignment-expressions) to assign a part of a .loc surrounded by brackets to a variable, use assignment operation outside of brackets and then access the length using .shape function on a variable. So something like `(affected := users.loc[users["BARCODE"] == "", "BARCODE"]) = users["ALTERNATE_BARCODE"]` and `affected.shape[0]` – Somebody Out There Mar 31 '22 at 09:38