0

Following is the df

from io import StringIO
import pandas as pd

df = pd.read_csv(StringIO("""
Group Date Rank
A 01-01-2023 1
A 01-02-2023 2
A 01-03-2023 3
A 01-04-2023 2
A 01-05-2023 1
A 01-06-2023 1
A 01-07-2023 3
A 01-08-2023 2
B 01-01-2023 2
B 01-02-2023 3
B 01-03-2023 2
B 01-04-2023 1
B 01-05-2023 3
B 01-06-2023 2
B 01-07-2023 1
B 01-08-2023 3"""), sep="\s+")

It has two groups date and rank achieved on each day.

I want following output :

Group   Count of 1      Count of 2    Count of 3
A       3               3             2
B       2               3             3  

First I counted each Rank using for loop and if loop with value_count() for each Rank and stored in list. later on converted list into df

I tried using groupby,agg value_count() of each rank and then created df.

I want to know how to solve this with pivot as there are various categories on which ranking is done and each category rank total is required.

Mark
  • 7,785
  • 2
  • 14
  • 34

2 Answers2

1
import pandas as pd

data = {
    'Group': ['A', 'A', 'A', 'A', 'A', 'A', 'A', 'A', 'B', 'B', 'B', 'B', 'B', 'B', 'B', 'B'],
    'Date': ['01-01-2023', '01-02-2023', '01-03-2023', '01-04-2023', '01-05-2023', '01-06-2023', '01-07-2023', '01-08-2023',
             '01-01-2023', '01-02-2023', '01-03-2023', '01-04-2023', '01-05-2023', '01-06-2023', '01-07-2023', '01-08-2023'],
    'Rank': [1, 2, 3, 2, 1, 1, 3, 2, 2, 3, 2, 1, 3, 2, 1, 3]
}

df = pd.DataFrame(data)

df_pivot = df.pivot_table(index='Group', columns='Rank', values='Date', aggfunc='count', fill_value=0)



df_pivot.columns = ['Count of 1', 'Count of 2', 'Count of 3']


print(df_pivot)

Result :

Group       Count of 1  Count of 2  Count of 3
A           3           3           2
B           2           3           3
Amira Bedhiafi
  • 8,088
  • 6
  • 24
  • 60
0

You can use pivot in the following way:

df.groupby(["Group", "Rank"]).size().reset_index().pivot(index="Group", columns="Rank", values=0)

Rank   1  2  3
Group         
A      3  3  2
B      2  3  3

Feel free to ask any questions you might have! :-)

Mark
  • 7,785
  • 2
  • 14
  • 34