1

I have a pandas dataframe with the following contents

Room Position Sensor Measuring Time Value1 Value2 Value3
Living Room A1 111 27-12-2021 1.1 1.2 0.9
Living Room A2 112 27-12-2021 2.1 2.1 1.9
Living Room B1 113 27-12-2021 1.5 1.4 1.4
Living Room B2 114 27-12-2021 1.8 1.7 1.9
Bed Room A1 211 27-12-2021 4.2 4.5 4.4
Living Room A1 111 29-12-2021 0.9 1.1 0.8
Living Room A2 112 29-12-2021 1.9 1.8 1.7
Living Room B1 115 29-12-2021 1.4 1.3 1.2
Living Room B2 114 29-12-2021 1.7 1.5 1.7

I want to convert the dataframe as below

Room Position Sensor Value 27-12-2021 29-12-2021
Living Room A1 111 Value1 1.1 0.9
Living Room A1 111 Value2 1.2 1.1
Living Room A1 111 Value3 0.9 0.8
Living Room A2 112 Value1 2.1 1.9
Living Room A2 112 Value2 2.1 1.8
Living Room A2 112 Value3 1.9 1.7
Living Room B1 113 Value1 1.5 N/A
Living Room B1 113 Value2 1.4 N/A
Living Room B1 113 Value3 1.4 N/A
Living Room B1 115 Value1 N/A 1.4
Living Room B1 115 Value2 N/A 1.3
Living Room B1 115 Value3 N/A 1.2
Living Room B2 114 Value1 1.8 1.7
Living Room B2 114 Value2 1.7 1.5
Living Room B2 114 Value3 1.9 1.7
Bed Room A1 211 Value1 4.2 N/A
Bed Room A1 211 Value2 4.5 N/A
Bed Room A1 211 Value3 4.4 N/A

I tried using pivot but was unsuccessful.

aj7amigo
  • 368
  • 2
  • 12

1 Answers1

2

Update

If you have duplicates, use pivot_table and an agg function:

out = df.pivot_table(index=['Room', 'Position', 'Sensor'],
                     columns=['Measuring Time'],
                     values=['Value1', 'Value2', 'Value3'],
                     aggfunc='last') \
        .rename_axis(columns=['Value', None]).stack(level=0).reset_index()

You can use pivot:

out = df.pivot(index=['Room', 'Position', 'Sensor'],
               columns=['Measuring Time'],
               values=['Value1', 'Value2', 'Value3']) \
        .rename_axis(columns=['Value', None]).stack(level=0).reset_index()

Output:

           Room Position  Sensor   Value  27-12-2021  29-12-2021
0   Living Room       A1     111  Value1         1.1         0.9
1   Living Room       A1     111  Value2         1.2         1.1
2   Living Room       A1     111  Value3         0.9         0.8
3   Living Room       A2     112  Value1         2.1         1.9
4   Living Room       A2     112  Value2         2.1         1.8
5   Living Room       A2     112  Value3         1.9         1.7
6   Living Room       B1     113  Value1         1.5         NaN
7   Living Room       B1     113  Value2         1.4         NaN
8   Living Room       B1     113  Value3         1.4         NaN
9   Living Room       B1     115  Value1         NaN         1.4
10  Living Room       B1     115  Value2         NaN         1.3
11  Living Room       B1     115  Value3         NaN         1.2
12  Living Room       B2     114  Value1         1.8         1.7
13  Living Room       B2     114  Value2         1.7         1.5
14  Living Room       B2     114  Value3         1.9         1.7
Corralien
  • 109,409
  • 8
  • 28
  • 52
  • 1
    Beautiful answer +1. – Mayank Porwal Jan 03 '22 at 15:54
  • @Corralien Thanks for the answer. It works for the sample set I provided. However, when I have multiple entries in rooms, for example bed room and bath room, pandas raise an error saying Index contain duplicate entries. How could I solve this? – aj7amigo Jan 03 '22 at 16:34
  • If you have duplicate values, you have to take a decision. Suppose for a same date and a same combination like ` Living Room, B2, 114, Value3`, you have to values like `1` and `2`. What do you want to do? Take the mean (1.5), the first (1), the last (2), the min (1), the max (2), ... – Corralien Jan 03 '22 at 16:43
  • @Corralien I have updated the input dataset with an additional rooms data. Sorry for the incomplete set which was provided earlier. – aj7amigo Jan 03 '22 at 16:43
  • @aj7amigo. I just need to answer to my question :) – Corralien Jan 03 '22 at 16:44
  • @Corralien If there are duplicates on same date, I want to take the last entry on that date. Thanks – aj7amigo Jan 03 '22 at 16:46
  • @aj7amigo. I updated my answer. Can you check it please? – Corralien Jan 03 '22 at 16:55