0

I have records of smartmeter in an mysql database.

Records in timestamp order looking in generall as follow:

key timestamp watt now
000001 2022-10-04-01-01-01 10
000002 2022-10-04-01-02-01 10
000003 2022-10-04-01-03-01 101
000004 2022-10-04-01-04-01 101
000005 2022-10-04-01-05-01 102
000006 2022-10-04-01-06-01 101
000007 2022-10-04-01-07-01 102
000008 2022-10-04-01-08-01 10
000009 2022-10-04-01-09-01 10
000010 2022-10-04-01-09-01 10
000011 2022-10-04-01-09-01 107
000012 2022-10-04-01-09-01 101
000013 2022-10-04-01-09-01 109
000014 2022-10-04-01-09-01 10
000015 2022-10-04-01-09-01 10

I want to identify the groups with bigger number (lets say > 100) and give them an incresing id. Also I want to get per group the first and last key id

Result of query should look like this:

month day numbers of group first id last id average watt
10 04 0 000003 000007 102
10 04 1 000011 0000013 105

Any help apreciated

gregor4711
  • 1
  • 1
  • 3
  • What defines as a group? From your example, it seems that consecutive keys with `>100` watt values will be one group, is that correct? Can you run `SELECT version();` to check your MySQL version and `SHOW CREATE TABLE your_table_name;` for the table structure and post them into your question? – FanoFN Oct 05 '22 at 00:55
  • Good question, a group is a number of values which :a) > 100 b) can be 1-x records, c) surrounded by lover values < 100. I marked 2 groups in the upper table with bold text. Hope that helps to get the idear – gregor4711 Oct 05 '22 at 06:06
  • Ok, so how's my request for info on the second part in my comment above? The MySQL version and table structure please. – FanoFN Oct 06 '22 at 01:00
  • FanoFN the Table structure is like mention above, – gregor4711 Oct 06 '22 at 05:50
  • CREATE TABLE iot.smartmeter ( `key` INT auto_increment NOT NULL, `timestamp` TIMESTAMP NOT NULL, watt varchar(100) NOT NULL ) ENGINE=InnoDB DEFAULT CHARSET=utf8mb4 COLLATE=utf8mb4_general_ci COMMENT='smart Meter log table'; – gregor4711 Oct 06 '22 at 05:52
  • There result of the query should looks like mention above – gregor4711 Oct 06 '22 at 05:53
  • month day numbers of group first id last id average watt 10 04 0 000003 000007 102 10 04 1 000011 0000013 105 – gregor4711 Oct 06 '22 at 05:53

1 Answers1

0

You'll need something to identify them as a group. My first thought was using RANK() or DENSE_RANK() but after multiple tries, I couldn't find a way. Then I thought about using LAG() but still I'm stuck at how to re-identify the rows as new group. After testing many times, I come up with this suggestion:

WITH cte AS (
  SELECT s1.*,
         @n := COALESCE(IF(s1.skey=1,1,s2.skey), @n) As newGroup
   FROM smartmeter s1
  LEFT JOIN (
  SELECT skey,
       stimestamp,
       watt,
      LENGTH(watt) AS lenwatt,
       LAG(LENGTH(watt)) OVER (ORDER BY skey) llwatt
   FROM smartmeter) s2 ON s1.skey=s2.skey
  AND lenwatt != llwatt)
SELECT MONTH(stimestamp) AS Month,
       DAY(stimestamp) AS Day,
       ROW_NUMBER() OVER (ORDER BY MIN(skey)) AS 'numbers of group',
       MIN(skey) AS 'first id',
       MAX(skey) AS 'last id',
       AVG(watt) AS 'Average watt',
       CEIL(AVG(watt)) AS 'Average watt rounded',
       newGroup
   FROM cte
  WHERE watt >= 100
GROUP BY newGroup, MONTH(stimestamp), DAY(stimestamp)

By the way, I've changed some of your column names because key is actually a reserve word. Although you can use it as column name as long as you wrap it in backticks, I personally find it's a hassle to do it every time.

Ok, so my idea was to use LENGTH(watt) and ORDER BY skey in the LAG() function. Then I'll separate those rows where the length doesn't match and use that as a starting point for each new group. After that, I left join the result of that with smartmeter table. The next challenge is to assign each of the rows that doesn't match with previous skey value then I've found this answer and applied it into the cte.

Once those are done, I just write another query to fulfil your expected result. Although, some part of it is not exactly as what you expected.

Here's a demo fiddle

FanoFN
  • 6,815
  • 2
  • 13
  • 33
  • FanoFN well noted and many thanks. I'll "transform" this to the real table, but what I understand so fahr that loocks great. Will come back end of weekend. Thans again – gregor4711 Oct 08 '22 at 07:35