1

How to write a query that loop perform the following task: (Substituting the id of individual results)

    SELECT * From Table where id= 24
    Id | next_to_check  |  next_to_check_2
     24|    34,23       | 4

Then we check what we displayed a means 34,23,4

    SELECT * From tablle where id = 34
    SELECT * From tablle where id = 23
    SELECT * From tablle where id = 4

Then substitute results 34,23,4 then the results with the results and deeper and deeper

24-> 34,23,4
34-> which results in a
23-> which results in a
4-> is as a result of

what a result -> result with the result and so on ...

when I do it manually, it looks like this:

enter image description here

Waldemar
  • 55
  • 6
  • Can you please post some more details? It would be useful to know the exact structure of your tables, an example of your data and of the desired result – Aleksej Feb 19 '16 at 14:18
  • 6
    Your table seems to be very poorly designed. You have a composite data element in `next_to_check` and, apparently, `next_to_check` and `next_to_check2` are different columns that store the same values. It would make much more sense (and make the query much easier to write) if you modeled this as a standard parent-child hierarchy table. – Justin Cave Feb 19 '16 at 14:19
  • They way I see it you have two issues here: 1 you need the multiple values to be split into separate rows. Second you need to traverse a hierarchy. You could use [table(cast(multiset(](http://stackoverflow.com/questions/14328621/splitting-string-into-multiple-rows-in-oracle) to split the , into multiple lines, and then use a [recursive CTE](http://stackoverflow.com/questions/4659803/recursion-in-oracle), or a [`connect by prior`](https://docs.oracle.com/cd/B19306_01/server.102/b14200/queries003.htm) to traverse hierachy – xQbert Feb 19 '16 at 14:26

1 Answers1

1

A query would be very simple if a table had been properly normalized.
As mentioned in the comments, there are two issues:

  • next_to_check and next_to_check2 are two columns that store the same values
  • both columns contains lists of values separated by commas, instead of individual values

The table should look like this:

SELECT * From Table where id= 24
    Id | next_to_check  | 
     24|    34          | 
     24|    23          |
     24|    4           |

where a type of next_to_check column must be the same as id column to avoid unnecessary casting.

For the above table the query may be just:

SELECT *
FROM "TABLE"
start with id = 24
connect by id = prior next_to_check;

If the table cannot be normalized, then you can normalize data "on the fly" using a query like this:

WITH normalized_data As (
    SELECT id, trim(regexp_substr(next_to_check, '[^,]+', 1, LEVEL)) next_to_check
    FROM "TABLE"
    CONNECT BY LEVEL <= regexp_count(next_to_check, ',')+1  
    UNION ALL
    SELECT id, trim(regexp_substr(next_to_check_2, '[^,]+', 1, LEVEL)) next_to_check
    FROM "TABLE"
    CONNECT BY LEVEL <= regexp_count(next_to_check_2, ',')+1 
)
SELECT * FROM normalized_data

and then glue the first query to the above query:

WITH normalized_data As (
    SELECT id, trim(regexp_substr(next_to_check, '[^,]+', 1, LEVEL)) next_to_check
    FROM "TABLE"
    CONNECT BY LEVEL <= regexp_count(next_to_check, ',')+1  
    UNION ALL
    SELECT id, trim(regexp_substr(next_to_check_2, '[^,]+', 1, LEVEL)) next_to_check
    FROM "TABLE"
    CONNECT BY LEVEL <= regexp_count(next_to_check_2, ',')+1 
)
SELECT * FROM normalized_data
start with id = 24
connect by id = prior next_to_check;

but a performance of this "workaround" will be poor, it may work for 100 or 1000 records, but it take years on a bigger table.

krokodilko
  • 35,300
  • 7
  • 55
  • 79
  • Consider using this REGEXP_SUBSTR() call in order to handle NULL list elements should they occur: `regexp_substr(next_to_check, '(.*?)(,|$)', 1, LEVEL, NULL, 1)`. See this for more info as using the regex `'[^,]+'` to parse a list with NULL elements will get you into trouble: https://stackoverflow.com/questions/31464275/split-comma-separated-values-to-columns-in-oracle/31464699#31464699. Granted this may not apply to this exact situation but you better be aware of the issue lest you return incorrect data without knowing it! – Gary_W Feb 19 '16 at 20:05