0

I have one of the column in oracle table which has below value :

select csv_val from my_table where date='09-OCT-18';

output
==================
50,100,25,5000,1000

I want this values to be in ascending order with select query, output would looks like :

 output
 ==================
 25,50,100,1000,5000

I tried this link, but looks like it has some restriction on number of digits.

Nirmal
  • 4,789
  • 13
  • 72
  • 114
  • 2
    Fix your data model so you are not storing lists of numbers as a string. – Gordon Linoff Oct 09 '18 at 15:52
  • Unfortunately data already there in legacy tables. – Nirmal Oct 09 '18 at 15:53
  • 1
    Possible duplicate of [Sort delimited separated values in a string with (Oracle) SQL](https://stackoverflow.com/questions/7808325/sort-delimited-separated-values-in-a-string-with-oracle-sql) – OldProgrammer Oct 09 '18 at 16:27
  • In the linked thread, is the limitation on number of DIGITS or on the number of TOKENS (NUMBERS)? What is the data type of CSV_VAL - it is VARCHAR2 or is it CLOB? Then: can your input string have null tokens (two consecutive commas, to represent a NULL), and if so, what is the desired handling in the output? –  Oct 09 '18 at 17:03
  • Regarding the "legacy tables" comment - that explains the input, but not the output. Why do you need the OUTPUT as a comma-separated string? Then: if you need the tokens in increasing order, why don't you execute a one-time UPDATE to change the stored strings? (And modify the process that inserts or updates the table so that in the future all CSV_VALUE values are already ordered.) Even if you have to put up with legacy tables, you could create a view (perhaps materialized) where the values are separated on individual rows. You shouldn't have to put up with this in every query you write. –  Oct 09 '18 at 17:05

2 Answers2

0

Here, I made you a modified version of the answer you linked to that can handle an arbitrary (hardcoded) number of commas. It's pretty heavy on CTEs. As with most LISTAGG answers, it'll have a 4000-char limit. I also changed your regexp to be able to handle null list entries, based on this answer.

WITH 
  T (N) AS --TEST DATA
    (SELECT '50,100,25,5000,1000' FROM DUAL
    UNION
    SELECT '25464,89453,15686' FROM DUAL
    UNION
    SELECT '21561,68547,51612' FROM DUAL
    ),
  nums (x) as -- arbitrary limit of 20, can be changed
    (select level from dual connect by level <= 20),
  splitstr (N, x, substring) as
    (select N, x, regexp_substr(N, '(.*?)(,|$)', 1, x, NULL, 1)
    from T
    inner join nums on x <= 1 + regexp_count(N, ',')
    order by N, x)
select N, listagg(substring, ',') within group (order by to_number(substring)) as sorted_N
from splitstr
group by N
; 

Probably it can be improved, but eh...

kfinity
  • 8,581
  • 1
  • 13
  • 20
  • 1
    Did you test your query? The outputs all have one fewer token than the inputs. (The reason probably has to do with `x <= regexp_count`; that should be `x <= 1 + regexp_count`). Then: you mention handling `null`; but `listagg` removes `null`, so that is not a complete handling of nulls. –  Oct 09 '18 at 17:09
  • Thanks, I accidentally deleted the `+1` when editing for readability. – kfinity Oct 09 '18 at 18:33
0

Based on sample data you posted, relatively simple query would work (you need lines 3 - 7). If data doesn't really look like that, query might need adjustment.

SQL>  with my_table (csv_val) as
  2    (select '50,100,25,5000,1000' from dual)
  3  select listagg(token, ',') within group (order by to_number(token)) result
  4  from (select regexp_substr(csv_val, '[^,]+', 1, level) token
  5        from my_table
  6        connect by level <= regexp_count(csv_val, ',') + 1
  7       );

RESULT
-------------------------
25,50,100,1000,5000

SQL>
Littlefoot
  • 131,892
  • 15
  • 35
  • 57