6

I have issues concerning the table-api of Flink (1.13+). I have a POJO containing several fields, one of them being:

List<String> my_list; 

I create my table using the following declaration for this field:

"CREATE TABLE my_table (
   ...
   my_list ARRAY<STRING>,
   ...
)"

When I try to convert my table to DataStream using the toRetractStream[MY_POJO] method after, I got the following error:

Exception in thread "main" org.apache.flink.table.api.ValidationException: Column types of query result and sink for unregistered table do not match.

Cause: Incompatible types for sink column 'my_list' at position 11. Query schema: [..., my_list: ARRAY, ...] Sink schema: [..., my_list: RAW('java.util.List', ?), ...]

I would like to avoid mapping every fields by hand and keep the code clean, is there a solution to handle this kind of data types ?

Fray
  • 173
  • 6

1 Answers1

0

I would recommend to try out the new API methods from/toDataStream and from/toChangelogStream. Those support all kinds of classes and data types.

toDataStream supports also mapping to POJOs with List members.

twalthr
  • 2,584
  • 16
  • 15
  • Not working for me, with new API I have the same issue – Niko Aug 08 '22 at 08:22
  • @Niko Maybe you can share an example with us? Or send it to my via Flink's Slack channel. It should work. – twalthr Aug 25 '22 at 12:53
  • I already forget what was my exact problem I have several complex fields and finally I used UDF's to figure it out. I asked about one of them [here](https://stackoverflow.com/questions/73321239/unable-to-decode-avro-data-from-kafka-with-avro-schema). How I can find Flink Slack ? – Niko Aug 25 '22 at 14:50
  • 1
    Great that you could find a solution. Slack can be found here: https://flink.apache.org/community.html#slack – twalthr Aug 26 '22 at 07:58