1

I've been beating my head against this for a while. I have created a channel, pipeline, datastore and dataset, but the dataset just contains __dt no matter what I do.

I believe the channel, pipeline, and datastore are working, primarily because I see correctly formatted JSON messages in the S3 bucket for the datastore.

My datastore is called "salt_datastore". When I navigate to the relevant S3 bucket, I see a folder called "salt_datastore", and in it, I see a folder with today's date called "__dt=2022-10-09 00:00:00/". Inside that folder, I see a separate .gz file for every message I have sent, with names of the format "1665276480000_1665276510000_435011638936_salt_sensor_0_840.0.salt_sensor_pipeline.json.gz". If I download and open one of these, I see the MQTT messages that were sent to the MQTT topic.

So I think the channel, pipeline, and datastore are working, but if I set up a dataset with the query "select * from salt_datastore", I only get "__dt". I feel like this is the starting text of the folder inside the salt_datastore S3 bucket, but I can't figure out how to construct a valid SQL query that gives me what's inside that folder. Any help?

Chris
  • 11
  • 1

1 Answers1

0

I had the same issue.

DELETED INCORRECT ANSWER

Edited

Previously I incorrectly said to put the name of the datastore in single quotes in the sql query.

But it was the name of the topic that I had to put in single quotes. This is actually in your sql statement in the IoT Core rule

NOT

the sql statement for the datastore.

So the sql for the topic should look like

SELECT * FROM 'some_topic'

That is, the topic that that you are publishing to.

Adding query images

Topic query

Topic Query

Dataset query

Dataset Query

name-andy
  • 423
  • 1
  • 5
  • 15
  • When I try that I get a red popup message that says "We could not update your data set. Provided SQL query is malformed." Using double quotes around "salt_datastore" doesn't have the same error, but it also doesn't appear to work any differently, showing just __dt in the dataset. – Chris Dec 12 '22 at 17:34
  • Apologies, I had posted an incorrect solution before. I have updated the answer to the reflect what I had actually done to fix the issue for me. – name-andy Dec 12 '22 at 18:52
  • 1
    I had been away from this for a long time, but your response got me looking at it again. I started all over and created a new channel, datastore, and dataset with the Quick Start, and it works fine now - my dataset is filling up. I compared my non-functional instances to the Quick Start ones, and the main difference I see is that the datastore storage is managed by the service. In my previous datastore, the storage was an S3 bucket managed by me. I suspect that maybe there was some disconnect there. It seemed like the data was making it into the bucket, but something wasn't working right. – Chris Dec 13 '22 at 20:12
  • Hey, good going! – name-andy Dec 14 '22 at 20:02