I have a published pipeline in AzureML that preprocess the data and train a new model. I am trying an event-based schedule so that whenever a new dataset is registered in the workspace, it triggers the whole training pipeline. I am using the python AzureML SDK-1.
Using the information from the docs, I tried setting up the schedule as follows:
datastore = Datastore(workspace=ws, name="workspaceblobstore")
reactive_schedule = Schedule.create(ws, name="MyReactiveSchedule", description="Based on input file change.", pipeline_id=pipeline_id, experiment_name=experiment_name, datastore=datastore, polling_interval=2)
When I check the status of the schedule, it says its active, however, when I register a new dataset in the blob storage associated with the workspace, nothing happens even if I wait for more than 5 mins.
Can someone help me understand how does this work in terms of triggering the pipeline when a new dataset is registered?