I have several files (11) already as datasets (mltable) in Azure ML Studio. Loading to df's works to all the cases except one. I believe the reason for that is the size - 1.95 GB. I wonder how can I load this dataset to dataframe? So far I did not manage to load it at all.
Any tips how to do it effectively? I tried to figure out a way to do it in parallel with the modin but failed. Below you will find the load script.
subscription_id = 'xyz'
resource_group = 'rg-personal'
workspace_name = 'test'
workspace = Workspace(subscription_id, resource_group, workspace_name)
dataset = Dataset.get_by_name(workspace, name='buses')
dataset.to_pandas_dataframe()