I have a directory tree like so:
- main
- training
- run.py
- utils.py
- __init__.py
utils.py
includes the following:
import numpy as np
from ray.rllib.algorithms.callbacks import DefaultCallbacks
# Guide: https://discuss.ray.io/t/log-or-record-custom-env-data-via-rllib/4674/2
class RewardLoggerCallback(DefaultCallbacks):
def on_episode_start(
self, *, worker, base_env, policies, episode, env_index, **kwargs
):
episode.user_data = {
'MainRew': 0
}
def on_episode_step(
self, *, worker, base_env, episode, env_index, **kwargs
):
# Running metrics -> keep all values
# Final metrics -> only keep the current value
info = episode.last_info_for()
for k in episode.user_data.keys():
episode.user_data[k].append(info[k])
def on_episode_end(
self, *, worker, base_env, policies, episode, env_index, **kwargs
):
for name, value in episode.user_data.items():
episode.custom_metrics[name + "_avg"] = np.mean(value)
episode.custom_metrics[name + "_sum"] = np.sum(value)
episode.hist_data[name] = value
However, when I try and add from main.training.utils import RewardLoggerCallback
, I get an error of the form ImportError: cannot import name RewardLoggerCallback from main.training.utils
. I thought it might be because I didn't add the import in the __init__.py
file, but even adding from main.training.utils import RewardLoggerCallback
there doesn't help.
Wondering what I might be missing? I have other files in the directory that I've been able to import fine before this way.