14 GB for your data is not abnormal. Let's do the math.
Simply adding up the sizes of your columns gives 68 bytes per column.
2 bigints @ 8 bytes each 16 bytes
4 integers @ 4 bytes each 16 bytes
4 doubles @ 8 bytes each 32 bytes
1 date @ 4 bytes 4 bytes
--------
68 bytes
149,444,622 at 68 bytes each is about 9.7 GB. This is the absolute minimum size of your data if there were no database overhead. But there is overhead. This answer reckons its about 28 bytes per row. 68 + 28 is 96 bytes per row. That brings us to... 14.3 GB. Just what you have.
I doubt you can reduce the size without changing your schema, dropping indexes, or deleting data. If you provided more detail about your schema we could give suggestions, but I would suggest doing that as a new question.
Finally, consider that 50 GB is a pretty small database. For example, the smallest paid database offered by Heroku is 64 GB and just $50/month. This may be a situation where it's fine to just use a bigger disk.