I am attempting to create dummy variables based on a factor variable with more than 200 factor levels. The data has more than 15 million observations. Using the "fastDummies" package, I am using the "dummy_cols" command to convert the factor variable to dummies, and remove the first.
I have read numerous posts on the issue. Several suggest subsetting the data, which I cannot do. This analysis is for a school assignment that requires that I use all included data.
I am using a 16GB RAM Macbook Pro with the 64-bit version of RStudio. The instructions in the post below dictate how to increase the max RAM available to R. However, the instructions seem to imply that I am already at maximum capacity, or that it may be unsafe for my machine to attempt to raise the memory restriction on R.
R on MacOS Error: vector memory exhausted (limit reached?)
I'm not sure how to go about posting 15 million rows of data. The following code shows the unique factor levels for the variable in question:
unique(housing$city)
[1] 40 80 160 200 220 240 280 320 440 450 460 520 560 600 640
[16] 680 720 760 840 860 870 880 920 960 1000 1040 1080 1120 1121 1122
[31] 1150 1160 1240 1280 1320 1360 1400 1440 1520 1560 1600 1602 1620 1640 1680
[46] 1720 1740 1760 1840 1880 1920 1950 1960 2000 2020 2040 2080 2120 2160 2240
[61] 2290 2310 2320 2360 2400 2560 2580 2640 2670 2680 2700 2760 2840 2900 2920
[76] 3000 3060 3080 3120 3160 3180 3200 3240 3280 3290 3360 3480 3520 3560 3590
[91] 3600 3620 3660 3680 3710 3720 3760 3800 3810 3840 3880 3920 3980 4000 4040
[106] 4120 4280 4320 4360 4400 4420 4480 4520 4600 4680 4720 4800 4880 4890 4900
[121] 4920 5000 5080 5120 5160 5170 5200 5240 5280 5360 5400 5560 5600 5601 5602
[136] 5603 5605 5790 5800 5880 5910 5920 5960 6080 6120 6160 6200 6280 6440 6520
[151] 6560 6600 6640 6680 6690 6720 6740 6760 6780 6800 6840 6880 6920 6960 6980
[166] 7040 7080 7120 7160 7240 7320 7360 7362 7400 7470 7480 7500 7510 7520 7560
[181] 7600 7610 7620 7680 7800 7840 7880 7920 8000 8040 8050 8120 8160 8200 8280
[196] 8320 8400 8480 8520 8560 8600 8640 8680 8730 8760 8780 8800 8840 8880 8920
[211] 8940 8960 9040 9080 9140 9160 9200 9240 9260 9280 9320
I used the following commands to create the dummy variables, based on the fastDummies package:
library(fastDummies)
housing <- dummy_cols(housing, select_columns = "city", remove_first_dummy = TRUE)
I get the following response:
Error: vector memory exhausted (limit reached?)
I am, again, trying to create 220 dummies based on the 221 levels (excluding the first to avoid issues of perfect collinearity in analyses).
Any help is most welcome. If I am missing something about the preceding suggestions, my apologies; none of them involved the exact issue I am experiencing (in the context of creating dummies) and I am not very proficient in use of the command line in Mac OS.