1

There is a text file ('mt.txt'), when I use 'read.csv' ,the variable V30 tail is replaced by zero, because the length is very big (refer to below V30 table). How can I solve it? Thanks.

V30
92748999985195543475049289
92748999985195543475049265
92748999985195543400030542
92748999985195543475049227

enter image description here

zx8754
  • 52,746
  • 12
  • 114
  • 209
anderwyang
  • 1,801
  • 4
  • 18
  • 2
    It is unclear what your actual issue is. What R prints there is not what is stored internally. See, e.g., `print(DF$V30, digits = 22)`. However, there is a loss of precision because R imports these as floating-point numbers, see `sprintf("%f", 92748999985195543475049289)`. You would need a to use a package that provides big integers or arbitrary precision numbers if you want to avoid that. But that could limit what you can do with these numbers. – Roland Oct 15 '21 at 08:02
  • 1
    Assuming these are some IDs, if you are not going to do any arithmetic operations on these numbers, then read them as character. `read.table(... colClasses = "character")` By the way read.csv is just a wrapper for read.table. As your input is not CSV, maybe use read.table instead. – zx8754 Oct 15 '21 at 08:12
  • @zx8754 thanks for you replay , that's what i want , i update my code as " read.csv('mt.txt',sep='\t',colClasses = "character") " , the result is great. Thanks. – anderwyang Oct 15 '21 at 08:18
  • Great, closed this post with relevant posts which used the "colClasses" approach as a solution. – zx8754 Oct 15 '21 at 08:21

0 Answers0