213

I have a JSON file consisting of about 2000 records. Each record which will correspond to a document in the mongo database is formatted as follows:

{jobID:"2597401",
account:"XXXXX",
user:"YYYYY",
pkgT:{"pgi/7.2-5":{libA:["libpgc.so"],flavor:["default"]}},     
startEpoch:"1338497979",
runTime:"1022",
execType:"user:binary",
exec:"/share/home/01482/XXXXX/appker/ranger/NPB3.3.1/NPB3.3-MPI/bin/ft.D.64",
numNodes:"4",
sha1:"5a79879235aa31b6a46e73b43879428e2a175db5",
execEpoch:1336766742,
execModify: new Date("Fri May 11 15:05:42 2012"),
startTime: new Date("Thu May 31 15:59:39 2012"),
numCores:"64",
sizeT:{bss:"1881400168",text:"239574",data:"22504"}},

Each record is on a single line in the JSON file, and the only line breaks are at the end of every record. Therefore, each line in the document starts with "{jobID:"... I am trying to import these into a mongo database using the following command:

mongoimport --db dbName --collection collectionName --file fileName.json

However, I get the following error:

Sat Mar  2 01:26:12 Assertion: 10340:Failure parsing JSON string near: ,execModif
0x10059f12b 0x100562d5c 0x100562e9c 0x10025eb98 0x10000e643 0x100010b60 0x10055c4cc 0x1000014b7    
0x100001454 
 0   mongoimport                         0x000000010059f12b _ZN5mongo15printStackTraceERSo + 43
 1   mongoimport                         0x0000000100562d5c _ZN5mongo11msgassertedEiPKc + 204
 2   mongoimport                         0x0000000100562e9c _ZN5mongo11msgassertedEiRKSs + 12
 3   mongoimport                         0x000000010025eb98 _ZN5mongo8fromjsonEPKcPi + 1576
 4   mongoimport                         0x000000010000e643          
                                         _ZN6Import8parseRowEPSiRN5mongo7BSONObjERi + 2739
 5   mongoimport                         0x0000000100010b60 _ZN6Import3runEv + 7376
 6   mongoimport                         0x000000010055c4cc _ZN5mongo4Tool4mainEiPPc + 5436
 7   mongoimport                         0x00000001000014b7 main + 55
 8   mongoimport                         0x0000000100001454 start + 52
Sat Mar  2 01:26:12 exception:BSON representation of supplied JSON is too large: Failure parsing    
    JSON string near: ,execModif
Sat Mar  2 01:26:12 
Sat Mar  2 01:26:12 imported 0 objects
Sat Mar  2 01:26:12 ERROR: encountered 1941 errors

I do not know what the problem is. Can someone recommend a solution?

Syscall
  • 19,327
  • 10
  • 37
  • 52
amber4478
  • 6,433
  • 3
  • 20
  • 17

20 Answers20

408

I was able to fix the error using the following query:

mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray

Hopefully this is helpful to someone.

Bruno Bronosky
  • 66,273
  • 12
  • 162
  • 149
amber4478
  • 6,433
  • 3
  • 20
  • 17
  • 33
    `--jsonArray` being the ticket, yes? – Dudo Jan 09 '14 at 20:01
  • 9
    Short form of this `mongoimport -d -c --jsonArray -f .json`. – Navidot Oct 19 '18 at 13:15
  • 3
    sometimes user/password are required `mongoimport --db dbName --collection collectionName --file fileName.json --jsonArray -u ser -p password` – Diego Andrés Díaz Espinoza Jul 11 '19 at 16:37
  • 1
    Adding to @DiegoAndrésDíazEspinoza comment, that in my case I got an error of "unable to authenticate using mechanism 'SCRAM-SHA-1'". So, after a search, I found that it is missing the keyword `authenticationDatabase` as mentioned in the answer https://stackoverflow.com/a/58067928/6791222. – Feroz Khan Jun 17 '20 at 21:56
  • For future searchers, you might still need to install mongoimport: https://docs.mongodb.com/database-tools/installation/installation/ – Kaleb Coberly Nov 21 '20 at 01:55
  • --file fileName.json is extra and causes: "Failed: cannot decode array into a primitive.D" – vahid sabet Feb 01 '22 at 06:24
  • I'm getting `Uncaught: SyntaxError: Missing semicolon. (1:14)` error on `--db` – Aayush Shah Nov 13 '22 at 04:46
68

try this,

mongoimport --db dbName --collection collectionName <fileName.json

Example,

mongoimport --db foo --collection myCollections < /Users/file.json
connected to: *.*.*.*
Sat Mar  2 15:01:08 imported 11 objects

Issue is because of you date format.

I used same JSON with modified date as below and it worked

{jobID:"2597401",
account:"XXXXX",
user:"YYYYY",
pkgT:{"pgi/7.2-5":{libA:["libpgc.so"],flavor:["default"]}},     
startEpoch:"1338497979",
runTime:"1022",
execType:"user:binary",
exec:"/share/home/01482/XXXXX/appker/ranger/NPB3.3.1/NPB3.3-MPI/bin/ft.D.64",
numNodes:"4",
sha1:"5a79879235aa31b6a46e73b43879428e2a175db5",
execEpoch:1336766742,
execModify:{"$date" : 1343779200000},
startTime:{"$date" : 1343779200000},
numCores:"64",
sizeT:{bss:"1881400168",text:"239574",data:"22504"}}

hope this helps

Srivatsa N
  • 2,291
  • 4
  • 21
  • 36
  • I have the same error as in the question... Did check this import? – Denis Nikanorov Mar 02 '13 at 09:59
  • I adjusted the dates as you suggested and that did get rid of that particular error. However, now I am getting a new one. Here is the new error: – amber4478 Mar 02 '13 at 20:25
  • Can you paste the new JSON and which version of Mongo you are on ? – Srivatsa N Mar 03 '13 at 12:09
  • I was able to fix the error by adding --jsonArray to the end of the query. – amber4478 Apr 11 '13 at 05:16
  • Need to use `""` around the `.json`, if it's contain folder name has spaces in it. [Answered by Abhi below](http://stackoverflow.com/a/25875547/452708) For E.g. **This will not work, need to add `""` to the json file location to import it.** `D:\>mongoimport --db testimport --collection small_zip < D:\Dev\test test\small_zips.json The system cannot find the file specified.` **This works** `D:\>mongoimport --db testimport --collection small_zip < "D:\Dev\test test\small_zips.json" 2016-04-17T18:32:34.328+0800 connected to: localhost 2016-04-17T18:32:34.610+0800 imported 200 documents` – Abhijeet Apr 17 '16 at 10:37
29

Using mongoimport you can able to achieve the same

mongoimport --db test --collection user --drop --file ~/downloads/user.json

where,

test - Database name
user - collection name
user.json - dataset file

--drop is drop the collection if already exist.

KARTHIKEYAN.A
  • 18,210
  • 6
  • 124
  • 133
22

console:

mongoimport -d dbName -c collectionName dataFile.js 
Andrew
  • 36,676
  • 11
  • 141
  • 113
8

Your syntax appears completely correct in:

mongoimport --db dbName --collection collectionName --file fileName.json

Make sure you are in the correct folder or provide the full path.

Sebastián Palma
  • 32,692
  • 6
  • 40
  • 59
Robert Grutza
  • 121
  • 1
  • 4
7

I have used below command for export DB

mongodump --db database_name --collection collection_name

and below command worked for me to import DB

mongorestore --db database_name path_to_bson_file
Ravi Hirani
  • 6,511
  • 1
  • 27
  • 42
5

Import JSON/CSV file in MongoDB

  • wait wait
  • first check mongoimport.exe file in your bin folder(C:\Program Files\MongoDB\Server\4.4\bin) if it is not then download mongodb database tools(https://www.mongodb.com/try/download/database-tools)
  • copy extracted(unzip) files(inside unzipped bin) to bin folder(C:\Program Files\MongoDB\Server\4.4\bin)
  • copy your json file to bin folder(C:\Program Files\MongoDB\Server\4.4\bin)
  • Now open your commond prompt change its directory to bin
cd "C:\Program Files\MongoDB\Server\4.4\bin"
  • Now copy this on your commnad prompt
mongoimport -d tymongo -c test --type json --file restaurants.json
  • where d- database(tymongo-database name), c-collection(test-collection name)

FOR CSV FILE

 mongoimport -d tymongo -c test --type csv --file database2.csv --headerline
Amit Kumar
  • 619
  • 7
  • 10
3

Run the import command in another terminal. (not inside mongo shell.)

mongoimport --db test --collection user --drop --file ~/downloads/user.json
Nija I Pillai
  • 1,046
  • 11
  • 13
3

In windows you can use your Command Prompcmd cmd , in Ubuntu you can use your terminal by typing the following command:

mongoimport  -d  your_database_name  -c  your_collection_name  /path_to_json_file/json_file_name.json

then when you open your mongo shell, you will find to check your database_name when running this command:

show databases
Noha Salah
  • 493
  • 1
  • 7
  • 11
2

This command works where no collection is specified .

mongoimport --db zips "\MongoDB 2.6 Standard\mongodb\zips.json"

Mongo shell after executing the command

connected to: 127.0.0.1
no collection specified!
using filename 'zips' as collection.
2014-09-16T13:56:07.147-0400 check 9 29353
2014-09-16T13:56:07.148-0400 imported 29353 objects
Abhi
  • 6,471
  • 6
  • 40
  • 57
2

Solution:-

mongoimport --db databaseName --collection tableName --file filepath.json

Example:-

Place your file in admin folder:-

C:\Users\admin\tourdb\places.json

Run this command on your teminal:-

mongoimport --db tourdb --collection places --file ~/tourdb/places.json

Output:-

admin@admin-PC MINGW64 /
$ mongoimport --db tourdb --collection places --file ~/tourdb/places.json
2019-08-26T14:30:09.350+0530 connected to: localhost
2019-08-26T14:30:09.447+0530 imported 10 documents

For more link

Parveen Chauhan
  • 1,396
  • 12
  • 25
1

I tried something like this and it actually works:

mongoimport --db dbName --file D:\KKK\NNN\100YWeatherSmall.data.json
JoSSte
  • 2,953
  • 6
  • 34
  • 54
tyne
  • 97
  • 5
1

This works with me when db with usrname and password

mongoimport --db YOUR_DB --collection MyCollection --file /your_path/my_json_file.json -u my_user -p my_pass

db without username password please remove -u my_user -p my_pass

My sample json

{ 
    "_id" : ObjectId("5d11c815eb946a412ecd677d"), 
    "empid" : NumberInt(1), 
    "name" : "Rahul"
}
{ 
    "_id" : ObjectId("5d11c815eb946a412ecd677e"), 
    "empid" : NumberInt(2), 
    "name" : "Rahul"
}
Rahul Mahadik
  • 11,668
  • 6
  • 41
  • 54
1

A bit late for probable answer, might help new people. In case you have multiple instances of database:

mongoimport --host <host_name>:<host_port> --db <database_name> --collection <collection_name>  --file <path_to_dump_file> -u <my_user> -p <my_pass>

Assuming credentials needed, otherwise remove this option.

sd1517
  • 607
  • 4
  • 7
  • 23
1
  1. Just copy path of json file like example "C:\persons.json"
  2. go to C:\Program Files\MongoDB\Server\4.2\bin
  3. open cmd on that mongodb bin folder and run this command

mongoimport --jsonArray --db dbname--collection collectionName--file FilePath

example mongoimport --jsonArray --db learnmongo --collection persons --file C:\persons.json

1

Number of answer have been given even though I would like to give mine command . I used to frequently. It may help to someone.

mongoimport original.json -d databaseName -c yourcollectionName --jsonArray --drop
Pramod Kharade
  • 2,005
  • 1
  • 22
  • 41
1

this will work:

$  mongoimport --db databaseName --collection collectionName --file filePath/jsonFile.json 

2021-01-09T11:13:57.410+0530 connected to: mongodb://localhost/ 2021-01-09T11:13:58.176+0530 1 document(s) imported successfully. 0 document(s) failed to import.

Above I shared the query along with its response

1

mongoimport -d <dbname> -c <collectio_name> --file <c:\users\test.json> --jsonArray

hamza
  • 21
  • 2
  • As it’s currently written, your answer is unclear. Please [edit] to add additional details that will help others understand how this addresses the question asked. You can find more information on how to write good answers [in the help center](/help/how-to-answer). – Community Jan 17 '22 at 13:36
0

If you try to export this test collection:

> db.test.find()
{ "_id" : ObjectId("5131c2bbfcb94ddb2549d501"), "field" : "Sat Mar 02 2013 13:13:31 GMT+0400"}
{"_id" : ObjectId("5131c2d8fcb94ddb2549d502"), "field" : ISODate("2012-05-31T11:59:39Z")}

with mongoexport (the first date created with Date(...) and the second one created with new Date(...) (if use ISODate(...) will be the same as in the second line)) so mongoexport output will be looks like this:

{ "_id" : { "$oid" : "5131c2bbfcb94ddb2549d501" }, "field" : "Sat Mar 02 2013 13:13:31 GMT+0400" }
{ "_id" : { "$oid" : "5131c2d8fcb94ddb2549d502" }, "field" : { "$date" : 1338465579000 } }

So you should use the same notation, because strict JSON doesn't have type Date( <date> ).

Also your JSON is not valid: all fields name must be enclosed in double quotes, but mongoimport works fine without them.

You can find additional information in mongodb documentation and here.

Denis Nikanorov
  • 832
  • 7
  • 16
  • I adjusted the dates as you suggested and that did get rid of that particular error. However, now I am getting a new one. Here is the new error: ' Sat Mar 2 15:22:07 exception:BSON representation of supplied JSON is too large: Failure parsing JSON string near: data:"1949 Sat Mar 2 15:22:07 Sat Mar 2 15:22:07 imported 0 objects Sat Mar 2 15:22:07 ERROR: encountered 34763 errors' – amber4478 Mar 02 '13 at 20:27
  • I think it's another error retaled to field `sizeT:{data: "1949..."}}` – Denis Nikanorov Mar 03 '13 at 20:29
0
  1. import json array data to ATLAS in your local laptop https://www.mongodb.com/docs/atlas/import/mongoimport/
mongoimport --uri "mongodb+srv://<user>:<password>@cluster0.elddaddy.mongodb.net/test?retryWrites=true&w=majority&ssl=true" --collection Providers --drop --file /Users/Documents/data2.json --jsonArray

This command imports the data in the data2.json file to a collection named Providers in the MongoDB database located at the cluster0.dl79aky.mongodb.net URI.

The --drop option is used to drop the existing collection if it exists.

The --jsonArray option specifies that the input file is a JSON array, rather than a single JSON object. This allows us to import an array of documents as a batch.

The --uri option specifies the URI to connect to the database, which includes the user credentials, database name, and connection options.

Here's the breakdown of the URI:

mongodb+srv://: specifies that this is a connection string for a MongoDB Atlas cluster that uses SRV DNS record :@: specifies the username and password of the user that is connecting to the database cluster0.dl79aky.mongodb.net: the name of the MongoDB Atlas cluster that you want to connect to

/test: the name of the database within the cluster that you want to connect to

?retryWrites=true&w=majority: specifies the write concern options for the connection. retryWrites=true specifies that the driver should retry writes if they fail, and w=majority specifies that the write operation should wait for the majority of nodes to acknowledge the write before returning

&ssl=true: specifies that the connection should use SSL/TLS encryption

  1. check your data should be json array
[
  {
    "name": "John",
    "age": 30,
    "email": "john@example.com"
  },
  {
    "name": "Jane",
    "age": 25,
    "email": "jane@example.com"
  },
  {
    "name": "Bob",
    "age": 40,
    "email": "bob@example.com"
  }
]