Recommendations could vary based on your implementation. Here are some notes copied directly from MySQL documentation:
Bulk Data Loading Tips
When importing data into InnoDB, make sure that MySQL does not have
autocommit mode enabled because that requires a log flush to disk for
every insert. To disable autocommit during your import operation,
surround it with SET autocommit and COMMIT statements.
Use the multiple-row INSERT syntax to reduce communication overhead
between the client and the server if you need to insert many rows:
INSERT INTO yourtable VALUES (1,2), (5,5), ...;
If you are doing a huge batch insert, try avoiding the "select from
last_insert_id" that follows the insert as it seriously slows down the
insertions (to the order of making a 6 minute insert into a 13 hour
insert) if you need the number for another insertion (a subtable
perhaps) assign your own numbers to the id's (this obviously only
works if you are sure nobody else is doing inserts at the same time).
As mentioned already, you can increase the size of the InnoDB buffer pool (innodb_buffer_pool_size variable). This is generally a good idea because the default size is pretty small and most systems can accommodate lending more memory to the pool. This will increase the speed of most queries, especially SELECTs (as more records will be kept in the buffer between queries). The insert buffer is also a section of the buffer pool and will store recently inserted records, which will increase speed if you are basing future inserts on values from previous inserts. Hope this helps :)