-1

I would like to understand if I am loading csv file to a DB table, am I suppose to create table and schema in the database before I do that? Or, can we create table, schema from spark through sqlcontext? If yes, then how ? any sample code that I can refer? I tried, but all I could understand that I need to create table, and schema in the DB before I load files from Spark...

I have spark 1.6.1, and scala 2.10.5, sql server 2008

Thank you...!!

user3521180
  • 1,044
  • 2
  • 20
  • 45
  • Welcome to Stack Overflow! We are a question-and-answer site, not a coders-for-hire service. Please explain what you have tried so far and why it hasn't worked. See: [Why is "Can someone help me?" not an actual question?](http://meta.stackoverflow.com/q/284236) and [How to ask a good question when I'm not sure what I'm looking for?](https://meta.stackoverflow.com/questions/262527/how-to-ask-a-good-question-when-im-not-sure-what-im-looking-for) – Joe C Mar 09 '17 at 07:18
  • My post related to FAQ, I am here not for any job assistance. Please understand my question before you put such comment, I have clearly mentioned my requirement. – user3521180 Mar 09 '17 at 07:20
  • Please read the links I have provided (particularly the first one) to understand why this question does not fit on Stack Overflow, and if you can edit in such a way that it can fit, please do so. – Joe C Mar 09 '17 at 07:22
  • On the question itself, I agree that the table should be there before you do anything with Spark. – Joe C Mar 09 '17 at 07:23
  • That is all I needed, Gracias..we can close the thread – user3521180 Mar 09 '17 at 07:25

1 Answers1

0

This answer may help you: Saving / exporting transformed DataFrame back to JDBC / MySQL In summary, yes you will need to have the table pre created. But from spark 2.0 you can create table while loading from csv.

Community
  • 1
  • 1
Tawkir
  • 1,176
  • 7
  • 13