I'm working on recovering from a system crash (ouch) in which I lost my Bookpedia data (double ouch).
The good news is that as I regularly use both Goodreads and LibraryThing, the data exists.
The bad news is that I'm running into problems with the .csv import. It seems that both Goodreads and LibraryThing export .csv files wherein many of the fields, while separated by commas, are also enclosed within quotation marks, in order to get around instances where commas appear within entries. For example (from my Goodreads export):
Code: Select all
"Spock, Messiah!","Theodore R. Cogswell","Cogswell, Theodore R.","","0553101595","9780553101591",0,3.50,"Bantam Books","Paperback","182",1976,1976,,06/14/09,to-read,to-read,,,,,,
...and so on, it imports as...Title: Spock, Messiah!
Author: Theodore R. Cogswell
(skipped)
(skipped)
ISBN: 0553101595
...and so on. Obviously, this results in a lot of useless data...and when I'm importing somewhere above 1,600 entries, that's a lot of useless data.Title: "Spock
Author: Messiah!"
(skipped)
(skipped)
ISBN: Theodore R."
I'm trying to find a solution (unfortunately, it's difficult to do a straight search-and-replace, as replacing commas with tabs would create the same problem, and since some fields use the quotes and some don't, I can't do a search-and-replace on the quote-comma-quote combination), but so far have ended up with little luck and a lot of frustration.
Any advice on how to work around this -- or, perhaps, a fix in the next update to Bookpedia -- would be greatly appreciated!
Thanks much!