fix for loading a resource with large amount of data in a single cell#112
fix for loading a resource with large amount of data in a single cell#112OriHoch wants to merge 3 commits intofrictionlessdata:masterfrom
Conversation
|
I think we can probably use |
ccf920c to
d24ac21
Compare
|
@akariv what's blocking merging this one in? |
|
@rufuspollock see my comment from Jan 3rd - basically don't add a parameter, but always set it with a default large number (which isn't |
|
I think that it's better to leave out these system specific details from datapackage-pipelines there will always be some limitation on field size (if not from field_size_limit then from memory limit), so I think it's better to leave it to calling code to set |
|
documented this issue in the README - #128 |
running
load_resourcefor a csv file which has a lot of data in a single cell raises an exception -fixed by adding
large-resourceboolean parameter toload-resourcewhich allows to opt-in to the fix for this problem. I'm not sure what are the implications of always doing this fix.