[Scilab-users] Trouble with reading whitespace separated csv

Rafael Guerra jrafaelbguerra at hotmail.com
Sat Nov 18 17:25:27 CET 2017


Hi Samuel,

Not a bug but maybe it could be enhanced for the conversion='double' case by treating consecutive spaces or tabs as a single separator?

Regards,
Rafael

From: users [mailto:users-bounces at lists.scilab.org] On Behalf Of Samuel Gougeon
Sent: Saturday, November 18, 2017 5:07 PM
To: Users mailing list for Scilab <users at lists.scilab.org>
Subject: Re: [Scilab-users] Trouble with reading whitespace separated csv

Hello Richard,

Le 17/11/2017 à 17:41, Richard llom a écrit :

Hello,

I'm trying to read in a file with whitespace separated values, with

[log_data, comments] = csvRead(log_file,ascii(32),'.','double');

However I'm constantly getting this error:

Warning: Inconsistency found in the columns. At line 2, found 11 columns

while the previous had 5.

but with changing column numbers...



Any clue whats going on?

Using space as a column separator is a very bad choice. For instance, in the sample that you have posted, the 3 consecutive following rows



16800 23200  1  1 10  -3.4 1005 151  4.0 5  2.6  89    1   28 230 -282  2

16800 23200  1  1 11  -3.1 1004 129  3.5 6  2.6  86   32   70 247 -285  2

16800 23200  1  1 12  -1.4 1004 148  5.2 4  2.6  80  146   54 245 -289  2

are then read by csvRead() as (if "," were the separator instead)
16800,23200,,1,,1,10,,-3.4,1005,151,,4.0 5,,2.6,,89,,,,1,,,28,230,-282,,2
16800,23200,,1,,1,11,,-3.1,1004,129,,3.5 6,,2.6,,86,,,32,,,70,247,-285,,2
16800,23200,,1,,1,12,,-1.4,1004,148,,5.2 4,,2.6,,80,,146,,,54,245,-289,,2

with many empty columns and a shift, that yields an expected error.
So, there is no bug here from csvRead(). As noted by Rafael, csvRead() does not fit to the
job here. Since your data are only numerical, fscanfMat() should work instead, indeed.

Samuel
-------------- next part --------------
An HTML attachment was scrubbed...
URL: <https://lists.scilab.org/pipermail/users/attachments/20171118/eaaa4142/attachment.htm>


More information about the users mailing list