[Scilab-users] Getting netCDF files into Scilab 6.0.1
arctica1963
arctica1963 at gmail.com
Sun Jul 22 11:11:45 CEST 2018
Hello,
I am looking to use netCDF files generated from the GMT software in Scilab
6.0.1, but there is no easy way to work the problem other than exporting the
data out from the binary grids to either XYZ or ESRI Ascii raster. The
latter is a more compact form with the basic structure of a header
describing the data limits and structure:
ncols 1200
nrows 1200
xllcorner -10 Lower-left X (Long-Lat limits -10/10/-10/10
yllcorner -10 Lower-Left Y
cellsize 0.0166666666667 - increment = 1 arc minute
nodata_value -9999
451.343170166 436.005554199 443.061035156 443.665924072 465.607574463
492.191741943 476.50994873 452.265014648 451.439880371 461.659393311 ......
...
... all of the Z-values
...
-4323.2890625 -4315.12451172 -4307.19384766 -4305.27490234 -4311.49902344
-4317.90771484 -4324.23046875 -4320.47070313 -4308.60107422 -4292.62011719
-4280.39697266
This is a moderately large file ~20 mB or so. Is there a way to reliably
read this structure and vectorise X, Y, and Z? The data I am looking at is a
grid so want to work on the whole grid as eith DEM data or gravity etc.
As a test, I converted the grid to xyz, and tried csvRead, having replaced
the spaces (which were not a constant number of spaces between values) with
commas. The file was in excess of 50 mb, so not that huge really, but
csvRead always failed to read all the file and drop out at one specific line
(~7608) with a message of inconsistent column numbers, i.e. it though there
were only 2 columns when in reality the whole thing was 3 (checked in the
program Surfer). When one looks at the line specified, it is clearly 3
columns.
Does csvRead have a limit to what it can load, if so what is a workaround? I
suppose one could chop it down into chunks and concatanate the arrays, but
that is a lot more work.
Any suggestions?
Thanks
Lester
--
Sent from: http://mailinglists.scilab.org/Scilab-users-Mailing-Lists-Archives-f2602246.html
More information about the users
mailing list