Loading NetCDF datafiles
Underlying assumptions
- There are NetCDF files in a folder (in what follows, argument
foldername
) that contain (time-dependent) scalar/velocity fields on a regular (in space and time) grid spanning the whole globe in longitude/latitude units (in particular, you do not care about the poles). - Each file corresponds to a single-time snapshot of this field (and spacing between times are constant).
- The filenames are orded ascending with time.
- Each file has a field with longitude and latitude coordinates as a 1d array (uniform accross files).
- You have a regular expression (e.g.
r"^nrt_global_allsat_phy_l4_([0-9][0-9][0-9][0-9])([0-9][0-9])([0-9][0-9])_.*.nc$"
) that matches only the files you want (in what follows, argumentschema
).
The package assumes that the velocities are given at grid points as opposed to, for example, at the center of each grid cell. This assumption may not hold exactly for a velocity field used, including those shown in examples here.
Loading velocity fields
OceanTools.read_ocean_velocities
— Functionread_ocean_velocities(foldername,start_date,end_date,boundary_t, [schema,LL_space=nothing,UR_space=nothing,...])
Reads velocity fields in a space-time window. Uses the whole globe if LL_space
and UR_space
are nothing
. Else the rectangle of points will include spatially include the rectangle bounded by LL_space
and UR_space
.
The first timestep loaded is the first one in the folder foldername
matching the regular expression schema
where the "time" variable (converted to Int
via kwarg date_to_int
) is not less than start_date
.
Spacing of "time" is assumed to be 1 day, change date_to_int
if your data is different (note that the units of the velocity fields in the files were assumed to be in m/s and are converted to deg/day, you will have to manually rescale if spacing is not 1 day ). The range will include end_date
Supports 360-periodic longitude in the sense that LL_space[1]
can larger than UR_space[1]
. However, it cannot extend by more than one period. If you are very close to one period use the whole globe to avoid issues.
Other keyword arguments are:
lon_label
with default "longitude"lat_label
with default "latitude"remove_nan
with defaulttrue
, sets missing values to 0 instead ofNaN
array_ctor
with defaultSharedArray{Float64}
to specify underlying storage to use.date_to_int
with default_daysSince1950
is the method for convertingDateTime
objects toInt
filename_match_to_date
(if not equal tonothing
) converts the match of the regexschema
with the filename into the (integer) date
of the contents.
Returns (res_full, Ust1, (Lon, Lat, times))
where res_full
is the corresponding ItpMetdata
object and Ust1 is a slice of the x-component at the first timestep without NaNs removed
Loading scalar fields
OceanTools.read_ocean_scalars
— Functionread_ocean_scalars(args...; scalar_field_name="ssh", kwargs...)
Reads in a scalar field, otherwise like read_ocean_velocities
. Resulting data
field in the ItpMetdata is a 1-tuple with an array (of type given by array_ctor
).
In each case, the end result contains an ItpMetadata
that contains all of the data needed for interpolation.
Pseudocode example
p, _ = read_ocean_velocities(arguments)
uv_trilinear(u,p,t)