ArgoData.MITprofAnalysis.add_climatology_factors!Method
add_climatology_factors!(df)

Add temporal interpolation factors (rec0,rec1,fac0,fac1) to DataFrame.

df=CSV.read("csv/profile_positions.csv",DataFrame)
MITprofAnalysis.add_climatology_factors!(df)
ArgoData.MITprofAnalysis.add_level!Method
add_level!(df,k)

Read from e.g. csv_levels/k1.csv and add variables to df.

df=CSV.read("csv/profile_positions.csv",DataFrame)
MITprofAnalysis.add_level!(df,5)
ArgoData.MITprofAnalysis.add_tile!Method
add_tile!(df,Γ,n)

Add tile index (see MeshArrays.Tiles) to df that can then be used with e.g. groupby.

input_file=joinpath("MITprof_input","csv","profile_positions.csv")
df=CSV.read(input_file,DataFrame)
G=GriddedFields.load()
MITprofAnalysis.add_tile!(df,G.Γ,30)
ArgoData.MITprofAnalysis.cost_functionsFunction
cost_functions(vv="prof_T",JJ=[])

Loop through files and compute nb profiles, nb non-blank profiles, nb levels mean, cost mean.

pth="MITprof/"
nt,np,nz,cost=MITprofAnalysis.cost_functions(pth,"prof_S")

using JLD2
jldsave(joinpath("csv","prof_S_stats.jld2"); nt,np,nz,cost)
ArgoData.MITprofAnalysis.csv_of_positionsFunction
csv_of_positions(path)

Create table (DataFrame) of the positions and dates obtained by looping through files in path. Additional information such as float ID, position on the ECCO grid pos, number of valid data points for T and S (nbT ,nbS).

using ArgoData
path="MITprof_Argo_yearly/"
csv_file="csv/profile_positions.csv"

using MeshArrays
γ=GridSpec("LatLonCap",MeshArrays.GRID_LLC90)
Γ=GridLoad(γ)

df=MITprofAnalysis.csv_of_positions(path,Γ)
CSV.write(csv_file, df)
ArgoData.MITprofAnalysis.csv_of_variablesMethod
csv_of_variables(name::String)

Create Array of all values for one variable, obtained by looping through files in path.

@everywhere using ArgoData, CSV, DataFrames
@everywhere list_v=("prof_T","prof_Testim","prof_Tweight","prof_S","prof_Sestim","prof_Sweight")
@distributed for v in list_v
    output_file="csv/"*v*".csv"
    tmp=MITprofAnalysis.csv_of_variables(v)
    CSV.write(output_file,DataFrame(tmp,:auto))
end
ArgoData.MITprofAnalysis.prepare_interpolationMethod
prepare_interpolation(Γ,lon,lat)

Alias for InterpolationFactors(Γ,lon,lat).

The loop below creates interpolation coefficients for all data points.

The results are stored in a file called csv/profile_coeffs.jld2 at the end.

using SharedArrays, Distributed

@everywhere begin
    using ArgoData
    G=GriddedFields.load()
    df=MITprofAnalysis.read_pos_level(5)

    np=size(df,1)
    n0=10000
    nq=Int(ceil(np/n0))
end

(f,i,j,w)=( SharedArray{Int64}(np,4), SharedArray{Int64}(np,4),
            SharedArray{Int64}(np,4), SharedArray{Float64}(np,4) )

@sync @distributed for m in 1:nq
    ii=n0*(m-1) .+collect(1:n0)
    ii[end]>np ? ii=n0*(m-1) .+collect(1:n0+np-ii[end]) : nothing
    tmp=MITprofAnalysis.prepare_interpolation(G.Γ,df.lon[ii],df.lat[ii])
    f[ii,:].=tmp[1]
    i[ii,:].=tmp[2]
    j[ii,:].=tmp[3]
    w[ii,:].=tmp[4]
end

fil=joinpath("csv","profile_coeffs.jld2")
co=[(f=f[ii,:],i=i[ii,:],j=j[ii,:],w=w[ii,:]) for ii in 1:np]
save_object(fil,co)
ArgoData.MITprofAnalysis.read_pos_levelFunction
read_pos_level(k=1; input_path="")

Read in from csv/profile_positions.csv and e.g. csv_levels/k1.csv, parse pos, then add_level!(df,k), and return a DataFrame.

df=MITprofAnalysis.read_pos_level(5)
ArgoData.MITprofAnalysis.subsetMethod
subset(df;lons=(-180.0,180.0),lats=(-90.0,90.0),dates=())

Subset of df that's within specified date and position ranges.

df=CSV.read("csv/profile_positions.csv",DataFrame)
d0=DateTime("2012-06-11T18:50:04")
d1=DateTime("2012-07-11T18:50:04")
df1=MITprofAnalysis.subset(df,dates=(d0,d1))
df2=MITprofAnalysis.subset(df,lons=(0,10),lats=(-5,5),dates=(d0,d1))
ArgoData.MITprofAnalysis.trimMethod
trim(df)

Filter out data points that lack T, Te, etc.

df=CSV.read("csv/profile_positions.csv",DataFrame)
MITprofAnalysis.add_level!(df,1)
df1=MITprofAnalysis.trim(df)
ArgoData.MITprofStandardType

MITprofStandard

Container for a MITprof format file data.

  • filename : file name
  • 1D arrays: lon,lat,date,depth,ID
  • 2D arrays: T,S,Te,Se,Tw,Sw
ArgoData.MITprofStandardMethod

MITprofStandard

Create a MITprofStandard view of an MITprof file.

fil="nc/5903955_MITprof.nc"
mp=MITprof.MITprofStandard(fil)
ArgoData.ProfileNativeType

ProfileNative

Container for a multivariate profile read from a GDAC Argo file.

  • 1D arrays: lon,lat,date,ymd,hms,pnumtxt,direc,DATAMODE,isBAD
  • 2D arrays: T,S,pressure,depth,T_ERR,SERR
ArgoData.ProfileStandardType

ProfileNative

Container for a multivariate profile in MITprof format.

  • 2D arrays: T,S,Testim,Sestim,Tweight,Sweight,Ttest,Stest,T_ERR,SERR
ArgoData.MITprofStat.stat_dfMethod
stat_df(df::DataFrame,by::Symbol,va::Symbol)

Compute statistics (mean, median, variance) of variable va from DataFrame df grouped by by.

ArgoData.MITprofStat.stat_driverMethod
stat_driver(;varia=:Td,level=1,years=2004:2022,output_to_file=false,
nmon=1, npoint=1, sta=:median, nobs=1, input_path="", output_path="")
P=( variable=:Td, level=10, years=2004:2007, 
    statistic=:median, npoint=3, nmon=3, 
    input_path="MITprof_input",
    output_path=joinpath(tempdir(),"MITprof_output"),
    output_to_file=false
    )

MITprofStat.stat_driver(input_path=P.input_path,varia=P.variable,level=P.level,years=P.years,
        nmon=P.nmon, npoint=P.npoint, sta=P.statistic, 
        output_path=P.output_path, output_to_file=P.output_to_file)
ArgoData.MITprofStat.stat_grid!Method
stat_grid!(ar::Array,df::DataFrame,va::Symbol,sta::Symbol; func=(x->x))

Compute map ar of statistic sta of variable va from DataFrame df. This assumes that df.pos are indices into Array ar and should be used to groupby df.

ArgoData.MITprofStat.stat_monthly!Method
stat_monthly!(arr:Array,df::DataFrame,va::Symbol,sta::Symbol,years,G::NamedTuple;
                func=(x->x), nmon=1, npoint=1, nobs=1)

Compute maps of statistic sta for variable va from DataFrame df for years years. This assumes that df.pos are indices into Array ar and should be used to groupby df. For each year in years, twelve fields are computed – one per month.

using ArgoData
G=GriddedFields.load()
df1=MITprofAnalysis.trim( MITprofAnalysis.read_pos_level(1, input_path="MITprof_input") )

years=2004:2007
arr=G.array(12,length(years))
MITprofStat.stat_monthly!(arr,df1,:Td,:median,years,G,nmon=3);
ArgoData.MITprofStat.stat_monthly!Method
stat_monthly!(ar::Array,df::DataFrame,va::Symbol,sta::Symbol,y::Int,m::Int,G::NamedTuple;
                func=(x->x), nmon=1, npoint=1, nobs=1)

Compute map ar of statistic sta for variable va from DataFrame df for year y and month m. This assumes that df.pos are indices into Array ar and should be used to groupby df.

using ArgoData
G=GriddedFields.load();

P=( variable=:Td, level=10, year=2010, month=1, input_path="MITprof_input",
    statistic=:median, npoint=9, nmon=3, rng=(-1.0,1.0))

df1=MITprofAnalysis.trim( MITprofAnalysis.read_pos_level(P.level,input_path=P.input_path) )

GriddedFields.update_tile!(G,P.npoint);
ar1=G.array();
MITprofStat.stat_monthly!(ar1,df1,
    P.variable,P.statistic,P.year,P.month,G,nmon=P.nmon,npoint=P.npoint);

MITprofPlots.stat_map(ar1,G,rng=P.rng)
ArgoData.GDAC.download_fileFunction
download_file(file::DataFrameRow,suff="prof",ftp=missing)

Get folder and wmo from data frame row and them call download_file.

using ArgoData
files_list=GDAC.files_list()
file=GDAC.download_file(files_list[10000,:])
ArgoData.GDAC.download_fileFunction
download_file(folder::String,wmo::Int,suff="prof",ftp=missing)

Download Argo file from the GDAC server (<ftp://ftp.ifremer.fr/ifremer/argo/dac/> by default) to a temporary folder (joinpath(tempdir(),"Argo_DAC_files"))

The file name is given by folder and wmo. For example, 13857_prof.nc for wmo=13857 is from the aoml folder.

The default for suff is "prof" which means we'll download the file that contains the profile data. Other possible choices are "meta", "Rtraj", "tech".

If the ftp argument is omitted or isa(ftp,String) then Downloads.download is used. If, alternatively, isa(ftp,FTP) then FTPClient.download is used.

Example :

using ArgoData
GDAC.download_file("aoml",13857)

#or:
ftp="ftp://usgodae.org/pub/outgoing/argo/dac/"
GDAC.download_file("aoml",13857,"meta",ftp)
ArgoData.GDAC.files_listMethod
files_list(fil::String)

Get list of Argo float files from csv file with columns two columns – folder and wmo.

using ArgoData
fil="https://raw.githubusercontent.com/euroargodev/ArgoData.jl/gh-pages/dev/Argo_float_files.csv"
files_list=GDAC.files_list(fil)
ArgoData.GriddedFields.interp!Method
interp!(T_in::MeshArray,Γ,mp::MITprofStandard,📚,T_out)
interp!(T_in::MeshArray,Γ,mp::MITprofStandard,T_out)

Interpolate T_in, defined on grid Γ, to locations speficied in mp and store the result in array T_out.

Providing interpolation coefficients 📚 computed beforehand speeds up repeated calls.

Example:

fil=glob("*_MITprof.nc","MITprof")[1000]
mp=MITprofStandard(fil)

(f,i,j,w)=InterpolationFactors(G.Γ,mp.lon[:],mp.lat[:]);
📚=(f=f,i=i,j=j,w=w)

T=[similar(mp.T) for i in 1:12]
[interp!(G.T[i],G.Γ,mp,📚,T[i]) for i in 1:12]

In the above example, [interp!(G.T[i],G.Γ,mp,T[i]) for i in 1:12] would be much slower.

ArgoData.GriddedFields.interpMethod
interp(T_in::MeshArray,Γ,mp::MITprofStandard)

Interpolate T_in, defined on grid Γ, to locations speficied in mp.

For a more efficient, in place, option see interp!.

fil="MITprof/1901238_MITprof.nc"
mp=MITprofStandard(fil)

interp(G.T[1],G.Γ,mp)
ArgoData.GriddedFields.loadMethod
load()

Load gridded fields from files (and download the files if needed). Originally this function returned Γ,msk,T,S,σT,σS,array.

The embeded array() function returns a 2D array initialized to missing. And array(1), array(3,2), etc add dimensions to the resulting array.

using OceanStateEstimation, MITgcm; OceanStateEstimation.MITPROFclim_download()
using ArgoData; gridded_fields=GriddedFields.load()
ArgoData.MITprof.formatFunction
format(meta,gridded_fields,input_file,output_file="")

From Argo file name as input : read input file content, process into the MITprof format, and write to MITprof file.

MITprof.format(meta,gridded_fields,input_file)
ArgoData.MITprof.formatMethod
format(gridded_fields,input_file)

From Argo file name as input : read input file content, process into the MITprof format, and write to MITprof file.

MITprof.format(gridded_fields,input_file)
ArgoData.MITprof.format_loopMethod
format_loop(II)

Loop over files and call format.

gridded_fields=GriddedFields.load()
fil=joinpath(tempdir(),"Argo_MITprof_files","input","Argo_float_files.csv")
files_list=GDAC.files_list(fil)
MITprof.format_loop(gridded_fields,files_list,1:10)
ArgoData.MITprof.writeMethod
MITprof.write(meta,profiles,profiles_std;path="")

Create an MITprof file from meta data + profiles during MITprof.format.

MITprof.write(meta,profiles,profiles_std)
ArgoData.MITprof.writeMethod
write(fil::String,mp::MITprofStandard)

Create an MITprof file from an MITprofStandard input.

ArgoData.MITprof.writeMethod
write(fil::String,mps::Vector{MITprofStandard})

Create an MITprof file from a vector of MITprofStandard inputs.

ArgoData.ArgoTools.interp_zMethod
interp_z(x,y,xi; keep_mask=false)

Call Interpolations.linear_interpolation with extrapolation_bc=Flat().

If keep_mask=true then retain NaNs that are sometime to indicate sea floor has been reached (e.g. in model output).

ArgoData.ArgoTools.metaMethod
meta(input_file,output_file)

Get parameters to call MITprof.format which will read from input_file to create output_file.

ArgoData.ArgoTools.meta_initFunction
meta_init(fil::String)

Get parameters to call MITprof.format from yaml file (fil, e.g. "../examples/ArgoToMITprof.yml").

ArgoData.ArgoTools.monthly_climatology_factorsMethod
monthly_climatology_factors(date)

If date is a DateTime, a vector of DateTime, or a date in days since DateTime(0) then compute the corresponding climatological months (1 to 12) and interpolation factors (0.0 to 1.0) and return result as fac0,fac1,rec0,rec1.

For example :

ff(x)=sin((x-0.5)/12*2pi)
(fac0,fac1,rec0,rec1)=monthly_climatology_factors(ArgoTools.DateTime(2011,1,10))

gg=fac0*ff(rec0)+fac1*ff(rec1)
(ff(rec0),gg,ff(rec1))
ArgoData.ArgoTools.sw_adtgMethod
sw_adtg(S,T,P)

Calculate adiabatic temperature gradient as per UNESCO 1983 routines from salinity (S; in psu), in situ temperature (T; in °C), and pressure (P; in decibar)

adtg = sw_adtg(S,T,P)
ArgoData.ArgoTools.sw_dpthMethod
sw_dpth(P,LAT)

Calculate depth in meters from pressure (P; in decibars) and latitude (LAT; in °N)

d = sw_dpth(100.0,20.0)
ArgoData.ArgoTools.sw_ptmpFunction
sw_ptmp(S,T,P,PR)

Calculate potential temperature as per UNESCO 1983 report from salinity (S; in psu), in situ temperature (T; in °C), and pressure (P; in decibar) relative to PR (in decibar; 0 by default).

ptmp = sw_ptmp(S,T,P,PR=missing)