I have a large dataset spanning 5 years having resolution of 5 mins. The idea is to interpolate them to cartesian grids (data is originally in azi, ran).
At the moment the regridding is done for each time step using wrl.georef.spherical_to_xyz() and interpolated to a new cartesian grid using wrl.comp.togrid().
After completing these steps I want to analyse the statistics over every azimuth to look for a filter for clutter.
Some questions are:
- how can the newly cartesian grid be converted to spherical?
- is there a more efficient method for regridding and saving the files in netcdf?
Hi @psradar,
welcome to openradar discourse.
If you are looking for some filter for clutter, why are you moving to cartesian in the first place? Would’nt it be better to make the statistics directly on the polar source data? Anyway here are some pointers to your questions.
- You might use
cart_to_irregular_spline
-function Interpolation — wradlib. An example is here Beam Blockage Calculation using a DEM — wradlib, although it is not quite fitting. But you should get the idea.
- If you are working with xarray based data you might use the power of dask to distribute the whole timeseries of sweeps or use some dedicated gridding package.
HTH,
Kai
Hi Kai,
You’re actually quite right. I explained the steps in the wrong order. The clutter correction is done before regridding. I still required a solution to Q1 to compare a sort of a sectoral statistics, to see which regridding method provides a good comparison (for e.g of extreme values of dbZ) to the original data.