1. Loading data

das_ice.io.dask_Terra15(fname, timezone=datetime.timezone.utc, **kwargs)

Open data using xarray dask

Parameters:

timezone (timezone) – default (timezone.utc)

das_ice.io.dask_febus(filenames, timezone=datetime.timezone.utc, concat_dim='time', **kwargs)

Load and combine multiple Febus DAS files using dask_febus_one_file.

Parameters:
  • filenames – List of file paths to process.

  • timezone (timezone object) – Timezone for datetime conversion (default: UTC).

  • concat_dim – Dimension along which to concatenate the data (default: ‘time’).

  • kwargs – Additional arguments to pass to dask_febus_one_file.

Returns:

Combined xarray.DataArray object.

das_ice.io.dask_febus_one_file(fname, timezone=datetime.timezone.utc, **kwargs)

Open and process Febus DAS data using xarray with dask.

Parameters:
  • fname – File name or pattern to load with xarray.

  • timezone (timezone object) – Timezone for datetime conversion (default: UTC).

  • kwargs – Additional arguments for xarray.open_mfdataset.

Returns:

xarray DataArray with time and distance coordinates.

das_ice.io.read_Terra15(fname, timezone=datetime.timezone.utc)

Open file from Terra15 version 6 using xdas loader

Parameters:

timezone (timezone) – default (timezone.utc)

2. Processes DAS Data

das_ice.processes.std_local(da, nt=6, nx=4, ax_t='time', ax_x='distance')

Compute local standard deviation for a DataArray

Parameters:
  • da (xr.DataArray) – Variable on which compute the local standard deviation

  • nt (int) – number of element in ax_t direction (default: 6)

  • nx (int) – number of element in ax_x direction (default: 4)

  • ax_t (string) – ax_t axis (default: ‘time’)

  • ax_x (string) – ax_x axis (default: ‘distance’)

das_ice.processes.strain_rate(da, ax_d='distance')

Compute strain rate from velocity mesurement

Parameters:
  • da (xr.DataArray) – Variable on which compute differentiate

  • ax_d (string) – axis for differentiate (default: ‘distance’)

3. Plot DAS Data

das_ice.plot.all_trace(da)

Plot all the trace using plotly. The number of trace should not be to high.

4. Filter DAS Data

exception das_ice.signal.filter.DecimationWarning

Custom warning for decimation issues.

__weakref__

list of weak references to the object

das_ice.signal.filter.bandpass(da, lowcut, highcut, order=4, axis='time')

Apply a bandpass filter to an xarray.DataArray along a specified axis lazily, compatible with Dask.

Parameters: da : xarray.DataArray

Input DataArray to be filtered. Can be Dask-backed for lazy evaluation.

lowcutfloat

Low cutoff frequency for the bandpass filter.

highcutfloat

High cutoff frequency for the bandpass filter.

orderint, optional

Order of the Butterworth filter. Default is 4.

axisstr, optional

The dimension along which to apply the filter. Default is ‘time’.

Returns: xarray.DataArray

The bandpass-filtered DataArray, computed lazily if Dask-backed.

das_ice.signal.filter.decimate(da, q_int, axis='time')

Decimate an xarray.DataArray or Dataset along a specified axis by a given interval.

Parameters: da : xarray.DataArray or xarray.Dataset

The input data to be decimated.

q_intint

The decimation interval. Must be a positive integer.

axisstr, optional

The dimension name along which to decimate. Default is ‘time’.

Returns: xarray.DataArray or xarray.Dataset

The decimated data.

Raises: ValueError:

If q_int is not a positive integer.

5. Metadata for DAS

das_ice.metadata.optic_length(da, metadata_optic, dist='distance')

Compute optic length from metadata

das_ice.metadata.optic_length_plot(da)

Plot optic length data