Keyboard shortcuts

Press or to navigate between chapters

Press S or / to search in the book

Press ? to show this help

Press Esc to hide this help

HEC-DSS Rust Library

A pure Rust implementation of the HEC-DSS (Data Storage System) version 7 file format used by the U.S. Army Corps of Engineers for hydrologic and hydraulic data.

What is HEC-DSS?

HEC-DSS is a binary file format for storing time series, paired data, spatial grids, and other hydrologic data. It is used by USACE software including HEC-RAS, HEC-HMS, HEC-ResSim, and HEC-DSSVue.

Why Rust?

The original HEC-DSS library is written in C with Fortran components. This Rust implementation provides:

  • Memory safety - no buffer overflows, null dereferences, or use-after-free
  • Thread safety - each file handle is protected by a mutex
  • Zero dependencies - no C compiler, zlib, or Fortran runtime needed
  • Cross-platform - builds on Windows, Linux, and macOS
  • Small footprint - 200 KB DLL vs 705 KB for the C version
  • Multiple language bindings - Rust, Python (PyO3), C (FFI), Fortran

Features

  • Read and write all DSS7 data types (text, time series, paired data, arrays, location, grids)
  • Multi-block time series with time window filtering
  • Record management (delete, undelete, squeeze, copy)
  • Alias system for record aliasing
  • CRC-based change detection
  • File integrity checking
  • zlib compression for grid data
  • DSS v6 to v7 conversion (via CLI tool)
  • Wildcard catalog filtering
  • 40/40 hecdss.h FFI functions implemented (100% drop-in compatible)

Installation

Rust

Add dss-core to your Cargo.toml:

[dependencies]
dss-core = { git = "https://github.com/hatch-tyler/hec-dss-rs" }

Python

pip install hecdss-rs

From GitHub Release

Download a pre-built wheel from the Releases page and install it:

pip install hecdss_rs-<version>-<platform>.whl

From Source

cd crates/dss-python
pip install maturin
maturin build --release
pip install target/wheels/dss_python-*.whl

Verify

import hecdss_rs

Fortran

Compile the module and link against the Rust DLL:

ifx -c src/hecdss_mod.f90
ifx -c your_program.f90
ifx -o your_program.exe your_program.obj hecdss_mod.obj path/to/dss_ffi.dll.lib

C / .NET

Use the dss_ffi.dll (or libdss_ffi.so on Linux) as a drop-in replacement for hecdss.dll. The function signatures match hecdss.h exactly.

Building from Source

git clone https://github.com/hatch-tyler/hec-dss-rs
cd hec-dss-rs

# Pure Rust (no C dependency)
cargo build -p dss-core -p dss-ffi --release

# Output: target/release/dss_ffi.dll (Windows)
#         target/release/libdss_ffi.so (Linux)

DSS Concepts

Pathnames

Every record in a DSS file is identified by a six-part pathname:

/A-Part/B-Part/C-Part/D-Part/E-Part/F-Part/
PartNameExampleDescription
AProjectSACRAMENTOGeographic area or project
BLocationFOLSOM DAMSpecific measurement location
CParameterFLOWWhat is measured (FLOW, STAGE, PRECIP)
DDate01JAN2020Start date of data block
EInterval1HOURTime step (1MIN, 1HOUR, 1DAY, IR-MONTH)
FVersionOBSData source or version (OBS, SIM, COMPUTED)

Maximum length: 393 characters.

Record Types

CodeNameDescription
100RTSRegular time series (floats)
105RTDRegular time series (doubles)
110ITSIrregular time series (floats)
115ITDIrregular time series (doubles)
200PDPaired data (floats)
205PDDPaired data (doubles)
300TXTText data
400-431GridSpatial grids (SHG, HRAP, Albers)
90-93ArrayGeneric arrays (int, float, double)
20LocationCoordinate and datum information

Time Series Blocks

Regular time series are stored in blocks aligned to calendar boundaries:

  • Hourly data: one block per month (744 values for January)
  • Daily data: one block per year (365/366 values)
  • Monthly data: one block per decade (120 values)

The write_ts_multi() method handles block splitting automatically.

Missing Values

  • DSS v7: -3.4028235e+38 (float minimum)
  • DSS v6: -901.0

The convert_missing_values() function handles v6→v7 conversion.

File Versions

  • DSS v7 (current): 64-bit addressing, supports up to exabytes, C-based
  • DSS v6 (legacy): 32-bit addressing, 4GB limit, Fortran-based

Use dss-convert to convert v6 files to v7.

Text Records

Text records store arbitrary string data in a DSS file.

Writing Text

#![allow(unused)]
fn main() {
use dss_core::NativeDssFile;

let mut dss = NativeDssFile::create("notes.dss")?;
dss.write_text("/PROJECT/SITE/NOTE///COMMENT/", "Dam inspection completed 01Jan2020")?;
dss.write_text("/PROJECT/SITE/NOTE///DESCRIPTION/", "Multi-line text\nLine 2\nLine 3")?;
}

Python:

with hecdss_rs.DssFile.create("notes.dss") as dss:
    dss.write_text("/PROJECT/SITE/NOTE///COMMENT/", "Dam inspection completed")

Reading Text

#![allow(unused)]
fn main() {
if let Some(text) = dss.read_text("/PROJECT/SITE/NOTE///COMMENT/")? {
    println!("Note: {text}");
} else {
    println!("Record not found");
}
}

Python:

text = dss.read_text("/PROJECT/SITE/NOTE///COMMENT/")
if text is not None:
    print(f"Note: {text}")

Notes

  • Returns None (Rust Option) / None (Python) if the pathname doesn’t exist
  • Text records use DSS type code 300
  • Maximum text size is limited by available disk space
  • Pathname must follow the /A/B/C/D/E/F/ format

Time Series

Time series are the most common DSS data type. They store sequences of values at regular or irregular time intervals.

Regular Time Series

Writing

#![allow(unused)]
fn main() {
use dss_core::NativeDssFile;

let mut dss = NativeDssFile::create("flow.dss")?;
let values = vec![100.0, 200.0, 300.0, 400.0, 500.0];
dss.write_ts(
    "/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/",
    &values,
    "CFS",        // units
    "INST-VAL",   // data type (INST-VAL, PER-AVER, PER-CUM)
)?;
}

Python:

import numpy as np
dss.write_ts("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/",
             np.array([100.0, 200.0, 300.0, 400.0, 500.0]),
             "CFS", "INST-VAL")

Reading

#![allow(unused)]
fn main() {
if let Some(ts) = dss.read_ts("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/")? {
    println!("Values: {:?}", &ts.values[..5]);
    println!("Units: {}", ts.units);
    println!("Type: {}", ts.data_type_str);
    println!("Record type: {}", ts.record_type);  // 105 = RTD (doubles)
}
}

Python:

values = dss.read_ts("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/")
# values is a numpy array
print(f"First 5: {values[:5]}")

Multi-Block Time Series

For data spanning multiple months, use write_ts_multi which automatically splits into monthly blocks:

#![allow(unused)]
fn main() {
let year_of_hourly = vec![0.0f64; 8760]; // 365 days * 24 hours
dss.write_ts_multi(
    "/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/SIM/",
    &year_of_hourly,
    "01JAN2020",  // start date
    3600,         // interval in seconds
    "CFS",
    "INST-VAL",
)?;
}

Python:

dss.write_ts_multi("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/SIM/",
                   np.zeros(8760), "01JAN2020", 3600, "CFS", "INST-VAL")

Time Window Filtering

Read data across multiple blocks within a specific time window:

#![allow(unused)]
fn main() {
if let Some(ts) = dss.read_ts_window(
    "/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/SIM/",
    "15JAN2020",   // start date
    "15FEB2020",   // end date
)? {
    println!("Got {} values in window", ts.values.len());
}
}

Python:

values = dss.read_ts_window("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/SIM/",
                            "15JAN2020", "15FEB2020")

Irregular Time Series

For data at non-uniform intervals:

#![allow(unused)]
fn main() {
let times = vec![60, 120, 300, 600];  // offsets in seconds from base date
let values = vec![10.0, 20.0, 15.0, 25.0];
dss.write_ts_irregular(
    "/BASIN/GAGE1/STAGE//IR-MONTH/OBS/",
    &times,
    &values,
    60,           // time_granularity_seconds (60 = minutes)
    "FT",
    "INST-VAL",
)?;
}

Python:

dss.write_ts_irregular("/BASIN/GAGE1/STAGE//IR-MONTH/OBS/",
                       np.array([60, 120, 300, 600], dtype=np.int32),
                       np.array([10.0, 20.0, 15.0, 25.0]),
                       60, "FT", "INST-VAL")

Query Without Reading Data

#![allow(unused)]
fn main() {
// Get sizes for pre-allocation
let (num_values, quality_size) = dss.ts_get_sizes("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/")?;

// Get units and type without reading values
if let Some((units, dtype)) = dss.ts_retrieve_info("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/")? {
    println!("Units: {units}, Type: {dtype}");
}

// Get date range
if let Some((first_j, _, last_j, _)) = dss.ts_get_date_time_range("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/")? {
    println!("From Julian {first_j} to {last_j}");
}
}

Paired Data

Paired data stores X-Y relationships such as frequency-flow curves, stage-discharge ratings, and elevation-area-volume tables.

Writing

#![allow(unused)]
fn main() {
// Single curve
dss.write_pd(
    "/BASIN/DAM/FREQ-FLOW///ANALYTICAL/",
    &[1.0, 5.0, 10.0, 50.0, 100.0],       // ordinates (X)
    &[500.0, 1000.0, 2000.0, 5000.0, 10000.0], // values (Y)
    1,           // number of curves
    "PERCENT",   // independent variable units
    "CFS",       // dependent variable units
    None,        // optional curve labels
)?;

// Multiple curves with labels
let ordinates = &[1.0, 10.0, 100.0];
let values = &[
    100.0, 200.0, 300.0,   // curve 1 (3 ordinates)
    150.0, 250.0, 350.0,   // curve 2 (3 ordinates)
];
dss.write_pd(
    "/BASIN/DAM/FREQ-FLOW///MULTI/",
    ordinates, values, 2,
    "PERCENT", "CFS",
    Some(&["Low Estimate", "High Estimate"]),
)?;
}

Python:

dss.write_pd("/BASIN/DAM/FREQ-FLOW///ANALYTICAL/",
             np.array([1.0, 5.0, 10.0, 50.0, 100.0]),
             np.array([500.0, 1000.0, 2000.0, 5000.0, 10000.0]),
             1, "PERCENT", "CFS")

Reading

#![allow(unused)]
fn main() {
if let Some(pd) = dss.read_pd("/BASIN/DAM/FREQ-FLOW///ANALYTICAL/")? {
    println!("Ordinates: {:?}", pd.ordinates);
    println!("Values: {:?}", pd.values);
    println!("{} ordinates, {} curves", pd.number_ordinates, pd.number_curves);
    println!("Units: {} vs {}", pd.units_independent, pd.units_dependent);
}
}

Python:

result = dss.read_pd("/BASIN/DAM/FREQ-FLOW///ANALYTICAL/")
if result is not None:
    ordinates, values = result
    print(f"Ordinates: {ordinates}")
    print(f"Values: {values}")

Query Without Data

#![allow(unused)]
fn main() {
if let Some((n_ord, n_curves, units_i, units_d)) =
    dss.pd_retrieve_info("/BASIN/DAM/FREQ-FLOW///ANALYTICAL/")?
{
    println!("{n_ord} ordinates, {n_curves} curves");
    println!("Units: {units_i} vs {units_d}");
}
}

Array Records

Array records store generic integer, float, and/or double arrays. A single record can contain any combination of the three types.

Writing

#![allow(unused)]
fn main() {
// Integer array
dss.write_array("/PROJECT/DATA/INDICES///V1/", &[1, 2, 3, 4, 5], &[], &[])?;

// Double array
dss.write_array("/PROJECT/DATA/VALUES///V1/", &[], &[], &[1.1, 2.2, 3.3])?;

// Mixed: integers + doubles
dss.write_array("/PROJECT/DATA/MIXED///V1/",
    &[10, 20, 30],           // integers
    &[],                      // floats (empty)
    &[1.5, 2.5, 3.5],        // doubles
)?;
}

Python:

dss.write_array("/PROJECT/DATA/VALUES///V1/", double_values=[1.1, 2.2, 3.3])
dss.write_array("/PROJECT/DATA/MIXED///V1/",
                int_values=[10, 20, 30], double_values=[1.5, 2.5, 3.5])

Reading

#![allow(unused)]
fn main() {
if let Some(arr) = dss.read_array("/PROJECT/DATA/MIXED///V1/")? {
    println!("Ints: {:?}", arr.int_values);
    println!("Floats: {:?}", arr.float_values);
    println!("Doubles: {:?}", arr.double_values);
}
}

Python:

result = dss.read_array("/PROJECT/DATA/MIXED///V1/")
if result is not None:
    print(f"Ints: {result['int_values']}")
    print(f"Doubles: {result['double_values']}")

Location Data

Location records store geographic coordinates, datum information, and supplemental metadata associated with a DSS record.

Writing

#![allow(unused)]
fn main() {
use dss_core::{NativeDssFile, LocationRecord};

let loc = LocationRecord {
    x: -121.5,               // longitude or easting
    y: 38.5,                 // latitude or northing
    z: 100.0,                // elevation
    coordinate_system: 2,     // 2 = lat/lon
    horizontal_datum: 1,      // 1 = NAD83
    vertical_datum: 2,        // 2 = NAVD88
    timezone: "PST".to_string(),
    supplemental: "USGS gage 11446500".to_string(),
    ..Default::default()
};
dss.write_location("/BASIN/GAGE1/LOC///INFO/", &loc)?;
}

Python:

dss.write_location("/BASIN/GAGE1/LOC///INFO/",
                   x=-121.5, y=38.5, z=100.0,
                   coordinate_system=2, horizontal_datum=1,
                   timezone="PST", supplemental="USGS gage 11446500")

Reading

#![allow(unused)]
fn main() {
if let Some(loc) = dss.read_location("/BASIN/GAGE1/LOC///INFO/")? {
    println!("({}, {}) elev={}", loc.x, loc.y, loc.z);
    println!("Datum: H={}, V={}", loc.horizontal_datum, loc.vertical_datum);
    println!("Info: {}", loc.supplemental);
}
}

Python:

loc = dss.read_location("/BASIN/GAGE1/LOC///INFO/")
if loc is not None:
    print(f"({loc['x']}, {loc['y']}) elev={loc['z']}")

Grid / Spatial Data

Grid records store 2D spatial data such as precipitation, temperature, or elevation grids. Data is compressed with zlib for efficient storage.

Writing

#![allow(unused)]
fn main() {
// Create a 100x50 grid
let nx = 100;
let ny = 50;
let data: Vec<f32> = (0..nx*ny).map(|i| (i as f32) * 0.1).collect();

dss.write_grid(
    "/SHG/BASIN/PRECIP/01JAN2020:0600/01JAN2020:1200/NEXRAD/",
    430,          // grid type (430 = SHG)
    nx as i32,
    ny as i32,
    &data,
    "MM",         // data units
    2000.0,       // cell size
)?;
}

Python:

import numpy as np
data = np.arange(5000, dtype=np.float32).reshape(50, 100) * 0.1
dss.write_grid("/SHG/BASIN/PRECIP/01JAN2020:0600/01JAN2020:1200/NEXRAD/",
               430, 100, 50, data.flatten().tolist(), "MM", 2000.0)

Reading

#![allow(unused)]
fn main() {
if let Some(grid) = dss.read_grid("/SHG/BASIN/PRECIP/01JAN2020:0600/01JAN2020:1200/NEXRAD/")? {
    println!("{}x{} grid, cell_size={}", grid.nx, grid.ny, grid.cell_size);
    println!("Units: {}", grid.data_units);
    println!("Data points: {}", grid.data.len());
}
}

Python:

result = dss.read_grid("/SHG/BASIN/PRECIP/01JAN2020:0600/01JAN2020:1200/NEXRAD/")
if result is not None:
    print(f"{result['nx']}x{result['ny']} grid")
    data = np.array(result['data']).reshape(result['ny'], result['nx'])

Grid Types

CodeNameDescription
400Undefined (time)Undefined projection with time stamp
410HRAP (time)Hydrologic Rainfall Analysis Project
420Albers (time)Albers Equal Area Conic
430SHG (time)Standard Hydrologic Grid

Compression

Grid data is automatically compressed with zlib on write and decompressed on read. No configuration needed.

File Management

Delete and Restore Records

#![allow(unused)]
fn main() {
// Delete a record (marks as deleted, space not reclaimed)
dss.delete("/OLD/PATH/DATA///REMOVE/")?;

// Restore a deleted record
dss.undelete("/OLD/PATH/DATA///REMOVE/")?;
}

Python:

dss.delete("/OLD/PATH/DATA///REMOVE/")
dss.undelete("/OLD/PATH/DATA///REMOVE/")

Squeeze (Compact)

Reclaim space from deleted records by copying all live records to a new file:

#![allow(unused)]
fn main() {
dss.squeeze()?;
}

Python:

dss.squeeze()

Copy Records Between Files

#![allow(unused)]
fn main() {
let mut src = NativeDssFile::open("source.dss")?;
let mut dst = NativeDssFile::create("destination.dss")?;

// Copy a single record
src.copy_record("/BASIN/LOC/FLOW/01JAN2020/1HOUR/OBS/", &mut dst)?;

// Copy all records
let count = src.copy_file(&mut dst)?;
println!("Copied {count} records");
}

Python:

src = hecdss_rs.DssFile.open("source.dss")
dst = hecdss_rs.DssFile.create("destination.dss")
src.copy_record("/BASIN/LOC/FLOW/01JAN2020/1HOUR/OBS/", dst)
count = src.copy_file(dst)

File Integrity Check

Validate the DSS file structure:

#![allow(unused)]
fn main() {
let issues = dss.check_file()?;
for issue in &issues {
    println!("{issue}");
}
// Last entry is "File integrity OK" if no problems found
}

Python:

issues = dss.check_file()
print(issues[-1])  # "File integrity OK" or description of problem

Catalog with Wildcard Filtering

#![allow(unused)]
fn main() {
// All records
let all = dss.catalog()?;

// Filter by parameter
let flow_records = dss.catalog_filtered(Some("/*/FLOW///*/"))?;

// Filter by location and parameter
let specific = dss.catalog_filtered(Some("/BASIN/GAGE1/FLOW///*/"))?;
}

Python:

all_records = dss.catalog()
flow_records = dss.catalog(filter="/*/*/FLOW///*/")

Wildcard * matches any string in a pathname part. Empty filter parts match empty pathname parts.

Query Record Type

#![allow(unused)]
fn main() {
let rtype = dss.record_type("/BASIN/LOC/FLOW/01JAN2020/1HOUR/OBS/")?;
match rtype {
    100..=119 => println!("Time series"),
    200..=209 => println!("Paired data"),
    300 => println!("Text"),
    0 => println!("Not found"),
    _ => println!("Type {rtype}"),
}
}

Aliases

Aliases allow multiple pathnames to reference the same underlying record data. This is useful when a record needs to be accessible under different naming conventions.

Adding an Alias

#![allow(unused)]
fn main() {
// First, write a primary record
dss.write_ts("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/",
    &[100.0, 200.0, 300.0], "CFS", "INST-VAL")?;

// Create an alias that points to the same data
dss.alias_add(
    "/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/",     // primary pathname
    "/ALTERNATE/NAME/FLOW/01JAN2020/1HOUR/ALIAS/",  // alias pathname
)?;
}

Python:

dss.alias_add("/BASIN/GAGE1/FLOW/01JAN2020/1HOUR/OBS/",
              "/ALTERNATE/NAME/FLOW/01JAN2020/1HOUR/ALIAS/")

Both pathnames now appear in the catalog and reference the same data.

Listing Aliases

#![allow(unused)]
fn main() {
let aliases = dss.alias_list()?;
for (alias_path, info_addr) in &aliases {
    println!("Alias: {alias_path} -> info at {info_addr}");
}
}

Removing an Alias

#![allow(unused)]
fn main() {
dss.alias_remove("/ALTERNATE/NAME/FLOW/01JAN2020/1HOUR/ALIAS/")?;
}

Removing an alias does not delete the primary record.

Change Tracking

CRC-based change detection enables tracking which records have been modified, added, or removed between two points in time.

Computing CRC for a Single Record

#![allow(unused)]
fn main() {
let crc = dss.get_data_crc("/BASIN/LOC/FLOW/01JAN2020/1HOUR/OBS/")?;
println!("CRC32: {crc}");
}

Snapshot All Records

Take a snapshot of CRC values for all records:

#![allow(unused)]
fn main() {
let snapshot = dss.snapshot_crcs()?;
for (pathname, crc) in &snapshot {
    println!("{pathname}: {crc}");
}
}

Detecting Changes

Compare two snapshots to find what changed:

#![allow(unused)]
fn main() {
let before = dss.snapshot_crcs()?;

// ... modify records ...
dss.write_text("/A/B/NOTE///NEW/", "added later")?;

let after = dss.snapshot_crcs()?;

let (changed, added, removed) = NativeDssFile::what_changed(&before, &after);

println!("Changed: {changed:?}");
println!("Added: {added:?}");
println!("Removed: {removed:?}");
}

Python:

before = dss.snapshot_crcs()

dss.write_text("/A/B/NOTE///NEW/", "added later")

after = dss.snapshot_crcs()
changed, added, removed = hecdss_rs.DssFile.what_changed(before, after)
print(f"Added: {added}")

Use Cases

  • Data synchronization: Detect which records changed since last sync
  • Audit trail: Track modifications to a DSS file over time
  • Incremental backup: Only copy changed records to backup

v6 to v7 Conversion

Using the CLI Tool

dss-convert input_v6.dss output_v7.dss

The tool:

  1. Detects the input file version (v6 or v7)
  2. Reads all records from the source
  3. Writes each record to a new v7 file using pure Rust
  4. Reports conversion statistics
Input: DSS version 6 file
Reading from: input_v6.dss
Writing to:   output_v7.dss
Records in source: 3

Conversion complete:
  Copied:  3
  Skipped: 0
  Output:  output_v7.dss
  Output records: 3

File Version Detection

Rust

#![allow(unused)]
fn main() {
use dss_core::format::v6;

let mut file = std::fs::File::open("unknown.dss")?;
let version = v6::detect_version(&mut file)?;
match version {
    6 => println!("DSS version 6"),
    7 => println!("DSS version 7"),
    0 => println!("Not a DSS file"),
    _ => println!("Unknown version"),
}
}

Python

# The DssFile.open() method handles both v6 and v7 automatically
# For explicit version checking, examine the file header

v6 vs v7 Differences

FeatureDSS v6DSS v7
Addressing32-bit (4GB limit)64-bit (unlimited)
Missing value-901.0-3.4028235e+38
Bin formatFortran COMMON blocksC structs
Info flagNone-97534
CompressionLimitedzlib
Max pathname~128 chars393 chars

v7-to-v7 Compaction

dss-convert also works as a compactor for v7 files, equivalent to the squeeze operation:

dss-convert bloated_v7.dss clean_v7.dss

This copies all live records to a new file, skipping deleted records and reclaiming wasted space.

Limitations

  • v6 reading requires the C library (with Fortran) as a bridge. The pure Rust v6 reader can detect and parse headers but full catalog scanning requires the legacy Fortran code paths.
  • Some v6 record types (images, binary files) may not convert.
  • v6 files with non-standard extensions may not be detected.

Rust API Reference

NativeDssFile

The main entry point for all DSS operations. Pure Rust, zero C dependencies.

File Operations

MethodSignatureDescription
createfn create(path: &str) -> io::Result<Self>Create a new empty DSS7 file
openfn open(path: &str) -> io::Result<Self>Open an existing DSS7 file
record_countfn record_count(&self) -> i64Number of records (including aliases)
record_typefn record_type(&mut self, pathname: &str) -> io::Result<i32>Get data type code (0 if not found)
catalogfn catalog(&mut self) -> io::Result<Vec<CatalogEntry>>List all records
catalog_filteredfn catalog_filtered(&mut self, filter: Option<&str>) -> io::Result<Vec<CatalogEntry>>List with wildcard filter

Text Records

MethodSignatureDescription
read_textfn read_text(&mut self, pathname: &str) -> io::Result<Option<String>>Read text (None if not found)
write_textfn write_text(&mut self, pathname: &str, text: &str) -> io::Result<()>Write text record

Time Series

MethodSignatureDescription
read_tsfn read_ts(&mut self, pathname: &str) -> io::Result<Option<TimeSeriesRecord>>Read single-block TS
write_tsfn write_ts(&mut self, pathname: &str, values: &[f64], units: &str, data_type: &str) -> io::Result<()>Write regular TS (doubles)
write_ts_irregularfn write_ts_irregular(&mut self, pathname: &str, times: &[i32], values: &[f64], granularity: i32, units: &str, data_type: &str) -> io::Result<()>Write irregular TS
read_ts_windowfn read_ts_window(&mut self, pathname: &str, start_date: &str, end_date: &str) -> io::Result<Option<TimeSeriesRecord>>Read across blocks with date filter
write_ts_multifn write_ts_multi(&mut self, pathname: &str, values: &[f64], start_date: &str, interval_sec: i32, units: &str, data_type: &str) -> io::Result<()>Write multi-block TS
ts_get_sizesfn ts_get_sizes(&mut self, pathname: &str) -> io::Result<(i32, i32)>Get (num_values, quality_size)
ts_retrieve_infofn ts_retrieve_info(&mut self, pathname: &str) -> io::Result<Option<(String, String)>>Get (units, type) without data
ts_get_date_time_rangefn ts_get_date_time_range(&mut self, pathname: &str) -> io::Result<Option<(i32,i32,i32,i32)>>Get first/last Julian dates

Paired Data

MethodSignatureDescription
read_pdfn read_pd(&mut self, pathname: &str) -> io::Result<Option<PairedDataRecord>>Read paired data
write_pdfn write_pd(&mut self, ..., n_curves: usize, ...) -> io::Result<()>Write paired data (doubles)
pd_retrieve_infofn pd_retrieve_info(&mut self, pathname: &str) -> io::Result<Option<(i32,i32,String,String)>>Get (n_ord, n_curves, units_i, units_d)

Array

MethodSignatureDescription
read_arrayfn read_array(&mut self, pathname: &str) -> io::Result<Option<ArrayRecord>>Read int/float/double arrays
write_arrayfn write_array(&mut self, pathname: &str, ints: &[i32], floats: &[f32], doubles: &[f64]) -> io::Result<()>Write array record

Location

MethodSignatureDescription
read_locationfn read_location(&mut self, pathname: &str) -> io::Result<Option<LocationRecord>>Read coordinates
write_locationfn write_location(&mut self, pathname: &str, loc: &LocationRecord) -> io::Result<()>Write coordinates

Grid / Spatial

MethodSignatureDescription
read_gridfn read_grid(&mut self, pathname: &str) -> io::Result<Option<GridRecord>>Read grid (auto-decompresses)
write_gridfn write_grid(&mut self, ..., nx: i32, ny: i32, data: &[f32], ...) -> io::Result<()>Write grid (auto-compresses)

Record Management

MethodSignatureDescription
deletefn delete(&mut self, pathname: &str) -> io::Result<()>Mark record as deleted
undeletefn undelete(&mut self, pathname: &str) -> io::Result<()>Restore deleted record
squeezefn squeeze(&mut self) -> io::Result<()>Compact file (reclaim space)
copy_recordfn copy_record(&mut self, pathname: &str, dest: &mut NativeDssFile) -> io::Result<bool>Copy one record to another file
copy_filefn copy_file(&mut self, dest: &mut NativeDssFile) -> io::Result<usize>Copy all records
check_filefn check_file(&mut self) -> io::Result<Vec<String>>Validate file integrity

Aliases

MethodSignatureDescription
alias_addfn alias_add(&mut self, primary: &str, alias: &str) -> io::Result<()>Create alias to primary
alias_removefn alias_remove(&mut self, alias: &str) -> io::Result<()>Remove alias
alias_listfn alias_list(&mut self) -> io::Result<Vec<(String, i64)>>List all aliases

CRC / Change Tracking

MethodSignatureDescription
get_data_crcfn get_data_crc(&mut self, pathname: &str) -> io::Result<u32>CRC32 of record data
snapshot_crcsfn snapshot_crcs(&mut self) -> io::Result<Vec<(String, u32)>>Snapshot all CRCs
what_changedfn what_changed(before: &[(String,u32)], after: &[(String,u32)]) -> (Vec<String>, Vec<String>, Vec<String>)Compare snapshots (static)

Data Types

TimeSeriesRecord

#![allow(unused)]
fn main() {
pub struct TimeSeriesRecord {
    pub pathname: String,
    pub values: Vec<f64>,
    pub times: Vec<i32>,
    pub quality: Option<Vec<i32>>,
    pub units: String,
    pub data_type_str: String,
    pub record_type: i32,
    pub time_granularity: i32,
    pub precision: i32,
    pub block_start: i32,
    pub block_end: i32,
    pub number_values: usize,
}
}

PairedDataRecord

#![allow(unused)]
fn main() {
pub struct PairedDataRecord {
    pub pathname: String,
    pub ordinates: Vec<f64>,
    pub values: Vec<f64>,
    pub number_ordinates: usize,
    pub number_curves: usize,
    pub units_independent: String,
    pub units_dependent: String,
    pub labels: Vec<String>,
    pub record_type: i32,
}
}

ArrayRecord, LocationRecord, GridRecord

See source code for complete field listings.

Error Handling

All methods return io::Result<T>. Errors include:

  • File I/O errors (not found, permission denied)
  • Invalid pathname format
  • Empty data arrays
  • Corrupt file structure
  • Record not found (for delete/undelete)

Python API Reference

Module: hecdss_rs

Install: pip install target/wheels/dss_python-*.whl

DssFile Class

File Operations

# Create / Open
dss = hecdss_rs.DssFile.create("new.dss")
dss = hecdss_rs.DssFile.open("existing.dss")

# Context manager (auto-closes)
with hecdss_rs.DssFile.create("example.dss") as dss:
    ...

dss.close()                      # Explicit close (safe to call multiple times)
dss.record_count()               # -> int
dss.record_type(pathname)        # -> int (0 if not found)
dss.catalog()                    # -> list of (pathname, record_type) tuples
dss.catalog(filter="/*/*/FLOW///*/")  # With wildcard

Text Records

dss.write_text(pathname, text)   # text: str
dss.read_text(pathname)          # -> str or None

Time Series

# Regular TS
dss.write_ts(pathname, values, units, data_type)  # values: numpy array
dss.read_ts(pathname)            # -> numpy array or None

# Irregular TS
dss.write_ts_irregular(pathname, times, values, granularity, units, data_type)
# times: numpy int32 array, values: numpy float64 array

# Multi-block
dss.write_ts_multi(pathname, values, start_date, interval_seconds, units, data_type)
dss.read_ts_window(pathname, start_date, end_date)  # -> numpy array or None

# Info queries
dss.ts_get_sizes(pathname)                # -> (num_values, quality_size)
dss.ts_retrieve_info(pathname)            # -> (units, type) or None
dss.ts_get_date_time_range(pathname)      # -> (fj, fs, lj, ls) or None

Paired Data

dss.write_pd(pathname, ordinates, values, n_curves, units_indep, units_dep)
# ordinates, values: numpy float64 arrays

dss.read_pd(pathname)            # -> (ordinates, values) numpy arrays or None
dss.pd_retrieve_info(pathname)   # -> (n_ord, n_curves, units_i, units_d) or None

Array Records

dss.write_array(pathname, int_values=[], float_values=[], double_values=[])
dss.read_array(pathname)         # -> dict with 'int_values', 'float_values', 'double_values'

Location

dss.write_location(pathname, x=0.0, y=0.0, z=0.0,
                   coordinate_system=0, horizontal_datum=0, vertical_datum=0,
                   timezone="", supplemental="")
dss.read_location(pathname)      # -> dict or None

Grid / Spatial

dss.write_grid(pathname, grid_type, nx, ny, data, data_units="", cell_size=0.0)
# data: list of float (flat, ny*nx)
dss.read_grid(pathname)          # -> dict with nx, ny, data, etc. or None

Record Management

dss.delete(pathname)
dss.undelete(pathname)
dss.squeeze()
dss.copy_record(pathname, dest_dss)   # dest_dss: another DssFile
dss.copy_file(dest_dss)               # -> int (count copied)
dss.check_file()                      # -> list of str

Aliases

dss.alias_add(primary_pathname, alias_pathname)
dss.alias_remove(alias_pathname)
dss.alias_list()                 # -> list of (pathname, info_address)

CRC / Change Tracking

dss.get_data_crc(pathname)       # -> int (CRC32)
dss.snapshot_crcs()              # -> list of (pathname, crc)

# Static method:
changed, added, removed = hecdss_rs.DssFile.what_changed(before, after)

Date Utilities (Static Methods)

hecdss_rs.DssFile.date_to_julian("15MAR2020")     # -> int
hecdss_rs.DssFile.julian_to_ymd(43905)             # -> (2020, 3, 15)
hecdss_rs.DssFile.parse_date("2020-03-15")         # -> (2020, 3, 15) or None

NumPy Integration

Time series values are passed as numpy.ndarray (float64). Arrays are returned as numpy arrays.

import numpy as np

# Write
values = np.array([1.0, 2.0, 3.0, 4.0, 5.0])
dss.write_ts("/A/B/FLOW/01JAN2020/1HOUR/SIM/", values, "CFS", "INST-VAL")

# Read
result = dss.read_ts("/A/B/FLOW/01JAN2020/1HOUR/SIM/")
# result is a numpy array
print(result.mean(), result.max())

C FFI API Reference

The dss-ffi crate produces a shared library (dss_ffi.dll / libdss_ffi.so) that is a drop-in replacement for the C hecdss library. All functions match the hecdss.h signatures.

Thread Safety

Each dss_file* handle is internally protected by a mutex. Multiple threads can share a handle safely.

Return Values

  • 0 = success
  • -1 = error (null pointer, file not found, write failure)
  • Positive = context-dependent (catalog returns count)

Functions (40 total)

File Management

int hec_dss_open(const char* filename, dss_file** dss);
int hec_dss_close(dss_file* dss);
int hec_dss_getVersion(dss_file* dss);              // Returns 7
int hec_dss_getFileVersion(const char* filename);    // 7, 6, 0, -1
const char* hec_dss_api_version();                   // "0.3.0-rust"
int hec_dss_CONSTANT_MAX_PATH_SIZE();                // 394
int hec_dss_set_value(const char* name, int value);  // Stub
int hec_dss_set_string(const char* name, const char* value); // Stub

Catalog

int hec_dss_record_count(dss_file* dss);
int hec_dss_catalog(dss_file* dss, char* pathBuffer, int* recordTypes,
    const char* pathFilter, int count, int pathBufferItemSize);
int hec_dss_dataType(dss_file* dss, const char* pathname);
int hec_dss_recordType(dss_file* dss, const char* pathname);

Time Series

int hec_dss_tsStoreRegular(dss_file* dss, const char* pathname,
    const char* startDate, const char* startTime,
    double* valueArray, int valueArraySize,
    int* qualityArray, int qualityArraySize,
    int saveAsFloat, const char* units, const char* type,
    const char* timeZoneName, int storageFlag);

int hec_dss_tsStoreIregular(dss_file* dss, const char* pathname,
    const char* startDateBase, int* times, int timeGranularitySeconds,
    double* valueArray, int valueArraySize,
    int* qualityArray, int qualityArraySize,
    int saveAsFloat, const char* units, const char* type,
    const char* timeZoneName, int storageFlag);

int hec_dss_tsRetrieve(dss_file* dss, const char* pathname,
    const char* startDate, const char* startTime,
    const char* endDate, const char* endTime,
    int* timeArray, double* valueArray, int arraySize,
    int* numberValuesRead, int* quality, int qualityWidth,
    int* julianBaseDate, int* timeGranularitySeconds,
    char* units, int unitsLength,
    char* type, int typeLength,
    char* timeZoneName, int timeZoneNameLength);

int hec_dss_tsRetrieveInfo(dss_file* dss, const char* pathname,
    char* units, int unitsLength, char* type, int typeLength);

int hec_dss_tsGetSizes(dss_file* dss, const char* pathname,
    const char* startDate, const char* startTime,
    const char* endDate, const char* endTime,
    int* numberValues, int* qualityElementSize);

int hec_dss_tsGetDateTimeRange(dss_file* dss, const char* pathname,
    int boolFullSet, int* firstJulian, int* firstSeconds,
    int* lastJulian, int* lastSeconds);

int hec_dss_numberPeriods(int intervalSeconds,
    int julianStart, int startSeconds,
    int julianEnd, int endSeconds);

Paired Data, Text, Array, Location, Grid, Delete, Squeeze, Date Utilities

See hecdss.h for complete signatures. All 40 functions are implemented.

Fortran API Reference

Module: hecdss (in hecdss_mod.f90)

Requires ISO_C_BINDING (Fortran 2003+). Compatible with gfortran, ifx, flang.

Usage

program example
    use hecdss
    use iso_c_binding
    implicit none

    type(c_ptr) :: dss
    integer(c_int) :: status

    status = hec_dss_open("example.dss"//c_null_char, dss)
    ! ... operations ...
    status = hec_dss_close(dss)
end program

Important: All Fortran strings must be null-terminated with //c_null_char.

Available Functions

All hec_dss_* functions from hecdss.h have Fortran interfaces. See C FFI Reference for parameter details.

Building

# Compile module
ifx -c src/hecdss_mod.f90

# Compile and link your program
ifx -c your_program.f90
ifx -o your_program.exe your_program.obj hecdss_mod.obj path/to/dss_ffi.dll.lib

String Handling

Fortran strings are fixed-length. When passing to DSS functions, always append c_null_char:

character(len=256) :: pathname
pathname = "/A/B/FLOW/01JAN2020/1HOUR/SIM/"
status = hec_dss_textStore(dss, trim(pathname)//c_null_char, &
    "Hello"//c_null_char, 5)

Date/Time Utilities

Julian Date System

DSS uses a modified Julian date where:

  • Julian 0 = 31 December 1899
  • Julian 1 = 1 January 1900

Rust API

#![allow(unused)]
fn main() {
use dss_core::datetime;

// String to Julian
let j = datetime::date_to_julian("15MAR2020");

// Julian to Y/M/D
let (y, m, d) = datetime::julian_to_year_month_day(j);

// Parse various date formats
let ymd = datetime::parse_date("01JAN2020");      // DDMonYYYY
let ymd = datetime::parse_date("2020-03-15");      // ISO
let ymd = datetime::parse_date("3/15/2020");       // US

// Y/M/D to Julian
let j = datetime::year_month_day_to_julian(2020, 3, 15);

// Interval parsing
let (interval_sec, block_months) = datetime::parse_interval("1HOUR").unwrap();
// (3600, 1) = 3600 seconds per value, monthly blocks

// Block operations
let blocks = datetime::generate_block_starts(start_j, end_j, 1); // monthly
let n = datetime::values_in_block(jan1_julian, 3600, 1);          // 744 for January hourly
}

Python API

j = hecdss_rs.DssFile.date_to_julian("15MAR2020")
y, m, d = hecdss_rs.DssFile.julian_to_ymd(j)
result = hecdss_rs.DssFile.parse_date("2020-03-15")  # (2020, 3, 15) or None

Supported Date Formats

FormatExampleNotes
DDMonYYYY01JAN2020, 15MAR2020Standard DSS format
MonYYYYJAN2020Day defaults to 1
ISO 86012020-01-15YYYY-MM-DD
US1/15/2020, 3-15-2020M/D/YYYY or M-D-YYYY

Supported Intervals

E-PartSecondsBlock Size
1MIN60Monthly
5MIN300Monthly
15MIN900Monthly
30MIN1800Monthly
1HOUR3600Monthly
6HOUR21600Monthly
1DAY86400Yearly
1WEEK604800Yearly
1MONVariableDecade
1YEARVariableCentury