Parquet Ingestion Fixture Matrix

Choose Parquet fixtures for columnar ingestion, warehouse imports, nested-column handling, and batch-load validation.

How to Use This Matrix

  • Covers flat all-types data, binary-heavy rows, and list-column layouts from a real parquet corpus.
  • Useful for warehouse loaders, parquet readers, and schema-shape validation in ETL jobs.
  • Anchored to ETL and warehouse packs for one-click ingestion test setup.

Open Primary Library

This matrix is anchored to the PARQUET library page and its manifest.

Fixture Rows

Variant Profile Test Focus File Size Download
All Types Parquet
Good default parquet fixture for reader smoke tests and columnar import validation.
Columnar baseline Primitive type handling parquet_alltypes_plain_sample.parquet 1.8 KB Download
Binary Records Parquet
Useful when warehouse and ETL readers need to preserve binary or blob-like columns.
Binary-value dataset Binary column decoding parquet_binary_records_sample.parquet 478 B Download
List Columns Parquet
Targets nested-column readers, schema inspection, and downstream flattening logic.
Nested column fixture Repeated/list column handling parquet_list_columns_sample.parquet 2.5 KB Download

Related Packs

ETL Validation Fixture Pack

Bundle of real Parquet, Avro, SQLite, NDJSON, and CSV fixtures for ETL staging, warehouse loads, and ingestion-pipeline validation.

etl_validation_fixture_pack.zip · 4.6 KB

Warehouse Import Fixture Pack

Bundle of real Parquet, Avro, SQLite, CSV, and JSON fixtures for warehouse import, schema mapping, and analytic table-load workflows.

warehouse_import_fixture_pack.zip · 3.7 KB

Related Workflows

Batch Ingestion Fixtures

Parquet, Avro, SQLite, NDJSON, and CSV fixtures for ETL staging, warehouse loads, and bulk-ingestion validation.

Open Workflow

Schema Evolution Fixtures

Avro, SQLite, Parquet, and JSON fixtures for producer/consumer drift, nullable fields, and migration-aware schema validation.

Open Workflow