Parquet Ingestion Fixture Matrix

Choose Parquet fixtures for columnar ingestion, warehouse imports, nested-column handling, and batch-load validation.

Como usar esta matriz

  • Covers flat all-types data, binary-heavy rows, and list-column layouts from a real parquet corpus.
  • Useful for warehouse loaders, parquet readers, and schema-shape validation in ETL jobs.
  • Anchored to ETL and warehouse packs for one-click ingestion test setup.

Abrir biblioteca principal

Esta matriz esta ancorada na pagina da biblioteca PARQUET e em seu manifesto.

Linhas de fixtures

Variante Perfil Foco do teste Arquivo Tamanho Baixar
All Types Parquet
Good default parquet fixture for reader smoke tests and columnar import validation.
Columnar baseline Primitive type handling parquet_alltypes_plain_sample.parquet 1.8 KB Baixar
Binary Records Parquet
Useful when warehouse and ETL readers need to preserve binary or blob-like columns.
Binary-value dataset Binary column decoding parquet_binary_records_sample.parquet 478 B Baixar
List Columns Parquet
Targets nested-column readers, schema inspection, and downstream flattening logic.
Nested column fixture Repeated/list column handling parquet_list_columns_sample.parquet 2.5 KB Baixar

Pacotes relacionados

ETL Validation Fixture Pack

Bundle of real Parquet, Avro, SQLite, NDJSON, and CSV fixtures for ETL staging, warehouse loads, and ingestion-pipeline validation.

etl_validation_fixture_pack.zip · 4.6 KB

Warehouse Import Fixture Pack

Bundle of real Parquet, Avro, SQLite, CSV, and JSON fixtures for warehouse import, schema mapping, and analytic table-load workflows.

warehouse_import_fixture_pack.zip · 3.7 KB

Fluxos relacionados

Batch Ingestion Fixtures

Parquet, Avro, SQLite, NDJSON, and CSV fixtures for ETL staging, warehouse loads, and bulk-ingestion validation.

Abrir fluxo

Schema Evolution Fixtures

Avro, SQLite, Parquet, and JSON fixtures for producer/consumer drift, nullable fields, and migration-aware schema validation.

Abrir fluxo