All notable changes to this project will be documented in this file.
The format is based on Keep a Changelog, and this project adheres to Semantic Versioning.
keys
and values
fields.$ref
are not always resolved.string
and format uuid
map
and enum
type for Avro schema import (#311)datacontract import --format spark
: Import from Spark tables (#326)glue_table
as parameter did not filter the tables and instead returned all tables from source
database (#333)--schema
option for the catalog
and export
command to provide the schema also locallydatacontract import --format jsonschema
when description is missing (#300)datacontract test
with case-sensitive Postgres table names (#310)datacontract serve
start a local web server to provide a REST-API for the commandsGOOGLE_APPLICATION_CREDENTIALS
as variable for connecting to bigquery in datacontract test
type
attribute, don’t assume all imported models are tablesdatacontract export --format avro
fixed array structure (#243)sqlserver
(#196)datacontract export --format dbml
: Export to Database Markup Language (DBML) (#135)datacontract export --format avro
: Now supports config map on field level for logicalTypes and default values Custom Avro Propertiesdatacontract import --format avro
: Now supports importing logicalType and default definition on avro files Custom Avro Propertiesconfig.bigqueryType
for testing BigQuery typesimport
through the glue-table
parameter (#122)datacontract catalog
Show search bar also on mobiledatacontract catalog
Searchdatacontract publish
: Publish the data contract to the Data Mesh Managerdatacontract import --format bigquery
: Import from BigQuery format (#110)datacontract export --format bigquery
: Export to BigQuery format (#111)datacontract export --format avro
: Now supports Avro logical types to better model date types. date
, timestamp
/timestamp-tz
and timestamp-ntz
are now mapped to the appropriate logical types. (#141)datacontract import --format jsonschema
: Import from JSON schema (#91)datacontract export --format jsonschema
: Improved export by exporting more additional informationdatacontract export --format html
: Added support for Service Levels, Definitions, Examples and nested Fieldsdatacontract export --format go
: Export to go types formatazure
(#146)delta
tables on S3 (#24)datacontract catalog
that generates a data contract catalog with an index.html
file.datacontract export --format sql
for Databricks dialectsdatacontract export --format great-expectations
datacontract export --format sql
sql_type_converter
to build checks.datacontract test --publish-to-opentelemetry
datacontract export --format protobuf
datacontract export --format terraform
(limitation: only works for AWS S3 right now)datacontract export --format sql
datacontract export --format sql-query
datacontract export --format avro-idl
: Generates an Avro IDL file containing records for each model.datacontract changelog datacontract1.yaml datacontract2.yaml
will now generate a changelog based on the changes in the data contract. This will be useful for keeping track of changes in the data contract over time.datacontract lint
will now check for a variety of possible errors in the data contract, such as missing descriptions, incorrect references to models or fields, nonsensical constraints, and more.datacontract import --format avro
will now import avro schemas into a data contract.datacontract export --format avro
This is a huge step forward, we now support testing Kafka messages. We start with JSON messages and avro, and Protobuf will follow.
datacontract import --format sql
(#51)datacontract export --format dbt-sources
datacontract export --format dbt-staging-sql
datacontract export --format rdf
(#52)datacontract breaking
to detect breaking changes in between two data contracts.This is a breaking change (we are still on a 0.x.x version). The project migrated from Golang to Python. The Golang version can be found at cli-go
test
Support to directly run tests and connect to data sources defined in servers section.test
generated schema tests from the model definition.test --publish URL
Publish test results to a server URL.export
now exports the data contract so format jsonschema and sodacl.--file
option removed in favor of a direct argument.: Use datacontract test datacontract.yaml
instead of datacontract test --file datacontract.yaml
.model
is now part of export
quality
is now part of export
diff
needs to be migrated to Python.breaking
needs to be migrated to Python.inline
needs to be migrated to Python.model
command parameter, type
-> format
.schema
command.models
section for diff
/breaking
.model
command.inline
print to STDOUT instead of overwriting datacontract file.quality
write input from STDIN if present.test
command for Soda Core.diff
/breaking
.diff
/breaking
.diff
/breaking
.schema
command: prints your schema.quality
command: prints your quality definitions.inline
command: resolves all references using the “$ref: …” notation and writes them to your data contract.--file
, --with
).diff
command for dbt schema specification.breaking
command for dbt schema specification.init
when the file already exists.validate
command to lint
.check-compatibility
command.