dbxio is a high-level client for Databricks that simplifies working with tables and volumes. It provides a simple interface for reading and writing data, creating and deleting objects, and running SQL queries and fetching results.
- dbxio connects the power of Databricks SQL and Python for local data manipulation.
- dbxio provides a simple and intuitive interface for working with Databricks Tables and Volumes. Now it's possible to read/write data with just a few lines of code.
- For large amounts of data, dbxio uses intermediate object storage of your choice to perform bulk upload later (see COPY INTO for more details). So, you can upload any amount of data, and dbxio will take care of synchronizing the data with the table in Databricks.
Currently, we are not aware of any alternatives that offer the same functionality as dbxio. If you come across any, we would be interested to learn about them. Please let us know by opening an issue in our GitHub repository.
dbxio requires Python 3.9 or later. You can install dbxio using pip:
pip install dbxio
import dbxio
client = dbxio.DbxIOClient.from_cluster_settings(
cluster_type=dbxio.ClusterType.SQL_WAREHOUSE,
http_path='<YOUR_HTTP_PATH>',
server_hostname='<YOUR_SERVER_HOSTNAME>',
settings=dbxio.Settings(cloud_provider=dbxio.CloudProvider.AZURE),
)
# read table
table = list(dbxio.read_table('catalog.schema.table', client=client))
# write table
data = [
{'col1': 1, 'col2': 'a', 'col3': [1, 2, 3]},
{'col1': 2, 'col2': 'b', 'col3': [4, 5, 6]},
]
schema = dbxio.TableSchema.from_obj(
{
'col1': dbxio.types.IntType(),
'col2': dbxio.types.StringType(),
'col3': dbxio.types.ArrayType(dbxio.types.IntType()),
}
)
dbxio.bulk_write_table(
dbxio.Table('domain.schema.table', schema=schema),
data,
client=client,
abs_name='blob_storage_name',
abs_container_name='container_name',
append=True,
)
dbxio supports the following cloud providers:
- Azure
- AWS (in plans)
- GCP (in plans)