NOTE: This is a fork of ethereum-etl that handles for Optimism data. Replace your ethereum-etl
package with ethereum-etl
from this repo
Fields Added
- l1_fee: The total amount of ETH paid in a user's transaction for the L1 data fee
- l1_gas_used: The amount of gas that the calldata (input data) of a transaction used.
- l1_gas_used_paid: The total amount of gas that the user pays for on L1:
Fee Scalar * (Calldata gas + Overhead gas)
. This is derived vial1_fee / l1_gas_price
. - l1_gas_price: The price for L1 gas charged for a transaction.
- l1_fee_scalar: The fee scalar applied to the total amount of L1 gas a transaction uses.
For more information see: Optimism Docs - Transaction Fees on L2
Ethereum ETL lets you convert blockchain data into convenient formats like CSVs and relational databases.
Do you just want to query Ethereum data right away? Use the public dataset in BigQuery.
Full documentation available here.
Install Ethereum ETL:
pip3 install ethereum-etl
Export blocks and transactions (Schema, Reference):
> ethereumetl export_blocks_and_transactions --start-block 0 --end-block 500000 \
--blocks-output blocks.csv --transactions-output transactions.csv \
--provider-uri https://mainnet.infura.io/v3/7aef3f0cd1f64408b163814b22cc643c
Export ERC20 and ERC721 transfers (Schema, Reference):
> ethereumetl export_token_transfers --start-block 0 --end-block 500000 \
--provider-uri file://$HOME/Library/Ethereum/geth.ipc --output token_transfers.csv
Export traces (Schema, Reference):
> ethereumetl export_traces --start-block 0 --end-block 500000 \
--provider-uri file://$HOME/Library/Ethereum/parity.ipc --output traces.csv
Stream blocks, transactions, logs, token_transfers continually to console (Reference):
> pip3 install ethereum-etl[streaming]
> ethereumetl stream --start-block 500000 -e block,transaction,log,token_transfer --log-file log.txt \
--provider-uri https://mainnet.infura.io/v3/7aef3f0cd1f64408b163814b22cc643c
Find other commands here.
For the latest version, check out the repo and call
> pip3 install -e .
> python3 ethereumetl.py
- Schema
- Command Reference
- Documentation
- Public Datasets in BigQuery
- Exporting the Blockchain
- Querying in Amazon Athena
- Querying in Google BigQuery
- Querying in Kaggle
- Airflow DAGs
- Postgres ETL
- Ethereum 2.0 ETL
> pip3 install -e .[dev,streaming]
> export ETHEREUM_ETL_RUN_SLOW_TESTS=True
> export PROVIDER_URL=<your_porvider_uri>
> pytest -vv
> pip3 install tox
> tox
-
Install Docker: https://docs.docker.com/get-docker/
-
Build a docker image
> docker build -t ethereum-etl:latest . > docker image ls
-
Run a container out of the image
> docker run -v $HOME/output:/ethereum-etl/output ethereum-etl:latest export_all -s 0 -e 5499999 -b 100000 -p https://mainnet.infura.io > docker run -v $HOME/output:/ethereum-etl/output ethereum-etl:latest export_all -s 2018-01-01 -e 2018-01-01 -p https://mainnet.infura.io
-
Run streaming to console or Pub/Sub
> docker build -t ethereum-etl:latest . > echo "Stream to console" > docker run ethereum-etl:latest stream --start-block 500000 --log-file log.txt > echo "Stream to Pub/Sub" > docker run -v /path_to_credentials_file/:/ethereum-etl/ --env GOOGLE_APPLICATION_CREDENTIALS=/ethereum-etl/credentials_file.json ethereum-etl:latest stream --start-block 500000 --output projects/<your-project>/topics/crypto_ethereum
If running on Apple M1 chip add the --platform linux/x86_64
option to the build
and run
commands e.g.:
docker build --platform linux/x86_64 -t ethereum-etl:latest .
docker run --platform linux/x86_64 ethereum-etl:latest stream --start-block 500000