
Teams want onchain data to show up where they already work. With Dune data now available inside Databricks, you can bring trusted blockchain intelligence straight into your existing warehouse setup without changing your workflow.
This builds in addition to our Datashare availability across Snowflake and BigQuery, giving you a consistent way to use Dune data across all major warehouse platforms.
Datashare places Dune data directly into your own compute environment. You can query onchain information alongside your internal datasets with zero ETL overhead.
Key capabilities include:
- Cross-region replication for dependable low-latency performance
- Full historical coverage of major chains, including Solana
- Continuous syncs that keep data current for production-scale use cases
- Native integration with governance controls for secure enterprise rollout
Already Powering Real Workloads
Companies such as RWA.xyz are already putting Dune’s data into action via Databricks. RWA.xyz provides institutional-grade analytics on tokenised real-world assets (RWAs), everything from tokenised treasuries and corporate credit to asset-backed projects on public blockchains. Their team uses Dune data inside their analytics pipelines to process large-scale onchain datasets, model asset flows across chains, and monitor issuance and redemption behaviours.
Other leading organisations, including BitGo and Artemis Analytics, rely on Dune’s Datashare offering as a primary source of blockchain data. BitGo is a major digital‐asset infrastructure provider offering regulated custody, staking, trading, and wallet services for institutional clients around the globe. Artemis Analytics is focused on standardising onchain metrics for stablecoins and digital assets. By extending the reach of Dune data into the Databricks ecosystem, we are giving more teams like these another streamlined, warehouse-native path to bring blockchain data into their existing tool stack.
Get started with Dune Datashare on Databricks today.


