Connect to Databricks with AWS S3
Last updated Thu Jun 30 2022
Begin your journey with integrating AWS S3 Databricks, which includes:
- Quick steps to configure the Amazon S3 REST Connector
- Quick steps for configuring the HTTP Client connector to connect to AWS
- Methods for executing a SQL command within Databricks with the Program Command Shape
Amazon Simple Storage Service (Amazon S3) is an industry-leading service offered by Amazon Web Services (AWS) for retrieving and storing large amounts of data for easy management. Databricks combines data warehouses and data lakes to provide a unified, open platform for all data. Together, both systems are essential to perform analytics across the platform.
Accelerate with data-driven innovations through AWS S3 Databricks from anywhere. Reduce costs in scaling data storage to benefit business needs of data durability. This helps with easy management of data for more robust and flexible operations.
The example process demonstrates the use of AWS S3 Databricks. First, test data is loaded into AWS S3 by the Message Shape to send to the connector. To connect to AWS S3, configure the Amazon S3 REST Connector with the appropriate credentials. Next, to establish a connection to AWS S3, the HTTP Client connector is configured to obtain and set the AWS Session Token. Following after, the Program Command Shape executes a SQL command to pull data from AWS S3 to a table of records. Last, the newly created file is deleted once data in AWS S3 has been consumed.
All amounts of data are securely stored and protected for majority use cases. Efficiently meet requirements across your business for scalability, data availability, security, and performance.
The recipe qualifies for Boomi’s Recipe Program
This Recipe is eligible for Boomi’s Recipe Program – that bundles free expertise from the Boomi PSO team included in the price of this prebuilt offering. The Boomi experts will help you get set up faster than ever, allowing you to get the most out of the Boomi AtomSphere Platform.