Create and attach environments
Data Engineering · General availability · Shipped
Description
To customize your Spark experiences at a more granular level, you can create and attach environments to your notebooks and Spark jobs. In an environment, you can install libraries, configure a new pool, set Spark properties, and upload scripts to a file system. This gives you more flexibility and control over your Spark workloads, without affecting the default settings of the workspace. As part of GA, we're making various improvements to environments including API support and CI/CD integration.
Change History
-
2024-05-21
Roadmap Item Added
Workload: Data Engineering