{"active":true,"blog_title":null,"blog_url":null,"feature_description":"To customize your Spark experiences at a more granular level, you can create and attach environments to your notebooks and Spark jobs. In an environment, you can install libraries, configure a new pool, set Spark properties, and upload scripts to a file system. This gives you more flexibility and control over your Spark workloads, without affecting the default settings of the workspace. As part of GA, we're making various improvements to environments including API support and CI/CD integration.","feature_name":"Create and attach environments","last_modified":"2024-05-21","product_id":"a731518f-36ca-ee11-9079-000d3a341a60","product_name":"Data Engineering","release_date":"2024-05-21","release_item_id":"435d0880-f2d6-ee11-9079-000d3a310f67","release_status":"Shipped","release_type":"General availability"}