Things under legendu
Things under legendu
All packages in a virtual environment must be managed by conda (rather than pip) so that it can be packe using conda-pack.
When using a conda-pack virtual environment with PySpark, the Python package
pyysparkcomes with Spark is automatically injected into PYTHONPATH so that users do not have to installpysparkinto the virtual environemnt by themselves. As a matter of fact, thepysparkcomes with Spark is always used even if you have a local copy installed when you submit a PySpark application with a conda-pack virtual environment. For more discussions, please refer to this isue.
References¶
Pack a Conda Virtual Environment