mirror of
https://github.com/bitnami/containers.git
synced 2026-02-17 11:17:17 +08:00
[bitnami/spark] Release 3.2.4-debian-11-r154 (#54770)
Signed-off-by: Bitnami Containers <bitnami-bot@vmware.com>
This commit is contained in:
@@ -8,10 +8,10 @@ ARG TARGETARCH
|
||||
|
||||
LABEL com.vmware.cp.artifact.flavor="sha256:1e1b4657a77f0d47e9220f0c37b9bf7802581b93214fff7d1bd2364c8bf22e8e" \
|
||||
org.opencontainers.image.base.name="docker.io/bitnami/minideb:bullseye" \
|
||||
org.opencontainers.image.created="2023-12-16T10:52:53Z" \
|
||||
org.opencontainers.image.created="2024-01-15T11:45:06Z" \
|
||||
org.opencontainers.image.description="Application packaged by VMware, Inc" \
|
||||
org.opencontainers.image.licenses="Apache-2.0" \
|
||||
org.opencontainers.image.ref.name="3.2.4-debian-11-r153" \
|
||||
org.opencontainers.image.ref.name="3.2.4-debian-11-r154" \
|
||||
org.opencontainers.image.title="spark" \
|
||||
org.opencontainers.image.vendor="VMware, Inc." \
|
||||
org.opencontainers.image.version="3.2.4"
|
||||
@@ -28,7 +28,7 @@ SHELL ["/bin/bash", "-o", "errexit", "-o", "nounset", "-o", "pipefail", "-c"]
|
||||
RUN install_packages ca-certificates curl libbz2-1.0 libcom-err2 libcrypt1 libffi7 libgcc-s1 libgssapi-krb5-2 libk5crypto3 libkeyutils1 libkrb5-3 libkrb5support0 liblzma5 libncursesw6 libnsl2 libreadline8 libsqlite3-0 libssl1.1 libstdc++6 libtinfo6 libtirpc3 procps zlib1g
|
||||
RUN mkdir -p /tmp/bitnami/pkg/cache/ ; cd /tmp/bitnami/pkg/cache/ ; \
|
||||
COMPONENTS=( \
|
||||
"python-3.10.13-13-linux-${OS_ARCH}-debian-11" \
|
||||
"python-3.10.13-14-linux-${OS_ARCH}-debian-11" \
|
||||
"java-11.0.21-10-6-linux-${OS_ARCH}-debian-11" \
|
||||
"spark-3.2.4-13-linux-${OS_ARCH}-debian-11" \
|
||||
) ; \
|
||||
|
||||
@@ -9,7 +9,7 @@
|
||||
"arch": "amd64",
|
||||
"distro": "debian-11",
|
||||
"type": "NAMI",
|
||||
"version": "3.10.13-13"
|
||||
"version": "3.10.13-14"
|
||||
},
|
||||
"spark": {
|
||||
"arch": "amd64",
|
||||
|
||||
@@ -10,7 +10,7 @@ fi
|
||||
|
||||
script=$1
|
||||
exit_code="${2:-96}"
|
||||
fail_if_not_present="${3:-y}"
|
||||
fail_if_not_present="${3:-n}"
|
||||
|
||||
if test -f "$script"; then
|
||||
sh $script
|
||||
|
||||
@@ -75,6 +75,43 @@ docker build -t bitnami/APP:latest .
|
||||
|
||||
### Environment variables
|
||||
|
||||
#### Customizable environment variables
|
||||
|
||||
| Name | Description | Default Value |
|
||||
|-------------------------------------------|----------------------------------------------------------------------------------|------------------------------------------------|
|
||||
| `$SPARK_MODE` | Spark cluster mode to run (can be master or worker). | `master` |
|
||||
| `$SPARK_MASTER_URL` | Url where the worker can find the master. Only needed when spark mode is worker. | `spark://spark-master:7077` |
|
||||
| `$SPARK_NO_DAEMONIZE` | Spark does not run as a daemon. | `true` |
|
||||
| `$SPARK_RPC_AUTHENTICATION_ENABLED` | Enable RPC authentication. | `no` |
|
||||
| `$SPARK_RPC_ENCRYPTION_ENABLED` | Enable RPC encryption. | `no` |
|
||||
| `$SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED` | Enable local storage encryption. | `no` |
|
||||
| `$SPARK_SSL_ENABLED` | Enable SSL configuration. | `no` |
|
||||
| `$SPARK_SSL_KEYSTORE_FILE` | Location of the key store. | `${SPARK_CONF_DIR}/certs/spark-keystore.jks` |
|
||||
| `$SPARK_SSL_TRUSTSTORE_FILE` | Location of the key store. | `${SPARK_CONF_DIR}/certs/spark-truststore.jks` |
|
||||
| `$SPARK_SSL_NEED_CLIENT_AUTH` | Whether to require client authentication. | `yes` |
|
||||
| `$SPARK_SSL_PROTOCOL` | TLS protocol to use. | `TLSv1.2` |
|
||||
| `$SPARK_METRICS_ENABLED` | Whether to enable metrics for Spark. | `false` |
|
||||
|
||||
#### Read-only environment variables
|
||||
|
||||
| Name | Description | Value |
|
||||
|--------------------------|--------------------------------|-----------------------------------------|
|
||||
| `$SPARK_BASE_DIR` | Spark installation directory. | `${BITNAMI_ROOT_DIR}/spark` |
|
||||
| `$SPARK_CONF_DIR` | Spark configuration directory. | `${SPARK_BASE_DIR}/conf` |
|
||||
| `$SPARK_WORK_DIR` | Spark workspace directory. | `${SPARK_BASE_DIR}/work` |
|
||||
| `$SPARK_CONF_FILE` | Spark configuration file path. | `${SPARK_CONF_DIR}/spark-defaults.conf` |
|
||||
| `$SPARK_LOG_DIR` | Spark logs directory. | `${SPARK_BASE_DIR}/logs` |
|
||||
| `$SPARK_TMP_DIR` | Spark tmp directory. | `${SPARK_BASE_DIR}/tmp` |
|
||||
| `$SPARK_JARS_DIR` | Spark jar directory. | `${SPARK_BASE_DIR}/jars` |
|
||||
| `$SPARK_INITSCRIPTS_DIR` | Spark init scripts directory. | `/docker-entrypoint-initdb.d` |
|
||||
| `$SPARK_USER` | Spark user. | `spark` |
|
||||
| `$SPARK_DAEMON_USER` | Spark system user. | `spark` |
|
||||
| `$SPARK_DAEMON_GROUP` | Spark system group. | `spark` |
|
||||
|
||||
Additionally, more environment variables natively supported by Apache Spark can be found [at the official documentation](https://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts).
|
||||
|
||||
For example, you could still use `SPARK_WORKER_CORES` or `SPARK_WORKER_MEMORY` to configure the number of cores and the amount of memory to be used by a worker machine.
|
||||
|
||||
When you start the spark image, you can adjust the configuration of the instance by passing one or more environment variables either on the docker-compose file or on the `docker run` command line. If you want to add a new environment variable:
|
||||
|
||||
* For docker-compose add the variable name and value under the application section in the [`docker-compose.yml`](https://github.com/bitnami/containers/blob/main/bitnami/spark/docker-compose.yml) file present in this repository:
|
||||
@@ -96,28 +133,6 @@ docker run -d --name spark \
|
||||
bitnami/spark
|
||||
```
|
||||
|
||||
Available variables:
|
||||
|
||||
* SPARK_MODE: Cluster mode starting Apache Spark. Valid values: *master*, *worker*. Default: **master**
|
||||
* SPARK_MASTER_URL: Url where the worker can find the master. Only needed when spark mode is *worker*. Default: **spark://spark-master:7077**
|
||||
* SPARK_RPC_AUTHENTICATION_ENABLED: Enable RPC authentication. Default: **no**
|
||||
* SPARK_RPC_AUTHENTICATION_SECRET: The secret key used for RPC authentication. No defaults.
|
||||
* SPARK_RPC_ENCRYPTION_ENABLED: Enable RPC encryption. Default: **no**
|
||||
* SPARK_LOCAL_STORAGE_ENCRYPTION_ENABLED: Enable local storage encryption: Default **no**
|
||||
* SPARK_SSL_ENABLED: Enable SSL configuration. Default: **no**
|
||||
* SPARK_SSL_KEY_PASSWORD: The password to the private key in the key store. No defaults.
|
||||
* SPARK_SSL_KEYSTORE_FILE: Location of the key store. Default: **/opt/bitnami/spark/conf/certs/spark-keystore.jks**.
|
||||
* SPARK_SSL_KEYSTORE_PASSWORD: The password for the key store. No defaults.
|
||||
* SPARK_SSL_TRUSTSTORE_PASSWORD: The password for the trust store. No defaults.
|
||||
* SPARK_SSL_TRUSTSTORE_FILE: Location of the key store. Default: **/opt/bitnami/spark/conf/certs/spark-truststore.jks**.
|
||||
* SPARK_SSL_NEED_CLIENT_AUTH: Whether to require client authentication. Default: **yes**
|
||||
* SPARK_SSL_PROTOCOL: TLS protocol to use. Default: **TLSv1.2**
|
||||
* SPARK_DAEMON_USER: Apache Spark system user when the container is started as root. Default: **spark**
|
||||
* SPARK_DAEMON_GROUP: Apache Spark system group when the container is started as root. Default: **spark**
|
||||
|
||||
More environment variables natively supported by Apache Spark can be found [at the official documentation](https://spark.apache.org/docs/latest/spark-standalone.html#cluster-launch-scripts).
|
||||
For example, you could still use `SPARK_WORKER_CORES` or `SPARK_WORKER_MEMORY` to configure the number of cores and the amount of memory to be used by a worker machine.
|
||||
|
||||
### Security
|
||||
|
||||
The Bitnani Apache Spark docker image supports enabling RPC authentication, RPC encryption and local storage encryption easily using the following env vars in all the nodes of the cluster.
|
||||
|
||||
Reference in New Issue
Block a user