The Complete Guide to Running OpenSearch on Docker
Updated: Nov 7
This comprehensive tutorial will walk through how to containerize OpenSearch using Docker for easy deployment and management.
What is OpenSearch?
OpenSearch is an open source search and analytics engine that is a fork of Elasticsearch. Key features include:
Powerful full-text search capabilities
Options for analytics and visualizations
Ability to scale to handle terabytes of data
REST APIs for easy integration
Plug-in ecosystem to extend functionality
Running OpenSearch using Docker containers provides portability and simplifies deployment on any infrastructure.
Overview of the OpenSearch Docker Image
The official OpenSearch Docker image is available on Docker Hub:
docker.io/opensearchproject/opensearch
This image contains:
The OpenSearch server
Kibana for visualizations and analytics
Sample data sets
Sensible default configuration
The image allows customization using environment variables covered later.
Why Run OpenSearch on Docker?
There are several key benefits to using Docker for OpenSearch:
Simplified Deployment - Getting OpenSearch up and running takes just one docker command. No need to install binaries and configure dependencies.
Portability - The OpenSearch Docker image runs on any platform supporting containers like Linux, Windows, cloud, Kubernetes and more.
Isolation - OpenSearch runs in an isolated container with dependencies provided by the image. No version conflicts.
Manageability - Easily manage, monitor, backup, upgrade the OpenSearch instance through Docker.
Scalability - Scale up OpenSearch by running multiple container replicas managed by Docker.
Configuring the OpenSearch Docker Container
The official OpenSearch Docker image comes with sensible defaults for configurations like memory allocation and heap size. However, the container can be customized as needed using the following environment variables:
opensearch_java_opts - Used to set any additional JVM command line options. This can be used to tune the JVM further.
opensearch_mem_limit - Sets a memory limit for the OpenSearch container. Lowering this can constrain memory usage.
opensearch_heap_size - Configures the JVM heap allocation size. Tune this for your usage.
Additional configs are also available - refer to the Docker Hub page for specifics.
Running an OpenSearch Docker Container
To start an OpenSearch Docker container, run the following Docker command:
docker run -p 9200:9200 -p 9600:9600 -e "discovery.type=single-node" docker.io/opensearchproject/opensearch:latest
Let's break this down:
-p 9200:9200 - Exposes port 9200 from the container to the host for HTTP requests
-p 9600:9600 - Exposes port 9600 for TCP transport
-e "discovery.type=single-node" - Configures OpenSearch to run in single-node mode
docker.io/opensearchproject/opensearch:latest - Uses the latest official OpenSearch image
This will pull the latest OpenSearch Docker image if not already present and start a container with ports exposed.
Interacting with OpenSearch Docker via the API
With OpenSearch Docker running, you can now interact with it via the REST API. For example, to check the health status:
curl -X GET "localhost:9200/_cluster/health?pretty"
This will return a JSON response with the current health status and other info.
You can use the OpenSearch REST API to:
Index and manage data
Run searches
Perform analytics
Manage indexes and documents
Monitor cluster health
Store and retrieve visualizations
See the OpenSearch REST APIs for details.
Accessing Kibana for Visualizations
The OpenSearch Docker container also provides access to Kibana for analytics and visualizations.
To access the Kibana web interface, go to:
http://localhost:9200/_plugin/kibana/
This allows creating visualizations, dashboards, exploring data and more via the browser.
Persisting Data with OpenSearch Docker
Any data indexed or stored in OpenSearch will be lost when the Docker container is removed. To persist OpenSearch data, mount a host directory when running the container:
docker run -v /my/custom/path:/usr/share/opensearch/data ...
This will store any OpenSearch indexes, logs, analytics, etc in the provided directory. The data will now persist across container restarts and upgrades.
For production deployments, choosing an appropriate persistent storage volume like SSD disks is recommended for optimal performance.
Running OpenSearch Docker in Production
For production environments, the following best practices are recommended:
Use a dedicated Docker network for inter-node communication
Replicate data across nodes for failover
Schedule periodic snapshots for backups
Monitor container resource usage
Vertically scale nodes if needed by using more powerful hosts
Horizontally scale out with Docker replicas behind a load balancer
This provides high availability, failover, increased throughput and redundancy for mission-critical use.
Recap of Key OpenSearch Docker Concepts
To summarize, the key points about running OpenSearch on Docker covered in this guide:
Official Docker image available containing OpenSearch and Kibana
Customizable configs using environment variables
Exposes HTTP port 9200 and TCP port 9600
Interact via REST API or Kibana UI for analytics
Persist OpenSearch data by mounting host directory
Follow production best practices for mission-critical use
Conclusion
Containerizing OpenSearch with Docker simplifies deployment while providing portability across environments. Use this comprehensive tutorial to get up and running with your own OpenSearch Docker containers and take advantage of the powerful search and analytics capabilities.