Elasticsearch, Kibana for IoT Data Visualization and Analysis
Elasticsearch, Kibana for IoT Data Visualization and Analysis
Launch an Elasticsearch instance or run Elasticsearch on your own Linux
The first step is to have an account for Elastic Cloud. If you donβt have one, you can register for a trial here (no credit card required). Once you log in, you can create a new deployment, choosing the size of the Elasticsearch instances that you want to use.
Use Elasticsearch and Kibana in the cloud: Easiest way!
you can follow this link to create an instance using the free-trial service: https://www.elastic.co/docs/deploy-manage/deploy/elastic-cloud/create-an-elastic-cloud-hosted-deployment#ec-prepare-production

After launching Elasticsearch instance, you can access the Kibana on the Elastic Cloud Hosted: https://www.elastic.co/docs/deploy-manage/deploy/elastic-cloud/access-kibana

Install Elasticsearch and Kibana on your own (Hard unless you have Linux System)
If you have linux system which has installed Docker, you can use
You need to follow this link to install elasticsearch: https://www.elastic.co/docs/deploy-manage/deploy/self-managed/installing-elasticsearch

For installing Kibana, please follow this link: https://www.elastic.co/docs/deploy-manage/deploy/self-managed/install-kibana
Install Elasticsearch and Kibana using Docker in Linux system (the quickest one)
1.Install Docker Compose:
1 | sudo apt remove $(dpkg --get-selections docker.io docker-compose docker-compose-v2 docker-doc podman-docker containerd runc | cut -f1) |
1 | # Add Docker's official GPG key: |
Type the two commands one-by-one into the terminal, then docker will be installed and running
2.Execute
1 | curl -fsSL https://elastic.co/start-local | sh |
Then Elasticsearch and Kibana will run!
π Congrats, Elasticsearch and Kibana are installed and running in Docker!
π Open your browser at http://localhost:5601
Username: elastic
Password:
π Elasticsearch API endpoint: http://localhost:9200
π API key:
IoT simulation and data visualization
Now you can access http://localhost:5601 to see the dashboard of Elasticsearch.
Now we use python to simulate IoT data, first we install elasticsearch library.
1 | pip install elasticsearch |
We write a python script:
1 | from elasticsearch import Elasticsearch |
Then we fill in the ELASTIC_PASSWORD and run it.
Here we can see a lot of data in the Elasticsearch Dashboard:

Then we click Dashboards, to create a data view:

When creating a data view, we fill in Index Pattern with iot-waste-metrics we used in the Python script.

We save data view to Kibana.
Then we create a dashboard, as shown in Figure below:

We create a dashboard and then create a visualization. After that, as shown in Figure below, we can drag fill_level_pct into the center of the canva, to create a view.

On the right side, we pick Gauge and choose the data aggregation method Median, Average , Maximum, etc.
We can also see different fill levels for different smart bins, as shown below:

Further Exploration
Using this tool, we can choose machine learning in the left part:

Different functions are provided. Based on this, we can train our customized model.
Also, other Python libraries for machine learning and deep learning are highly encouraged, especially, Pytorch and Tensorflow.


