diff --git a/CHANGELOG.md b/CHANGELOG.md
new file mode 100644
index 0000000..6ed645e
--- /dev/null
+++ b/CHANGELOG.md
@@ -0,0 +1,29 @@
+# Change Log
+All notable changes to this project will be documented in this file.
+This project adheres to [Semantic Versioning](http://semver.org/).
+
+
+
+## [1.1.0] - 2019-06-20
+#### Added
+- Update ELK stack: v6.3.0 -> v7.1.1
+- Multiple modifications to the ingestor service:
+ * Move ingestor to `extensions` folder
+ * Modify VulntoES to record MAC addresses, if present
+ * Update ingestor container from python2.7 to python3.7
+ * Semplify call method: `docker-compose run ingestor`
+ * Minor refactoring to `VulntoES.py`
+#### Fixed
+- Time pattern now available
+#### Removed
+- Remove extensions/logspout
+
+
+## [1.0.1] - 2018-10-17
+#### Fixed
+- Modify VulntoES to only ingest open ports
+
+
+## [1.0.0] - 2018-07-16
+#### Added
+- First Public Release
\ No newline at end of file
diff --git a/README.md b/README.md
index 88caa19..6e4980c 100644
--- a/README.md
+++ b/README.md
@@ -14,7 +14,7 @@ A full walkthrough that led me to this setup can be found at: [https://www.marco
```
❯ git clone https://github.com/marco-lancini/docker_offensive_elk.git
```
-2. Create the `_data` folder and ensure it is owned by your own user:
+2. Create the `_data` folder (if not present) and ensure it is owned by your own user:
```
❯ cd docker_offensive_elk/
❯ mkdir ./_data/
@@ -25,27 +25,8 @@ A full walkthrough that led me to this setup can be found at: [https://www.marco
docker-elk ❯ docker-compose up -d
```
4. Give Kibana a few seconds to initialize, then access the Kibana web UI running at: http://localhost:5601.
-5. During the first run, [create an index](#create-an-index).
-6. [Ingest nmap results](#ingest-nmap-results).
-
-
-### Create an Index
-
-1. Create the `nmap-vuln-to-es` index using curl:
-```bash
-❯ curl -XPUT 'localhost:9200/nmap-vuln-to-es'
-```
-2. Open Kibana in your browser ([http://localhost:5601](http://localhost:5601)) and you should be presented with the screen below:
-
-
-3. Insert `nmap*` as index pattern and press "_Next Step_":
-
-
-4. Choose "_I don't want to use the Time Filter_", then click on "_Create Index Pattern_":
-
-
-5. If everything goes well you should be presented with a page that lists every field in the `nmap*` index and the field's associated core type as recorded by Elasticsearch.
-
+5. Start [ingesting your nmap results](#ingest-nmap-results).
+6. During the first run, [create an index](#create-an-index).
@@ -55,7 +36,7 @@ In order to be able to ingest our Nmap scans, we will have to output the results
Once done with the scans, place the reports in the `./_data/nmap/` folder and run the ingestor:
```bash
-❯ docker-compose run ingestor ingest
+❯ docker-compose run ingestor
Starting elk_elasticsearch ... done
Processing /data/scan_192.168.1.0_24.xml file...
Sending Nmap data to Elasticsearch
@@ -64,3 +45,19 @@ Sending Nmap data to Elasticsearch
Processing /data/scan_192.168.3.0_24.xml file...
Sending Nmap data to Elasticsearch
```
+
+
+
+### Create an Index
+
+1. Open Kibana in your browser ([http://localhost:5601](http://localhost:5601)) and you should be presented with a screen similar to the one below:
+![elk_index_1](.github/elk_index_1.jpg)
+
+2. Insert `nmap*` as index pattern and press "_Next Step_":
+![elk_index_2](.github/elk_index_2.jpg)
+
+3. In the "_Time Filter_" field name choose "`time`", then click on "_Create Index Pattern_":
+![elk_index_3](.github/elk_index_3.jpg)
+
+4. If everything goes well you should be presented with a page that lists every field in the `nmap*` index and the field's associated core type as recorded by Elasticsearch.
+![elk_index_4](.github/elk_index_4.jpg)