Summary
As commented in the previous post, the destinations for the recovered information will be:
- A simple log
- An Elasticsearch node, fur further data indexing and analysis.
How do we accomplish this? Very easy.
As we saw, during the project creation, we included the dependency spring-boot-starter-data-elasticsearch. This dependency will provide us with:
- A "template" API to perform REST operations in towards an ES node, but without all the REST boilerplate.
- An embedded ES node, that works great in order to perform tests and help you understand ES.
Launching an ES node along with your Spring-Boot application:
If you want to launch an embedded ES node, you just need to:- Enable it in your application .properties file by setting spring.data.elasticsearch.properties.http.enabled=true.
- Check that is properly running this curl command:
- $ curl -XGET 'http://localhost:9200/_cluster/health?pretty=true'
- Some cluster health statistics will be displayed (name, shards available, etc.)
Sending the information to ES
If you recall, our Spring Integration flow ended in a Service Activator. This component specified that @ELKClient.pushToELK() would be invoked along with the payload of the message received (that is the CSV quotation). Well, let´s see the code used to perform the sending (Which I think is pretty much self-explanatory)
Building the document to be indexed
Finally we just need to create a simple bean, with some annotations specifying which fields are to be sent, analyzed and stored and in which format:
Next Steps
Once we run the application, it will load some files containing historical data and it will query the spreadsheets stored in Google Docs. Each entry will be converted to a Java Object and sent to Elastic Search. In the next post we will see how can we build some nice graphs using Kibana (part of ELK stack) and the data we have indexed. One example (showing the historical data):
Moreover, we will see how can we deploy the micro-service, the elastic search node and Kibana to the cloud, so we can continuously gather stock quotations and perform better analysis.
No comments:
Post a Comment