Thursday, 13 July 2017

ODAX September 2017 Iron Condor adjustment

Summary

The ODAX September 2017 Iron Condor reached today the adjustment point on the Call side (30 delta), as the Dax has climbed more than 300 points since I opened the spread a week ago.

Dax index on 12/07/2017 Source Investing 
This is a pretty bad scenario, with a relative quick movement against you without having consumed much time and without almost any Theta decay.

Tuesday, 11 July 2017

Trading ODAX stock options

Summary

In the unlikely case that you have followed this blog since its inception, you might have noticed that it contains different "learning notes" that I have compiled while learning different technologies and tools, all of the software related, which I wanted to share with anyone interested.

Apart from the software development, I have other hobbies, one of them being the stocks, derivatives and the markets in general. A while ago, I started learning about stock options trading and after some studying I began, at the start of 2017, performing some actual trades using the options over the DAX30 index (the index holding the 30 biggest traded German companies).  They are called ODAX, by the way.

So today, I would like to introduce you the system I have followed for the better part of this year, which I would like to polish as I continue learning about stock options trading. Feedback is always welcome!

Sunday, 7 May 2017

Zeppelin + Scala: Consuming a HTTP endpoint

Summary

In our last post, we discussed on how could we execute Spark jobs in Zeppelin and then create nice SQL queries and graphs using the embedded SQLContext (provided along with the usual Spark context).

Today, we will see a way to populate your Spark RDDs/Data-Frames with data retrieved from a HTTP endpoint/REST service. We will focus on the parsing of the resulting JSON response. Additionally, we will also learn how to import additional libraries to Zeppelin.

Friday, 13 January 2017

Fast prototyping with Zeppelin, Spark & Scala

Summary


It´s been a while since I wrote anything here (again) but I don´t have much free time nowadays. Currently I´m taking some training on Scala and Spark, which, by the way, brings us here today.

A recipe for quick prototypes for Data Analysis: Scala + Spark + Zeppelin


If you remember well, I wrote some time ago about some personal learning projects I was working into, which basically picked stocks price information from the Web (using Spring Integration) and ran a couple of Spark analysis that were lately displayed in an AngularJS interface.

Nothing complicated at all, but rather verbose and time consuming to set up, specially if you just want to learn the subject.



Sunday, 23 October 2016

How to set up e-mail notifications for Cron

Summary

Last week, I found myself setting up some jobs in a Unix environment (Centos 7) for which I wanted to get error email notifications and I ran into some troubles setting up the proper configuration, so I would like to take the chance and share the actions I took in case anyone finds him/herself in the same situation.

Please note that we are going to use Postfix as mail service here!

Initial Crontab setup

This was the setup I was facing, some jobs scheduled that might fail sometimes and one that will always fail as it was badly written:
$crontab -l

30 15 * * * java -jar /path_to_your_jar_app/job.jar
45 13 * * * /path_to_some_script/script.sh
45 19 * * * obviously_wrong_command
I wanted to be notified by email of any failure (and as I was using Cron, the expected behavior was to get an email  notification when either process returns a non-zero exit code).

This is an example of an email that should be generated for the above setup:
Message  1:
From cron_user@homepc  Sun Oct 23 19:45:01 2016
Delivered-To: john.doe.the.sysadmin@mailservice.com
From: "(Cron Daemon)" 
Subject: Cron  obviously_wrong_command
Content-Type: text/plain; charset=UTF-8
Auto-Submitted: auto-generated
Precedence: bulk
X-Cron-Env: XDG_SESSION_ID=3289
X-Cron-Env: XDG_RUNTIME_DIR=/run/user/3001
X-Cron-Env: LANG=en_US.UTF-8
X-Cron-Env: MAIL_TO=admin_user_alias
X-Cron-Env: SHELL=/bin/sh
X-Cron-Env: HOME=/home/cron_user
X-Cron-Env: PATH=/usr/bin:/bin
X-Cron-Env: LOGNAME=cron_user
X-Cron-Env: USER=cron_user
Date: Sun, 23 Oct 2016 19:45:01 +0200 (CEST)
Status: RO

/bin/sh: obviously_wrong_command: command not found
Keep reading if you want to get the same setup! 

Sunday, 16 October 2016

Introduction to Spring Cloud Dataflow (I)

Introduction

This is the first of a series of posts on how to develop data-driven micro-services with Spring Cloud Dataflow (SCDF from now on).  For now, we will see what is the approach proposed by this framework and how to build locally the basic components: Source, Sink and Processor.

Also, if you are familiar with Spring, we will take a look to the already-made components that are available for you to use so you don´t have to reinvent the wheel.

Contents
  • What is Spring Cloud DataFlow?
  • Introduction to the API: A simple Producer (Source), Consumer (sink) setup.
  • Prerequisites: Kafka and Zookeeper.
  • Coding a simple Source: HTTP Poller that retrieves live stock prices.
  • Coding a simple Sink that store the results in a file.
  • Writing an additional Batch Source.
  • Summary & Resources.

Sunday, 28 August 2016

How to clean up ElasticSearch with Curator 4.0.6

Summary

Today, I would like to share with you a quick introduction to a tool that cleans and maintains your ElasticSearch cluster clean and free from old data: Curator (thanks flopezlasanta for the tip on this!)

This tool, along with ElasticSearch itself, evolves very quickly, so when I was looking for information on the subject and I found this blog entry from 2014, I noticed how much the way of working with the tool has changed. Anyway, kudos to @ragingcomputer!

Installation instructions

Before you install Curator, you need to get the Python package Installer (Pip), so Python is another requirement, Note that if you are running a version of Python newer than 3.4 (version 3) or 2,7 (version 2), Pip comes already installed with Python. More info here.

Note: You need to be super user to perform the installation.

And you are ready to go! Now let´s check the other two files needed to make it work.