The loss function is simple as doing the following. Which is simply the pin-ball loss function.
def tilted_loss(q,y,f): e = (y-f) return (q*tt.sum(e)-tt.sum(e[(e<0).nonzero()]))/e.shape
When it comes to compiling the neural network, just simply do:
model.compile(loss=lambda y,f: tilted_loss(0.5,y,f), optimizer='adagrad')
I chose 0.5 which is the median, but you can try whichever quantile that you are after. Word of caution, which applies to any quantile regression method; you may find that the quantile output might be extreme/ unexpected when you take extreme quantiles (eg. 0.001 or 0.999).
A more complete working example can be found here: https://github.com/sachinruk/KerasQuantileModel/blob/master/Keras%20Quantile%20Model.ipynbcomments powered by Disqus