mheimann / SVMSGDWrapper / 0.1.0

This is a wrapper function for the Linear SVM with Stochastic Gradient Descent, designed to be passed into Bayesian optimization.  It takes in an array of four hyperparameters: a regularization coefficient, initial learning rate, weight decay parameter, and number of iterations to perform stochastic gradient descent.  The data, which is a collection of SMS messages to be classified as "spam" or "ham", is supplied manually, broken into training, validation, and test data (training and validation are used here).  A model is trained with the given hyperparameters on the training data, and the output is the classification error on the validation data.  

This algorithm is designed to be used as a demonstration for Bayesian optimization.  Stochastic gradient descent shines most on larger scale data, but this job will finish in enough time to allow Bayesian optimization with a "reasonable" number of jobs to finish in a "reasonable" amount of time.