The following code will learn general kernel combinations based on gradient descent optimization and standard SVM solvers. The code is in Matlab and uses LIBSVM as the internal SVM solver. The code is quite flexible and you can plug in your own kernel function and regularizer. I've included examples of functions to compute sums and products of base kernels subject to l1 or l2 regularization. It is also possible to use precomputed base kernel matrices. The code computes a common set of kernel parameters across the entire training data set. It is fairly straight forward to modify the code to compute a different set of kernel parameters per training data point.
The function for performing gradient descent is COMPGDoptimize.m while GMKLwrapper.m shows how to call the function. Please go through the comments in both files before running the code. By setting the parameters in GMKLwrapper you can tackle various problems such as classification, regression, etc.
This code is made available as is for non-commercial research purposes. Please make sure you go through the licenses. The LIBSVM source has been modified to return the value of the objective function at the optiumum.Download GMKL code in Matlab
Please send all feedback to Manik Varma [firstname.lastname@example.org].