Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
New general-purpose optimization algorithm promises order-of-magnitude speedups (news.mit.edu)
126 points by _pius on Oct 23, 2015 | hide | past | favorite | 17 comments


Very impressive work, here's the arxiv link: http://arxiv.org/abs/1508.04874


Thanks! Even just the abstract is a much better overview than the vague article.


In an interesting coincidence, I have another of their papers in my hand, and it's equally as impressive (I'm using it to improve a related solver in econometrics).

One problem though is that the constant terms of these models make take a while to go down (i.e. don't expect to see this in production anywhere for at least 5yrs), but it's still great to see progress there.


Can you elaborate a bit on the applicability of those methods for general optimization? Are they significantly better than [stochastic] Gradient descent or Saddle free newtons method? For what kinds of landscapes/dimensions it does well?


Well their paper that I'm reading is about solving systems of equations (Ax=b) where the matrix has a special form (diagonally dominant), which is also a huge contribution.

It also uses a similar idea to the convex optimization paper (to find the "dual" problem and try to solve it) so I would guess it has similar issues:

- Very very VERY complex to code, compared to e.g. PCG - Although it is asymptotically faster, the constant terms that you have to amortize might be very large. For instance, imagine you can sort on O(n) instead of on O(n log n) but the constant on the first case is a thousand times larger than on the second.

All in all, I am very interested about it, but you will need some valiant efforts in order to implement it efficiently.


Thanks. I don't have much knowledge on the field; I was wondering what important problems involve large linear systems and/or convex optimization. Here's an expert from the wiki if anyone's interested:

"Convex minimization has applications in a wide range of disciplines, such as automatic control systems, estimation and signal processing, communications and networks, electronic circuit design, data analysis and modeling, statistics (optimal design), and finance."

I wonder if we could use convex optimization techniques to solve non convex problems (and occasionally jump out of local minima traps?)


Maybe spiral can automate much of this work ?

http://spiral.net/

http://www.spiralgen.com/


Seems interesting but after spending 5 minutes on the site, I have no idea how to set it up or use it..


I don't think they have a free version.


These are algorithms for convex optimization whereas stochastic gradient descent and saddle free newton are generally used for nonconvex optimization (neural nets in particular).


A very cool paper, esp how they are able to improve other algorithms. Each of those could have been a paper by themselves.


Wow, that is great news. solving optimisation problems is one of the true benefits of IT that few people know about. From warehousing to routing we use it every day in logistics and supply chain.

It is gradually moving out in finance and accounting from the big guys.


Are there implementations available, yet?


I would bet that not at all. And probably won't be for a few years (if past experience is a good predictor).


Yup, then someone would implement in C. Then in a few months you'll have it in every popular language and many libraries.

People underestimate how much a reference implementation contributes to the real world proliferation of an algorithm.


Did anyone else think it was strange, in an article about runtime efficiency, for them to say that problem-specific optimisations promise speedups of "several orders of magnitude"? Because, well, that doesn't change the order of the algorithm...


It's common to refer to speed ups in powers of ten as orders of magnitude. Maybe that's it?




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: