Arguing that closures are great because they let you replace for loops is like saying the iPhone is great because it has a better texting interface than a Nokia 3310: it's entirely missing the point.
What makes closures is that it opens up a whole new way of coding. Two things specifically make this possible: that they can be treated as data[1], and that they can capture local variables.
Closures is, for example, what makes the entire premise of OS X's Grand Central Dispatch framework possible. You can create a closure, and seamlessly execute it in a background thread, on a different core. You can build complex queues and dependencies of operations and use all cores in parallel, while keeping a very simple interface to the whole system.
Here's an example pulled from my toying around in Ruby, trying to implement a machine learning algorithm.
thetas = gradient_descent 0.1, 10 do |theta0, theta1|
J theta0, theta1, training_set
end
This code uses a gradient_descent function, and passes it a closure to find the values of theta0 and theta1 which minimize the function. It will automatically figure out how many arguments to minimize, based on the closure's `arity`. The closure captures the `training_set` variable and uses the arguments to call cost function `J`. This would not have been possible, or nearly as easy, without closures. Peruse the entire <60lines code at [2].
Closures open up a whole new world of possibilities.
What makes closures is that it opens up a whole new way of coding. Two things specifically make this possible: that they can be treated as data[1], and that they can capture local variables.
Closures is, for example, what makes the entire premise of OS X's Grand Central Dispatch framework possible. You can create a closure, and seamlessly execute it in a background thread, on a different core. You can build complex queues and dependencies of operations and use all cores in parallel, while keeping a very simple interface to the whole system.
Here's an example pulled from my toying around in Ruby, trying to implement a machine learning algorithm.
This code uses a gradient_descent function, and passes it a closure to find the values of theta0 and theta1 which minimize the function. It will automatically figure out how many arguments to minimize, based on the closure's `arity`. The closure captures the `training_set` variable and uses the arguments to call cost function `J`. This would not have been possible, or nearly as easy, without closures. Peruse the entire <60lines code at [2].Closures open up a whole new world of possibilities.
[1]: Most languages cannot serialize a closure, though. Lisp is pretty unique there. [2]: https://gist.github.com/48a7006e138460b65173