Doesn't the graph on that site show that electric trucks would be a huge win for lower noise in the city? I would be interested in the standard deviations of these line plots, because I would guess that the engine noise varies much more depending on the acceleration of the truck. I am not that much annoyed by a truck driving with constant speed, but much more annoyed by a truck which is heavily accelerating. And maybe 9 out of 10 trucks passing by are just driving with constant speed but the 10th will be the one which is annoying because it is accelerating.
Could you please elaborate on this. I would really like to know if autoencoders are still useful for classification if I have only labels for a small part of my training data. Is unsupervised pretraining still useful or was it completely replaced by other techniques as the article somehow seems to suggest?
A single layer autoencoder with n nodes is equivalent to doing PCA and taking the first n principal components. If you're familiar with PCA in natural language processing, which is called Latent Semantic Analysis (or Indexing), projecting high dimensional data on a lower dimensional surface can actually improve your features. This is because similar words will project onto the same Principal component allowing you to model some semantic information.
Autoencoders with more than 1 layer are more interesting because you end up doing what is essentially non-linear PCA by projecting your data onto a curved manifold. This famous paper, "Reducing the Dimensionality of Data with Neural Networks" [0], by Hinton shows the improvement in how linearly separable documents become once multi-layer autoencoders are used.
The old argument was that unsupervised pretraining helps get proper weights faster, but this has largely been disproven. However, I do believe AEs assist in semi-supervised learning because they project the initial data into a more useful space. As you can seen in the article I linked the projected data are much more linearly separable.
And as a practical evidence: I used a 5 layer AE in the kaggle black box competition [1] to eventually outrank of team of Hinton's grad students. The problem had a larger unsupervised data set with a small number of labels. Using the autoencoders before the MLP ended up nearly doubling our team's score.
Thank you for the answer. That makes very much sense.
Just a side note: As far as I know a single layer autoencoder and PCA are only equivalent if all units have no activation function (linear activation function), which is usually not the case.
Don't you think nginx is better for serving static files? I first served directly croon node with express and did a test with chrome with a simulated modem connection speed. A simultaneous fast client was blocked because the number of connection nodejs/express was exhausted by the slow modem connection getting a lot of js files...
I think I also read many other blogs suggesting serving static files with nginx in front of nodejs. Are you sure that this does not make sense?
It's not so much a matter of 'better' vs 'worse' as a balance between complexity and speed. node has very fast IO - that's what it's known for - and unlike Python or Ruby it can handle static files at speed: http://i.stack.imgur.com/amngX.png (from the second post at http://stackoverflow.com/questions/9967887/node-js-itself-or... which provides some excellent discussion). nginx is faster, but small projects that don't need load balancing may benefit from having less software to install.
As the article mentions, you'll probably want a load balancer for high availability. Whether you use haproxy or nginx for that is a whole different discussion.
Faster than Ruby or Python doesn't mean anything if it's objectively slow. And it is. I've only been able to serve a small handful of concurrent users at a time with Express. I'm talking 5 or 6 before things get out of hand.
You should aim to have the bulk of your payload be static, and all of the static payload be served by apache or nginx. It's the difference between getting good feedback from a Show HN post or completely missing out and looking like a fool.
> I've only been able to serve a small handful of concurrent users at a time with Express.
Something's massively wrong with your node setup. node won't be as fast, but it should be on the same order of magnitude as nginx: https://github.com/observing/balancerbattle (or any other benchmark)
Yes. Also, if several nations have colonies on Mars, I would assume that they would all claim independence roughly at the same time and soon after would form some kind of pan-martian government.
It just makes more sense to increase collaborations between colonies if you are so far away from earth. If there would be a US colony on Mars and a Chinese colony on Mars, these people would probably have more interactions with each other than with any person or nation on earth. There are huge advantages of exchange of goods between colonies, so that not every colony has to produce every special item or mine every kind of resource.
Thank you for the link.
For other people interested in a comparison, there is a very nice graph on wikipedia which compares the solar cell efficiency of many different technologies and how they improved over recent years:
https://en.wikipedia.org/wiki/Solar_cell_efficiency
And touchpad designers too. I have a Lenovo Thinkpad Yoga and the Touchpad is unusable. It has this mechanic click where the full touchpad lowers down, and while it is doing this the mouse moves slightly. Therefore it is hard to click precisely on something if during the click the mouse is moving. They probably never tested this before deciding to put this into a thinkpad or they tested it and nobody cared to hear what the testers said...
Worse. They market tested it on their best laptop (the X1 Carbon). It flunked on the market, with the 2nd gen X1 receiving appalling reviews. The 3rd generation X1 Carbon reverted to the old touchpad and its marvelous Thinkpad mouse buttons.
I am wondering if this centralized infrastructure for financial news is actually a good idea. This could always happen again and again. All the employees in these news companies could get a mass of insider information which they could sell.
Isn't maybe an alternative decentralized news publishing service a better idea? Couldn't the CEO of a company publish their financial news only on their own website at the given publication date? Why is it necessary for these news to be stored in some central news database days before their publishing date?
And I mean these as honest questions because I have really no idea what the advantage would be?
And another related question: wouldn't it make sense with today's Internet infrastructure to reduce the interval between earnings reports. Maybe it could even be something like a continous automatic publishing of these company finances. Always when some financials change it could directly be published. That way all investors would at all times have the same information as the insiders, so everyone would be on the same level. Of course some extraordinary news like mergers or acquisitions might still give some people insider information who prepare the deal, but at least the quarterly earnings could not be insider information.
> Why is it necessary for these news to be stored in some central news database
Amusingly it's because of the hedge funds. They want to have a limited list of places to check for news to make sure no one gets there ahead of them.
This is why they were so upset when the Netflix CEO made something public on Facebook -- because they weren't watching his Facebook page for news (but they sure are now!).
The SEC actually has a very limited set of places that you can release financial news because of this.
I mean, hedge funds have the resources to monitor thousands of websites. Joe Retail Investor doesn't. Reg FD was introduced to level the playing field.
I was under the impression that it was because financial news is an extremely sensitive business. Twitter has the power to move the market now, and we already know the damage simple, fake websites can do[0]. There's a lot of power in publishing public financial information (as the article demonstrates) and that is why we have regulation. People need confidence in their information if they are to have confidence in their market.
Dear Patrick, are there any plans to incorporate SEPA payments and/or SEPA Direct Debit into Stripe? Europe is a huge market. Many people here don't even have credit cards. I think you would gain a lot of traction in Europe if you would incorporate this...
Good to hear that the pirate party member Martin Delius is the head of the parliamentary committee leading the investigation. I'm wondering if we would be that informed if a member of another political party would be in charge, because one of the main programmatic points of the pirate party is transparency.
Would it make sense to harvest asteroids for water and bring it to the moon?
I am thinking about just one spaceship which could go back and forth between some asteroid and a moon orbit. It shoots the water down to the moon, so that it does not even have to touch-down and take-off again. How much energy would be needed for such a mission? Probably it would be a very complex mission to get right, but once it is established you might have water for free on the moon.
Besides what tdy721 and monk_e_boy said, water would be tricky to be had in our near-space neighborhood for two reasons. First, Earth, being a real planet, cleaned its surroundings of most of free-roaming objects. Besides the (domesticated) moon, there aren't that many wild objects around to look for. Second, water has a relatively low boiling point and on the Earth's distance from the Sun there is a lot of solar energy to melt the asteroid ice and boil away (like what we see happening on comets) before we can extract it, and then having difficulties with its handling after we have it due to very same reason.