Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Scott Adams: The Ultimate Food and Exercise App (dilbert.com)
36 points by cwan on Sept 27, 2010 | hide | past | favorite | 41 comments


This is a bit impractical. Recognizing the food with few errors would be horribly hard. I think it is sometinhg like speech recognition or automatic translation: yes, it is easy to create some simple not very good speech recognizer or translator software, but it is incredibly hard to create a high quality one. The only remaining option is to create a user interface where the user can type-in his daily food intake very easily/intuitively. Maybe most users would want to do this on their computer's screen and not on a phone. Of course there are plenty of calory-tracker, 'diet organizer' software on the market like http://www.dietorganizer.com/. But I am not an expert in this market.


> Recognizing the food with few errors would be horribly hard.

I wonder if it would be possible to have some sort of a hybrid system, where automatic recognition is attempted, with a fall-back to manual input where this fails. (The manual input could be to select from several automatic "guesses") If the app stored all the data from all users, this could eventually lead to a really big database, and the automatic recognition might get better and better.

EDIT: Also, automatic vegetable/fruit recognition is actually in production in some supermarkets, for weighing/pricing. (I saw this in France.)


How will you know when you should fall back to manual? There's really no measurement of accuracy that I can think of. If you've reached the point in the algorithm where you know you've incorrectly identified the food it seems like you've already solved the problem.


That is a pretty common problem in machine learning. There are tons of solutions, but they are usually classifier specific. For example, if using a decision tree, one can take test data and determine for each leaf node in the tree what fraction of instances are classified correctly. For leaf nodes with accuracy below X, change the output of that node to "give up".


> EDIT: Also, automatic vegetable/fruit recognition is actually in production in some supermarkets, for weighing/pricing. (I saw this in France.)

I saw this in France. It sucked. I wish I'd documented it better, but as far as I could tell, the algorithm was "display the top five fruit/veg with a similar hue," i.e. you put on a courgette(zuchini) and it offered you--cucumber, lettuce, green pepper, cabbage, grapes. Pinnacle of machine vision algorithms, it ain't.


I disagree people want to do this on a computer screen. Pulling out an iphone is far less friction than remembering to enter this information next time you're on a computer.


Is that white cream topping a low-fat yogurt or some kind of delicious artery clogging velouté sauce? That distinction alone could make a world a difference if you're calorie/fat counting; different methods of cooking food that ends up appearing the same can drastically change the healthiness of a food. In general, someone's time might be better spent learning the basis of maintaining a nutritional diet and everything that goes along with it, rather than snapping pictures of everything eaten, twice.


I have a hack that makes this somewhat simpler — hook into people's online grocery shopping. Then, all the stuff you eat at home can be counted automatically. Obviously, it doesn't work so well if you throw food away (so you have to tell the app that, or allow for a small amount of wastage) or share food (so you have to say how many people live in the household and their consumption profiles), but the data is available now.

Also, you don't need to measure exercise to help with weight-loss; instead just work out how many calories the user is eating and then encourage them to restrict that until they start losing weight (basically we are building an eat-watch à la the hacker diet.)


Good idea, except roughly 40% of peoples' food budgets are spent on food outside of the home.


Supermarkets are cheaper than eating out.


I think what jeebusroxors was trying to say is that supermarkets still contribute a majority of your calories since you get more calories per dollar than you get eating out.


>If it seems impossible that an app could recognize food types, consider that software can already recognize faces, voices, specific songs, and fingerprints. Recognizing broccoli can’t be that much harder.

Assuming "broccoli" is a stand-in for "arbitrary food", that's one of the most wrong things I've ever seen Scott Adams say. Vision is a really hard problem, and there's far more variance in the appearance of food than there is in those of faces or fingerprints. (Recognizing songs is a non-sequitur.)


Perhaps if anything that wasn't just a 'wrapper match' could be farmed out to Mechanical Turk. You could charge a monthly fee high enough to cover the identification, and increase your profit margin as your recognition gets better.


Well if we limit the visual input to the realms of visible light then maybe not, but what if your camera could recognize other frequencies, like infrared and the brocoli gave off a unique set of frequencies that allowed it to be identified. Also the app doesn't need to know about everything, just everything food related, and not even all of it just the major.

It would also need a way to measure weight. It's all fair enough saying it's broccoli but you also need to say how much.


There are tons of probably-unsolvable problems with the idea (how can you tell if something has cream filling from a picture?), but what I'm saying is that even the vision one, which Adams explicitly seems to think is solvable, isn't.


Well if we limit the visual input to the realms of visible light then maybe not, but what if your camera could recognize other frequencies, like infrared and the brocoli gave off a unique set of frequencies that allowed it to be identified

That could work, except it's completely and awfully wrong. Don't they teach physics anymore?


Meh. Just recording the weights I do along with my body weight (along with a graph of each) works fine, and it honestly couldn't be much easier [1].

[1] Shameless self-promotion: http://daytaapp.com


I love your app and wish I could find something similar for the Android platform.

EDIT: Or something similar on Windows/Linux/Web(not Daytum)


Maybe a web-based compliment to Dayta for iPhone? It may be a possibility (though don't bet on it).


Ideally restaurants would upload the nutrition facts of their foods to an online database. (Maybe this should be legally required.)

Such a database would make this app more feasible, as the smart phone could use GPS to figure out what restaurant it is in. Maybe even better would be a numerical code next to each food item on the menu that could be input into a smart phone to provide full nutrition facts. That seems like more of an inevitability than a general picture scanner, offering superior ease of use and, let's face it, technical feasibility.


This mode of doing things is, for the most part, already available. There are apps that work around the same principle as CalorieKing, where common restaurants and foodstuffs are saved in a database with their respective nutrition details. This tends to be very convenient when eating at popular restaurants, or eating commonly purchased grocery items.

The GPS element is an interesting twist on the idea though. How cool would it be if you walked into a Qdoba (just as an example), and your phone vibrated and asked you what you were ordering?


Yeah, I've seen that kind of wiki-like functionality in the Livestrong Nutrition app, and let's face the facts: preparation styles vary wildly among different restaurants. If I type "slice of pizza" into that nutrition app, I get "large slice," "cheese slice," "small slice," and several tens of variations. It's okay, but not ideal. If I go to the local pizza joint, I want to know fairly certainly what I'm getting in terms of calories if I order one of their specialty pizzas. That's where I think a database would be useful.

As for the GPS, that would be very cool.

As has been said before, the biggest obstacle to food apps is mobilizing a sales force to convince all restaurants to embrace a product. And because many restaurants don't really have (or need) IT capabilities, it's actually costly for them to implement computers and train staff to learn the particular system. That's why OpenTable and others are having problems with adoption.


Sort of already exists... only in the other direction.

http://www.ideaconnection.com/blog/2009/04/japanesetoilet-an...


The biggest problem with a system like this is scale, how do you determine the size of the portion? Consider these two pictures of mash potatoes

http://www.mccormick.com/~/media/Images/Recipes/Recipe%20Det...

http://3.bp.blogspot.com/_6slcYXNa204/SwT183CdpjI/AAAAAAAABN...

We all know from years of experience that the first image is a larger bowl and probably contains 6-8 times the amount as the second picture, but image analysis would not.


I amuses me to no end how everybody assumes that the reason we eat unhealthy food is that we don't know any better.

In reality, it is because unhealthy food taste better (not always, but most of the time) and is a lot more convinient.


I disagree, I just yesterday showed someone than although they has switched from Coke to apple juice, the apple juice actually had more sugar. People don't all read labels.


Sure it may have more net carbohydrates, but the real difference is the density of those calories and the types of carbohydrates they are.

Eating fast food contains saturated fats (and trans fats) which are incredibly difficult for the body to break down. Olive oil, on the other hand, is almost all monounsaturated and polyunsaturated fats, which the body can break down much easier.

So yes, I think the GGP poster was more correct in that sometimes people just want a nice big fattening hamburger.


Health data could be a powerful next data layer, perhaps as location has become. There are some devices and apps in the space, but I haven't seen anything yet that focuses on _publishing_ the data, as Foursquare focuses on publishing location and tips and I think Adams has it right... it's got to be easy (and fun) or people will just give up.


Your figure publishes the data well enough.


I'm tired of calorie apps that think I only eat out at Chili's, McDonalds, and other chains or prepackaged foods such as Mac & Cheese, Sara Lee, and Oreos.

The ultimate food calorie app would know the calories in a Mission burrito.


It's so variable. I had a burrito last week at 24th and Shotwell that was more than half the size of my upper arm.


There are no apps that will save you from that kind of decision.


I am pretty sure I have seen an app that does food recognition already. Not sure if it was for iPhone, or traditional desktop.

The one sad aspect of this idea is that in theory, humans should be able to determine their needs without external monitoring devices - certainly when it comes to food.

Perhaps instead of an external device, it would be better to invest in some self awareness? I realize, though, that it is not possible for everybody, in fact, I also don't know a good way to learn it.


Most humans are capable of eating healthy. As most humans are capable of recognizing faces. But not all.


I almost always eat 100% of what I cook and put on my plate. After reading this article it struck me that some people, or even a majority of people, might not does the same. Do you throw away a significant amount of your food?

Of course I live alone and most of the time I cook what I bought in more than one meal... But what I put in my plate always goes in my stomach. How could it be otherwise? But if not, why bother with a picture of what have been left in the plate!


It would be a pain to pull a smartphone out before each meal to snap a picture. To put some UX polish on this idea, it should be an embedded appliance fitted inside the stomach or other suitable organ. I'm not a doctor, but installing a device that "reads all input" would be more accurate, user-friendly, and not require food photography 3-6 times a day.


This takes "feature X shouldn't be too hard to implement" to a whole new level: "Recognizing broccoli can’t be that much harder."


A lot of this is already available. DailyBurn with scanner app will fill in nutritional info from labels. Scales transmit your weight to online services. Fitbit will track your movement levels and sleep. Runkeeper will track how much you run. Not perfect yet, but a synthesis is inevitable.


Perhaps your watch could display both the current time and how many days you have left if you keep living the way you are.

I think walking around with a death clock ticking away on my wrist would stress me out to the point of significantly lowering my life expectancy. Does the app account for that?


Someone just started working on Gourmet Goggles in their 20 percent time.


If this ever becomes popular, I expect orthorexia rates to soar.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: