Hacker News new | past | comments | ask | show | jobs | submit login

This is not quite like the "AI" that's hyped in recent years, the key component is OpenCV and it has been around for decades. Few years ago, this might have been called Machine Learning (ML) instead of Artificial Intelligence (AI).



So it doesn't actually drop hats onto heads and doesn't use what most people would consider AI... I think I could probably rig up something to gracelessly shove an item out of an open window too which is basically what we're left with. It'd take longer to create the app for booking appointments, and to set up everything for payment processing.


You have discovered a secret area of my personalized "pet peeves" level: just a few days ago I saw an article (maybe video) about how "AI" tracks you in a restaurant. Screenshot was from an OpenCV-based app with a bounding box around each person, it counted how many people are in the establishment, who is a waiter and who is a customer, and how long they have been there.


Image recognition is AI.


There's an old saying: "Yesterday's AI is today's algorithm". Few would consider A* search for route-planning or Alpha-Beta pruning for game playing to be "Capital A Captial I" today, but they absolutely were back at their inception. Heck, the various modern elaborations on A* are mostly still published in a journal of AI (AAAI).


https://en.wikipedia.org/wiki/AI_effect We got it named already, it just needs to be properly propagated until there's no value left in calling things 'AI'.


This is a fair point and maybe someone more well versed can correct me but pretty much all state of the art image recognition is trained neural networks nowadays right? A* is still something a human can reasonably code, it seems to me that there is a legitimate distinction between these types of things nowadays.


Apparently there was a big scare that AI would take programmers' jobs away... decades ago, when the first compilers came out.


Yes, no more machine code. Everything was to be written in BASIC. ...how we laughed at that outlandish idea. It was so obvious performance would be... well... what we have today pretty much.


IKR? If you can't hand-pick where instructions are located on the drum, you may have to use separate constants, and if that's the case what is even the point?


If you spend a few hours writing a bit of code that has to run for decades, millions or billions of times per day on hundreds of thousands or millions of machines it seems quite significant to use only the instructions needed to make it work. A few hundreds of thousands extra seems a lot. One would imagine other useful things could be done with quintillions or septillions of cycles besides saving a few development hours.


We will likely develop more accurate names for the different shades of AI after the fact. Or the AI will.


A* is definitely AI... Why would someone say it isn't?


As a data point in my early 2010s computer science bachelor program it was taught to me as the A* algorithm.


Right, in an AI class. For example, lecture 5 in 6.034: https://ocw.mit.edu/courses/6-034-artificial-intelligence-fa...


No, in an introduction to data structures and algorithms class. It’s pretty odd behavior to disagree with someone who is simply sharing their lived experience.


Yeah sorry, rereading, that came off as way aggressive for no reason. Rereading the chain, I think I just meant that it’s an algorithm that was frequently taught in AI classes, so at least some profs think it counts, even though it was called an algorithm.


Same class name with the same algorithm for me.


same as parent, it was taught to me in an introduction to algorithms class, and no one during my academic stay ever referred to it as an AI.

I don't disagree that it certainly meets certain AI criteria, just saying that particular phrasing (A* is AI) was never used.


Maybe it is easier to define what isn't AI? Toshiba's handwritten postal code recognizers from the 1970s? Fuzzy logic in washing machines that adjusts the pre-programmed cycle based on laundry weight and dirtyness?


Historically, we often call something AI while we don’t really understand how it works. After that it quietly gets subsumed into machine learning or another area and called X algorithm.


Those both sound like AI to me

An example of similar computer can do that isn't AI would be arithmetic


Adding two numbers, each having 100 digits? Reciting the fractional part of Π on and on? I have only seen that done by talented people appearing in TV shows. Seems AI.


Looks like the key component is roboflow (a computer vision/ai platform) and the user trained and deployed a yolo deep-learning model.


That's my point: legislation seldom defines AI rigorously enough to exclude work like OpenCV. I presume that leaves it to courts or prosecutorial discretion.


Thank you! I was wondering how they managed to wedge an AI model into a RasPi. And I couldn't figure out what the AI was needed for.


Be it "AI" or not, these mostly fall under "AI" legistlation, at least in the new EU AI Act. Which is IMHO a better way to legislate than tying laws to specific algorithms d'jour.


This has been going on for a while:

https://www.smbc-comics.com/comic/ai-9




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: