We never hear about broken and worn-out products. Pretty much all gear nowadays is baseline ok, it’s the negatives that really set things apart.
For once, let's turn it all upside down:
We should build a collection about how things break - review broken and worn-out products to teach how to identify cheap products. That's why I built failscout.co
It's simple: You upload your broken products and quickly describe how long you owned them, how often you used them, and where they failed.
Everything breaks eventually, but when it does, can you easily repair/fix it? That's why users can suggest a fix to a broken/inconvenienced product.
What could we do with all this data?
- Identifying the common failure modes of product
- Collect fixes for common product failures
- See if a product's quality has changed or gone down at some point
- Add a simple JSON API so other sites and projects can leverage our data.
Throwing in my kudos. This is such a needed kind of site. What is your business model? If you can't spill the beans due to seeking funding, I understand completely.
I had a similar web-app-idea after my the computer on my Kenmore dishwasher broke after 13 months of use.
My goal was to identify products that have a catastrophic failure after an unreasonably short amount of time / pressure the manufacturers into improving quality control.
Tangent:
I also had an idea for people to log instances of items stolen from their luggage with the goal of identifying airports where this is frequent.
I had this idea after flying into Paris and finding my Leatherman multi-tool gone. After some research, I suspect that it wasn't actually stolen, but legally confiscated because it had a knife and France has laws against folding knives.
Consumers see some failures. Service-people see all failures, over a statistically-valid popupation size.
Be Glassdoor for repair service-professionals. And get the legal shielding / anonymity right.
It consistently amazes me how much undocumented knowledge those folks have. "These model Kenmore switches go bad all the time." "The CV joints on these years leak." If you want the real story, find the person repairing the thing all day, every day.
I just assume the oval "made in america" sticker with the US flag on it implies the item is from that cartel, and is therefore garbage. (Many made in America appliances do not have it!) After that, I double check the list.
> We never hear about broken and worn-out products.
I disagree. When I read reviews (mostly on Amazon) I click on the 1 stars and read those first. Those will mostly all be about how crappy the products are.
That's a different aspect - people write reviews about products that are crappy immediately; however, this is about products which are fine initially, but get broken or worn out later, possibly years later.
But first filter out all the legion irrelevant 1-star reviews ("The courier left the box in the rain", "I couldn't return it because I waited too long to unpack it because it was a surprise present - SO MAD!!!!", etc).
Why is it failscouts domain to solve? That should squarely be laid at the feet AMZN, not some 3rd party. Seems like such a strange twist of logic to be a question like this.
It's why reviews, in particular AMZN's, are just not worth the trouble. Too many fake ones, so people have come up with "clever" rules for themselves to only read the negative ones. Somehow, they believe the good ones are gamed, yet the negative ones are not? Another strange twist in logic. Once you believe the system is gameable, then the whole system is suspect.
You got me wrong. Failscout is a reviews platform. How does it solve the "competitor sabotage" problem on its own platform? I never implied failscout should fix Amazon's.
Apologies, yes, I did read that like a suggestion of 3rd party reviews fixing AMZN reviews.
To continue the thread but being on the same page, I still think reviews are not something that humans can handle correctly. The people looking to game the reviews will always try harder than the people trying to stop them. I just don't trust online reviews from random internet people. I might ask people I know personaly and am familiar with and would be willing to trust their assessments. Randos on the internet just can't prove anything, because of course a bot could say/do everything a person would try to prove they are not a bot.
For those that are willing to put faith in reviews, have fun. Don't let my pessimism stop you.
| I just don't trust online reviews from random internet people.
What we need is to be able to give extra weigth to reviews we can (somewhat) trust. This means that we need to start keeping "files" on each other, and we need a tool for looking up authors of messages and traversing chains of trust ...
Right?
Love this idea. I could imagine this being a great way to alert people to product recalls, or even start class action lawsuits in extremely serious cases.
For once, let's turn it all upside down:
We should build a collection about how things break - review broken and worn-out products to teach how to identify cheap products. That's why I built failscout.co
It's simple: You upload your broken products and quickly describe how long you owned them, how often you used them, and where they failed.
Everything breaks eventually, but when it does, can you easily repair/fix it? That's why users can suggest a fix to a broken/inconvenienced product.
What could we do with all this data?
- Identifying the common failure modes of product
- Collect fixes for common product failures
- See if a product's quality has changed or gone down at some point
- Add a simple JSON API so other sites and projects can leverage our data.