Mm. Google Maps will sometimes confidently suggest roads which without that recommendation would have seemed quite unsuitable. So people put their instincts aside and think Google must know better ... only to regret it later.
It would be helpful if Google could add a few qualifiers like "Note that this is a very steep and mostly single-lane unpaved road" or "This road leads through an unsafe neighbourhood", followed by "An alternative route would be ..."
Ha... It's not that they can't do it now, it's just Google fear the uproar the woke brigade will inevitable make about how dare Google classify poor black neighborhoods as unsafe. But yeah I agree with you, that feature would be damn helpful, especially for someone like my wife driving home at night.
I think it is unrelated. Regardless of how dangerous is an area, there are still people who libe there and have to go there or through there for various reasons.
It is not like life stops when places have an higher crime rate. Peoole still conduct business, have family outings and whatnot.
Besides the feeling of safety is totally subjective and some white neighborhoods you might consider safe as a white person might be unsafe to people of colors going there because they will be viewed under the bias of systemic racism and might be seen as a threat to people.
Absolutely. Same with the favelas – locals are safer than strangers.
Incidentally, the UK government website contains specific warnings about using GPS navigation in Brazil:
---------
The security situation in many favelas is unpredictable. Visiting a favela can be dangerous. Avoid all favelas, including favela tours marketed to tourists and any accommodation, restaurants or bars advertised as being within a favela.
You should:
– make sure the suggested route does not take you into a favela if you’re using GPS navigation
– avoid entering unpaved, cobbled or narrow streets which may lead into a favela - tourists have been shot after accidentally entering favelas
If you’re unsure about a location, check with your hotel or the local authorities.
Yep, I just experienced it today on work.
Used chat gpt to rephrase my test.
My boss: you used chat gpt right? Next time I'll ask chat gpt to send your weekly.
Otherwise someone would need to write a plugin for it, which would probably be pretty simple - I imagine it would look a bit like the llm-mistral plugin but adapted for the Ollama API design: https://github.com/simonw/llm-mistral/blob/main/llm_mistral....
Which honestly is the easiest option of them all if you own an Apple Silicon based Mac. You just download the ollama and then run `ollama run mixtral` (or choose a quantization from their models page if you don't have enough ram to run the defalt q4 model) and that's it.
Always force it to suggest main routes, even if suggest a route with less traffic.