Agreed. Couldn’t black holes warp spacetime to the extent that there is no such place as “inside”? Time dilation is infinite at the event horizon, after all.
As you approach the event horizon, your frame of reference slows asymptotically to match that of the black hole while the universe around you fast-forwards toward heat death. I’d expect the hawking radiation coming out at you to blue shift the closer you got until it was so bright as to be indistinguishable from a white hole. You’d never cross the event horizon; you’d be disintegrated and blasted outward into the distant future as part of that hawking radiation.
The time dilation at the event horizon is infinite for an external observer. It appears that the person falling into the black hole slows down and never passes the event horizon. They redshift until you can't see them anymore.
For the unfortunate person falling into the black hole, there is nothing special about the event horizon. The spacetime they experience is rotated (with respect to the external observer) in such a way that their "future" points toward the black hole.
In a very real sense, for external observers there isn't really an interior of the black hole. That "inside" spacetime is warped so much that it exists more in "the future" than the present.
Professor Brian Cox also says that from a string theory perspective there isn't really an inside of a black hole, it's just missing spacetime. I tried to find a reference for this but I couldn't find one. Perhaps in his book about black holes.
I'm no physicist so happy to be corrected on any of the above!
> For the unfortunate person falling into the black hole, there is nothing special about the event horizon.
This is from a simplified model using black holes with infinite lifetime, which is non-physical. Almost all textbook Penrose diagrams use this invalid assumption and shouldn't be relied upon..
Fundamentally, external observers and infalling observers can't disagree on "what happens", just the timing of events. If external observers never see someone falling in, then they didn't fall in.
> Fundamentally, external observers and infalling observers can't disagree on "what happens", just the timing of events. If external observers never see someone falling in, then they didn't fall in.
This isn't true. As long as the two observers can't communicate with each other, they absolutely can have different results. To put it in simpler terms, the requirement of physics is that an experiment has a unique result according to some rule, but different experiments can have different results even if they break our intuitions.
So, if you measure the position of a particle falling towards a blackhole, you will see it disappear at the event horizon, and perhaps be radiated out later as Hawking radiation from that same event horizon. If you measure the position of the same particle while you yourself are passing through the event horizon, you will it will record no special interaction and see the particle moving completely normally. Since you can't perform both experiments at once, and you can't relay any data from one to the other, there is no contradiction.
This is just another case of a duality in physics, similar to how some experiments measure electrons as point-like particles completely localized to a certain place, and others measure them as waves spread out over a very large area.
If that were true, then black holes would appear as extremely bright balls of red-shifted radiation, as all particles whose trajectory ever moved towards the center of the black hole would still be visible. This is obviously not true, black holes appear as completely black objects that might have an extremely bright ring or halo of matter orbiting.
Sure, timing of the events, but infinity kind of breaks it - if external observers don't see someone falling in, then they didn't fall in yet, and if external observers see that falling in takes an infinite time (as in this case), then that is on some sense just a difference in the timing of events - however, from the observer perspective where that thing takes a finite time, they will also get to observe what happens afterwards.
Black holes don’t live forever. In principle, an external observer could watch you until the black hole evaporates. As mentioned above, if they never saw you fall in, then you never fell in. GR allows for disagreement on durations of events but not the events themselves.
Most that I know would say that it was disapointingly too big and too general to make specific predictions tied to this specific universe we occupy, although it had early promise.
Brian Cox didn't even make the wikipedia page so its difficult to claim he had any major role in perpertaring it as a large scale fraud.
Laughing at someone who says “hey an idea that isn’t falsifiable isn’t a good theory and certainly not something that any other ideas or theories should be constructed upon is I think more serious than not.
Ironically, this is a great way to build actual products (if you’re open to letting them grow).
Three years ago I created a simple app for my family and friends to share recipes together. I kept adding features they requested, and after about two years, the app was apparently good enough that people started sharing it by word of mouth.
By October, the app had grown big enough that I had to start charging new users to cover server costs. I’m now contemplating a future where I work on it full-time.
> I have a boatload of feature creep ideas for you to reject... Shall I post them here, or DM you?
Ha! Either, or feel free to email me at jeff@umami.recipes.
> Take a pic of a recipe, OCR if can
The app actually has an OCR recipe importer (tap "Add Recipe" > "Scan Recipe"). It uses Apple's native iOS OCR library, which inserts text directly from the camera.
It doesn't support importing from an existing photo yet though -- that's high up on my list to build (after Android).
> else host the pic?
You can upload photos for recipes!
> However, an interesting aspect of the "recipe" app logic is that it can apply to, say, mechanics
That's a really good point. I bet much of the underlying functionality could be re-used in a completely different app focused on something like mechanics. Another example I've thought about is a separate app specific to cocktail recipes, with features tailored around mixology and bar management.
I’m excited to start working on the Android version of Umami (https://umami.recipes). Like the iOS version, I’m building it 100% native, this time with Jetpack Compose.
I like to ask people _how_ non-deterministic laws of nature would give rise to free will, and listen as they stumble toward the realization that determinism is orthogonal to the subject.
I just added multi-ingredient keyword search across all of your recipes to Umami (https://www.umami.recipes). If you’re on iOS, I’d recommend checking it out!
Looks promising! I’ll give it a thorough test. Been working on my own recipe clean up tool [1]. Does your app all the scraping and parsing on the client side?
Wow, this is awesome! Your layout, subtle dividers, and serifed font really come together perfectly.
At the moment, Umami's scraping is done server-side. I'd really like to speed it up, so I'm going to start working on a 100% client scraper soon (for the native apps & browser extensions; the web version will always have cross-domain browser restrictions though).
Umami looks great, I just sent it to my wife who just uses Apple notes for recipes right now. I’m curious, how can you offer it for free while having server side processing / syncing etc?
It's few enough users right now that I can cover the couple bucks per month in server costs. If more people start to use it, I'll make it paid (either upfront or subscription, not sure yet), but I'll grandfather all existing users as free.
Interesting, does the server fetch the html itself? Or does the client POST it? The letter would be useful for paywalled sites like Cook‘s Illustrated.
I spent quite some time on ingredient labelling (what’s a unit, quantity etc.). Feel free to ping at hn[at]franz.hamburg
Yep, the client just sends the recipe URL and then the server fetches the HTML. Agreed, it would be better to have the client send the HTML for paywalled sites, which is another reason I just want to do it all on the client.
> I spent quite some time on ingredient labelling (what’s a unit, quantity etc.)
I can relate to you there. It was a long process of trial and error for me to get right, and there are still plenty of edge cases left to handle. Long-term I think AI + NLP will make this kind of thing easier, but for me it wasn't fast, reliable, cheap, or portable enough to run in an iOS app in real time quite yet.
Nice! I dabbled with deep learning and CRF for the sequence tagging. I ran in the same issues you mentioned. Current approach is hand rolled. The „holy grail“ would be linking ingredients mentioned in the directions. So, I could just tap on „…vinegar…“ in an instruction and see that I need 2 tbsp.
Isn't there also the issue of the device overheating ? accidental water projection ? oil projections ? Camera fogging ? The battery dying while doing something and you're suddenly blind ?
I think the Apple demo of people mostly sitting on a couch is not a lack of imagination, but a pragmatic approach of how this device should be used.
You may be right. My hope is the AVP is at least step toward something that could make potentially dangerous physical activities, like cooking, safer (e.g., by showing you the temperature of a surface before you touch it with your hands, as suggested elsewhere in this thread).
It really makes it a pity the Hololens didn't go anywhere. I wonder how much time we'll have to wait for a device that actually aims for that space, and not a portable display like the AVP seems to be.
I feel like even if it is good enough it’s only a matter of time until it lags and you lose a finger. Lulling us into a false sense of security is the real danger.
I don’t know about you, but it’ll be obvious to me in much less than 1ms when the passthrough is lagging. You know your hand moves, so when you don’t see it move it’s incredibly jarring.
You might be special. (I'm not being snarky.) I'm a digital musician. 5 ms lag is just barely perceptible to most of us.
My intuition regarding proprioception is similar to yours, though -- give me laggy input of my own hands and I'm still pretty likely to get things right.
Sound travels around 1.7 meters in 5ms. Acoustic musicians can play fine while being apart more of that distance, although there is a tendency to slow down, unless consciously keeping the tempo up (everyone individually feel that they are rushing a little bit, but that's how they just keep the pace).
There is a huge difference between consistent latency and jitter, but even then I doubt that 1ms latency + 1ms jitter would be very noticeable.
Damn, a bunch of worry warts in your other responses. I feel like I could safely manage a knife if I closed my eyes, or if I was cooking at night and my power went out! Like, I wouldn't immediately chop off a finger... I could manage to set it down. Sheesh.
I'm planning to be a Day 1 user of AVP to see what I can build with it, and I look forward to your cooking app!
Even if it’s almost fine, do you really want to encourage someone to hold a knife with your app and a big black box in front of their eyes, ready for any litigation?
I’m definitely getting a vision pro the moment I can, but I am most definitely not going to wear one when I’m handling anything other than a keyboard on my couch.
I'm open to it! Right now it runs on firebase and a unary gRPC service (nodejs). Probably firebase could be swapped out for supabase, and gRPC is easy enough to host.
Looks cool. As a cook in my fifth decade, may I humbly offer some suggestions? Im on Android and PC so perhaps not everything will apply, but here goes:
1. Import from other recipe apps.
Recipe apps and sites have a finite lifetime. Its great that you already support export, but import from others would be great. Pepperplate for example is sqlite on android. Other apps might be harder, eg the New York Times recipe box. But without import, its hard to commit to a new system.
2. Cooking mode
This is a mode that avoids that situation where you are following a recipe and need to manipulate your phone but you cant because you have hands covered in flour/chicken juice or whatever. Voice control would be awesome too: "scroll down", "set a timer for 20"
3. Notes
Some place where you can comment on the recipe, like "use peanut oil not canola"
Definitely agree. A couple days ago, someone asked for a Paprika recipe importer, which I've added here: https://www.umami.recipes/paprika-import
I'll have to take a look at Pepperplate and NYT recipe box (though, you can already import any NYT recipe link!).
> Voice control
Love this idea.
> 3. Notes
There is indeed a notes section! Though I'm guessing you mean inline comments, like google docs style?
On #1, there are so many potential import sources that you probably cant do them all solo...but attacking the top few and documenting what your import format is would be a win. In short, let motivated members of the community with an itch to scratch help. Good ones could be folded into the mainline if you agree with the submitter, maybe.
On 3, notes can either be for personal use (eg in Cozi's recipe handler) or a shared conversation in a recipe (like what the NYT does). I feel the former is more important, but your users might have different opinions.
But yes, you are right, as far as native apps go. FWIW, I just got an Android test phone so I can start on that version, should be ready in a few months.
I tried that and it was super buggy... Thankfully I built the whole thing in SwiftUI so that I can port it to Mac fairly quickly. Moving as fast as I can!
^ Colors in that space only just recently started getting supported by displays/browsers. And while p3 colors work in css on Safari, I had to use a single-pixel background png to make it work in Chrome.
Just last week, I added the ability to export Umami[1] recipes as Recipe JSON Schema[2]. Writing the code for it was quite pleasant thanks to schema-dts[3].
As you approach the event horizon, your frame of reference slows asymptotically to match that of the black hole while the universe around you fast-forwards toward heat death. I’d expect the hawking radiation coming out at you to blue shift the closer you got until it was so bright as to be indistinguishable from a white hole. You’d never cross the event horizon; you’d be disintegrated and blasted outward into the distant future as part of that hawking radiation.