> Apple Pencil latency is dropping from 20ms to 9ms,
This is why I respect Apple. Most tech forums would have been flooded people whinging about how mere humans can’t perceive such low latencies and how improving them is a waste. Meanwhile Apple knows not to leave anything on the table for UX.
Actually I think it highlights how well Apple manage their message and keep people focused on them: most people probably don't notice or actually care about the difference in latency, and 9ms is not the best in the market, yet it becomes a 'good news story' validating Apple
Latencies bellow 16ms require a refresh rate higher than 60Hz.
For 9ms you need at least 90Hz. Don't know many who do that. Specially if you take in consideration the battery hit that the faster update rate requires.
In the worst case 60Hz + this new pen will add up to (16 + 9) 15ms of latency, right?
If the screen refreshes at T=0,16,33,50 and you create input at T=8 then the input will be processed at T=17 (8+9), so you will need to wait for T=33 before it show up for a total latency of 25ms (33-8).
Which is a lot better than the potential total latency of 36ms with a 20ms pen.
Refresh rate isn't latency. Though you are right that you do have to wait for the screen to redraw.
There is screen latency as a completely separate number though, and those would add up. Many lcd monitors can have up to a 20ms latency. And those numbers would add up like you suggest. It's why gaming monitors can charge a bunch for their 1ms latency at 60hz.
It is also possible to have 120hz and 20ms latency. The numbers are not related.
No, refresh rate directly affects latency. At a refresh rate of 60Hz (without gsync/freesync), with input arriving at arbitrary times, you have an average of half a frame (and a maximum of a whole frame) of unavoidable latency, above and beyond whatever other latency may exist.
If you have enough control over your devices, you can arrange for your input to arrive at just the right time to be included in each frame without extra latency, though.
The kind of latency you describe is that internal to the display; it could be called the video-signal-to-photon latency. The parent you're replying to is explaining the action-to-video-signal and compute-to-video-signal latency, and its effects on the action-to-photon latency.
Well I didn't even consider any latency of the screen itself.
But someone please explain how a refresh rate less than infinity doesn't create some sort of delay. In my naive understanding a 60Hz monitor means it refreshes the screen every 16ms. If you miss one refresh, you'll have to wait 16ms for the next one. This would be an extra 16ms it takes between moving the pen and seeing the reaction, right?
Yeah I was going to say. The iPad Pro display has a 120hz refresh rate and it's highly noticeable, and IMHO far superior to any other mobile device display.
Why would it be important to know? Previous situation was already industry best (comparing other tables, Wacom, etc), the new one is several ms better.
> Someone said "I want to know end to end latency", to which I comment that it isn't important to know this to consider this a positive improvement, given that there's a measurable marked reduction in latency.
It might not be necessary information to know to understand that an improvement was made, but knowing the end to end latency would be interesting to see. The gp expressed an interest in the end to end latency and you responded with a dismissive comment which didn't answer their question.
>The gp expressed an interest in the end to end latency and you responded with a dismissive comment which didn't answer their question.
It wasn't meant to answer their question, it was meant to point on the relevance (or lack thereof) of the answer to the question to appreciate the improvement.
In other words, the gp seemed to dismiss the improvement, because what would be important for them would be to know the "end to end" latency.
The feel of the thing is determined by end-to-end latency, not the latency of any individual part. Without knowing that, we can’t evaluate how much of an improvement this is in relative terms— is the overall latency 50% of what it was or 95%?
We know that the monitor is 120hz and that the previous generation pencil/OS already took advantage of that to show less latency, so we already know its in the 60hz to 120hz scale...
Will there be a multi-user mode? Perfect for separating work from home or multiple clients... It would make iPad so much more usable as a single device for both work and fun.
Also whenever you want to update an app you purchased for the kids you have to log back into your account, and you just don't want kids poking into your stuff in general.
This is pretty cool. It keeps them within the app I want them to be in. you can even set it up so that when they're watching a movie, you can disable taps so they can't exit the video accidentally.
It took my 3yo niece about a week to learn how to escape guided access. At that point it became more of a hassle then a help. It might be fine for toddlers but for any kids over 2 you need a fully sandboxed profile if you’re not handing them a dedicated tablet.
Yup, she brute-forced it. This was back before touchID and faceID and IIRC GA pins were only four digits. The game quickly changed from let’s watch cartoons to let’s figure out how to change apps.
I mean, I just checked and now the pin is 6 digits long, and there's a 10 second time out between attempts.
Your original comment implies there's some kind of fundamental flaw with GA that makes it useless, which apparently couldn't be further from the truth.
Even with a 4 digit pin it takes 5 seconds to change it if it's figured out, and if the child has enough time to brute force 10,000 possible combinations without you being able to intervene and rebuff that behavior, I'd say that's it's own problem.
10,000 combinations aren't required. The child stops when the answer is right. And if they've ever noticed one or two digits when watching the parent unlock the iPad for them to use, it drops to 1,000 or 100 max.
Maybe you’re not aware, but technically you can guess any number in one try, saying you brute forced something with 10,000 combinations doesn’t mean it took 10k tries.
10k is a representation of the fact it’s not likely to guess it easily, and the new GA lock increases that number to 1 million combinations with 10 second lockouts.
Not to mention, you should yes, generally avoid showing a pin to the person you’re keeping out.
You don’t need to enter the pin to use GA. It’s also not the same pin used to unlock the iPad as your comment seems to imply.
just wipe the ipad clean and see where the finger smudges happen in combination with finger movements. i was doing this on my dad's htc palmtop as a kid to access the browser
If you sit down with them and watch/play together, absolutely. If you just plop the kid down in front and let Youtube's autoplay algorithm do the work, that's a very different matter.
It's stressful enough trying to keep up with all the expectations society puts on new parents without the additional hassle of strangers on the Internet criticizing your parenting style because it doesn't agree with their preconceived notions of how you ought to raise your children. Let parents parent.
With the tvOS, you need different profiles for different members of the family. Otherwise your five-year-old ends up seeing a lot of mature TV shows in their recommendations. And the adults see various recommendations meant for small children. If you have small children and Netflix or you don’t have multiple user accounts set up you know what I mean.
This is the simple and easy solution for the tvOS because they can’t realistically expect you to purchase multiple Apple TVs for the multiple members of your family.
On the other hand, they very much do hope you’ll purchase multiple iPads.
This is what is necessary for me to buy an iPad Pro. I really like the idea of the device, but I can't justify the price unless I can share it and have personal and work stuff on there too.
Yay! I am sick and tired of crappy iPhone ports to the iPad where it wants to turn my iPad pro into a ginormous phone screen. At least now I hope that crap goes away.
It has been fascinating to watch the evolution of the iPad version of iOS and the desktop version of Windows 10 to better support a screen/pen/detachable keyboard sort of world.
I don't think this will do anything to help with the "crappy iPhone [app] ports to the iPad", at least not for a while. Those apps will still be there, and they'll likely still be approved in the future.
From my understanding, "iPadOS" is essentially just a marketing signal that the iPad-version of the OS will have more features unique to iPad. For developers, Xcode will still have the option to compile the same codebase for iOS and iPadOS. It's up to the developer to take advantage of more screen real estate when available, though SwiftUI will probably make that easier on the developer (i.e., more likely to show up in the apps you use).
This feels like it could become a repeat of Android Honeycomb, but iOS is a much more mature operating system now than Android was at the time, so I guess we'll have to wait and see.
Nonsense. iPad has had "iPadOS", since the iPad-specific features like Split View landed in iOS 9. This is just Apple communicating more formally about that strategy with its userbase, and increasing support for iPad-centric workflows.
Consider that most of the iPadOS features shown are refinements of or much-asked-for enhancements of existing features:
- Multi-document Split View
- Slide Over app switcher
- Network share and external media support in the Files app
etc.
[edit: props to snazz, who corrected me on the iOS split view release.]
It is an interesting strategy; where others have tried to unify their OS (WinRT/UWP, Honeycomb->Ice cream sandwich).
Honeycomb and WinRT were viewed as fragmentation, I think somehow it won’t be viewed the same with iPadOS
I think the tooling that’s rolling out to support building apps for each platform flavor (Marzipan, Catalyst, SwiftUI) make this a lot easier. There’s also always been a bias in some dev circles towards making dedicated iPad experiences and not just scaling the iPhone app UI. I think Apple has had the benefit of market position to let it take a long time to come to this point rather than needing to differentiate platforms to try to leapfrog over a competitor.
>Honeycomb and WinRT were viewed as fragmentation, I think somehow it won’t be viewed the same with iPadOS
iPadOS runs in at most 10-20 devices (and all are iPad models). HoneyComb and WinRT run on hundreds and hundreds of devices, with widely different capabilities.
WinRT ran on the original Surface RT and maybe 2 Asus devices. After less than a year Asus bounced, as nobody wanted a Windows equivalent of a Chromebook. Even MS themselves killed the OS well before discontinuing the Windows Phone division altogether.
Gingerbread, Android 2.3, was released December 2010 for phones.
Honeycomb, Android 3.0, was released Feb 2011 for tablets. It was a radical new UI and introduced several new APIs. It was the largest Android update ever, except that no devices were updated to it. Android tablets didn't sell very well and developers had little to no reason to support Honeycomb.
Ice Cream Sandwich, Android 4.0, October 2011. This refined the Honeycomb changes and was appropriate for use on phones. It did work on tablets but wasn't really optimized for them until 4.1/jellybean.
Honeycomb wasn't a fork or anything, but the strong divergence and poor sales of Android tablets meant that no one wanted to target Honeycomb until they were actually targeting Ice Cream Sandwich, and then at that point it was barely worth supporting Honeycomb tablets, but still work supporting Gingerbread phones.
I don't think it's particular relevant to the iPadOS situation.
Technically, there was at least one device updated to Honeycomb - the HTC Flyer was originally released with Gingerbread and was later upgraded. I know, because I've still got mine, as handed out by Google at workshops where they tried to persuade everyone to use Fragments for everything. I think they've largely stopped trying by now...
Also relevant, that Honeycomb wasn't released into AOSP until ICS, which meant there were no third-party builds or even support from off-name brands.
Honeycomb was rushed release to get proper tablet support out, and essentially wasn't released for phones (it worked there, but badly). Once the catchup on mobile side was done, you got unified 4.0
The same split happened with the original iPad – iPhoneOS 3.2 was only released for the iPad and then they merged them into iOS in version 4.0. Both were well supported though.
AFAIK, Honeycomb was primarily for tablets due to the (then current) UI scaling terribly and primarily being for portrait orientation. This was the era of 320x240 screens, after all, so there was bound to be some issues when scaling that up to tablet resolutions.
Since most of the improvements ended up being useful for the upcoming "phablets" as much as tablets, Google just merged those features into Jelly Bean and gave up on a separate tablet UI.
> AFAIK, Honeycomb was primarily for tablets due to the (then current) UI scaling terribly and primarily being for portrait orientation. This was the era of 320x240 screens, after all, so there was bound to be some issues when scaling that up to tablet resolutions.
No, not at all. Android's UI has scaled exceptionally well since long before then, and nobody on Android was using 320x240 screens at that time. Android had already long since done the high DPI jump from 320x480 (HTC G1) to 480x800 (Nexus One).
You need a UI that takes advantage of the extra screen real-estate, yes, but there was nothing about Android's core UI toolkit that failed to handle that in any way.
> Since most of the improvements ended up being useful for the upcoming "phablets" as much as tablets, Google just merged those features into Jelly Bean and gave up on a separate tablet UI.
Google never actually forked Android for tablets, so there was nothing to "merge." They just focused on the tablet UIs & devices such that the phone ones had bugs. You can boot up Honeycomb on a phone-sized device and poke about. Everything's basically there, it's just super buggy.
Honestly that's exactly what the iPad needs. The focus on discoverability has crippled the iPad since the beginning, so its about time Apple started making the iPad faster to use. Software designed to maximize productivity always has a learning curve.
I think an interface that presents you with multiple windows by default, and then lets you drag them, is a lot more discoverable than what the iPad is doing.
I agree. This is a challenge with the iOS (and Android) UIs, I think. Mice and trackpads have a more obvious and somewhat richer set of "verbs" for interaction, e.g., right-click versus "try tapping with two fingers," and menu bars make all their commands very easily discoverable in ways that toolbar-only UIs have a lot of trouble duplicating.
I'm not sure, though, what a better approach is on the iPad, assuming that we don't just give a menu bar. I'm pretty interested in a lot of these changes; as silly as it may sound, being able to display two documents from the same app side by side on the iPad will alone solve one of my huge frustrations with the system as it is now, and the changes to the Files app sound like they'll be really useful in practice. I know there will be a lot of folks, particularly in the HN crowd, who will be upset that iPads can't do [Insert Thing They Need], but iOS, er, iPadOS is reaching a point where there's very little it can't do that I personally need. (Adding my proviso that "the iPad can't do this thing I need to do the way I'm used to doing it" should not be conflated with "the iPad can't do this thing I need.")
> This is a challenge with the iOS (and Android) UIs, I think. Mice and trackpads have a more obvious and somewhat richer set of "verbs" for interaction
GTK+3 has a good approach there, I think. It turns the window titlebar into a thick "header bar", which includes a "title" part for easy grabbing as well as a handful of button-accessible menus (fewer than the top levels in a traditional menubar), including a kitchen-sink "hamburger menu" for lesser-used options. It becomes a bit less convenient for mouse&keyboard use, but the touch usability is absolutely there. Too bad that running Linux on tablet-like computers is still way too fiddly, it could be a fierce competitor to the iPad ecosystem for more pro tasks.
Added: And on recent GTK+3 releases, one can shrink the window horizontally and the buttons will simply shift to the bottom when there's not enough room for them in the top headerbar. This solves a flexibility issue with the previous headerbar-only approach (especially on smaller or lower-res screens), while still being quite intuitive.
> GTK+3 has a good approach there, I think. It turns the window titlebar into a thick "header bar", which includes a "title" part for easy grabbing as well as a handful of button-accessible menus
In case you don’t know, that’s what modeled after macOS. (It had it from the beginning. AFAIK)
The one multi-window functionality I need is the ability have the iPad in portrait mode and split the screen horizontally. That way I could have a book or movie on top and notes on the bottom. It seems that iPadOS still cannot do this very basic thing. It's frustrating beyond belief knowing that this device could be so much more useful with that tiny basic functionality that has existed in GUIs for decades.
(And why on Earth does iOS allow me to split a portrait mode vertically, creating two thin strips that are seemingly useless for any work?)
> (And why on Earth does iOS allow me to split a portrait mode vertically, creating two thin strips that are seemingly useless for any work?)
Because developers can then present the iPhone interface.
> It's frustrating beyond belief knowing that this device could be so much more useful with that tiny basic functionality that has existed in GUIs for decades
Other systems have differently-designed interfaces designed for different kinds of interaction.
If the iPad interface allowed tiling apps top-and-bottom, the iPhone interface wouldn't be usable because the aspect ratio would be wrong and the iPad interface wouldn't be usable because navigation chrome would take up just about all the screen.
I could only see this working if Apple had developers using their iPhone landscape interface on that view, but so many iPhone apps don't have a landscape interface so barely any iPad apps would inherit this.
And why on Earth does iOS allow me to split a portrait mode vertically, creating two thin strips that are seemingly useless for any work?
Depends on what work you're doing. I do it all the time. It's especially useful when you're writing something in Notes in one pane and referencing e-mail in another.
The bottom half of the screen is the keyboard space.
When talking about hidden UX, just remember about pull gestures on iPhone. I suppose few more in iPad just adds a little to the pile.
* Pull up a little to go home.
* Pull up more to get task list.
* Pull down from top to get notifications, date and clock.
* Pull down from right top corner to get quick controls.
* Pull down from middle of the screen to get siri suggestions.
* Pull right to see widgets.
yeah but once you learn it, you'll retain the knowledge
Pretty much the attitude of people learning Windows or MacOS in the late 90's. ("literacy") It's gone from having to win an audience in the early iOS days, to firmly having the audience in the company's grasp.
Well yeah, but don’t you think this is exceptionally hard with a touch screen?
Didn’t microsoft try to do this in Windows 8 with little icons in the corners? The problem is that little icons provide only marginally more information, if at all, than the existence of the corner of the screen.
It’s a very hard problem and Apple works on it. They have abandoned previous standards on discoverability because of the inherent limitations of a touch interface on a small screen.
Sacrificing discoverability for usability is the right choice.
I think one issue is that some UI conventions have not been settled on, so I often have to look up how to do something on iOS.
The most recent example of this was “shake to undo.” After looking this up, not only did I learn how to undo typing, I also learned why I would occasionally get these inexplicable warning windows asking if I wanted to undo!
Honestly these new changes (esp. the second monitor for MacOS support, if it works well) will probably finally get me to buy an iPad. This all looks very good to me. Real file system browser; much better multi-tasking; closer integration with macbook.
I think we've crossed the point where, for most people, an iPad (or surface) makes more sense for personal use than a laptop. I have one of the recent iPad Air models and find it more enjoyable to use for everyday tasks like web browsing, light photo editing and sharing, goofing off on youtube, etc. Had I known I was going to like it so much, I probably would have sprung for the more expensive iPad Pro model with face ID, the new pencil, etc. My personal laptop mostly sits unused these days. Admittedly there are some tasks, like booking travel for example, where I still prefer to use a laptop - but those seem to be less and less of an issue over time.
For development work, I don't see my MacBook Pro going away any time soon. But I do see the iPad becoming more incorporated into my professional workflow. It sits on my desk while I'm working and I use it to run things like iTunes, email, slack, and personal reminders - where they feel like less of a distraction than when I run those things on my desktop. Integration between my macbook and ipad is already pretty great when it comes to handoff, airdrop, messages, notes, etc - and Apple seems committed to continued improvement as I think they realize that seamless integration is one of the things that makes people stick within their ecosystem.
This has been exactly my experience. If I’m working out of an office (I do a lot of on-prem consulting) I often leave my laptop at the desk during the week and use my iPad at home - it cuts down on the weight in my bag and has a small side benefit of helping me be better at separating work and personal time.
I might switch to writing code on an iPad. Even after all these years of writing code using a keyboard, I still think better with a pen.. I realize that I'll be pushing text slower with a pen, but I'm that hoping the quality will improve!
This sounds.. crazy to me but hey if it works for you that's great and maybe you should post something! I'd be curious to see a workflow that uses handwritten code.
The “file system” is not new, the files app is at least a year old. What they added now was just zip support (really needed) and USB sticks (I don’t care but many love this).
Don’t listen to all the whining podcasters that claim that ipad doesn’t have a file system. The files app and its integration into many apps works amazingly well, especially in conjunction with icloud drive. And you can access the local directories, so you can put a video file in the VLC folder or a PDF in the Acrobat folder. You can of course also AirDrop to the ipad already into these apps, but if you want complete control.
That ‘rumor’ was never prevailing - it was a theory, very frequently and quickly shot down by most Apple rumor sources, and by Apple itself at WWDC last year...
Calling it iPadOS feels like a step in the wrong direction for that. Calling it iPadOS could make people say "This is an iPad, not a Mac", whereas iOS is a more natural merger.
Never trust rumours. Rumours are tinged with bias.
Rather, prefer to trust the horse's mouth and its history. Apple has always said they wouldn't merge macOS and iOS, but they have a history of bringing iOS frameworks to macOS for consistency (AVFoundation was a big one).
Rumours are the fodder of those who cannot be bothered to learn about something but want the air of clairvoyance to impress friends; they are the life blood of those kids at school whose uncle definitely works at Nintendo and he has definitely played Pokémon Rainbow but you'll never get to see it because his uncle said it's classified.
Nah, then it wouldn't be a rumour, it'd be a leak. If you pay enough attention to the Apple ecosystem and horrendous tech journalism that describes it, you come to know the difference — rumours sound like Windows/Android-user wishlists that macOS/iOS become more like what they're used to, leaks sound like … well, like what Apple would do and does.
What's old is new again! Mac OS became Mac OS X became OS X became macOS. iPhone OS, whatever it was called on the iPod touch and iPad OS merged to be called iOS, and now there's a new iPadOS.
Your comment contradicts itself. It wasn't always called iOS if it was called iPhone OS "for a very short period of time."
And the period of time was not "very short" by my definition at least.
According to [0]:
> Apple announced iPhone OS 1 at the iPhone keynote on January 9, 2007, and it was released to the public alongside the original iPhone on June 29, 2007.
> Apple announced iPhone OS 2 at the iPhone software roadmap keynote in March 2008, and it was released to the public on July 11, 2008 alongside the iPhone 3G.
> Apple announced iPhone OS 3 in March 2009, and it was released to the public on June 17, 2009 alongside the iPhone 3GS.
> Apple announced iOS 4 in March 2010 and it was released to the public on June 21, 2010 alongside the iPhone 4.
It was called iPhone OS for roughly 3 years. About 25% of the time since the OS was released it was called iPhone OS.
The iPod touch launched 2 months after the original iPhone launched, and they both ran iPhone OS. Why are you so stubborn over something that can be easily searched and debunked?
IOS (and iPhone) were trademarks owned by Cisco so Apple had to license them. As far as I remember for iPhone they just released it and then bargained using the exclusive vpn client being Cisco’s. I don’t know what they gave for iOS.
I owned a 2nd-generation iPod touch during the “iPhone OS” era. It wasn't always called iOS. I'm not actually sure what it was called on the iPod touch though. I found some old reference on Apple's site to “iPod touch FW 3”, but I don't remember how much Apple explicitly mentioned it ran iPhone OS.
Unbelievable sneaky 2004 Apple story: Engineers would come to our offices at midnight and practically slip machines under the door. One said, "Officially, this machine doesn't exist, you didn't get it from me, and I don't know you. Make sure it doesn't leave the building."
iPads still don't have a calculator app? I find that baffling.
Also baffling: People who will spend $600 on an iPad, but not 99¢ on a calculator app.
I sometimes wonder if Apple intentionally leaves these gaps in the ecosystem (Weather, Stocks, etc...), hoping that the market will bring forth solutions. It doesn't seem to work all the time.
bigger screen to see memes on. iPads are NOT uber popular because creators like them, they're popular coz it's a giant beautiful screen you can watch netflix/instagram on
PCalc does its math wrong (using floating point), and probably isn't what you want.
Calculator (infinity-symbol) and Calc HD Pro do it the right way.
(Check '(0.1x1024-102)x10-4' on a physical calculator and on PCalc/others, it should return '0' if the math is done using bcd (almost all physical calculators) or ~5.68e-14 on broken floating point math)
This is amazing! I've seen 2 recommendations for PCalc in this thread alone.
If an erroneous calculator app using floating point can have success, this gives me hope to have success as well. I mean no disrespect towards the app, it just gave me a bit of extra motivation.
I'm not sure how to ask this without it sounding anything other than genuinely curious, but it is: why would you spend money on a calculator app? What does it offer that other (free) calculator apps don't? Are free apps just so crippled that you need to pay for something full-featured and/or without something like ads? I'm not familiar with the iOS/iPadOS ecosystem.
You could ask a similar question to physical calculators, I think. Why would you pay for an advanced calculator when you can get the cheap-o solar powered calculators for free in many many places?
Because you want a quality product with some thought put in to it.
Well, I bought the nice physical calculators because I needed the functionality that they provided over the cheapo calculators.
The difference in hardware (processing and screen) between e.g. a solar-polared four-function calculator and a TI-89 are representative of a difference in potential functionality and manifest with a difference in price (driven by real-world production costs).
However, one app doesn't have any less resources than another and/or doesn't have any hardware differences that would e.g. add cost if you want to visualize a graph versus compute a number -- outside of the software costs. I guess I just assume all apps are a race to the bottom here, then, because I find it mind-boggling that there aren't free apps with functionality that mimics any and all paid apps (especially in a calculator app specifically, where the dev-work is generally just _making it_ rather than inventing something new).
That all is to say: thanks -- I guess it's just a difference in features and quality comparing paid apps to free apps here. I don't really understand how that is the case (why don't people continually make cheaper/free calculators of equal quality?), but at least I understand the how. :)
I spent, what, $50 or more on a calculator toolkit for the Mac about 25 years ago, just because it was such cool software. Never got any value out of it, just fun.
We are absolutely spoiled by low software prices, and they’re literally spoiling the market.
Why would you NOT spend money on a calculator app if it’s better, or you just fancy it?
An iPad pro costs 1000 bucks and your time is probably worth around 100 bucks an hour, so if an app is good and costs less than 25 bucks you shouldn’t even stop to think before buying it.
A person with a well paid job that thinks twice about paying 5 bucks for anything he could potentially use for work even once is doing it wrong.
A paid calculator app might be better designed than a free one. Or it might not have ads, when a free one does. Or it being paid could mean that it'll be better supported with any necessary bugfixes and updates. Or the fact that it's a paid app could imply that it's better designed or better supported (without necessarily being true). Or perhaps you really care about something particular that the paid calculator app does right in your eyes, and the free ones simply don't have.
There are lots of reasons one might spend money on a paid calculator app instead of a free one, and I'm sure this list doesn't even cover all the reasons.
My point is that 'it costs money' is sadly not a way to gauge good calculators on ios, and a lot of the 'costs money' ones don't even have a free option to test/check-the-precision.
Note that I never said cost was a way to gauge good calculators. I was merely answering the question of why one might pay for a calculator app instead of using one of the many free ones. Again, this is why I very deliberately said "might" in all of my statements. By saying "might", I'm also implying "might not"...
There's also PCalc lite which is free. I bought full PCalc, but had them both on my phone for awhile until I eventually deleted the lite version. I rarely needed the full version, though I was happy to help support a developer doing good work by buying the full version.
I also like Wolfram Alpha. It has all the data, isn’t just a calculator, so I can enter in English `GDP of USA divided by land area` and get an answer. And a graph. And automatic currency conversions.
> Not sure how much of it requires network.
The downside is that literally everything it does requires an internet connection. Even `3+7=`.
IIRC phone-mode apps that allow it can run on an iPad in a framed mode so they don't look too weird. I get why they don't do that with the calculator because it'd be "poor UX" but "what do you mean I spent $1200 on this and there's no fucking calculator, my Apple phone and laptop both have one, you mean I gotta go sort through the app store to find something that's not gonna show me ads, then probably pay some more money on top of that $1200, to get a calculator, I mean there was one in Windows 3.1 FFS" is even worse UX.
Can someone recommend a good programmer's calculator for the iPad?
If I'm going to buy a calculator app, I might as well get a good one.
Bonus if it can do time calculators. (I used to have a physical time calculator that could do 15 hours + 3 hours, 21 minutes = x minutes, and it was surprisingly useful=.)
Apple in 2009: Web browsing on a desktop is cumbersome and painful
Apple in 2019: We've brought the desktop browsing experience to the iPad
I mean I get it, they're maturing the platform. But window snapping and home screen widgets seem very "Windows Vista". That combined with the new swipe keyboard and finally adding storage support... it really just seems like playing catch up for things that have existed for over a decade now.
I would personally never use a device for professional work unless I had full control over the software on that device. Being locked in by the app store is still a non-starter for me.
At the surface level, it does. However, at the implementation level, I think Apple is slowly reimagining all of these things for a new paradigm — and the new paradigm has needed time to evolve slowly, for Apple to see what directions its users and developers wanted to go in, and to take the time to decide how to do things.
For example: the sidebar on Vista was next to useless because (A) its widgets didn't tap into any data people wanted, (B) the widgets weren't the right size to show any useful information, and (C) didn't fare any better on Windows 7 because they were buried under windows. iOS widgets, on the other hand, directly tap into applications, can come in variable sizes, have a somewhat uniform presentation style, and one visits the Home screen many more times than one clicks the Show Desktop button on Windows.
Windows Vista gadgets and OS X Dashboard widgets seemed to be little toys, the styling to look like a little desktop accessory being more important than their actual utility. The way that iPadOS integrates Today Widgets with the home screen is more like putting useful information directly at users' disposal. Pair this with widgets like Shortcuts, Launch Center Pro, or Pythonista, and you also have a means for quickly starting workflows.
iPad's multiple windows isn't really window snapping in the same sense as on Windows. That's just an implementation detail. The actual concept, at least as of iPadOS (as opposed to iOS 11 and 12), is more like macOS' Spaces or Mission Control.
All in all, to appreciate something, you have to appreciate all the little details. To me, Today Widgets on the Home screen have the potential to offer productivity improvements similar to Alfred. I'm sure there'll be apps with widgets specifically crafted to take advantage of this.
I completely get what you're saying. Thinking about it more now, I was pretty off. Yeah, those were the reasons that Windows widgets failed and Dashboard Widgets on OS X were always hidden which made them pretty unusable. But, on Android I've got a ton of options for widgets and I still don't use them.
I think the only useful one I've really found is weather. Maybe that's because I don't use my calendar. Just going through the list of the ones I have available, I don't see any that really bring any value to what I'm doing.
What's replaced what I've used widgets for in the past is more context-aware tasks. I've got tasks in tasker for doing things like when I plug into the headphone jack, either my podcasts app or music app pops up (depending on which was accessed last).
Obviously I'm more of a power user, but I really think iOS/iPad OS could benefit from a workflow subsystem. Things like activities in Android and the AppleScript system in macOS are insanely powerful for professional users. Unfortunately, it seems like most Apple apps in macOS seem to be doing away with AppleScript API's.
With iOS 13 it gets added as a default app and the shortcuts can be run automatically by triggers, so it's like a more user friendly and natively integrated version of Tasker.
It seems like the triggers in iOS 13b1 are Time of Day - Alarm (Stopped or started) - Apple Watch workout - Arrive/Leave Location - Connect to CarPlay - Airplane Mode - WiFi - BlueTooth - Do Not Disturb - Low Power Mode - NFC Tag - Open App (ex. “When I Open Reddit”)
iOS/iPadOS does have a workflow subsystem, which is how the now built-in Shortcuts and Siri Shortcuts work. What Apple needs is to offer more triggers for intents to be used like, as you say, a device being connected.
> iOS widgets, on the other hand, directly tap into applications, can come in variable sizes, have a somewhat uniform presentation style, and one visits the Home screen many more times than one clicks the Show Desktop button on Windows.
Android has supported exactly this since 1.5 (2009).
> The actual concept, at least as of iPadOS (as opposed to iOS 11 and 12), is more like macOS' Spaces or Mission Control.
I honestly have no idea what you're talking about here. Multiple workspaces aren't exactly a new innovation, and Mission Control just seems to be a fancier but less usable variant of alt+tab.
> Android has supported exactly this since 1.5 (2009)
But I think those widgets are still so Vista/Dashboard-like in nature; they're made to look good and present simple information, but rare are the widgets that are generally more useful than the features built directly into the launcher (of which my favourite Android launcher remains Nova).
> I honestly have no idea what you're talking about here. Multiple workspaces aren't exactly a new innovation
I didn't say the iPadOS implementation was a new innovation. You might be trying to read something into what I said that isn't there.
> Mission Control just seems to be a fancier but less usable variant of alt+tab
The difference is that Mission Control doesn't just handle open windows or apps, it also handles groups of windows (Spaces). It's also far from being less usable, especially if you are accustomed to using multitouch gestures — and with things like MacBook or Magic Trackpads, the crowd of people who're allergic to taking their hands of the keyboard no longer have the monopoly on productivity.
Basically, they took that productivity and brought it to a touch-first system, making it work for touch.
Not a new innovation, no. But the best kind of innovation: taking something that exists and improving upon it.
I am nervous this means Apple will hold back features from iOS to get people to buy an iPad. For example, they announced the new Files app with iPadOS but there is no reason that shouldn't all be available on an iPhone.
FWIW Garner classifies this error as "stage 1" (rejected) in a recent edition, so we're still safe for a while, at least in professional writing. However, I do see this mistake more often than I see either word used correctly in non-professional Web writing, to include much work that is paid, but neither written nor edited at anything like a professional level.
Garner attributes the error—among pros, anyway—to closeness to "leery". I'd guess it's more often just bad phonemic awareness when committed by the general public, though.
These two aren't homophones, but (in American English, anyway) "weary" does sound like "weird" rather than like "wear", while "wary" sounds like "wear" but not like "war", so it's kind of worse.
It specifically says "PS4" and "Xbox One S" controllers.. I hope that just means "Bluetooth" and they're not doing some exclusive BS where only those two specific controllers will work.
I have a 30$ 8BitDo Bluetooth controller (that works on PS4, Xbox, Switch, Windows, MacOs) that I'd love to just throw in my travel bag with my iPad when I'm on the road.
iOS had MFi gamepad support. It has never had general gamepad support. Not sure it does even now. At least now you can have a gamepad that will work with iOS and all Mac games. Most Mac games do not work with MFi controllers.
I was also wondering how Apple will decide which feature will go to iPadOS vs iOS. The files app is a great example. Will there be a new trickle down path from iPadOS to iOS?
Indeed. I really hope the SMB native support in Files comes to iOS. For example, I currently use Documents by Readdle, but I would love to eliminate that step.
This one boggles my mind. I probably wouldn't need it often, but there are some times where I wish I had it handy instead of needing to pick up my phone. (And without needing to install a 3rd party app.)
Files is an iOS feature. It was added in iOS 11. This new update just has new cosmetic features like the column view. I'm sure the external drive and server connections work on iOS as well.
This is because iPadOS is just a marketing name for iOS 13. iPadOS still runs the same Springboard as iOS and is referred as iOS internally.
Indeed there's no reason those shouldn't be available on iPhone, so thankfully they are. The name is just marketing; if in doubt, just see what's new in iOS in general.
iPhone is Apple's biggest product by a large margin and it's in a super competitive space. There's no way they'll hold back features that reasonably work on it.
While I don't think that those necessarily mean it can't work well on iPhone, it does bring up an important point about feature differences between iPad and iPhone, and one that Apple surely considers: They are _different_ devices, and are _different_ sizes. Some functions will work better on one device than on the other, and vice versa, because of this. This is why it makes sense that iPadOS is now a thing, and why features might exist with one and not the other. It shows that Apple is carefully designing the OS for the different devices.
Alternatively, I use a lot for a long while Tydlig. Sadly it has been stagnant for a while (I'd like to be able to save "sheets", for instance). But its interaction mode (akin to a spreadsheet, in a sense) is very refreshing.
PCalc works as a notification widget - so it looks like you can have it floating on the new iPadOS homescreen - or just pull down and swipe to access it everywhere else
Surely it's not some kind of meddling with the device width to make responsive web design even more confusing. That would be very odd.
My hope is that it's some kind of user agent manipulation that thwarts device sniffing so you don't see a mobile phone stylesheet or get redirected to a mobile site while using your iPad.
Yep, that's it. Even major websites have this problem, with YouTube being a particularly bad one. If you have an iPad handy try loading up YouTube, then do a long-press on the reload button and request desktop site. It's a much better experience.
I don't have the betas, but I expect Apple is applying this automatically based on viewport size. Full screen Safari instance? Desktop page. Narrow viewport in split or slide-over multitasking? Mobile page.
Wow. I've somehow missed this little gem. I usually requested the desktop site via the share sheet option but this is far nicer. Ditto for a quick way to disable content blockers. Thanks!
This is what always confuses me when people say "macOS/iOS is soooo intuitive!" - discoverability of features is well, horrible. You have to long/3d/long 3d press all manner of elements to discover, not to mention single/two/three finger swipes in differing directions, from screen, from edge.
Because these gestures are akin to keyboard shortcuts — they're not the only way to do achieve something, just a convenient shortcut. One could always request the desktop site via the Share sheet.
It's worse because keyboard shortcuts you could learn from the menu bar. Mac convention has been that all shortcuts should be discoverable there, and you even see the menu title flash when you hit it.
To your point about request desktop site being in the share sheet, that's true. But recently closed tabs are not.
Advanced iOS features are definitely less discoverable. But it's a trade-off for how easy the essentials are.
Personally I'm always impressed that people put up with this kind of stuff from Apple. Press the home button 3 times, hold it for 3 seconds then swipe in a Z shape to take a screenshot!
Yes, this. This bridges some gaps for sites that are built as either mobile-specific (but not truly responsive), or as desktop-only. I would have thought that the whole device detection, mobile-specific thing would have died many years ago, but it is stubbornly persistent. True responsive sites have worked fine for ages, and should continue to do so.
On the desktop-only site side, it looks like Apple has also done some interaction refinements for problems touch-based users can hit on these sites, especially around distinct hover vs. click interactions, the need to "tap-twice" to essentially "focus then click", etc.
Honestly, just using the basics of "responsive web design" (search phrase) is most of it. That is, implementing your site using CSS media queries to adapt your layout across multiple device sizes. Contrast that to using hacky device detection to deliver some entirely separate and "mobile-specific" site. One anti-pattern that often happens is the "mobile" detection is wrong, and delivers some phone-sized UI to a tablet. Instead just leverage CSS to make your site look good at the user's current display width, whatever that is, without having to know anything about the device. This is also great for desktop users, e.g. who might have your site in a browser side-by-side with something else, or similar.
All of the "evergreen" browsers have a mobile design view built into their dev tools these days. You can enable that, switch between specific device screen profiles, and continuously drag the "screen size" handles to watch your site's behavior at different sizes. This is great for development, since your web inspector view stays at full size.
Combining CSS media queries with CSS flexbox is a hugely powerful way customize the end-user experience for different devices. Flexbox not only makes basic layout adaptations straightforward (or free!), it lets you handle some otherwise tricky cases using tools like reordering support.
It's probably also worth searching for "mobile first web design" for further resources.
Thanks, looking at a few CSS frameworks I think I've been confused that they seem to have coarse resolution break values and I thought this was an anti-pattern ("don't detect mobile with a hard-wired resolution"). Maybe my mistake is to confuse resolution (after all modern phones often have equivalent resolution as many desktops) with width, so mobile is portrait, desktop often landscape. Instead of a mobile type presentation and a desktop type presentation just go with say one column if <600px or something, two columns if more and not worry if it's a mobile device. Maybe that's what the CSS frameworks are actually doing.
It's already available with a long press on the refresh icon, "request desktop site". Probably it'll let you request a mobile version now and requests the desktop site by default.
I hope it's more than this. A lot of sass websites have features inexplicably broken on the ipad pro, even though they display perfectly. And some google sites refuse to load. This is even if you request desktop.
But go to icab mobile, set user agent to desktop, and the sites load and work perfectly.
No information on whether the iPad Pro keyboard is fully compliant in terms of arrow keys, etc. This really gets in the way of things like Cloud 9, hosted VS Code, etc.
As much as people (myself included) wish we could do native development on the iPad, I would quickly settle for just supporting Cloud 9. The only reason I have to bring two laptops and an iPad with me when I travel instead of just one laptop and an iPad is because Cloud 9 doesn't work on an iPad, for absolutely no good reason.
Off-topic: "Opening workspaces (including access to databases and app frameworks) will be disabled on June 30, 2019. Cloud9 will stop working on December 31, 2019." [https://c9.io/login]
voice control (an accessibility feature for now) lays the groundwork for more pervasive computing for all, but mouse support fills a hole until that happens.
i don't think voice control will replace physical controls, but do think it will grow in usage over time to become an important augment to physical controls.
now i'm just waiting for pervasive handwriting input with realtime OCR and indexing (for instant search), which can replace all sorts of note-taking and annotation workflows (and the requisite printing to paper that now often entails).
The small print on the page that mentions this stat says that the measurements were taken on a new 12.9 inch iPad Pro using the Gen 2 Apple Pencil. Not sure what latency reductions may make their way to the other iPads, but nothing is stated right now.
It might be. On monitors the difference between a 120hz refresh rate and 144hz refresh rate is about 14ms between frames. Some people (claim) to tell the difference.
Interesting development, I'd had the impression that they were moving towards MacOS on all devices, but I suppose it comes down to naming as very much is shared, though it would be nice if a small macbook was actually the same as a large ipad with a keyboard.
It seemed like they were on a path to make everything iOS including the laptops. Granted it was more of a iOS-ification, but seemed like the general direction. Now I'm curious if this is a shift in direction if they are going to split ipads from ios.
I'm having a hard time finding any info about how this affects iOS / iPadOS app development. Do you still just build & submit once and it targets both platforms?
For awhile apple was trying to "bring iOS features to the desktop", thankfully now they've realized each device form requires a special user interface.
I wish they'd introduce multiple accounts on the iPad. That, and let me directly and securely access files (and in particular my humongous photo library) on my Mac over WiFi, without all the iCloud bullshit. I know it's not going to happen, and Apple will continue to pretend this is not something users want, but one can have dreams, am I right?
An way to think about this is that iPadOS is now (presumably) a separate team at Apple. Each year a new version of iPadOS will come out and that team will need to have features for it.
When it was just one platform, it felt like iPad features only got implemented when the iOS team had some had spare time away from developing the clearly higher priority iPhone features.
That looks like a fantastic feature, especially due to the fact that you can use Apple Pencil to control your Mac. I wonder if that will cannibalize Wacom sales. Having used both Apple Pencil + iPad Pro and a Wacom tablet (separately), I found it real hard to draw on the Wacom table without seeing what I'm drawing under the pencil.
Wacom also sells drawing tablets with integrated displays [1]. I've never used them or an Apple Pencil, so I don't know how they compare, but Wacom isn't totally caught out here.
On the other hand, I can see a lot of people buying (or already owning) iPad Pros and using this feature who might not buy a dedicated device for it.
Wouldn't Apple be okay with everyone switching over to their own iOS based ARM products? They wouldn't have to deal with Intel for processors regular CPU's anymore and they wouldn't have to support an extra software architecture on the development side of things.
I think the Apple of old that was willing to make bold decisions, even at the expense of part of its own product range, is gone. Soon Apple will start selling printers again!
i.e. I can connect multiple displays with Android via hdmi and a usb adapter but Displaylink technical support suggested for an extended desktop I'd need to pester Google.
Probably worried too many apps would assume you have a mouse, so touch support would suffer from mixed-paradigm induced jank. See: the Surface. I wish they'd enable them only for remote desktop apps, though. Then I could sell my Macbook, get a Mac Mini, and remote in when I want to do desktop stuff.
What do you use for accessing macOS devices remotely? I've never had luck with VNC, very slow and choppy. The best solution I've found is chrome remote desktop, parallels remote, etc. Wondering if you found something that doesn't rely on an internet connection and can stay local to the LAN.
The built-in screen sharing for MacOS (just called "Screen Sharing" and distinct from the "Remote Desktop Connection" program) is pretty good. I think it's just VNC though, so if you've not found that acceptable then yeah, probably not gonna work for you. An X client on iOS would be pretty cool and someone might go to the trouble of making it if mouse support were added. Let me stick a BSD in a closet somewhere and client in from my iPad Pro with native-feeling X.
Everyone else's software/hardware is just that bad, so Apple can fuck around for years and switching's still a tough sell. That's why I'm still around, anyway. This WWDC looks like a move in the right direction for the first time in a while, at least.
Because we as developers also desperately want a UNIX-style system, and our lives are so much easier when we can choose one that won't get us laughed out of the room by commercial software vendors.
Because I can fully customise it using something like BetterTouchTool.
And so in IntelliJ I have custom buttons for executing common commands. I never remembered what each function key does in each app so it's useful for me.
This is why I respect Apple. Most tech forums would have been flooded people whinging about how mere humans can’t perceive such low latencies and how improving them is a waste. Meanwhile Apple knows not to leave anything on the table for UX.