Like you, I think that Xcode maybe gets a worse rap than it deserves, but it's also endlessly frustrating.
First, the performance is just bad. The responsiveness compared to apps like VSC or Panic’s Nova is night-and-day.
The attention given to the design of new features is piss-poor. Placing the AI functionality on the left sidebar makes no sense; all the other tools on the left are project management; the "let me run weird functions and interact with stuff" UIs like terminal, debug and logs are in the bottom panel. Or maybe a new tab in the main workspace area?
The SwiftUI preview canvas can't be floated as a separate window, making it all but useless on anything smaller than a 16" MBP (and only barely usable there). In fact, I think it might be impossible to use Xcode in multiple screens altogether…?
Old simulator versions and cache files hang around forever, you need a third-party app like DevCleaner just to keep your storage from filling with nonsense. Cryptic messages like "copying symbols to device"… clear-cache that doesn't seem to clear-cache, that stupid list UI for info.plist…
I never thought I'd have anything nice to say about PNPM package management, but you can always just delete `node_modules` and reinstall and count on things working. Swift package management is a cryptic mess, and their insistence on using a GUI instead of a basic JSON manifest just compounds it. Like the info.plist thing, a lot of Xcode is based on a developer UI philosophy from the Mac Classic days that has mostly been abandoned by the rest of the world.
Mostly, I think the vitriol surrounding Xcode is that Apple seems to think they're doing a good job; meanwhile their most ardent and adept users are insisting that they are not. Same boat as MacOS, really.
> functionality on the left sidebar makes no sense
they really just need to get rid of 'sidebars' and go full-on panel oriented ui so i can put whatever inspector/tool on whatever edge of the window i want; i'm constantly switching between opening panels and closing panels and hunting and pecking for the right panel-within-a-panel with those tiny icons...
I'd like an option to make things like inspectors into floating utility panels like used to be common in Mac apps back in the OS X 10.0-10.6 era. This would be really nice for multi-monitor setups… your editors could use the entirety of the main window while inspectors get tossed over to the laptop's built in screen or maybe onto one of those funky vertical strip external displays.
> their insistence on using a GUI instead of a basic JSON manifest just compounds it
I think this is a big part of the problem. Apple owns the IDE and the programming languages; in theory this should lead to a great experience. In practice, because they insist you only use their languages with their ide, and their ide with their languages, it leads to lousy tool design.
Features that would be best implemented as part of the compiler suite are instead be implemented in the GUI. File formats that could be simplified live on, because everyone is using GUIs in the IDE to edit them anyway.
Fixes that should be prioritized in the IDE get punted because the IDE is not competing with any other IDE, it's the only way to develop the language, people will use it anyway, etc.
> modern incarnation with all of the flaws and seemingly missing QA process
… so I’ve been kind of biting my tongue on this thread because “works fine for me” is not interesting or helpful, but: it’s been working great for me since it was introduced in 2007.
Periodically a disk will get flaky or go bad, maybe once every 2-3 years. I’ll erase the drive and start over. I always have two backups running so there’s never danger of being completely unprotected.
I don't doubt the people having Time Machine problems, but they usually seem to involve some unusual setup like a NAS. But for every one person who has a problem and speaks up, I suspect there are hundreds or thousands who are just humming along without a hitch.
(and yeah, I do pray for a "Snow Tahoe," "oops all bug-fixes" MacOS release, and I’d love to hear that there’s a team working not just to make Time Machine more resilient, but to expand it to do local backups of iPhones and iPads… a guy can dream)
> … so I’ve been kind of biting my tongue on this thread because “works fine for me” is not interesting or helpful, but: it’s been working great for me since it was introduced in 2007.
Is immediately contradicted by this
> Periodically a disk will get flaky or go bad, maybe once every 2-3 years. I’ll erase the drive and start over. I always have two backups running so there’s never danger of being completely unprotected.
Having to periodically erase the drive and start over is one of the problems we’re talking about.
In my experience, restoring files gets flakey before it reaches the point of having obvious backup failures so you may be experience more problems than you know about if this is happening periodically.
As a current C# web developer I think C# is amazing. I know multiple other languages (Java, JS, Python and others) fairly well and none of them measure up to modern C# in my opinion. Visual studio is trash though.
Garbage collection hiccups are probably meaningless on a web platform. As far as real-time processing goes, one of the most significant figures in the history of C# thinks it's a bad fit. If you disagree with him, respond to the video, I guess.
Yeah sure there are some issues with game development and garbage collection. It's fine for a lot of other stuff though. He says that too, still uses C#.
I also think there's probably a lot of skill issue involved as well. I've seen the code written by the average developer, it isn't pretty. Very few developers actually write half-decent code. The vast majority write code that I'd just delete and rewrite rather than work with. Slow, buggy, messy, sloppy.
And then they write a game and there's a bunch of problems because their code is ass and there's an angle where they can blame someone other than themselves so they do. And so the authors of Mono and Unity etc are held responsible, why can't they just fix their thing so that the bad developers can write bad code and still have a functioning game? And sure, if Swift can offer that then it seems Swift is the better choice for this application.
But there's plenty of good games made with C# so clearly it's also possible to do so.
You clearly saw some value in the convenience. Smartphone and smartwatch NFC offers that convenience everywhere. Even setting up palm authentication feels like unnecessary work.
I used it at Whole Foods cause it did my prime code and charged me at the same time without digging my phone out of my pocket but also my Whole Foods has bad reception so it’s annoying to use
I used Amazon One at my workplace all the time, but I only used it at the self-checkout line since I'd rarely get more than a few items, and the lines are shorter at this crowded neighborhood WF. There, I would scan all my items and use my palm to both log in to Prime and pay. Given that I would be scanning my own items, I much preferred it to phone or watch, as I didn't have to fish them out after scanning.
I am surprised nobody has mentioned the real joy of checkout at Whole Foods, which is that there is no annoying, incessant voice asking every self-checkout shopper, "Have you scanned your rewards card yet?" and "Please complete the transaction on the pin pad." It must be sheer torture working all day with those going off constantly.
In theory, but not in practice. The devil is in the details. Yes, Apple wallet and Google wallet allows to store loyalty cards. And those cards can be summoned using respectively VAS and SmartTap.
But... while all payment terminals are compatible to VAS and SmartTap, very few have the firmware and a POS that can make sense of it. So, in practice, beside Walgreens and maybe CSV, it is not much adopted.
I like to go running with nothing on me besides a house key, and it's useful to be able to stop by Whole Foods after the run and buy a snack without a phone, watch, or wallet.
I've consciously reduced my pocket contents from car keys+wallet+phone to driver's license+phone. I'd love to be able to get rid of the phone sometimes.
It all boils down to the tradeoff between convenience and security.
I don't think it is particularly easy to replicate a living hand with all the blood vessels.
And it is not particularly easy to get a NFC ring with a secure element compatible with payment terminals.
I thought that the engineering team at Amazon did a great job with Amazon One. I wish someone could pick up the tech and carry on.
For 2020's-era palm scanners you don't have to replicate a 3D hand -- just like a video chat doesn't replicate my 3D face. You just have to emit photons (some of them infrared, yes) in the correct pattern. The hack won't look like a 3D-printed hand, it'll look like a display panel that works beyond visible wavelengths. It'll probably be some device developed for a totally unrelated market, and then one day "whoops, all those palm scanners are 0wn3d" (natürlich auf Deutsch) will be a talk title at CCC.
But all this is academic. The real problem with biometrics is that when your password is a body part, you can't change your password.
I agree and I get it. But at the same time, it is only used for payment and discounts at grocery store. Payment with a card is even less secure here in US. So, I do not think that Amazon Go was particularly unsecured since it was just for credit card payment.
If someone manages to replicate my pulsing blood vessels from my hand and trick the scanner, that would be fine. I would dispute the purchase, and the store would not even pull the camera footage, and just refund.
Amazon Go was not used to hold access to bank accounts or crypto wallets. I think it was a good technology and balance between convenience and security, for the purpose (grocery loyalty and payment).
A twin or even sometimes a relative (son and mother) can open an iphone and its banking apps using the facial recognition. That is more concerning to me than Amazon Go palm scanning for groceries.
Set up once with the CC with rewards for groceries, hover hand 2 seconds, done.
Apple Pay in the phone or watch are super convenient as well, but they take just a tad bit more of time between selecting the menus in the touch screen for pay options, and then selecting the matching CC.
I save like 30s? Possibly.
Is this tech overkill? Most likely.
It’s not as central to the GOP platform as it used to be, though. Really ever since the “War on Terror,” their messaging has mostly been around whatever the enemy du jour is. Small gov’t was a paleocon thing and McCain, maybe Rand Paul, are pretty much the last of them.
Bob Dole was the first and last Republican I ever voted for. I still think he was kind of a fun guy, although it’s good that his candidacy failed.
I’m kind of curious why Adobe hasn’t gone all-in on Linux by now. IIRC Adobe apps were built on cross-platform tooling like Java (InDesign) and AIR (all the CS UI, and then the underpinnings in C++).
I guess Adobe doesn't exactly have much to win by Windows failing, but their inaction does mean the open source alternatives will continue to get better, and that will hurt them.
Every few years someone posts the same thing in the Adobe forums which then piss off the moderators and Adobe "experts". They continue to say the same thing which makes zero sense.
I'm paraphrasing here but its something along the lines of:
"Linux users are open source people and expect everything for free. They'll never pay for our products, so we see no need to port them for Linux users because so few of them would be willing to pay for our products."
A lot of other comments are cheeky ones like, "Adobe is already on a Linux OS, its called MacOS." - rolls eyes-
And you are correct, the OSS alternatives are getting better and closing the gap, but they're just not there yet - but I hold out hope they will be one day.
> If 3 years ago you would have told me that Microsoft would singlehandedly sabotage their own OS, doing more Linux marketing than the most neckbearded Linux fanboy (or the most femboy Thinkpad enjoyer), I'd have laughed in your face
I have no idea what that Thinkpad burn is supposed to mean.
Its not a burn. Just acknowledging the memes about cross dressing and programming/computer science that has been going around certain online circles for years.
I love this, but I’d rather have an indicator in the menu bar, an app notification, or maybe a nice flow around the "notch" like NotchNook. (https://lo.cafe/notchnook)
Can't really afford to play with an app that would have such an obvious hit on productivity and mental clarity.
First, the performance is just bad. The responsiveness compared to apps like VSC or Panic’s Nova is night-and-day.
The attention given to the design of new features is piss-poor. Placing the AI functionality on the left sidebar makes no sense; all the other tools on the left are project management; the "let me run weird functions and interact with stuff" UIs like terminal, debug and logs are in the bottom panel. Or maybe a new tab in the main workspace area?
The SwiftUI preview canvas can't be floated as a separate window, making it all but useless on anything smaller than a 16" MBP (and only barely usable there). In fact, I think it might be impossible to use Xcode in multiple screens altogether…?
Old simulator versions and cache files hang around forever, you need a third-party app like DevCleaner just to keep your storage from filling with nonsense. Cryptic messages like "copying symbols to device"… clear-cache that doesn't seem to clear-cache, that stupid list UI for info.plist…
I never thought I'd have anything nice to say about PNPM package management, but you can always just delete `node_modules` and reinstall and count on things working. Swift package management is a cryptic mess, and their insistence on using a GUI instead of a basic JSON manifest just compounds it. Like the info.plist thing, a lot of Xcode is based on a developer UI philosophy from the Mac Classic days that has mostly been abandoned by the rest of the world.
Mostly, I think the vitriol surrounding Xcode is that Apple seems to think they're doing a good job; meanwhile their most ardent and adept users are insisting that they are not. Same boat as MacOS, really.
reply