Hacker News

MacOS Catalina: Slow by Design?(sigpipe.macromates.com)

2030 pointsjrk posted 14 days ago901 Comments
901 Comments:
usmannk said 14 days ago:

It seems like there is a lot of confusion here as to whether this is real or not. I've been able to confirm the behavior in the post by:

- Using a new, random executable. Even echo $rand_int will work. Edit: What I mean here is generate your rand int beforehand and statically include it in your script.

- Using a fresh filename too. Just throw a rand int at the end there. e.g. /tmp/test4329.sh

I MITMd myself while recording the network traffic and, sure enough, there is a request to ocsp.apple.com with a hash in the URL path and a bunch of binary data in the response body. Unsure what it is yet but the URL suggests it is generating a cert for the binary and checking it. See: https://en.wikipedia.org/wiki/Online_Certificate_Status_Prot...

Here's the URL I saw:

http://ocsp.apple.com/ocsp-devid01/ME4wTKADAgEAMEUwQzBBMAkGB...

Edit2: Anyone know what this hash format is? It's not quite base64, nor is it multiple base64 strings separated with '+'s but it seems similar...

Edit3: Here is the exact filename and file I used: https://gist.github.com/UsmannK/abb4b239c98ee45bdfcc5b284bf0...

Edit4 (final one probably...): On subsequent attempts I'm only seeing a request to https://api.apple-cloudkit.com and not the OCSP one anymore. Curiously, there's no headers at all. It is just checking for connectivity.

varenc said 13 days ago:

Here's some shell script to use a random file name and have friendlier output.

  RAND_FILE="/tmp/test-$RANDOM.sh";
  time_helper() { /usr/bin/time $RAND_FILE 2>&1 | tail -1 | awk '{print $1}'; }  # this just returns the real run time
  echo $'#!/bin/sh\necho Hello' $RANDOM > $RAND_FILE && chmod a+x  $RAND_FILE;
  echo "Testing $RAND_FILE";
  echo "execution time #1: $(time_helper) seconds";
  echo "execution time #2: $(time_helper) seconds";
Introducing a network delay makes the effect much more obvious. Normally I see a delay of about 0.1 seconds, but after using the XCode network link conditioner (pf rules) to add 500ms latency to everything the delay shoots way up to ~2 seconds.

example output:

  Testing /tmp/test-24411.sh
  execution time #1: 2.32 seconds
  execution time #2: 0.00 seconds
with developer tools checked both executions report "0.0 seconds".
varenc said 13 days ago:

I tried just blocking "api.apple-cloudkit.com" with /etc/hosts. This reduces the delay but doesn't eliminate it. A connection attempt is still made every time. (I don't recommend making this change permanent. Just give your terminal app the "Developers Tools" permission instead)

After blocking that domain I can see that tccd and syspolicyd are logging some error messages to the console related to the failed connection. I don't recommend blocking because my guess is that'll put syspolicyd/tccd in some unexpected state and they'll repeatedly keep trying to make requests.

Try this for watching security related console log messages:

  sudo log stream --debug --info --predicate "processImagePath contains 'tccd' OR processImagePath contains 'syspolicyd' OR processImagePath Contains[c] 'taskgated' OR processImagePath contains 'trustd' OR eventMessage Contains[c] 'malware' OR senderImagePath Contains[c] 'security' "
syspolicyd explicitly logs when it makes the network request.

   syspolicyd: cloudkit record fetch: https://api.apple-cloudkit.com/database/1/com.apple.gk.ticket-delivery/production/public/records/lookup, 2/2/23de35......
(you need to enable private logging to see that url)
saagarjha said 13 days ago:

Enabling private logging is fairly annoying these days, unfortunately. (Interestingly, if macOS thinks you're AppleInternal, it will make it just as annoying to disable private logging…)

varenc said 13 days ago:

wait a sec...I recognize that name. I only know how to enable private logging thanks to your detailed and informative blog post! Seriously, it's one of the favorite macOS things I've read in a while. I loved the step by step walk through using gdb you showed.

Though just today I saw that apparently an enterprise policy config can enable private logging in 10.15.3+ without having to disable SIP. https://georgegarside.com/blog/macos/sierra-console-private/

For reference for others: this is the blog post by OP on enabling private logging in Catalina. check it out! https://saagarjha.com/blog/2019/09/29/making-os-log-public-o...

saagarjha said 12 days ago:

I’m glad you appreciated it, but I think it also happened to be some of the fastest-to-deteriorate advice I’ve given :) I should go back and revisit this, as on my system I have it currently stuck in a state where it unconditionally enables private data logging at boot (which mean my crash logs have personal information in them unless I remember to turn it off with the workaround I’ve been using until now…)

krferriter said 13 days ago:

Huh this is crazy. 2 seconds is way slow and this shouldn't involve any network activity. Seems like a real problem.

Erlich_Bachman said 13 days ago:

He/she added an artificial network latency/delay into the config, just like they describe. That is the reason for the delay. It is made artificially long on purpose.

maremp said 13 days ago:

It’s not an unreasonable delay on a slow 3g hotspot. It’s problematic to have the performance tied to the network speed and suffer an overall slow performance because your network happens to be slow.

Erlich_Bachman said 13 days ago:

Have I written anything that is contradicts that? I simply pointed out that in the example the delay was artificial, and it was definitely due to network, not due to something other than network, as the comment suggested.

rurban said 13 days ago:

It's called lockdown for a reason. Apple was just the very first to implement centralized binary blacklisting, revocation. They call it notarization.

Problem is, that they did it unannounced. There must be really some weird stuff going on in those managers heads. How can they possibly think to go away with that?

dagmx said 13 days ago:

There were announcements about notarization around WWDC last year. They didn't seem to get a lot of media traction however, but there were specific pages detailing what's required from a developer and some basic details on how it would work

From April 10, 2019: https://developer.apple.com/news/?id=04102019a

https://developer.apple.com/documentation/xcode/notarizing_m...

rurban said 13 days ago:

For each and every shell or perl script that I create and use privately? No, certainly not.

john_alan said 13 days ago:

Command line apps aren't affected by Notarization.

If you're compiling something yourself, the compiler won't put a quarantine bit on it and it will execute fine. Same with homebrew/friends.

Scripts don't need to be signed. There is something else going on here.

john_alan said 13 days ago:

Seems that in fact even though scripts aren't signed, IF YOU DONT have devTooling enabled for a given terminal, scripts are hashed and checked against bad known digests.

not a big deal, assuming no data is kept.

Also I wonder what it looks like if a script is deemed bad...

kevinh456 said 13 days ago:

There was nothing "unannounced" about it. Notarization was introduced at WWDC 2018 and announced as required at WWDC 2019. Every macOS developer should have been aware of this requirement. It was a special project for my apps.

ghayes said 13 days ago:

I believe the concern here is that this is affecting not just macOS developers, but all developers who use macOS. That's an important distinction.

pjmlp said 13 days ago:

Developers who use macOS as shiny GNU/Linux replacement are only getting what they deserve, they should have supported Linux OEMs to start with.

Those that show up at FOSDEM, carrying their beloved macBooks and iPads while pretending to be into FOSS.

I use Apple devices knowingly what they are for, not as replacement for something else.

nottorp said 13 days ago:

Sadly it's not the "shiny"... it's the fact that Mac OS has a GUI that works.

Been using linux since the days you installed Slackware from floppies and recompiled your kernel to get drivers. Command line has always been a bliss, but no one has managed to come up with an usable and consistent GUI yet.

Btw does sleep work on linux laptops these days? How's hi dpi support?

green7ea said 12 days ago:

Sleep has been working on my last ~10 laptops and desktops, it's a non-issue at this point unless you have brand new exotic hardware. I did have a motherboard issue on a first-gen Ryzen that required a bios update to get it working.

hi-dpi works very nicely if you use GTK or Qt. For the other apps, it really depends how they are implemented. For me it has been working better than Windows.

These are strawman agruments. Give Ubuntu 20.04 a try an you'll see stuff pretty much just works on any common hardware. You can even use slackware and get everything working with a bit of fiddling.

MacOS is a very nice OS but it isn't FOSS and it isn't more capable at this point, it's just a personal preference. Pretending otherwise is disingenuous.

moe said 12 days ago:

> you'll see stuff pretty much just works

The problem is the "pretty much" part.

We all know what that means in practice. That's why OSX is popular.

fxtentacle said 12 days ago:

I switched my AI workstation to Ubuntu 20 last week, and the experience was fast and great. I can now run docker containers with cuda, use PyCharm to coordinate everything and have code completion as if the code was local, even if it's executing on a docker worker node in our data center.

200% scaling on my 4K screen looks great, wifi, network, sleep, gpu all worked out of the box. And the IDE behaves exactly like on OS X.

The only thing I disliked was the default Ubuntu color scheme, but that was easy enough to change.

happymellon said 12 days ago:

Then you cannot possibly have used MacOS. There is plenty of flakey edges, that actually don't work very well.

Fucking multiple desktop shit.

My MacBook Pro can't even remember the order of my monitors when it goes to sleep, or between reboots. Even Linux can do that.

bwat49 said 10 days ago:

OSX can only guarantee that everything works because apple controls both the hardware and software.

Windows can only guarantee that everything works because they have a monopoly and therefore hardware vendors have to support windows.

Most laptops don't ship with linux/are never tested with linux, so it's never going to work flawlessly on all possible hardware configurations. It's just not possible.

It does however, 'pretty much' work on most hardware.

And if you buy a machine from a vendor that actually supports/pre-installs/tests linux, all of the hardware will work out of the box.

runjake said 10 days ago:

It's that "pretty much" that's the debate.

I recently switched from macOS to Ubuntu 19.10 and then 20.04 as my daily driver and it's way flakier and has far more random app crashes than macOS.

That said, the system is fast, the UX is way further along than I expected -- in some ways it's got a better UX than macOS. It's way, way faster at nearly everything.

bwat49 said 10 days ago:

my point is that if you want to do better than 'pretty much', you should buy a machine from an OEM that actually supports linux

If you're installing it on a random windows laptop, you're never going to get better than 'pretty much', because the OEM doesn't support linux or test their hardware with linux.

happymellon said 12 days ago:

Sway does HiDpi nicely as well, so you don't even have to use the Gnome/KDE pair.

stilley2 said 13 days ago:

When was the last time you gave KDE a try? I just switched from using a tiling window manager and was impressed by how much stuff "just works" and the degree of customizability.

konart said 13 days ago:

>the degree of customizability.

That's part of the problem. Customizability is good, but in return you get inconsistency that you can't fix. And even if all system default apps looks the same (they still look horrible in my opinion), 90% of 3rd party apps look and feel different. You can hardly name a linux (qt or gtk) app that can be name elegant or at least thought through (UI wise). Almost all applications still look like they were build to be used on some factory terminal.

nottorp said 12 days ago:

Last time i used KDE for a significant amount of time, something was distracting. Then i realized what it was: the "system tray" icons were erasing themselves and then got redrawn one by one and readjusted their position with each redraw. Distracting as hell when you're trying to concentrate on the code in a nearby window.

Mind, that was in 2013, and hopefully KDE has improved since then. Perhaps it has even reached the level KDE 3 was at? It's been downhill from there.

Btw, I switched to Macs from running Linux with KDE as my desktop of choice full time.

Filligree said 13 days ago:

Sleep usually works, assuming you get a laptop that's known to with with Linux. The arch wiki is good for this.

HiDPI is hit and miss. Some applications work, some (especially Java) break badly. Expect to need manual, fragile configuration. You also cannot set scaling per-screen, so you're SOL if you have heterogenous monitors.

Personally I use Windows. I check back in Linux every few months, but WSL seems to be improving far faster than native Linux is, so there's not much reason to use it anymore.

Even once HiDPI works, assuming that happens, by that point I'll have HDR and VRR as requirements... and I have no confidence that those will work anytime soon.

cutemonster said 12 days ago:

Sleep works fine, since many years, but the Hibernate button should get renamed to "Crash now please and Again on the next restart"

IshKebab said 12 days ago:

Some people at my work use Linux laptops. Judging by the Linux slack channel, no sleep doesn't work reliably yet, external monitor support is terrible and touchpads still suck. No idea about HiDPI but I doubt it works reliably.

Whenever you bring anything like this up though you'll just get a load of "When was the last time you tried it? It works perfectly for me" replies. Linux users don't want to admit its flaws.

brmgb said 12 days ago:

It's pretty difficult to acknowledge a supposed flaw pointed by a guy who knows a guy who uses Linux when you have never had it yourself.

I used Linux at work for years. Sleep just works, external monitor also just works. HiDPI was rough at the start but works fine now.

Touchpads do kind of suck. I generally really dislike the default mouse acceleration. Font rendering is still so so if you don't have a HiDPI screen and the most popular desktop environments are still kind of terrible.

But sleep definitely does work.

bwat49 said 10 days ago:

> Whenever you bring anything like this up though you'll just get a load of "When was the last time you tried it? It works perfectly for me" replies. Linux users don't want to admit its flaws.

Are you implying that those users are lying?

I'm sure sleep does work reliably for them.

'Does sleep work on linux' is a fallacious question to begin with, because sleep working/not working depends on the hardware.

On some configurations it works flawlessly, on others it doesn't. Therefore you will always have some people saying it works, and others saying it doesn't. FWIW, my current laptop is a machine that ships with linux (system76 darter pro) and sleep works 100% reliably.

In my experience, when sleep doesn't work reliably, it's usually due to buggy firmware behaviour because most vendors don't care about supporting anything other than windows.

Along those lines, since most OEMs don't ship/test linux, it's simply not possible for every single hardware configuration to work flawlessly with linux.

vetinari said 13 days ago:

You know, many things changed since time Slackware was installed from floppies. Even Macs got working virtual memory meanwhile.

pjmlp said 13 days ago:

It is hard to improve things when everyone is on other platforms.

I am mostly on Windows devices, and use a GNU/Linux aging netbook for travelling.

In what concerns this Asus 1215B, everything works, with the exception that the open source AMD drivers were a downgrade from the binary blobs (OpenGL 4.1 => OpenGL 3.3 without video hardware decoding).

However I still kept it around, because although I don't target GNU/Linux as part of my work, I wanted to give Asus the message that selling GNU/Linux laptops might be a relevant business.

Eventually when it dies, I will be Windows/Android and occasionally macOS only user/developer, but I am not using any of these platforms to emulate GNU/Linux, I use them for their own value.

vinceguidry said 12 days ago:

If you want a good experience with Linux on an ultra book, you need to buy hardware designed for Linux. System76 or Purism are my recommendations. I don’t trust Dell.

phatfish said 12 days ago:

This is the only way to do it.

The kernel devs or distros can't possibly support every hardware combination and BIOS bug for each hardware manufacturer.

For Windows the hardware manufactures have a reason to make the drivers bug free, its where they make most of their money, and Microsoft has the capacity to help them get it fixed if needed.

This doesn't exist for Linux unfortunately, unless you buy a laptop where Linux is fully supported (and you use the supported distro and kernel version most likely).

I have to say the main culprit for issues is usually power saving. I assume that's because ACPI is often badly implemented and power saving requires a lot of separate components to function together, to specification. Likely one doesn't, and the laptop comes out of sleep with the touchpad not working, or something worse.

komali2 said 13 days ago:

> Btw does sleep work on linux laptops these days? How's hi dpi support?

Both work out of the box with Ubuntu 18.04 running Gnome on a Thinkpad x1 carbon.

But having to flip a few switches is a funny excuse to handcuff yourself to OSX and the hardware required to run it.

fluffything said 13 days ago:

I've partially switched from MacOS X to Linux now that wayland pipewire is reaching a mostly functional state and am quite happy with it.

It took me maybe 150 hours to do the switch though during quarantine, and I still haven't managed to be able to properly connect to SMB at work...

vetinari said 13 days ago:

What problem do you have connecting to SMB?

It's one of the things that work better for me on Linux than on MacOS (no problem with browsing shares, no disappearing shares, no problem with non-normalized unicode filenames).

fluffything said 11 days ago:

It just doesn't connect / mount at all. Last time I tried to debug it, this was caused due to a too old samba protocol version being used on the Windows side.

On MacOSX, I just click on connect to server, and it works for me "as is".

vetinari said 11 days ago:

On MacOS, I get randomly appearing and disappearing servers in the sidebar (they disappear usually when I need them) and "cannot be opened because the original item cannot be found" for already mounted shares. It also keeps permanently mounted "photos" share on my home NAS and bad things happen when I try force unmounting it (but if it disappears because I'm not connected to my home network, that's ok for some reason). This got especially bad in Mojave and Catalina; there was a period of time (10.15.0 - 10.15.2) when I had to restart Finder if I wanted to mount share that was previously unmounted.

Never happened that with Linux. What did happen that there was a period of time on some distributions (circa Fedora 28-30?), when SMB1 discovery didn't work because entire SMB1 was disabled. This was security migitation (EternalBlue/WannaCry/NotPetya) and Microsoft is doing the same in Windows 2016/2019/10[1][2]. In general, using SMB2/3 is good idea anyway, Linux distributions/Samba eventually enabled SMB1 only for client-side discovery, and you can still enable entire SMB1 if you need it for some reason - do you still have Windows 2003 someplace?

[1] https://blogs.technet.microsoft.com/josebda/2015/04/21/the-d... [2] https://techcommunity.microsoft.com/t5/storage-at-microsoft/...

bwat49 said 10 days ago:

> Last time I tried to debug it, this was caused due to a too old samba protocol version being used on the Windows side

IIRC, the only smb version that would be considered too old is smbv1 (which I'd hope they are not using on the windows side... its quite insecure and is deprecated by microsoft).

uep said 13 days ago:

I'm on Linux now, very interested in using Wayland+Pipewire, but still stuck on Xorg. What distro are you using?

I was considering building a Wayland/Pipewire Desktop software stack from scratch since my distro doesn't support them yet. I have become partial to experimenting with new software this way because it allows me to switch back to my known-good distro software without rebooting (most things I care about preserving the state of exist in the console anyway).

If it is relatively supported in a specific distro, I'm sort of interested in trying it.

fluffything said 11 days ago:

I use Arch with Sway.

saagarjha said 13 days ago:

What if using macOS enables me to be a more effective FOSS contributor? What if I think that FOSDEM is actually has many participants who aren't really into free software?

pjmlp said 13 days ago:

Then they are on the wrong spot to start with, and really didn't got the message what FOSDEM is all about.

It is a bit hard to be an aspiring FOSS contributor given the foundations those contributions are built upon.

Those same Apple loving users would be laugh upon at FOSDEM if they demoed any of their stuff on Windows instead.

Yet, there is hardly any difference between those corporations going all the way back to their origins.

Somehow after NeXTSTEP's adoption as OS X, NeXT and Apple's proprietary behaviour was forgotten and everything excused, because "hey they are shipping an UNIX clone"!

Yetanfou said 13 days ago:

> What if using macOS enables me to be a more effective FOSS contributor?

How would that work? When you build a house on rented ground the house may seem to be yours but it can always be taken away from you.

saagarjha said 13 days ago:

I’m familiar with macOS and contribute to a number of FOSS projects from it. I’m less productive on other platforms.

Yetanfou said 13 days ago:

In that case you'd do both yourself and those who depend on you for your contributions a favour by taking some of that time to get acquainted with alternative platforms seeing as how Apple seems to be on a course which will make it harder and harder to use their platform for this purpose. Like the Boy Scouts (used to) say, "Be Prepared!". Install a (few) Linux/BSD distribution(s) in a VM and try using those for a while to get a feel of the platform and its strengths/weaknesses so you have somewhere to land when the time comes.

saagarjha said 12 days ago:

I do use Linux for some of my work, especially when I’m working with ELF binaries. Just not a comfortable with it.

ecnahc515 said 11 days ago:

Your analogy isn't the best. This is like someone renting construction equipment to build a house on land they own, and finding out that the construction equipment phones home to the owners about how it's being used.

make3 said 13 days ago:

developer who uses MacOS != MacOS developer. I couldn't care less about what is announced at WWDC

la_oveja said 13 days ago:

First? Windows SmartScreen has checked for malicious binaries since Windows 8.

yariik said 12 days ago:

> Problem is, that they did it unannounced.

No, the entire thing is the problem. Windows 10 can still open applications that were compiled in 1994, and it doesn't make it less secure.

m463 said 13 days ago:

Once you start something, it's hard to stop it.

Every software place I've worked gives a special urgency to security stuff.

And even if features don't come out regularly, security updates do. This is more of that.

jules said 13 days ago:

Isn't this what bloom filters are for?

ComodoHacker said 13 days ago:

>Apple was just the very first to implement centralized binary blacklisting

No, AV vendors did it for decades. In a more efficient way though.

andy_ppp said 13 days ago:

Not sure it’s more efficient given how sluggish most AV software used to make my machine...

happymellon said 12 days ago:

Not as bad as Catalina

said 13 days ago:
[deleted]
kccqzy said 14 days ago:

OCSP is Online Certificate Status Protocol, generally used for checking the revocation status of certificates. You used to be able to turn it off in keychain access, but that ability went away in recent macOS releases.

VonGuard said 14 days ago:

Ah, Apple. When you can no longer innovate, just start removing features and call it simplicity...

throwaway851 said 14 days ago:

Another way to look at it is that Apple is making it harder to run the system in an insecure fashion. You may not agree with that decision, but I certainly appreciate how Apple is looking out for the safety and security of the user.

Tangent: as much as some developers hate that the only way to distribute apps for the iPhone is through the App Store, as a user I consider that walled garden of apps to be a real security benefit. When John Gruber says “If you must use Zoom or simply want to use it, I highly recommend using it on your iPad and iPhone only. The iOS version is sandboxed and reviewed by the App Store.” There’s a reason why he can say things like that and it’s because Apple draws a hard line in the sand that not everyone will be happy with.

userbinator said 13 days ago:

Another way to look at it is that Apple is making it harder to run the system in an insecure fashion. You may not agree with that decision, but I certainly appreciate how Apple is looking out for the safety and security of the user.

"Those who give up freedom for security deserve neither."

(Yes, I know the original intent was slightly different, but that old saying has gotten a lot more vivid recently, as companies are increasingly using the excuse of security to further their own interests and control over their users.)

The ability to control exactly what millions of people can or cannot run on "their" computers is an authoritarian wet dream. People may think Apple's interests aligns with theirs --- but that is not a certainty. How many times have you been stopped from doing what you wanted to because of Apple? It might not be a lot so far, but can you break free from that relationship when/if it does turn against you?

roenxi said 13 days ago:

The quote isn't at all relevant to technical decisions though. Eg, there is enforcement that a program can't arbitrarily access any RAM it likes on the same machine. That is trading freedom for security and it is a good trade. And there isn't really an argument against gatekeeping software - users as a body don't have time to verify that the software they use is secure. I'd be shocked if the median web developer even reads up on all the CVEs for their preferred libraries. Gatekeepers are an overwhelmingly good idea for typical don't-care everyday users.

The issue is if it becomes practically impossible to move away from Apple to an alternative. Given that they have a pretty typical market share in absolute terms that doesn't seem like a risk right now. They don't even hold an absolute majority in what I assume is their strongest market, the US, let alone globally.

Wowfunhappy said 13 days ago:

Of course it's relevant! Software is a form of expression. Apple controls what types of expression are allowed on your phone.

A developer made a game depicting bad practices at FoxConn. Apple removed it for "Objectionable Content"[1]. How is this inherently different from Apple saying you can't use your iPhone to read a certain book?

Apple's restrictions also make it easy for authoritarian governments to ban software they dislike: https://news.ycombinator.com/item?id=21210678

[1] https://www.theverge.com/2012/10/12/3495466/apple-bans-anoth...

roenxi said 13 days ago:

It is identical, and if I considered my phone to be primarily a research platform I'd be really upset. I got really upset with YouTube mucking around curating what videos they allow on their platform because I want to choose my own videos.

But ultimately I own an iPhone because I need a GPS map, SIM card and web browser on the go. Apple doesn't exercise any creative control over those things. Apart from that they explicitly sell a highly curated platform. I expect them to make decisions I don't agree with; that is what curators do. That is the service they sell so I'm not going to complain.

If someone used that walled garden approach on my PC I'd be furious. On my phone, I give them hundreds of dollars for the privilege. If I were going to get upset about freedom and phones, which is reasonable, I have a loooong list of problems before I get to Apple's security model - starting with government interception of messages and moving down to having my name attached to my SIM card. Apple's activities don't really rate, and they have better incentives than Google.

PS. I'm not arguing against phones being scary. Look at the COVID tracking apps that some companies and governments are bringing out that might become mandatory one day. Or the way the US is known to use phone GPS to target drone strikes. Phones are terrifying. Apple's curating/censorship/what have you really doesn't rate on my threat model when dealing with a phone.

userbinator said 13 days ago:

If someone used that walled garden approach on my PC I'd be furious.

As this article shows, Apple is slowly moving in that direction for their PCs. They aren't going to be satisfied with locking down their phones only.

kiawe_fire said 13 days ago:

Are they really moving in that direction, though?

An App Store from which you can download software with confidence is a pretty sensible first step for most users.

Complementing that with a Notarization service for apps that can't live in the App Store, while still giving both users and developers confidence that the user is installing the "real" app, and not something malicious, seems like a pretty sensible way to protect most users outside the App Store.

And if all else fails, there are ways to allow running that un-Notarized, non-App Store app that you're sure you trust.

None of that seems like something that inherently means to take away your ability to run what you want on your PC, it just sounds like a common sense approach to giving your users confidence in what they run, and guiding them to do so safely by default, while allowing overrides as needed.

Are these ALSO things that Apple could use to lock down your PC completely?

Sure... but then, why bother with any of it if that was the intent?

They already have Mac App Store, and they already have the infrastructure to deal with a "whitelist only" approach, so why bother with this Notarization and Gatekeeper stuff at all?

Don't get me wrong, there's plenty of room to criticize Apple for their implementation. They are clearly figuring out some of this as they go, and trying to find a proper balance. That isn't easy, despite how many people make it out like it is.

Give the average user too many prompts or chances to override security, and they will do that, every time, without thinking it through.

On the other hand, bury the overrides too deeply, and risk making things miserable for the developers and power users who need to use your platform freely.

So far, I see only evidence that Apple is trying to find that balance, but no evidence that they intend to lock the entire platform down entirely.

Are they doing it perfectly? Clearly not. But I think if we're being honest, no other platform has either. I appreciate Apple's approach the most so far, but time will tell if they are able to figure this balance out or if another platform will at some point.

vetinari said 13 days ago:

> They already have Mac App Store, and they already have the infrastructure to deal with a "whitelist only" approach, so why bother with this Notarization and Gatekeeper stuff at all?

Change management. For the same reason why Ebay had to backtrack changing their background color and do it again, slowly.

kiawe_fire said 12 days ago:

That's certainly possible.

But as someone who has been using Macs on and off for about 10 years now, I've heard people shout that Apple was locking down Macs from the moment the App Store was created on iOS (and long before it came to MacOS). So far, that hasn't happened.

Is it possible this is the next step in a 10+ years plan to "boil the frog slowly"? Of course! Not sure how they would accomplish this without also losing the developers they need to continue making both MacOS and iOS viable platforms for users, but I guess if they just don't care and want to lock everything down, this could certainly be one more step towards their long term nefarious goal.

But it also still seems like a reasonable step towards making their platform more trusted and secure for the average user while continuing to give devs and power users control.

So far, I see no evidence for the former, and enough evidence for the later, that I'm not too worried.

davrosthedalek said 13 days ago:

Last time I checked, they force you to use the safari engine for your web browser on IOS. Also having a curated app store doesn't mean they have to disallow any other means of installing software. It's even ok if they say: You installed other software, no support for you. But making it not possible is a money grab.

pjmlp said 13 days ago:

Not at all, you are always free to buy computers, phones and tablets from other vendor.

Don't go buy Apple and then cry in the corner that you aren't getting the right set of toys to play with.

I use Apple devices and fully support don't having random app uploading my stuff into the world.

pinopinopino said 13 days ago:

Sure, you can buy whatever you want, you aren't living in a dictatorial country. Sadly enough, most people can't say this. Therefore it is important for you to fight decisions like this. If something doesn't exist, it cannot be abused by some regime.

I am going to say something very cynical now, if the reader doesn't like that, he should tune out now. But I guess Apple can't wait to have that special China deal. ^_^

pjmlp said 13 days ago:

Except Apple isn't a dictatorial country, and there are other computer vendors to choose from.

Apple isn't Mafia, doing personal visits while giving advices to buy Apple computers otherwise accidents do happen.

Buying an Apple computer is a conscious decision.

I love how many around here make their decisions, and then feel entitled to complain and point the finger to big corporations, as if these corporations are the only ones to blame and they poor souls were mislead.

pinopinopino said 13 days ago:

Multinationals are not countries, but they are operating in multiple countries and there actions can have influence on the people in those countries. If Apple makes it possible to stop certain software to be installed then China can abuse the mechanism.

And I am entitled to complain about big corporations. That is the beauty if you life in a free country and even if it wasn't free to complain about them, I still would do it.

I rather see them all burn today than tomorrow.

pg-gadfly said 13 days ago:

Buying a house and suddenly getting your water cut off because the county"doesent feel like it" is also similarily a "conscious" decision, and similarily bites you only a time after you bought something.

You might say that's illegal, and I'd recommend thinking about why that has become the way it is. Things are deemed important to everyday life, and suddenly they aren't free game.

pjmlp said 13 days ago:

Which fails again as an example, because legally is not the same thing.

pg-gadfly said 13 days ago:

It's can vs. can't, which is perfectly comparable, in both cases you cant know what you get until afterwards, which is not acceptable. When the freedom to use the your own devices is in question, it needs to be addressed.

Shifting the blame onto the victims by saying they should have known the county can do that, is just sheltering yourself from the uncomfortable truth.

I don't want to feel like I'm being taken advatage of either, believe me. It's just better to fight back than let it roll over you.

userbinator said 13 days ago:

When they force their proprietary standards on everyone else... https://news.ycombinator.com/item?id=23250831

saagarjha said 12 days ago:

Apple was the first major HEIC adopter, but it’s not really something proprietary they came up with: https://en.wikipedia.org/wiki/High_Efficiency_Image_File_For...

ecnahc515 said 11 days ago:

I agree. I'd take your point on gatekeepers being a good idea further.

Gatekeepers are a good idea for even experts. There's a reason it's still in your best interest to use battle tested crypto libraries instead of writing your own, even if you're a security expert. The reason stands that it's possible for experts to make mistakes, which is why auditing is so important.

Now for this to hold, we need to assume Apple has done a good job with their notarization system, and that it's regularly audited to ensure it's not causing too many issues.

In this case, I trust Apple isn't doing these things to make developers life harder. They're doing it because it's incredibly difficult to make something both ergonomic for experts (developers) and secure/safe for non-experts (average end-users), and they would rather ship something less-than-perfect for developers if it's going to help non-developers.

kevinh456 said 13 days ago:

So keep a Linux box if you want. Don't shit on people for using a mac.

I can use macOS, Windows 10, and any distribution Linux I want without having to pick one. That's freedom. I have choices. I choose all of the above in my personal setup. I'll fight to keep my free software but, at the same time, you can pry logic on the mac from my cold dead hands. I've been using it for 15 years and I am not going to stop now. Use the best/preferred tool for the job you have to do.

VonGuard said 12 days ago:

I expelled Apple from my life 5 years ago and couldn't be happier. Before that, I'd been using their stuff for longer than you. I was quite close to the company for a time, covering them as a journalist full time. I have 3 Linux boxes and a Windows box. I shit on Apple from great height. Their entire ethos has been lost, and they don't make anything easier. My folks continue to use them, and my father's business life has been nearly ruined by their CONSTANT updating of the OS and ending of support. He's almost 80, he's not going to learn anything new, but he hit one button accidentally when it prompted him, and now he's been updated to god knows what newer-yet-still-unsupported version of their OS and his email client stopped working and his legitimately paid-for iTunes music stopped working. Apple has not only contempt for its users, it has contempt for its developers and fans. It treats them all like morons.

I thought this was computing for the masses.

austincheney said 13 days ago:

The original quote from Franklin was about liberty not freedom. A suttle but vitally important distinction as freedom requires security where liberty does not. If you sacrifice freedom for security you still at least have security, as in a despotism, but if you sacrifice security for freedom you have neither. Conversely if you sacrifice liberty for security you have less liberty without any increase in security just resulting in a net loss.

austincheney said 13 days ago:

This is perhaps, strangely enough, the most contentious comment I have placed on HN. Last night when the comment was fresh it was quickly up voted at least 7 times. This morning I awoke to the comment down voted back to it’s original 1 karma. I am unclear as to how this comment is so polarized.

Here is the Franklin quote (I encourage you to read the whole article): https://www.washingtonpost.com/news/volokh-conspiracy/wp/201...

yesenadam said 13 days ago:

I always thought the two words are synonyms. (That belief somehow survived decades of philosophical reading, media, and more than a few moral/political philosophy courses.) Here in Australia, liberty sounds like a USA word. We talk of civil liberties etc, but not liberty on its own like that. That sounds 18th C and/or estadounidense.

Your distinction sounds like (what I learnt as) Berlin's negative and positive liberty:

"Negative liberty is the absence of obstacles, barriers or constraints. One has negative liberty to the extent that actions are available to one in this negative sense. Positive liberty is the possibility of acting — or the fact of acting — in such a way as to take control of one's life and realize one's fundamental purposes. While negative liberty is usually attributed to individual agents, positive liberty is sometimes attributed to collectivities, or to individuals considered primarily as members of given collectivities."

"The idea of distinguishing between a negative and a positive sense of the term ‘liberty’ goes back at least to Kant, and was examined and defended in depth by Isaiah Berlin in the 1950s and ’60s."

https://plato.stanford.edu/entries/liberty-positive-negative...

That article goes on:

"Many authors prefer to talk of positive and negative freedom. This is only a difference of style, and the terms ‘liberty’ and ‘freedom’ are normally used interchangeably by political and social philosophers. Although some attempts have been made to distinguish between liberty and freedom (Pitkin 1988; Williams 2001; Dworkin 2011), generally speaking these have not caught on."

Ah that's what I thought!

Also, referring to your other comment, if a "despot can do whatever he wants to you or to your family", like disappear you in the night, and it's not a loss of security, I'm not sure what you mean by 'security'.

delian66 said 13 days ago:

In despotism, you do not have security either - the despot can do whatever he wants to you or to your family.

austincheney said 13 days ago:

That is a loss of freedom, not security. Compare that to living entirely on your own in the wilderness where you will enjoy maximal freedom with no security from people or nature or starvation.

That distinction is why, in history, non-civilized people find civilization abhorrent and why other people would choose to live under a despot opposed to living on their own. In the ancient world people were not friendly to the idea of abandoning freedoms for class distinctions but once they had it they were not willing to sacrifice personal security or quality of life increases for risk of death and starvation.

That is why people claim freedom isn’t free, because many people, even now, are frequently ready to abandon freedoms for increased security opposed to the extra effort required to increase both.

gowld said 13 days ago:

That’s not close to the original quote. And it was just Ben Franklin politicking, not the word of god.

deathgrips said 13 days ago:

No one cares, it's the concept that matters. This is on the same tier as saying "haha hey buddy looks like you typed 'there' instead of 'their' haha #rekt".

mikeyjk said 13 days ago:

> No one cares, it's the concept that matters. This is on the same tier as saying "haha hey buddy looks like you typed 'there' instead of 'their' haha #rekt".

While the content / concept is the main point, facts matter. Even if it is ancillary to the intended message. Why suffer misinformation no matter how small?

zanethomas said 13 days ago:

Another way to look at it is that Apple is moving towards a future where all software for the mac must be purchased from the app store.

Bubye Apple, my next machine will likely be a Dell Ubuntu.

amatecha said 13 days ago:

Yeah, this is the future I've been foreseeing for years. Every new OS update just ever so slightly decreases your ability to control what software is on your device, and how you can use it.

For example, you used to be able to back up your purchased iOS apps to your computer, and restore them from your computer. In one iOS update (9 IIRC?), they removed the ability to back up the apps from your phone. In a later iOS/iTunes update, they removed the ability to restore backed up apps from your computer, making your existing backed-up apps useless, if you still had them.

Now, the only way to keep your software on your iPhone indefinitely is to never delete it, and never reformat your phone. Ohh and never update iOS because they will break backwards compatibility with apps you already have. For any app that is no longer supported by the developer, you're just out of luck (and I have purchased MANY such apps, being an iPhone user since 2009).

hoppeilene49 said 13 days ago:

> making your existing backed-up apps useless, if you still had them.

This isn't true. You can still install existing IPAs you have saved in the past by syncing it with Finder. You can also just AirDrop an IPA to your iOS device to install it.

> Now, the only way to keep your software on your iPhone indefinitely is to never delete it, and never reformat your phone.

You can still back up IPA installers by downloading them with Apple Configurator 2. https://ios.gadgethacks.com/how-to/download-ipa-files-for-io...

amatecha said 13 days ago:

I can't seem to find documentation about AirDrop installation of .ipa backups I have. Also that Apple Configurator 2 process appears to force me to update the apps before they are backed up (I have automatic updates turned off because of how often app updates tend to be regressions rather than improvements)... Also, how do I "sync it with Finder"? (what is "it"?)

pietrovismara said 13 days ago:

If I may ask, why do you still persist with apple products then? Sounds like masochism from here...

amatecha said 13 days ago:

I have no intention of buying more at this point. The last was the iPhone 8 in 2017. No clue yet what I'll do in the future for a smartphone, because I don't see Android as an option at all. Hopefully this iPhone 8 lasts forever :)

pietrovismara said 13 days ago:

Personally I find smartphones less and less useful. I use them mostly to stay in touch with people or to read articles online, and I do all my work from a laptop anyway. I used to buy flagship Android phones but I realized that it's wasted money. Now I have a 200€ Samsung phone, it works fine, yesterday it fell and the screen glass broke a bit, I couldn't care less.

If I keep going at this rate, I think I will quit smartphones within a few years.

Yetanfou said 13 days ago:

Get a server or some hosting, load it with whatever you need - mail, web, cloudy things, media, communications etc - and use a portable terminal to access it when on the move. That portable terminal can be a phone with a browser or some future device which is more tailored to this type of application. With the current generation of SoC, Wasm and a capable browser (Firefox Nightly Preview is shaping up nicely) this setup is a viable replacement for most 'apps'. One of the advantages of such a setup is that those 'apps' do no get to track your every move - that is, as long as that capability is not built into the browser at some stage (persistent web workers etc).

vbezhenar said 13 days ago:

iPhone SE is iPhone 8 on steroids.

m463 said 13 days ago:

this is sort of an ecosystem pattern.

First xbox was offline, subsequent xboxes were more intrusive

first windows pcs were offline, now they have become spy ("telemetry") machines

Apple has reigned itself in (a bit), but they just as stubbornly put business decisions above user wants.

pmarreck said 13 days ago:

Mine is already about to be a Linux workstation since, in addition to all the developer hostility the past few years, Catalina essentially killed off Mac gaming (something like 75% of Mac games are 32 bit? or something?). Prior to that it was merely a joke, but it was nice to have an occasional game to play. Now? Nope, Apple Store and recently updated game code or GTFO

cjohansson said 13 days ago:

Dell Ubuntu is not a good choice, they don’t provide proper drivers and their support has zero knowledge about Linux

m463 said 13 days ago:

Ubuntu phones home a lot too.

motd-news, apport, snaps, whoopsie, kerneloops, ubuntu-report, unattended-upgrades, ...

cookiengineer said 13 days ago:

> Dell Ubuntu

Casual Manjaro and Arch rolling distro with AUR is better drop.

api said 13 days ago:

The problem is that there is more than one market here. There is a general market where people love the vendor looking after their security and doing things for them, and there is a pro/hacker market where people want to control things themselves and dont want a lot of this stuff.

rsj_hn said 13 days ago:

This. Yes the option of a walled garden is a great thing and I wouldn't recommend anything but an Apple device to my non-technical relatives. But if Apple also wants to make the $$ that comes from selling "pro" gear, they need to stop relentlessly consumerizing and turning OS X into iOS. I don't think they realize the level of ill will they are engendering in the developer/pro market.

Perhaps it's time for a "Pro" and "Home" Mac OS.

saila said 13 days ago:

I've been doing software development on macOS/OS X for quite some time now and the consumerization aspects don't bother me. I install almost everything I need via Homebrew, from software libraries to desktops apps, and the fact that there's an App Store isn't particularly relevant (although I do use it for consumer apps now and then).

I'm trying to think of how macOS is so different from 10/20 years ago. What's missing? What can I not do now? Maybe my brain has just been consumerized and I forgot something important.

I was going to switch to Linux 10 years ago when people were talking about the iOSification of OS X back then, but that never happened.

nrclark said 13 days ago:

Do you write much system-level software? I feel like Apple's changes don't affect the XCode crowd much - but under the hood, things are slowly getting worse for command-line developers.

How about when Apple removed /usr/include in its entirety from Mojave? Or when they decided to make the root filesystem read-only? Or when they removed the ability to permanently disable the "only run verified apps" option? Or when they even made that the default in the first place?

How about when they stopped supporting or updating the MacOS X11 server, which doesn't have proper GPU support and probably never will?

How about when Apple replaced gcc with a thin wrapper around clang, so that /usr/bin/gcc generates identical code to /usr/bin/clang? Or how they froze all GNU tools (including bash) at the last-released GPLv2 version, just so that they could retain the option to lock you out from modifying your OS install?

How about the fact that Apple has officially deprecated Python on MacOS?

How about the increasingly slow filesystem access? Not a big deal for app users, but terrible for shell-scripts and system software kind of generally.

How about when Apple removed the ESC key from two generations of Macbook Pro? And also how they replaced the function keys with a touchbar?

Did you know that Apple will soon be using zsh for /bin/sh? Without much regard to how many shell scripts have a #!/bin/sh hashbang and some bashisms in them? You can call those scripts buggy or poorly designed if you want - but they're plentiful and widespread, and will be broken so that Apple can steer clear of GPLv3 code. All so that they can block you from modifying your OS installation.

MacOS was a Unix nerd's dream 10 years ago. It was fast, reliable, and it had a good terminal paired with amazing hardware and software that "just worked". Over time, everything that attracted me to the platform has slowly eroded. I stopped buying or recommending Macbooks in 2016, and only use one now because my employer is an Apple shop.

oarsinsync said 13 days ago:

> Did you know that Apple will soon be using zsh for /bin/sh? Without much regard to how many shell scripts have a #!/bin/sh hashbang and some bashisms in them? You can call those scripts buggy or poorly designed if you want - but they're plentiful and widespread, and will be broken so that Apple can steer clear of GPLv3 code. All so that they can block you from modifying your OS installation. MacOS was a Unix nerd's dream 10 years ago

Yep. Sorry. I’m struggling to connect “Unix nerd” to “thinks /bin/sh and /bin/bash are the same”, especially as that’s very much a Linux distro created problem, and (the clue’s in the name) Linux Is Not UNix.

john_alan said 13 days ago:

Interesting analysis, thanks for sharing.

command line apps installed via home-brew don't have gate-keeper/notarization though.

I don't know why ppl seem to think they do...

What am I missing? I'm on the latest Catalina and, for me, anything installed via home-brew / scripts/c++/python/rust I write and run/compile myself, just run.

I also don't see any time different between my apps on linux and macOS.

I use itemr2, with Fulldisk access and it's specified as a devtool in privacy.

What am I missing that's a big problem here?

Yetanfou said 13 days ago:

Maybe you're missing to foresee the future step in Apple's strategy which will make it harder if not impossible to run something like Homebrew? As far as I know there is no such thing on (non-jailbroken) iOS. Apple seems be be steering macOS in that direction, a curated platform instead of a general-purpose computing device.

ecnahc515 said 11 days ago:

You realize Apple employs engineers right? The same engineers who use homebrew for their own job? If they go down that route, it's likely they'll need to support something like homebrew or similar.

Honestly, it wouldn't surprise me if it just meant distributing package via homebrew means signing the package, much like any other package manager. Yes, you can get something similar with checksums, but it doesn't provide any method of authenticity of the distributor.

Is it friction? Hell yeah. A pain? Yes. Is it purely bad? No. Does it have positives? Some. It's not black and white.

api said 13 days ago:

If they do that, I am gone. Parent mentioned that they feared that though 10 years ago and it never really happened.

Apple seems to be trying to walk a line with MacOS and keep all of its user bases happy, but it's a hard line to walk.

john_alan said 13 days ago:

Agree with you completely.

john_alan said 13 days ago:

I would move to Arch or Debian.

That said, how can they lock it down? You need macOS open to develop apps for their other devices.

They can’t get rid of homebrew et al, as they’d lose their iOS developers! Don’t you agree?

The fact they explicitly have a “Dev tool” category you can use here says a lot about their approach being open for power users.

pjmlp said 13 days ago:

By writing system level macOS software, although I think you mean old style POSIX UNIX stuff.

Here is a thing, already with NeXTSTEP, UNIX support wasn't never something worthwhile looking for, NeXTSTEP was used for its Objective-C tooling and frameworks, like Renderman and Improv.

The UNIX stuff was just a solution for having a quick ramp up for their OS development, and just like Microsoft with Windows 3.1 NT, to have a tick in the box when selling to the government,

Their famous commercial against Sun, hardly touches on UNIX like development.

https://www.youtube.com/watch?v=UGhfB-NICzg

You aren't going to see a CLI on that NeXTSTEP screen.

Just like the SDK is all about Objective-C related stuff, even the device drivers were written in Objective-C.

https://www.nextop.de/NeXTstep_3.3_Developer_Documentation/

The only fouls here are those that keep giving their money to corporations instead of supporting Linux OEMs, as Microsoft cleverly discovered.

In fact, had either A/UX not been discontinued or Microsoft seriously supported their POSIX personality, Linux would never taken off, as the same crowd would be happily using these systems.

sooheon said 13 days ago:

I feel everything you say, and still don't see a better alternative. They're just too good at the hardware and integration.

warrenm said 13 days ago:

Methinks you don't grok how Apple uses the term ”Pro”

saagarjha said 13 days ago:

It comes in Space Gray?

warrenm said 12 days ago:

Sigh

No - it's for people who want to Get Stuff Done™ and not worry about all the crap under the hood.

ibeckermayer said 13 days ago:

Why can’t they have their walled garden App Store and also allow me to install other app stores?

It’s an authoritarian usurpation of the spirit of property rights. I should be able to decide for myself what software to run on my hardware, Apple HQ’s opinion should be irrelevant.

jerryzh said 13 days ago:

Why would any developer even want to release their app in walled garden when they can do whatever they want by releasing elsewhere?

pinopinopino said 13 days ago:

Analogue question in the linux world: Why would anyone get something in the debian package repository, when they can just release their package on their website? Because it gets added support, a bigger reach and a safer and easier installation for users?

vbezhenar said 13 days ago:

There are special people: maintainers. They collect software from the world and package them for Debian. They often are different from original developers. Original developers might not even know that their software was repackaged. It's possible because of free software licenses. Apple can't do that even if they would want: proprietary software typically does not allow redistribution.

pinopinopino said 13 days ago:

Good point, it wouldn't work that way with proprietary software.

pjmlp said 13 days ago:

Usually on the walled garden they get paid.

colejohnson66 said 13 days ago:

On macOS, they do. On a phone, if you want to side load, there’s the option of Android.

43920 said 14 days ago:

Wouldn't a sandboxed Zoom downloaded directly from them be equally secure?

zrm said 13 days ago:

> Wouldn't a sandboxed Zoom downloaded directly from them be equally secure?

More relevantly, wouldn't a sandboxed Zoom downloaded from Apple's store be equally secure even if you could install different apps from developers you trust more outside of the store?

Retric said 13 days ago:

Apple’s rejected a huge number of App updates for security reasons. It’s not a huge benefit, but it does exist.

cliffsteele said 13 days ago:

And also allowed a jailbreak app in the iOS App Store. Yes, it only happened once (that I know of), but it still shows you can't really be oblivious to their practices.

colejohnson66 said 13 days ago:

So out of the millions of apps on the App Store, they slipped up once? Sounds like a really good success rate.

saagarjha said 13 days ago:

That's just the one jailbreak that ended up in the news. There's been many other of bad things that have been pulled.

cmdshiftf4 said 13 days ago:

>been many other of bad things that have been pulled

A jailbreak app making it to the app store being bad, and "apple's walled gardens are bad", are fundamentally incompatible.

saagarjha said 13 days ago:

Apple can be bad at doing what they claim to be doing and also be doing the wrong things. The nice way this works is that Apple curates a bunch of software they think is safe, and I can run whatever I want on my device. The worst of both worlds is that I can't run what I want, but sometimes malicious things get through Apple's checks.

jasonlotito said 13 days ago:

Jailbreak apps are bad for Apple. Walled gardens are bad for users. It's not complicated.

neotek said 13 days ago:

I, a user, am extremely appreciative of Apple's walled garden. I've never once had to worry that the app I'm downloading is crammed full of malware because I trust that Apple's processes are robust and will work well in 99.999% of all circumstances.

davrosthedalek said 13 days ago:

A walled garden is not the same as a curated app store. You could have the same benefit if apple would allow non-app-store apps to be installed after flipping a switch, tethering with a Mac or some other voodoo.

neotek said 13 days ago:

Apple does give you the ability to install non-app-store apps (some without tethering), e.g. sideloading or enterprise certificates, although I agree it's not as easy as flipping a switch.

They should also provide a way to downgrade iOS via Xcode for those with a dev account, but that's another story.

friendlybus said 13 days ago:

People who are precious about security never obtain apps that aren't generally approved and vetted by professionals anyway. Forcing this deciscion onto everybody is just going to push the people who want a free and open platform into places you dont want them. The benefits of openness don't go away just because apple said so.

LaGrange said 13 days ago:

We get Zoom, we used to install Java (remember when it was bundled with crapware in hope you'll forget to uncheck a checkbox?). Companies routinely strong-armed users into getting malware. And I doubt popular game mods are all that strongly reviewed by security experts, but are quite popular with tech people.

App Store policies are a poor replacement for collective action, of course, but let's not pretend we can just become immune to hostile by sheer force of will.

neotek said 13 days ago:

I care about security, but that doesn't preclude me from jailbreaking my iphone and running dozens of tweaks that haven't been "vetted by professionals", along with sideloaded apps that haven't been through Apple's vetting process either.

My MacBook runs homebrew which currently lists 84 packages installed plus their dependencies, very few of which will have been professionally vetted, and of the 127 apps in my /Applications folder only a third of them came from the Mac App Store, and I would estimate that a quarter of the others aren't even signed with a paid developer certificate.

I want the apps that I get from Apple directly to be safe. I want to know that when I put my faith in the App Store that I'm not lulling myself into a false sense of security. I want my parents and girlfriend, who are not technical people, to have that same sense of security without them having to learn entire programming languages to vet source code themselves.

The benefits of closed systems don't go away just because you say so.

throwaway851 said 13 days ago:

Yes, but would a typical user know or care if the app they downloaded from a web site was sandboxed and would otherwise have been approved by the App Store if it was submitted there? And if not, how could someone like John Gruber make that claim of safety on anything other than iPhone and iPad? Taking the Zoom example on a parent thread above, look at what happens when you’re installing a Zoom client on the Mac without the strict enforcements of the iOS App Store: https://news.ycombinator.com/item?id=22736608

ken said 13 days ago:

This just doesn't seem like a terribly difficult problem. Web browsers have figured it out. Any webpage that isn't served over SSL says "Not Secure" right at the top.

I can think of a dozen ways which the OS could prominently display "Not Secure" for non-sandboxed applications, in a way that wouldn't preclude or hinder users from using such applications if they really wanted to.

ithkuil said 13 days ago:

I wonder what's a decent way to do this with a CLI app

beowulfey said 13 days ago:

I don’t really understand this argument. Apple has long been heralded for its safety and security. It’s why in three decades of owning macs we’ve never installed antivirus software.

What is the point of all this security these days? What are they protecting us from?

markdown said 13 days ago:

Who is this Gruber person you quote and why is he relevant here?

AlchemistCamp said 13 days ago:

He's the person who made the markdown format, which you've used as your username.

Other than that, he's mostly known for writing and talking about Apple.

markdown said 13 days ago:

> He's the person who made the markdown format, which you've used as your username.

That's news to me. My username is my name plus down (I use up for work-related accounts, and down for leisure).

> Other than that, he's mostly known for writing and talking about Apple.

Ahh, ok thanks.

said 13 days ago:
[deleted]
gameswithgo said 13 days ago:

if gruber wants to dictate what i run on my computer maybe he can pay for my computer instead of me.

monadic2 said 14 days ago:

Honestly I'm trying to think of a reason you would WANT to disable OCSP, I'm having enough problems thinking of more than 2 developers I know who can actually articulate how it works enough to evaluate this. Not that it's complicated—it's just mostly invisible.

Even when OCSP is a problem, generally you're more worried about issuing a new certificate than an immediate workaround. What are you going to do, ask all your customers to go into keychain access to work around your problem?

This behavior of slowing down appears to be because apple is making HTTPS connections apparently synchronously (probably unnecessarily) and you'd only be potentially harming yourself by disable OCSP.

Though, I am often frustrated FLOSS desktops and Windows don't allow the behavior I want—maybe this is just cultural.

feross said 14 days ago:

How about it's totally ineffective? OCSP is pointless if you "soft fail" when the OCSP server can't be reached. [1]

This is why Chrome disabled OSCP by default all the way back in 2012-2013 era. Not to mention the performance cost of making all HTTPS connections wait for an OCSP lookup. [2]

[1]: https://www.imperialviolet.org/2012/02/05/crlsets.html

[2]: https://arstechnica.com/information-technology/2012/02/googl...

johnp_ said 13 days ago:

That's why there's OCSP stapling and OCSP must staple. Ever seen an nginx server fail HTTPS connection exactly once after rotating the certificate? That's nginx lazily fetching the OCSP response from upstream for stapling purposes.

saagarjha said 13 days ago:

Notarization has a similar "stapling" workflow as well.

cliffsteele said 13 days ago:

Well, security starts from the user. If you're not mindful of what websites you visit, or what files/apps you download and run, there's no OCSP or anything else there to save you.

OCSP enabled or not, you're still one website click away from being pwned to oblivion, giving full control to the hacker – which, of course, is inevitable to an extent, since bugs always find their way into software.

So why not make it easy to disable?

monadic2 said 13 days ago:

Well, are you going to manually look up certificate revocations yourself? This necessarily requires a network lookup—you can't just glance at the certificate. What's the benefit of disabling this functionality that actively alerts you to revocations?

> Well, security starts from the user. If you're not mindful of what websites you visit, or what files/apps you download and run, there's no OCSP or anything else there to save you.

Sure, but we're discussing good-faith security here. Presumably if people complain about a missing feature they can envision using it. The scenario here is not visiting a shady website and doing something stupid, the scenario here is something like a man-in-the middle attack using a revoked certificate, which would by definition by difficult for the end-user to detect.

> So why not make it easy to disable?

Because then people would disable it for no discernable good effect.

I mean let me be clear, if you're a security researcher you can just modify your own HTTP stack, run a VM, control the hardware, whatever. This isn't a blocker to investigating HTTPS reactions sans OCSP—this is about denying secure connections when they've publicly revoked the cert used to sign the connection. The only reason this is even considered a discrete feature is that most people have never written an OCSP request in order to then trust an HTTPS server—you're just opening yourself up to be misled without even realizing this (and this goes for most of my very network-stack-aware coworkers).

If you're in a browser, you want the browser to be using best practice security, which necessarily includes OCSP. If you know what you're doing this is trivial to bypass.

D-Coder said 14 days ago:

Feature-removal has been the most aggravating part of my Mac life for the past several years. Admittedly I tend to use unusual features, but it's just another PITA when they go away.

ngcc_hk said 13 days ago:

Not sure they have removed anything, but add something.

torstenvl said 13 days ago:

What happens if you edit /private/etc/hosts to point ocsp.apple.com to 0.0.0.0 and flush the DNS cache?

Myrmornis said 13 days ago:

This seems like an interesting line of inquiry.

AIUI doing what you said would permit the network request to proceed, and it would fail because nothing is listening on port 80 [1] We already know that the phone-home bails out when there's no network connection, so perhaps that code also bails out on connection failure?

Alternatively, is there some way to make DNS lookup itself fail for ocsp.apple.com?

Last resort, if we know how to fake the response, running a dummy server listening on localhost would be faster than allowing the request to go over the internet.

[1] Empirically, `curl http://0.0.0.0` yields a connection failure. I think I know that 0.0.0.0 is used in a listening context to mean "listen on all interfaces" but tbh I don't really know what it means in a sending context. Maybe someone can educate me?

IncRnd said 13 days ago:

Sending to 0.0.0.0 will fail immediately. This differs from sending to 127.0.0.0/8 that may connect to a server on the local machine.

Myrmornis said 13 days ago:

> Sending to 0.0.0.0 will fail immediately.

Right, and as far as we know that exception might be caught in the same way as "your computer doesn't have any network connection at all" is caught. Or would those be likely to generate the same exception? Either way, there's a chance that it would result in exec gracefully and quickly not doing the blocking phone-home isn't there?

usmannk said 13 days ago:

0.0.0.0 is non-routable and generally only valid as a src not a dest

saagarjha said 13 days ago:

I think it is fairly likely that your system would not work at all.

saagarjha said 13 days ago:

I believe it's just Base64 encoded DER information, based on the code that seems to be similar: https://github.com/apple-open-source-mirror/Security/blob/70...

caf said 13 days ago:

Yes, that base64 decodes to:

  OCSP Request Data:
    Version: 1 (0x0)
    Requestor List:
        Certificate ID:
          Hash Algorithm: sha1
          Issuer Name Hash: 3381D1EFDB68B085214D2EEFAF8C4A69643C2A6C
          Issuer Key Hash: 5717EDA2CFDC7C98A110E0FCBE872D2CF2E31754
          Serial Number: 7D86ED91E10A66C2
usmannk said 13 days ago:

I can't edit anymore but it seems like the OCSP link could potentially be a red herring just checking the cert for the next request to https://api.apple-cloudkit.com/. It's worth looking further!

Darkstryder said 13 days ago:

I'm surprised nobody mentioned that Windows Defender does something very similar (checking for never-seen-before binaries at runtime, uploading them to Microsoft servers, then running them there) : https://news.ycombinator.com/item?id=21180019

pinopinopino said 13 days ago:

God, this shit makes me laugh. Why are they doing this.

But from Edit2: Your hash is some sort of base64

     let str = 
"ME4wTKADAgEAMEUwQzBBMAkGBSsOAwIaBQAEFDOB0e_baLCFIU0u76+MSmlkPCpsBBRXF+2iz9x8mKEQ4Py+hy0s8uMXVAIIfYbtkeEKZsI="

Then we see weird random gaps in the alphabet used, not so weird, because not every character will be used in every string:

     Prelude Data.List> map head $  group $ sort $ str
     "+0246789=ABCDEFGIKLMOPQRSTUVXYZ_abefghiklmpstuwxyz"
If we fill these up then:

      Prelude Data.List> let xs = "+0123456789=ABCDEFGHIJKLMNOPQRSTUVWXYZ_abcdefghijklmnopqrstuvwxyz"
      Prelude Data.List> length xs
      65
So base64 with some non standard symbols. I don't know what standard base64 is supposed to look to be honest, so perhaps it is standard base64. The = is definitely padding.
saagarjha said 13 days ago:

It decodes cleanly as base64.

ignoranceprior said 13 days ago:

Does this mean you can't run a custom shell script without an internet connection?

usmannk said 13 days ago:

If the connection fails it goes ahead and grants permission.

markandrewj said 14 days ago:

The isn't specific to the article, but another place that can be interesting to look at system activity on Mac OS is the console.

https://support.apple.com/en-ca/guide/console/cnslbf30b61a/m...

jwatte said 15 hours ago:

Let's assume that sending network packets to verify the trustworthiness of commands is a good idea. (It may not be, but that's a different discussion.) If you have a modern OS with sufficient virtualization and containerization and indirection, you could optimistically let the commands run, and not commit the side effects of the command until you get back a result. Create little write logged mini branches of your file system, and only actually pause when someone else wants to inspect your side effects. By then, an asynchronous check should have gotten back to you.

said 14 days ago:
[deleted]
moyix said 13 days ago:

Were you able to MITM the api.apple-cloudkit.com connection? I tried with MITMProxy but ran into a client error, which made me think they were doing cert pinning.

If you did get it to work could you paste the logs somewhere?

usmannk said 13 days ago:

Yes but it looks like there is no actual session, at least for shell scripts that don't have an app bundle ID. There is just an HTTP CONNECT, TLS negotiation, then nothing.

davidvartan said 14 days ago:

> a degraded user experience, as the first time a user runs a new executable, Apple delays execution while waiting for a reply from their server.

The way to avoid this behavior is to staple the notarization ticket to your bundle (or dmg/pkg), i.e. "/usr/bin/stapler staple <path>." Otherwise, Gatekeeper will fetch the ticket and staple it for the user on the first run.

(I'm the author of xcnotary [1], a tool to make notarization way less painful, including uploading to Apple/polling for completion/stapling/troubleshooting various code signing issues.)

[1] https://github.com/akeru-inc/xcnotary

xenadu02 said 14 days ago:

Xcode (the UI) is able to bypass GateKeeper checks for things it builds.

The "Developer Tool" pane in System Prefs, Security, Privacy is the same power. Drag anything into that list you'd like to grant the same privilege (such as xcodebuild). This is inherited by child processes as well.

The point of this is to avoid malware packing bits of Xcode with itself and silently compiling itself on the target machine, thus bypassing system security policy.

indemnity said 14 days ago:

Reminds me of the AV exception folder our corporate IT created for developers. Soon absolutely everything developers needed or created was installed into that folder. Applications, IDEs, you name it.

kunday said 12 days ago:

Guilty as accused. I try to keep to an absolute minimum. Like docker data-dir and IDE. With that i can atleast use my machine.

otherwise this macos notarisation, along with a possibly of cpu heating issues with left thunderbolt usage and corporate av scanning, makes my machine, next to useless

LeoPanthera said 14 days ago:

Putting Terminal (and your favorite text editor) in this category and in "Full Disk Access" will change your life.

MrBuddyCasino said 13 days ago:

How does "Full Disk Access" help?

lloeki said 13 days ago:

You can browse Time Machine backup directory trees from the CLI again.

sneak said 14 days ago:

Yes, falling victim to ransomware is definitely lifechanging if you don’t have good backups.

LeoPanthera said 14 days ago:

That is a non-sequitur.

mperham said 13 days ago:

It's not; they are stating that if you bypass these security checks, you open the machine up to ransomware.

justinmeiners said 13 days ago:

better not turn on it at all, to be extra safe

grishka said 14 days ago:

So since these permissions apply to process trees, what happens if you put launchd in there?

aasasd said 14 days ago:

The computer will probably hang while it tries to solve the chicken-egg problem.

Isn't launchd Mac's ‘init’? I.e. run before anything else.

grishka said 14 days ago:

Yes, and that's the point — everything you run will theoretically inherit the permission from it.

acecilia said 13 days ago:

Can you advise on how to make the "Developer Tool" panel in "System Prefs, Security, Privacy" appear if it is not present? Cant find a way: https://stackoverflow.com/questions/60176405/macos-catalina-...

saagarjha said 12 days ago:
acecilia said 12 days ago:

Thanks for the link. Tried it, but that did not work

wila said 13 days ago:

GateKeeper only triggers the check for things downloaded from the internet. IOW, it checks if your binary has a quarantine flag attached via an extended attribute.

xenadu02 said 13 days ago:

That is not correct starting with Catalina.

make3 said 13 days ago:

How do I get a "Developer Tool" pane in System Prefs? Do I have to install X-Code? I would really rather not

closeparen said 14 days ago:

This is life-changing. Thank you!

pindab0ter said 14 days ago:

What did you notice?

scottlamb said 14 days ago:

> The way to avoid this behavior is to staple the notarization ticket to your bundle (or dmg/pkg)

Maybe in some cases, but the article says "even if you write a one line shell script and run it in a terminal, you will get a delay!"

Shell scripts don't come in bundles. I don't think this kind of stapling is possible for them? I don't think it'd be reasonable to expect users to do this anyway.

davidvartan said 14 days ago:

The Gatekeeper behavior is specific to running things from Finder (not Terminal), and only if you downloaded it via a browser that sets the com.apple.quarantine xattr.

Two posts from Apple dev support (Cmd+F "eskimo") describe this in more detail.

https://forums.developer.apple.com/thread/127709

https://forums.developer.apple.com/thread/127694

nemosaltat said 14 days ago:

I recently learned that `xattr -cr path/to/my.app` solves the “this App is damaged would you like to move it to the trash” you get when you copy an app from one Mac to another.

rhizome said 14 days ago:

That might be the Windows-iest feature of OSX I've ever heard of.

cosmojg said 14 days ago:

It seems macOS is going downhill fast these days.

withinboredom said 14 days ago:

No, it’s just that they’re becoming more popular. When you become a popular desktop OS, governments and militaries want to start using it which comes with some strange requirements. It also means that you can’t rely on “obscurity” to provide any sort of security, where before you could overlook some things.

catalogia said 14 days ago:

Can you cite any sources for your claim that these things are being implemented to satisfy government/military requirements?

o-__-o said 14 days ago:

DISA?

I don’t know why grand op is downvoted. DoD requirements literally require a timeout setting for screensavers to begin locking. This has caught systems which have a race condition where you can move your mouse quickly and gain desktop access before it locks.

The long term effects come from the required changes to the development security model to remain productive and profitable (took MSFT a few OOB hotfixes and service packs to fix that example above, look when gnome kde xscreensaver etc introduced that feature etc)

saagarjha said 13 days ago:

> This has caught systems which have a race condition where you can move your mouse quickly and gain desktop access before it locks.

I fail to see how this is a race condition rather than how a screensaver is supposed to work?

o-__-o said 13 days ago:

Because it’s not, that’s why I pointed to xscreensaver feature implementation. Lock time is separate from screensaver activation time which is separate from energy saving activation time.

What defines when a locking screen saver is “locked”? 10m? Or 10m1s? You are making assumptions and that is what DISA spells out. Which forces the OS design to change in subtle ways. Like xattrs on files as great grand op was alluding to.

Does that provide clarity into how development security models evolve over the lifetime of an application?

noisem4ker said 14 days ago:

What would that mean?

bobbylarrybobby said 14 days ago:

It would appear to mean it's a hacky, over-technical solution to a problem that shouldn't exist in the first place, as copying things from one computer to another should just work™. This is one place where macOS used to shine and seems to be increasingly falling behind in.

JadeNB said 14 days ago:

> The Gatekeeper behavior is specific to running things from Finder (not Terminal), and only if you downloaded it via a browser that sets the com.apple.quarantine xattr.

The article says the described problem isn't limited in this way:

> This is not just for files downloaded from the internet, nor is it only when you launch them via Finder, this is everything. So even if you write a one line shell script and run it in a terminal, you will get a delay!

staticfloat said 14 days ago:

If you read the comments of the article and do your own testing, you will find that reality appears to be more complicated than the article suggests. Users have shown using both timing and wireshark that the shell scripts do not appear to be triggering notarization checks.

said 14 days ago:
[deleted]
reuben_scratton said 14 days ago:

Quinn The Eskimo at Apple's forums is a 10x support engineer, his posts have helped me fix dozens of problems.

Someone said 14 days ago:

Unless somebody took over his name he’s been at Apple for almost 25 years, and was already being interviewed as such 20 years ago (http://preserve.mactech.com/articles/mactech/Vol.16/16.06/Ju...)

His site (http://www.quinn.echidna.id.au/Quinn/WWW/) supports its claim “I'm not a great believer in web” :-)

saagarjha said 13 days ago:

It's interesting to see a time when Apple seemed to allow employees to have side projects…

saagarjha said 14 days ago:

He needs to be, because Apple Developer Technical Support is chronically understaffed.

xenadu02 said 13 days ago:

This is the way things worked prior to Catalina but is no longer the case.

oefrha said 14 days ago:

I mean, when I’m developing in a compiled language with the workflow edit code -> compile -> run (with forced stapling), changing it to edit code -> compile -> staple -> run doesn’t make it any less slow...

oefrha said 14 days ago:

An update: flat out denying network access to syspolicyd using Little Snitch could cut down on the delay. (Yes, syspolicyd does send a network request to apple-cloudkit.com for every single new executable. Denying its access to apple-cloudkit.com only isn't sufficient either since it falls back to IP address directly.) Note that this might not be a great idea, and it still has nonzero cost — a network request has to be made and denied by Little Snitch.

Here's my benchmarking script:

  #!/bin/zsh
  tmpfile=$(mktemp)
  cat >$tmpfile <<EOF
  #!/bin/sh
  echo $RANDOM  # Use a different script each time in case it makes a difference.
  EOF
  chmod +x $tmpfile
  setopt xtrace
  time ( $tmpfile )
  time ( $tmpfile )
  unsetopt xtrace
  rm -f $tmpfile
If your local terminal emulator is immune with "Developer Tools" access (interestingly, toggling it off doesn't bring back the delay for some reason), you should be able to reproduce the delay over ssh.
davidvartan said 14 days ago:

I can repro this locally as well. Interesting if it's inconsistent with Apple docs and when Gatekeeper should be firing, as running stuff locally without distributing/downloading is somewhat out of scope for notarization.

Reached out about this to Apple dev support, hope to get more insight.

abathur said 12 days ago:

> interestingly, toggling it off doesn't bring back the delay for some reason

Noticed the same; it should come back if you disable it and reboot.

davidvartan said 14 days ago:

Notarization/stapling/etc. is for distribution only, not generally part of your dev workflow.

oefrha said 14 days ago:

But TFA and my personal experience do point to a noticeable delay after each recompile in dev workflows, and TFA claims this is due to notarization checks... So I guess I’m confused and you’re talking about something else?

rgrs said 14 days ago:

How does mac identify a dev workflow and normal workflow?

jmercouris said 14 days ago:

When you use XCode you have different compilation options.

ihiulll said 14 days ago:

I'm confused. does macbook send executable to apple servers or just the hash?

saagarjha said 13 days ago:

Just the hash.

dahfizz said 13 days ago:

The way to avoid this behavior is to not buy a machine from a company that actively hates it's users.

jaimehrubiks said 14 days ago:

In our company many of us have similar issues. I have always loved OSX but this time it is driving me crazy. I though the issue was some sort of company antivirus/firewall, or it could even be a combination of that and this issue (maybe my vpn + path to company firewall is what magnifies the issue in this post). The thing is that some commands take 1 second, some others take 2 minutes or even more. Actually, some commands slow down the computer until they are finished (more likely, until they just decide to start).

For example, I can run "terraform apply" and it could take up to 5 minutes to start, leaving my computer almost unusable until it runs. The weird thing is that this only happens sometimes. In some cases, I restart the laptop and it starts working a little bit faster, but the issue comes back after some time.

It's already been a few months since I try to run every command from a VM in a remote location, since I am tired of waiting for my commands to start.

I have a macbook air from 2013 which never had this issue.

Any easy fix that I could test? Disconnecting from the internet is not an option. Disabling SIP could be tried, but I think I already did and didn't seem to fix it, plus it is not a good idea for a company laptop.

Don't we have some sort of hosts file or firewall that we can use to block or fake the connectivity to apple servers?

derefr said 14 days ago:

IIRC the big thing that changed with 10.15 for CLI applications is that BSD-userland processes (i.e. ones that don't go through all the macOS Frameworks, but just call libc syscall wrappers like fopen(2)) now also deal with sandboxing, since the BSD syscall ABI is now reimplemented in terms of macOS security capabilities.

Certain BSD-syscall-ABI operations like fopen(2) and readdir(2) are now not-so-fast by default, because the OS has to do a synchronous check of the individual process binary's capabilities before letting the syscall through. But POSIX utilities were written to assume that these operations were fast-ish, and therefore they do tons of them, rather than doing any sort of batching.

That means that any CLI process that "walks" the filesystem is going to generate huge amounts of security-subsystem request traffic; which seemingly bottlenecks the security subsystem (OS-wide!); and so slows down the caller process and any other concurrent processes/threads that need capabilities-grants of their own.

To find a fix, it's important to understand the problem in fine detail. So: the CLI process has a set of process-local capabilities (kernel tokens/handles); and whenever it tries to do something, it first tries to use these. If it turns out none of those existing capabilities let it perform the operation, then it has to request the kernel look at it, build a firewall-like "capabilities-rules program" from the collected information, and run it, to determine whether it should grant the process that capability. (This means that anything that already has capabilities granted from its code-signed capabilities manifest doesn't need to sit around waiting for this capabilities-ruleset program to be built and run. Unless the app's capabilities manifest didn't grant the specific capability it's trying to use.)

Unlike macOS app-bundles, regular (i.e. freshly-compiled) BSD-userland executable binaries don't have a capabilities manifest of their own, so they don't start with any process-local capabilities. (You can embed one into them, but the process has to be "capabilities-aware" to actually make use of it, so e.g. GNU coreutils from Homebrew isn't gonna be helped by this. Oh, and it won't kick in if the program isn't also code-signed, IIRC.)

But all processes inherit their capabilities from their runtime ancestors, so there's a simple fix, for the case of running CLI software interactively: grant your terminal emulator the capabilities you need through Preferences. In this case, the "Full Disk Access" capability. Then, since all your all CLI processes have your terminal emulator as a runtime ancestor-process, all your CLI processes will inherit that capability, and thus not need to spend time requesting it from the security subsystem.

Note that this doesn't apply to BSD-userland executable binaries which run as LaunchDaemons, since those aren't being spawned by your terminal emulator. Those either need to learn to use capabilities for real; or, at least, they need to get exec(2)ed by a shim binary that knows how.

-----

tl;dr: I had this problem (slowness in numerous CLI apps, most obvious as `brew upgrade` suddenly taking forever) after upgrading to 10.15 as well. Granting "Full Disk Access" to iTerm fixed it for me.

saagarjha said 14 days ago:

> IIRC the big thing that changed with 10.15 for CLI applications is that BSD-userland processes (i.e. ones that don't go through all the macOS Frameworks, but just call libc syscall wrappers like fopen(2)) now also deal with sandboxing, since the BSD syscall ABI is now reimplemented in terms of macOS security capabilities.

Is this actually new in macOS 10.15? I seem to recall this being a thing ever since sandboxing was a thing, even all the way back to when it was called Seatbelt.

> That means that any CLI process that "walks" the filesystem is going to generate huge amounts of sandboxd traffic, which bottlenecks sandboxd and so slows down the caller process.

Is this not implemented in the kernel as an extension? I thought the checks went through MAC framework hooks. Doesn't sandboxd just log access violations when told to do so by the Sandbox kernel extension?

> Unlike macOS app-bundles, regular BSD-userland executable binaries don't have a capabilities manifest of their own, so they don't start with any process-local capabilities (with some interesting exceptions, that I think involve the binary being embedded in the directory-structure of a system framework, where the binary inherits its capabilities from the enclosing framework.)

I am fairly sure you can just embed a profile in a section of your app's binary and call the sandboxing Mach call with that…

derefr said 14 days ago:

> I seem to recall this being a thing ever since sandboxing was a thing, even all the way back to when it was called Seatbelt.

Maybe you're right; I'm not sure when they actually put the Seatbelt/TrustedBSD interpreter inline in the BSD syscall code-path. What I do know is that, until 10.15, Apple tried to ensure that the BSD-userland libc-syscall codepath retained mostly the same behavioral guarantees as it did before they updated it, in terms of worst-case time-complexities of syscalls. Not sure whether that was using a short-circuit path that went around Seatbelt or used a "mini-Seatbelt" fast path; or whether it was by hard-coding a pre-compiled MAC ruleset for libc calls that only relied upon the filesystem flag-bits, and so never had to do anything blocking during evaluation.

Certainly, even as of 10.12, BSD-userland processes weren't immune to being exec(2)-blocked by the quarantine xattr. But that may have been a partial implementation (e.g. exec(2) going through the MAC system while other syscalls don't.) It's kind of opaque from the outside. It was at least "more than nothing", though I'm not sure if it was "everything."

One thing that is clear is that, until 10.15, BSD processes with no capabilities manifest, still had the pretty much exactly the same default set of privileges that they had before capabilities, which means "almost everything" (and therefore they almost never needed to actually hit up the security system for more grants.) I guess all Apple really needed to have done in 10.15 to "break BSD", was to introduce some more capabilities, and then not put them in the default/implicit manifest.

I suppose what actually happened in 10.15 can be determined easily-enough from the OSS code that's been released. :)

> Is this not implemented in the kernel as an extension? // I am fairly sure you can just embed a profile in a section of your app's binary and call the sandboxing Mach call with that…

Yeah, sorry, you're right; updated my assertions above. I'm not a kernel dev; I've just picked up my understanding of this stuff from running head-first into it while trying to do other things!

danudey said 14 days ago:

It's a new behavior that doing 'find ~' will trigger a MacOS (GUI) permissions warning dialog when `find` tries to access your photos directory, contacts file, etc.

saagarjha said 14 days ago:

That is new, but I believe the groundwork for that was mostly laid in 10.14 and is also mostly in the kernel.

jfkebwjsbx said 14 days ago:

Why would sandboxing be slower?

They are definitely doing something way too slow.

derefr said 14 days ago:

Apple replaced the very simple (i.e. function fits in a cache line; inputs fit in a single dword) BSD user/group/other filesystem privileges system, with a Lisp interpreter (or maybe compiler? not sure) executing some security DSL[1][2].

[1] https://wiki.mozilla.org/Sandbox/OS_X_Rule_Set

[2] https://reverse.put.as/wp-content/uploads/2011/09/Apple-Sand...

This capabilities-ruleset interpreter is what Apple uses the term "Gatekeeper" to refer to, mostly. It had already been put in charge of authorizing most Cocoa-land system interactions as of 10.12. But the capabilities-ruleset interpreter wasn't in the code-path for any BSD-land code until 10.15.

A capabilities-ruleset "program" for this interpreter can be very simple (and thus quick to execute), or arbitrarily complex. In terms of how complex a ruleset can get—i.e. what the interpreter's runtime allows it to take into consideration in a single grant evaluation—it knows about all the filesystem bitflags BSD used to, plus Gatekeeper-level grants (e.g. the things you do in Preferences; the "com.apple.quarantine" xattr), plus external system-level capabilities "hotfixes" (i.e. the same sort of "rewrite the deployed code after the fact" fixes that GPU makers deploy to make games run better, but for security instead of performance), plus some stuff (that I don't honestly know too much about) that can require it to contact Apple's servers during the ruleset execution. Much of this stuff can be cached between grant requests, but some of it will inevitably have to hit the disk (or the network!) for a lookup—in the middle of a blocking syscall.

I'm not sure whether it's the implementation (an in-kernel VM doesn't imply slowness; see eBPF) or the particular checks that need to be done, but either way, it adds up to a bit of synchronous slowness per call.

The real killer that makes you notice the problem, though, isn't the per-call overhead, but rather that the whole security subsystem seems to now have an OS-wide concurrency bottleneck in it for some reason. I'm not sure where it is, exactly; the "happy path" for capabilities-grants shouldn't make any Mach IPC calls at all. But it's bottlenecked anyway. (Maybe there's Mach IPC for audit logging?)

The security framework was pretty obviously structured to expect that applications would only send it O(1) capability-grant requests, since the idiomatic thing to do when writing a macOS Cocoa-userland application, if you want to work with a directory's contents, is to get a capability on a whole directory-tree from a folder-picker, and then use that capability to interact with the files.

Under such an approach, the sandbox system would never be asked too many questions at a time, and so you'd never really end up in a situation where the security system is going to be bottlenecked for very long. You'd mostly notice it as increased post-reboot startup latency, not as latency under regular steady-state use.

Under an approach where you've got many concurrent BSD "filesystem walker" processes, each spamming individual fopen(2)-triggered capability requests into the security system, though, a failure-to-scale becomes very apparent. Individual capabilities-grant requests go from taking 0.1s to resolve, to sometimes over 30s. (It's very much like the kind of process-inbox bottlenecks you see in Erlang, that are solved by using process pools or ETS tables.)

Either Apple should have rethought the IPC architecture of sandboxing in 10.15, but forgot/deprioritized this; or they should have made their BSD libc transparently handle "push down" of capabilities to descendent requests, but forgot/deprioritized that.

comex said 13 days ago:

The Scheme interpreter only runs when compiling a sandbox. It's compiled into a simple non-Turing-complete bytecode, and that's what's consulted on every syscall. This has been the case since… 10.5 or something. It's always been on the path for BSD code. And Cocoa operations lower to BSD syscalls anyway. There's no system for them to get a "capability" for a directory tree; on the contrary, file descriptors ought to be able to serve as capabilities, but the Sandbox kext stupidly computes the full path for every file that's accessed before matching it against a bunch of regexes. This too has been the case as long as Sandbox has existed.

There is a bunch of new stuff in 10.15, mostly involving binary execs (and I don't understand all of it), but I'm pretty sure it doesn't match what you're describing.

saagarjha said 14 days ago:

> Lisp interpreter (or maybe compiler? not sure)

I believe it is actually a Scheme dialect, and I would be very surprised if it is not compiled to some internal representation upon load.

> This capabilities-ruleset interpreter is what Apple uses the term "Gatekeeper" to refer to, mostly.

I am fairly sure Gatekeeper is mostly just Quarantine and other bits that prevent the execution of random things you download from the internet.

lioeters said 13 days ago:

In the Apple Sandbox Guide v1.0 [1], it mentions Dionysus Blazakis' paper [2] presented at Blackhat DC 2011.

In the latter, Apple's sandbox rule set (custom profiles) is called SBPL - Sandbox Profile Language - and is described as a "Scheme embedded domain specific language".

It's evaluated by libSandbox, which contains TinyScheme! [3]

From what I could understand, the Scheme interpreter generates a blob suitable for passing to the kernel.

---

[1] https://reverse.put.as/wp-content/uploads/2011/09/Apple-Sand...

[2] https://media.blackhat.com/bh-dc-11/Blazakis/BlackHat_DC_201...

[3] http://tinyscheme.sourceforge.net/home.html

saagarjha said 13 days ago:

That sounds about right. I was doing some work in this area very recently, which found a couple of methods to bypass sandboxing entirely, but somewhat humorously the issues did not require me to have any understanding of how the lower levels of this worked ;)

lioeters said 13 days ago:

Blazakis' paper is a fascinating investigative/exploratory work, delving deep into the sandbox mechanism. I learned more than I wanted to know!

saagarjha said 13 days ago:

Yeah, it's on my reading list :)

jfkebwjsbx said 13 days ago:

> Much of this stuff can be cached between grant requests, but some of it will inevitably have to hit the disk (or the network!) for a lookup—in the middle of a blocking syscall.

Running any kind of I/O during a capability check is a broken design.

There is no reason to hit the disk (it should be preloaded), much less the network (such a design will never work if offline).

dcow said 14 days ago:

A command like `terraform` shouldn't trigger the check because the quarantine system is bypassed altogether when you download and extract an archive. Maybe this is a red herring and your initial gut inkling is correct.

saagarjha said 14 days ago:

Try sampling the process as it starts; I doubt your issue is the one shown here.

acdha said 14 days ago:

> For example, I can run "terraform apply" and it could take up to 5 minutes to start, leaving my computer almost unusable until it runs.

On a clean Catalina install this does not happen. Does “terraform version” have the same delay? If not, check your remote configuration - maybe run with TF_LOG=trace. Terraform Cloud will definitely highlight the inherent performance problems of using a VPN.

jen20 said 14 days ago:

It is worth noting that `terraform version` connects to HashiCorp’s own checkpoint service by default so this may not be the best test.

totetsu said 13 days ago:

docker run -i -t -v "$(pwd)":/project hashicorp/terraform:light apply /project/thing.tf . Maybe(if your projects terraform version is the latest.)?

brendangregg said 13 days ago:

Adding network calls to syscalls like exec() is utterly insane. This road can lead to bricked laptops where you can't run anything to fix it (imagine an unexpected network error that the code doesn't handle properly). And crackers will just use ways to overwrite running instruction text to avoid the exec().

The comments on the article are annoying: it good that there's a mini way to reproduce, but please, use some further debugging like tcpdump (it still exists on osx, right?). Last time I summarized osx debugging was https://www.slideshare.net/brendangregg/analyzing-os-x-syste...

I'd also stress test it: generate scripts in a loop that include random numbers and execute them.

xvector said 13 days ago:

There is no excuse for this except for sheer, utter incompetence. Everyone involved in writing and shipping this should be ashamed of themselves.

drvdevd said 13 days ago:

This is what I scrolled all the way down this thread for - to see if anyone thinks this is a good design/security decision on Apples part. I’m trying to understand what the reasoning is for this particular decision and if it actually makes the OS more secure in any meaningful way? Or does it actually- just degrade performance with very limited benefits? Are there any real benefits to this VS current security design in popular Desktop Linux distros at this point?

HappyDreamer said 13 days ago:

Couldn't this have been a business decision? Not about security? (just what they say?)

To make non-App-store apps annoyingly unusable, so the App store will sell more apps, instead of people downloading in other ways?

Just like Apple cripples the Safari browser and PWA apps.

Long term, maybe Apple wants to be able to remote-forbid apps if Apple is developing their own competing app?

Whilst most developers working at Apple understands this, and don't like it? Maybe the developers even feel happy about people here at HN being disappointed, and think that "now the business people here at Apple notice that this causes disappointment" ?

fluffything said 13 days ago:

Most of the apps that sell well originate from a developer solving a need they had, on the system they were using.

If this drives developers from OSX to other OSes, chances are they will develop apps for those OSes first.

Apple is too big to fail at this point, but driving developers away from your platform isn't a very clever strategy. You never know when you are going to hit a tipping point, and after you notice and people stop using macosx for development its already too late.

It took me ~150 hours to migrate to Linux, but my user and developer experience on Linux is much better than on MacOSX (emacs daemon "just works"!!!), so after all that work I wouldn't consider switching to OSX in the next 5 years at least. I had a Macbook air 2012, and because Apple still hasn't released a laptop that isn't a downgrade from that in some sense (keyboard, magsafe, ...) I've went with a think pad instead. Tiny details, like having a webcam that doesn't suck now prevent me from going back to OSX.

saagarjha said 13 days ago:

I don't think the people at Apple are actively trying to make non-App Store apps unusable because they want to make more money from the App Store or anything. It's just that they want code to pass through them, and as a by product making code that has been vetted less or does things that could potentially be abused is made more annoying to run. Such a change is divisive, as you may have guessed.

michaelmrose said 13 days ago:

That vetting will come at the cost of 30% of money paid for your software and any money earned within the software.

said 13 days ago:
[deleted]
saagarjha said 13 days ago:

It checks that executables have been notarized by Apple? I can't say I really think notarization is great, but I think it's clear from their perspective how it would be beneficial?

drvdevd said 13 days ago:

Sure. But as Brendan Gregg pointed out in his comment - doing this at the level of exec() on a UNIX-like OS is ... a questionable technical choice to say the least.

What’s the Linux equivalent of “notarization”? I’m not sure. Of course there’s probably more than one answer to that - let’s just taking signing packages as an example.

In theory Apple could put their weight behind vetting some of the popular open source packages perhaps? Or delegate that to the maintainers of those repositories and make them trusted? Like homebrew, for example (maybe a poor example, but you see how I’m trying to compare this with Linux...)

This is after all, what actually makes macOS useful to people on the command line 99% of the time, anyway.

So anyway, I agree on the surface it seems like this might be beneficial to Apple, but it doesn’t appear to be well considered.

They could invest more time in better sandbox and/or container type features that let people define some of their own more granular security boundaries. But they aren’t I guess? What are they doing here?

pjmlp said 13 days ago:

Apple OSes never were about CLI, pre-OS X you didn't have a CLI as standard OS feature.

Selling UNIX underpinning was just a marketing move for willing to betray GNU/Linux and BSD in name of a better laptop experience, instead of helping OEMs selling their stuff.

Something that NeXT also did against the Sun workstations market.

On Linux side of the this kind of security measures never work, because the moment someone introduces something like this, the distribution gets forked.

It works on ChromeOS and Android, because it hardly matters to userspace that Linux is the actual kernel, Google could embark (and it is actually) in a kernel replacement project and most stuff would just work.

saagarjha said 13 days ago:

I'm not sure I particularly appreciate your use of the word "betray" for the BSDs. Sure, macOS is not really a great adherent to the GNU philosophy, but for the BSDs it actually did fairly well for a while. (It's still true UNIX, if barely.)

pjmlp said 13 days ago:

Take as you wish, if those users were actually supportive of the BSDs, they would be giving their hard earned cash directly to OEMs selling proper FreeBSD, OpenBSD, NetBSD, DragonFly based devices.

One cannot give the money instead to Apple and then come back complain that they were mislead.

NeXTSTEP was also a true UNIX, that wasn't why most business bought it, rather Renderman and other graphical based tooling.

I have used Apple platforms on and off since the LC II days, their commercial view was always quite clear to me.

trasz said 13 days ago:

The problem with BSD on the desktop isn’t the BSD, it’s the desktop. Open Source desktop environments are still ages behind OSX.

saagarjha said 13 days ago:

I am actually curious who sells BSD hardware these days.

pjmlp said 13 days ago:

Examples from Germany,

https://www.tuxedocomputers.com/

They do GNU/Linux, but BSDs should probably work on their hardware, as mentioned on this old post (sorry in German).

https://www.tuxedocomputers.com/de/Infos/News/OpenBSD-6-3-cu...

Or by getting in touch with companies like os-cillation.

https://www.os-cillation.de/en/opensourceprojekte/bsd-specia...

saagarjha said 13 days ago:

Thanks for the links. I probably won’t be buying any of those soon, but they looked surprisingly beefy for the price point. As an aside, the immature part of me giggled a bit to see the German for product dimensions:

> max 1,65cm dick

john_alan said 13 days ago:

Watching the notarization video from WWDC last year they explicitly said it wouldn’t affect command line apps.

saagarjha said 13 days ago:

I believe that some of the problems here have actually started affecting command line apps in Catalina.

john_alan said 13 days ago:

Only if you don't specify your terminal as a dev tool

will_pseudonym said 13 days ago:

Hey, malevolence can also play into this. Don't chalk everything up automatically to incompetence. /s

pmarreck said 13 days ago:

There’s going to be a big exodus of open source developers going to Linux-powered platforms instead of the standard Mac laptop because of this ridiculousness

jfkebwjsbx said 13 days ago:

> the standard Mac laptop

There is nothing standard about a Mac laptop, both technically and in market share.

pmarreck said 13 days ago:

Well, I'd say 90% of the computers I've seen at the last 10 confs I've attended were Macbook Pros

https://hackernoon.com/why-do-developers-run-macs-9ad81d58d1...

jfkebwjsbx said 13 days ago:

Look outside the US.

saagarjha said 13 days ago:

At Silicon Valley technology companies? A Mac is generally the computer that you're likely to get.

jfkebwjsbx said 13 days ago:

Silicon Valley is a very small dot in the global scale.

IshKebab said 12 days ago:

It's not just Silicon Valley. In the last two companies I've worked in in the UK everyone had Macbooks.

saagarjha said 13 days ago:

A fairly influential one, nonetheless.

jfkebwjsbx said 13 days ago:

Influential in technology output? Yeah. Influential in Mac market share? Not in the slightest.

Companies around the globe don’t care one bit about which laptops SV companies are buying.

cageface said 13 days ago:

This is happening at my company already because docker performance on Macs is terrible.

millstone said 13 days ago:

On the one hand, of course it is, because Macs are slow at running Linux stuff in the same way that Linux is slow at running non-Linux stuff.

On the other hand, Apple should decide if they care about Docker performance. The answer seems to be "a little" (Hypervisor.framework) but much less than, say, Microsoft.

Apple doesn't talk about their future plans. Today we see stagnation, YET with spikes of exotic ideas (e.g. L4, which would permit efficient L4 Linux).

Per Apple's style, a big kernel change on the Mac side would absolutely be tied to a hardware change, to break things once and not twice. Build a new Mac with a Linux-friendly kernel (perhaps Linux, perhaps modified L4, or something new), put it on their beastly ARM CPUs, and I'm drooling.

Then again I don't work at Apple.

pmarreck said 13 days ago:

Is that slowness possibly related to the OP's issue? And possibly might benefit from the same workarounds posted here?

saagarjha said 13 days ago:

> And crackers will just use ways to overwrite running instruction text to avoid the exec().

This would require breaking your code signature and as such requires extra entitlements in the hardened runtime.

xenadu02 said 11 days ago:

That's not quite correct. If network access is unavailable or fails then the exec is allowed. The behavior has been improved over time, putting stricter limits on how long the check is allowed to take before giving up.

The Mac remains a Mac: if you turn off SIP it also disables this behavior. You are free to choose less security for more convenience if that is your preference.

ridiculous_fish said 13 days ago:
saagarjha said 13 days ago:

…with everything to do with the sandbox left out.

ridiculous_fish said 13 days ago:

Fair point. These tarballs may be, err, editorialized.

If exec is blocking in the kernel on IPC to some daemon, that should be observable (e.g. Instruments with kernel traces enabled).

saagarjha said 13 days ago:

Yeah, I'm sure a good spindump would be able to find what the code is blocked on. Sadly I run with SIP disabled so I can attach to things, so I probably cannot reproduce the issue…

m463 said 12 days ago:

Most of the important parts are left out.

at this point opensource and apple are sort of on life support.

millstone said 13 days ago:

Well NFS and SMB exist, you can exec() on such mounts.

nromiun said 14 days ago:

> This is not just for files downloaded from the internet, nor is it only when you launch them via Finder, this is everything. So even if you write a one line shell script and run it in a terminal, you will get a delay!

> Apple’s most recent OS where it appears that low-level system API such as exec and getxattr now do synchronous network activity before returning to the caller.

Can anyone confirm this? Because honestly this is just terrifying. I don't think even Windows authorises every process from a server. This doesn't sound good for both privacy and speed.

mbreese said 14 days ago:

There are two new Security/Privacy Settings that I just noticed last night.

"Full Disk Access" to allow a program to access any place on your computer without a warning. A few programs requested this, so it looks like it's been around for a while.

The other one is "Developer Tools" and it looks pretty new. The only application requesting it is "Terminal". This "allows app to run software locally that do not meet the system's security policy". So, my reading of this is that in Terminal, you could run scripts that are unsigned and not be penalized speed-wise.

oefrha said 14 days ago:

I don't see it on macOS 10.15.4 (19E287). The full list of categories on my Privacy tab:

  - Location Services
  - Contacts
  - Calendars
  - Reminders
  - Photos
  - Camera
  - Microphone
  - Speech Recognition
  - Accessibility
  - Input Monitoring
  - Full Disk Access
  - Files and Folders
  - Screen Recording
  - Automation
  - Advertising
  - Analytics & Improvements
Granted I don't typically use Terminal.app (iTerm 2 user), so I launched terminal and did some privileged stuff. Had to grant Full Disk Access to, say, `ls ~/Library/Mail`, but "Developer Tools" never popped up.

Are you running a beta build or something?

---

Update: Okay, I checked on my other machine and that one does have it (Terminal is listed but disabled by default). What in the actual fuck?!?

xenadu02 said 13 days ago:

You can make the category appear and put Terminal in it with this command:

sudo spctl developer-mode enable-terminal

saagarjha said 13 days ago:

I'd be nice if this was documented somewhere :/

hanche said 13 days ago:

I was going to be that guy and say “man spctl”, but that usage isn’t listed there. If you run spctl with no arguments, it will tell you, however. The man pages on macos really do leave something to be desired.

acecilia said 13 days ago:

This does not make the "developer tools" panel show up in my machine :( tried everything already

saagarjha said 14 days ago:

I don't see it on my machine. Do you happen to have System Integrity Protection disabled?

oefrha said 14 days ago:

No, SIP is fully enabled on both the machine with the Developer Tools category and the one without.

Interestingly, I rebooted the machine without after some benchmarking and experimentation with syspolicyd (see https://news.ycombinator.com/item?id=23274903), and after the reboot the category has mysteriously surfaced... Not sure what triggered it. Launching Xcode? Xcode and CLT were both installed on the machine, but I'm not sure when I last launched Xcode on this machine. Another possible difference I can think of: the machine without was an in-place upgrade, while the other one IIRC was a clean install of 10.15.

In the worst case scenario, you can probably insert into the TCC database (just a SQLite3 database, located at ~/Library/Application Support/com.apple.TCC/TCC.db) directly:

  INSERT INTO access VALUES('kTCCServiceDeveloperTool','com.apple.Terminal',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1590165238);
  INSERT INTO access VALUES('kTCCServiceDeveloperTool','com.googlecode.iterm2',0,1,1,NULL,NULL,NULL,'UNUSED',NULL,0,1590168367);
(Should be pretty self-explanatory. The first entry is for Terminal.app, the second entry is for iTerm 2.)

Back up, obviously. I'm not on the hook for any data loss or system bricking.

saagarjha said 14 days ago:

> In the worst case scenario, you can probably insert into the TCC database

Does this not require disabling SIP?

oefrha said 14 days ago:

Yes. I got mine to appear through mysterious yet fully SIP-enabled means, but if all else fails for you you can temporarily disable SIP to change this.

Sangeppato said 14 days ago:

Maybe you need Xcode, try running "mkdir /Applications/Xcode.app"

oefrha said 14 days ago:

As mentioned in a reply to a sibling, Xcode has been installed (for like five years) on this machine, and launching it doesn't help. The next step would be to compile and run an application with it, which I haven't bothered.

saagarjha said 14 days ago:

I would expect checks for Xcode to go through xcselect rather than a simple directory check. Installing the command line tools (sudo xcode-select --install) might actually be a better idea to test this.

Sangeppato said 14 days ago:

I thought the same, but actually this method worked for me when I wanted the the Spotlight "Developer" option to show up (the CLT were already installed). I have the Developer panel under "privacy" as well, even if I never installed Xcode on my machine

mbreese said 14 days ago:

Maybe if you ran Terminal.app once it would work?

(I'm also on 10.15.4 (19E287))

oefrha said 14 days ago:

No, I played around with Terminal.app for quite a while already. Actually the category does show up on another machine of mine (see edit)... I suspected that maybe I never ran Xcode on the first machine since I upgraded to Catalina, so I launched Xcode, but again, no luck. I'm at a complete loss now.

asdff said 14 days ago:

Terminal actually gives an error if you poke into the top level library folder with full disk access disabled, no prompt to change without me looking on stack overflow for the solution.

0x0 said 14 days ago:

I wonder what "Developer Tools" grants in practice. Clicking the (?) for viewing built-in help does not mention this particular setting, it skips right over it going from "Automation" above it to "Advertising" below it.

said 14 days ago:
[deleted]
saagarjha said 14 days ago:

I believe it means the process will no longer check for the Quarantine xattr.

0x0 said 13 days ago:

But the quarantine xattr has nothing to do with checking notarization?

ether_at_cpan said 12 days ago:

via https://lapcatsoftware.com/articles/catalina-executables.htm..., I've added an entry in my /etc/hosts to block requests to api.apple-cloudkit.com:

    127.0.0.1 api.apple-cloudkit.com
    127.0.0.1 *.api.apple-cloudkit.com
ken said 14 days ago:

Full Disk Access was added in 10.14 (2018), so it's relatively new.

jhrmnn said 14 days ago:

I'm using the Kitty terminal, and observed the script launch delay described in the blog post. After adding Kitty to "Developer Tools", the delay disappeared. Thanks!

dTal said 13 days ago:

Making this about speed is burying the lede. From a privacy and user-freedom perspective, it's horrifying.

Don't think so? Apple now theoretically has a centralized database of every Mac user who's ever used youtube-dl. Or Tor. Or TrueCrypt.

gitgud said 13 days ago:

Richard Stallman's ideals have become a bit less crazy for me now...

Either you have the ability to control the software, or it controls you

verytrivial said 13 days ago:

I think coming to this realisation about Stallman's ideas (not the man, mind) is something that most rational computer users are bound to do. It happens at different times for different people, but I think people very rarely go back after that "Hang on a second ....??" moment.

m463 said 12 days ago:

I remember once he said "proprietary software subjugates people" and I just sort of blinked a bit. It seemed sort of over the top. And over time I started to understand that the way things end up working out, it is very true.

hexchain said 12 days ago:

I always wonder why people usually choose to neglect privacy issues about Apple.

First, there was Apple scanning photos to check for child abuse[0] (that obviously got no attention on this site), then there was this one - Apple uploading hashes of all unsigned executables you run.

Do people really accept that company's "privacy" selling point?

[0] https://news.ycombinator.com/item?id=21180019, https://news.ycombinator.com/item?id=22008855

acecilia said 12 days ago:

Is it even legal that Apple is retrieving this information?

threeseed said 13 days ago:

Apple already has every iPhone user's photos, messages, browsing history, keychains etc.

Not sure how a list of installed apps is going to be worse than that.

saagarjha said 13 days ago:

Not if you choose to not sync them.

radicaldreamer said 13 days ago:

Yup, you can choose to not use iCloud backup and back up offline in an encrypted way (even over wifi) if you’d like.

ccmcarey said 14 days ago:

How could this possibly not be absolutely awful on projects that run hundreds of executables during their execution (e.g. some shell wrappers like oh-my-zsh call out to a large amount of different scripts every time they run).

parhamn said 14 days ago:

It looks like it is done once by executable lifetime. Changing the content doesn't cause it to rerun.

gowld said 13 days ago:

If you don’t trust Apple, don’t run a multi Gigabyte closed source OS they provide.

parhamn said 14 days ago:

I can confirm that executing a trivial script takes 20-200ms longer on the first run. Using 10.15.

said 14 days ago:
[deleted]
neurobashing said 14 days ago:

not sure if I'm lucky or somehow I disabled something but the trivial script problem isn't affecting me on any of my machines. I am using Homebrew for a large % of command line/scripting so maybe that's why?

greatjack613 said 14 days ago:

Privacy it may be a plus since in theory notarization provides some protection.

Speed, definitely not, this is going to make things slowwwww

tromp said 14 days ago:

> provides some protection.

That's security, not privacy...

sooheon said 14 days ago:

Although insecurity leads to less privacy as well.

ashtonkem said 14 days ago:

Insecurity leads to loss of privacy, but security does not lead to privacy. Things can be secure and non-private by design.

yjftsjthsd-h said 14 days ago:

Sometimes, but sometimes security measures lead to less privacy. Say, if executing local programs sends information to a remote server.

Razengan said 14 days ago:

If that information can’t be used to identify anyone then it retains privacy while being secure. Being slow would still be an issue.

simion314 said 14 days ago:

But you can't be 100% sure that the server where the information is sent is not putting in a database your IP, the app you run and whatever else. As a power user I would prefer a prompt before anything is sent.

gouggoug said 14 days ago:

I experienced this one day while tethering in the train. I was coding and running `go build` multiple times.

I could not for the life of me understand why go build would take upwards to 30 seconds to run and sometimes 100ms. I finally realized it was related to my internet connection being extremely spotty. I went online and searched if anybody had the same experience with `go build` but couldn't find anything.

I finally know what happened. This is a pretty intolerable "feature".

lallysingh said 14 days ago:

Does it work at all when unconnected?

enriquto said 14 days ago:

There seems to be a delay of about 5 seconds, then it "gives up" trying to notarize your program .

gouggoug said 14 days ago:

I don't remember if it did or not, but I'm fairly certain it did. (otherwise I'd probably remember it, I think...)

unown said 14 days ago:

As someone living in China, this is my result when I connected to my VPN (this is my normal life, thus I can visit sites like HN):

> Hello

> /tmp/test.sh 0.00s user 0.00s system 0% cpu 5.746 total

> Hello

> /tmp/test.sh 0.00s user 0.00s system 79% cpu 0.006 total

And even if I didn't connect to my VPN:

> Hello

> /tmp/test2.sh 0.00s user 0.00s system 0% cpu 1.936 total

> Hello

> /tmp/test2.sh 0.00s user 0.00s system 78% cpu 0.005 total

That's just ridiculous and unbearable.

Apple should provide a way to disable this notarization thing, and the user should still be able to enable SIP while disabling it.

additional information:

- macOS version: 10.15.4

- terminal: iTerm2 3.3.9

- didn't install any "security" software

neonate said 14 days ago:

Is HN blocked in China?

unown said 14 days ago:

HN has been blocked in China since about 9 months ago.

https://news.ycombinator.com/item?id=20676573

wux said 14 days ago:

I'm curious what your results would be with the stock Terminal. Do you have the settings that others have talked about under "Security > Privacy > Developer Tools" with Terminal.app listed? If so, and the results are better with Terminal, then it'd be interesting to see if the issue is fixed when you add iTerm2 to the list of exempted apps as well.

unown said 14 days ago:

I have tried what you suggested. Granting "Developer Tools" access definitely FIXED THIS ISSUE for the specific application.

Here is the new result (I only run once for each case):

    ╒══════════╤═════════════╤═══════════════════════════╕
    │          │             │ +"Developer Tools" access │
    ╞══════════╪═════════════╪═══════════════════════════╡
    │ terminal │ 1.448/0.004 │ 0.016/0.004               │
    ├──────────┼─────────────┼───────────────────────────┤
    │ iTerm2   │ 1.240/0.006 │ 0.024/0.007               │
    ╘══════════╧═════════════╧═══════════════════════════╛
`1.448/0.004` means the first time it is `1.448 total`, and the second time it is `0.004 total`.

(It seems I have "good" VPN/internet connection condition at this time)

airstrike said 13 days ago:

Upvoted for ASCII table alone

ccmcarey said 14 days ago:

It doesn't work when there's no network connection, wonder if it would be possible to filter out and automatically block notarization traffic, or if it's all encrypted with cert pinning to prevent this type of MITM+filter.

Karliss said 13 days ago:

Dropping packets when there is an otherwise working connection could potentially make the delay even worse depending on timeout or retry strategy used by Apple code. I assume that in the fast case without network connection it checks the network status flag and doesn't try to do any network connection at all.

ttsda said 14 days ago:

I'm still on 10.14, but I guess it will show up on Little Snitch. Unless they bundle it with some other more essential traffic.

chipotle_coyote said 14 days ago:

Okay, I've tried this test on my MacBook Air 2020 several times, first by saving the "echo Hello" shell script in an editor and then, because I wasn't getting the results the author experienced, trying again exactly as he wrote it. Essentially the same result:

    airyote% echo $'#!/bin/sh\necho Hello' > /tmp/test.sh
    airyote% chmod a+x /tmp/test.sh
    airyote% time /tmp/test.sh && time /tmp/test.sh
    Hello
    /tmp/test.sh  0.00s user 0.00s system 74% cpu 0.009 total
    Hello
    /tmp/test.sh  0.00s user 0.00s system 75% cpu 0.007 total
Is it possible that Allan Odgaard, as good a programmer as he unquestionably is, has something configured suboptimally on his end? Because it just strikes me as super unlikely that Apple has modified all the Unix shells on macOS to send shell scripts off to be notarized. (From what I've read, while shell scripts can be signed, they can't be notarized, and Gatekeeper is not invoked when you run a shell script in Terminal -- although it is invoked if you launch a "quaurantined" shell script from Finder on the first run, but it treats the shell script as an "executable document." This is the way this has worked for years, as I can find references to it in books from 2014.)

I have my complaints with macOS Catalina, and I know that Apple's "tighten all the screws" approach to security is anathema to a lot of developers (and if there was a big switch that I could click to disable it all, I probably would), but I'm using Macs running Catalina every day and I gotta admit, they just don't seem to be the dystopian, unlivable hellscape HN keeps telling me they are. At least off the top of my head, I can't think of anything I was doing on my Macs ten years ago that I can't do on my Macs today. ("Yes, but doing it today requires an extra step on the first run that it didn't used to" may be inconvenient, but that's not the same thing as an inability to perform a function -- and an awful lot of complaints about modern Macs seem to be "the security makes this less convenient." There's an argument to be had about whether Catalina's security model strikes the right balance, of course.)

Sangeppato said 14 days ago:

I don't experience a delay in Terminal.app either, but I've tried running the script with a fresh install of iTerm2 while capturing with Wireshark and it does look like the script triggers a connection to an Apple server

varenc said 13 days ago:

I initially saw the delay in Terminal.app, but then it went away! I've made sure Terminal doesn't have the "Developers Tools" permission but the network request delay is still missing.

However, I was able to reproduce this by downloading a whole new terminal app, Alacritty. With the random script and file path I can always reproduce the delay in Alacritty. My guess is Terminal.app might have some special case behavior?

See my comment above on some shell script that does the random file name stuff for you.

false_kermit said 14 days ago:

I just ran the same script on iTerm2 and had no delay.

Sangeppato said 14 days ago:

I had no delay neither until I reinstalled iTerm2, I have no idea why

chipotle_coyote said 14 days ago:

Obviously I can't say that's impossible, it would just be... very weird, and would seem to contradict what Apple Developer Relations was saying on Apple's devrel forums as recently as this year.

defnotashton2 said 14 days ago:

So its an actual fact documented that it happens. I agree that overall Mac os x still has a very nice ux and I'll never go back to windows.. But it's very clear apple is platforming their os to the degree they will ios. It's not weird it's happening, it's real life...

grishka said 14 days ago:

> and if there was a big switch that I could click to disable it all, I probably would

First, disable SIP to allow yourself to modify the system. Then, disable AMFI, the component responsible for code signature checking, entitlement enforcement and all that very useful stuff, with a kernel argument:

    nvram boot-args="amfi_get_out_of_my_way=0x1"
Then you should be done.
nightowl_games said 12 days ago:

That argument reads to me like the implementer knew this stuff was obtrusive.

jaykru said 13 days ago:

I might be wrong about this but if you're running a shebang'd script directly as an executable, they wouldn't need to modify the behavior of the shell itself but rather the executable loader. It would be interesting to see whether, e.g., `bash test.sh` doesn't phone home where "./test.sh" does.

ehutch79 said 14 days ago:

10 to one says this is because you've run something calling /bin/sh before.

if he switched the /bin/sh out to /bin/zsh or /bin/bash which ever his default shell was, he wouldn't have seen the first delay.

chipotle_coyote said 14 days ago:

That's plausible -- but I'd be (mildly?) surprised if Apple hadn't pre-okayed binaries they supply with the OS. Even if you flip the Super Paranoia switches in privacy settings, you don't need to give macOS explicit permission to launch Apple-supplied binaries from the Finder.

mrits said 14 days ago:

Most vendors have separate engines for detecting malicious scripts. I'd assume notarizing is more about executables, in which case it would be checking the signatures around the shell binary.

Also worth noting "echo" doesn't spawn a process but is a routine in the shell itself. If you replaced echo with something that does spawn a process "like scp" it would be interesting to see the results. And if that's doesn't introduce latency then I'd try it with some hello world programs with a UUIDv4 in the binary to ensure they haven't seen the hash before.

saagarjha said 14 days ago:

> Also worth noting "echo" doesn't spawn a process but is a routine in the shell itself.

In Bash echo is a builtin but /bin/echo also exists if you do actually want to spawn a process.

mrits said 13 days ago:

Maybe OP edited a few times but it doesn't look like they are doing that to me

saagarjha said 13 days ago:

I'm not sure I understand?

said 14 days ago:
[deleted]
fxtentacle said 14 days ago:

try again with a randomized filename

saagarjha said 14 days ago:

There was a thread on the almost-forgotten Cocoa-dev list about this: https://lists.apple.com/archives/cocoa-dev/2020/Apr/msg00008...

Catalina has a huge number of things that synchronously block application launch, and if any of them fail you get nothing but a hung app. A friend and I have a running discussion of the many ways where an application would just hang and we’d send samples and spindumps, to each other trying to figure out the right daemon or agent to kill to get the process to start responding again. It’s madness.

twhb said 14 days ago:

I tested whether running a script you just wrote really contacts Apple to “notarize” it. It does.

I first used the author’s timing method. First runs are consistently about 300 ms, subsequent runs consistently about 3 ms. Something is happening at first run.

Some in the comments are saying it’s “local stuff”, so I tested timing again with internet off. First runs go to about 30 ms, subsequent remain the same. So there is “local stuff”, but it doesn’t explain the delay.

Just to be entirely sure, I installed Little Snitch and got clear confirmation: running a script you just wrote results in syspolicyd connecting to api.apple-cloudkit.com. syspolicyd is the Gatekeeper daemon.

I don’t know what exactly is being sent. Maybe somebody else can do a proper packet analysis.

mindfulhack said 13 days ago:

I still love macOS, a lot. Since moving over after the disaster that was Windows 8 (and by then I was already using MacBook hardware), I've become a loving power user e.g. with AppleScript and setting up hotkeys or other ways to do absolutely anything I want on the screen. It really is still as powerfully customisable as Linux. Turn off SIP if need be.

My only problem in moving to Linux software is that I prefer Apple's hardware. I'm on the 2019 16-inch MBP. Linux's compatibility with all the T2 and SSD hardware isn't there yet, but apparently it almost is.

If Linux on the T2 MBP becomes solid and stable in the next 1-2 years, after extensive testing I may move over permanently. I already use Linux on secondary computers, and I love and value its privacy. Same with my phone. I just love my privacy.

My needs are a high bar though. Productivity must be held back by nothing. I use macOS notes extensively and it syncs with my iPhone which is an extremely useful tool for me to note things down both in audio and. It needs to be reliable and - heh - 'just work'. I just discovered the cross-platform 'Standard Notes' app, with a bit more money paid out to Linux-compatible services like that, maybe it can all work. Casual photoshop can be taken care of via a VM.

Surprisingly, macOS Catalina is itself a disrupter to my productivity. It seems buggy as hell - glitchy, and weirdly slow for many extremely basic things - all since Catalina. I just don't get it. Is it caused by this article's observation? Something's definitely going on.

Maybe Apple will fix this in the next release? Like how they fixed the keyboard?

Either way, I still want to move to Linux on this fabulous (fixed) hardware that is the 16-inch MBP. (T2 issues aside.)

fphhotchips said 13 days ago:

I have a 2019 Macbook Pro 16in and I hate it. It runs exceptionally hot (leading to massive performance problems), doesn't get enough power from the adapter to start with no battery, doesn't play nicely with my display, needs restarting every couple of days so Chrome doesn't crash and takes forever to boot.

That's just the technical problems. I'm willing to give the UI a break, since it's probably as much me adjusting as it being bad.

This is my first Apple anything, and if this is what "just works" looks like, I don't want it. I could be more productive on an Android tablet at this point.

mindfulhack said 13 days ago:

Actually, I do agree with you with some of those observations. Apple's been trying to fix their terrible T2 issue and I suspect some of the problems lately have been them trying to prevent the T2 reboot crash, while ruining other parts of the experience in the process as a necessary compromise. It may get worse (or better) as they move to all-Arm architecture.

I also am sick of the touch bar now - after 2 years living with it. I have to press it twice to actually pause my media, because it's an LCD screen and it has to auto turn off to prevent burn-in. That's a regression from the old hard media button in the Fn row which was both instant and far easier to press. At least we got 'Esc' back.

But man, their trackpad...nothing beats it. Still.

saagarjha said 13 days ago:

> it's an LCD screen

OLED.

mindfulhack said 13 days ago:

I hear OLED can be just as bad if not worse. So same diff.

saagarjha said 13 days ago:

Much worse. Just explaining why that would be a problem.

arkis22 said 13 days ago:

Mine starts spinning up the fan (theres kind of a pattern as to when), heating up the entire computer. The computer previously had been fine.

I usually have to restart and reset the "SMC" to stop the fan from nuking the computer.

I can let the computer drop to 5% battery life and the fan will turn off and the computer will cool down. Which is the opposite of what you want if it was actually overheating.

carnitas said 13 days ago:

Counterpoint, I also have the 16 inch 2020 MBP as my first Mac work laptop and absolutely love it. No issues, it works perfectly, and I’m 2x as productive on it as I was on my previous Ubuntu setup.

ochoa said 13 days ago:

Do you write anywhere online about your workflow setup using AppleScript? It sounds interesting. I’d like to configure my macOS experience more.

mindfulhack said 13 days ago:

Oh it's not like I have a Cmd+<X> for every single possible task you can imagine, it's a very tailored and customised set of sometimes complicated scripts for my weird personal needs that I've built up over the years.

Each time I want to do something, I goddamn will spend 8 hours figuring it out if have to. E.g. this: https://apple.stackexchange.com/a/381441/163629 - one hotkey to change macOS Notes text into a specific hex colour (and/or bold etc). It took me a day but I worked it out. Where there's a will there's, 99 times out of 100, a way.

You can seemingly do almost anything with AppleScript. Emphasis on almost.

Here's another example: Right after I plug in my iPhone via USB, I have one hotkey to automate a little-known feature of macOS where you can turn your Mac into a speaker dock for the iPhone. Awesome thing when you have the dramatically improved 16-inch MBP speakers. Here's my applescipt for that, just customise according to your iPhone name near the bottom and try it out: https://pastebin.com/raw/9BY710Y6

YMMV, if you have additional audio devices in sound prefs so may need to change the code a bit.

AppleScript also has the ability to perform unix bash scripting and commands, so with homebrew able to install most common Linux packages, you can go wild if you want.

I'm definitely not 'advanced' applescript level, I'm intermediate. Hundreds of HN readers would know more than me. I just google and think until I find a way. I'm not a programmer.

I have other shortcuts e.g. to control the MPV media player even if it's not the currently active window. Again, weird personal needs, but awesome. AppleScript to the rescue.

FastScripts is how I assign universal hotkeys to any of my applescripts.

guildmaster said 12 days ago:

Would be great if you could write about the scripts you hack to optimize your workflow

ronyfadel said 14 days ago:

I hope Apple currently has a team focused on macOS perf.

I worked on the team in charge of improving iOS (13) perf at Apple and IIRC there was no dedicated macOS “task force” like the one on iOS.

Luckily some iOS changes permeated into macOS thanks to some shared codebases.

bentcorner said 14 days ago:

I agree. This kind of behavior certainly smells like teams doing their development work on high-capacity low-latency networks without much performance oversight.

yariik said 14 days ago:

> I hope Apple currently has a team focused on macOS perf.

Apple doesn't give a fuck about macOS since 2015.

cjsawyer said 14 days ago:

I wonder what % of their users are developers only begrudgingly sticking around for iOS builds.

pier25 said 14 days ago:

> IIRC there was no dedicated macOS “task force” like the one on iOS

It's not surprising. Macs are less than 10% of Apple's revenue.

https://www.macrumors.com/2020/04/30/apple-2q-2020-earnings/

robenkleene said 14 days ago:

Except all of Apple's other devices are built on macOS. Apple's clear de-prioritization of macOS based on revenue numbers is so insane I can barely believe it's happening. If developers, who use Macs in large numbers today, go to another platform, there's very real risk that their entire empire starts to come apart at the seams. And, this may just be me being naive, but it doesn't seem like that much work to keep macOS going, all they have to do is stop trying to turn it into iOS. They are literally doing a tremendous amount of active engineering work that drives developers away from their platforms.

They are risking their entire empire because (apparently) someone at Apple has an axe to grind with macOS's Unix underpinnings. And until they start getting real consequences (developer's leaving in huge numbers), it doesn't seem like it's going to stop. The tragedy is, if they ever do reach that point, where developers are leaving in huge numbers, it'll be too late. Platforms are a momentum game, you're either going up, or you're going down. And once you're going down, you're as good as dead.

fxtentacle said 14 days ago:

Agree. That's probably also one reason why more and more people want to use cross-platform app frameworks instead of developing for iOS natively. That way, you can do most of the dev work on Windows and Android, and you'll only need to use Mac & XCode for compiling the iOS binary.

And I'd wager that some iOS games are released without the developer ever touching XCode: https://docs.unity3d.com/Manual/UnityCloudBuildiOS.html

saagarjha said 13 days ago:

Signing and submitting apps to Apple is fairly annoying to do without Xcode.

fxtentacle said 13 days ago:

Unity has a service where they do it for you.

saagarjha said 13 days ago:

Where you give them you key?

fxtentacle said 13 days ago:

Yes. The procedure is explained in the link that I posted.

saagarjha said 13 days ago:

I'm not sure I'd be entirely comfortable with that, to be honest.

gubikmic said 14 days ago:

100% agree! If more people understood this, I hope this narrative would gain some traction and eventually reach Apple management.

To me, the idea that an OS is mostly finished is completely bananas. There's so much room for improvement and hardly any of that potential was tapped into in what's starting to feel like a decade.

And if Apple had invested into a successor for Cocoa, there might be a larger gap between native apps and (Electron) web apps, leading to some lock-in. Instead most new stuff is not native and for good reasons (and I do dislike the way they don't adhere to Mac conventions, but still).

I think ultimately the problem is Tim Cook. He's too attached to Apple's stock price. I think that's the one metric that he believes rates his performance. But inertia is a bitch. Like in politics, the effects might hit hard only once he's out and it could be too late to fix by then.

If I think about how much this impacts the economy overall (i.e. make millions of knowledge workers a little bit less efficient) then I can only hope that I'll see more sophisticated organizational structures in my lifetime that prevent such erosion.

indemnity said 13 days ago:

Tim Cook is Apple’s Ballmer, who is their Nadella?

plmu said 14 days ago:

I was thinking exactly this, 8 years ago. I moved from an imac + mbpro to linux only.

It took longer than expected. I even intended to buy put options, but someone I trust told me otherwise and to invest in equity instead, which I did, because I know that most buy decisions are not made rationally.

But it looks like the time has come now? On the other hand, I have been off by several years before. People are crazier than you think, especially when it comes to status and association with brands and self-confirmation of past decisions. They might well put up with Apples moves for a few more years.

robotresearcher said 14 days ago:

But at Apple scale: 9% of $58 billion = $5.2 billion Mac revenue last quarter.

ksec said 14 days ago:

Yes, that is what drives me crazy whenever people say Mac is only 9% of revenue and they dont care about it.

If the Mac revenue was separated out on its own, it would be about Fortune 120, that is higher than Kraft Heinz. With plenty more space for growth. Apple only has 100M Active Mac users. There are 1.4B Windows PC.

pier25 said 14 days ago:

OTOH when Apple was a much smaller company the mac was much more important to them and it showed.

Maybe it's not related to revenue per se, but clearly since iOS became their main thing the Mac has suffered tremendously.

said 14 days ago:
[deleted]
valuearb said 14 days ago:

Apples Macintosh division is the most profitable PC company in the world and has been for at least a decade. In fact, Macintosh is likely more profitable than all other PC companies combined.

Less than 10% is no excuse.

pier25 said 13 days ago:

Like I said in another comment, is not about the revenue per se, but it's undeniable that the more popular iOS is the less Apple cares about the Mac.

_underfl0w_ said 13 days ago:

Do you have a source for that claim?

goatinaboat said 14 days ago:

It's not surprising. Macs are less than 10% of Apple's revenue.

Without Macs for developers and other content creators that other 90% doesn’t exist.

ARandomerDude said 14 days ago:

Exactly. Especially given the Xcode lock-in nonsense.

qppo said 14 days ago:

It's surprising that they don't improve the developer experience for their own developers using their own tools, including hardware.

saagarjha said 14 days ago:

Apple uses the same tools you do. They just might not be using it like you are; you can find a lot of features that clearly have no reason to exist outside of Apple nonetheless shipping with their software.

yariik said 14 days ago:

> Apple uses the same tools you do.

No. A special directory can be created at the root of the file system called /AppleInternal. Then, if you work at Apple, you can put some special files there that do stuff. I've read somewhere that they are able to easily disable all of this privacy protection crap and other annoying stuff.

saagarjha said 13 days ago:

There's nothing really special about /AppleInternal, it's just a fairly normal directory that a couple of tools change in order to do things like offer more detailed diagnostics or the option to create a Radar. On a normal internal install there are some internal utilities, many of which are listed here: https://www.theiphonewiki.com/wiki/Category:Apple_Internal_A.... But their code is all Xcode projects and stuff, it's not like they're really using special tools for themselves except in certain cases. There are a couple of internal tools that possess entitlements to bypass security, but more often than not engineers just run with the security features disabled, which you can do yourself.

qppo said 14 days ago:

That's kind of my point - it's surprising to me that they're shipping slow hardware and software, when they're used to develop that same hardware and software. Developer time is expensive.

saagarjha said 14 days ago:

I would actually be quite happy if the engineers were forced to work on four-year-old MacBook Pros and develop against Display Zoomed iPhone 7 and the second generation Apple Watch, using the toolchain and software they push to their developers.

asdff said 14 days ago:

Is there a list somewhere of Apple's in house dev environments or workflows? I wonder what cool tricks they use internally that could be pretty useful generally.

saagarjha said 14 days ago:

Nothing special that can really be talked without internal context. You can get a hint at how they use their own tools though (which are available externally) if you pay careful attention to their public appearances and presentations.

ronyfadel said 14 days ago:

Very messy internally, every team has their own.

callinyouin said 14 days ago:

I wouldn't be surprised if they've determined that developers will generally put up with a bad experience in order to have access to the massive iOS market.

arvinsim said 14 days ago:

There isn't much incentive to improve because they know that people will buy their hardware regardless.

Not to mention people defend and market their products for free.

pier25 said 14 days ago:

Maybe internally they are using a different version of macOS?

saagarjha said 14 days ago:

It’s basically the same ones you’re running, possibly a couple builds ahead and with all the security features turned off.

azinman2 said 14 days ago:

Nope

codeisawesome said 13 days ago:

I find it funny how people are downvoting your innocent comment pointing out a fact... out of anger and hate for the actual fact :D

markdog12 said 14 days ago:

What changes permeated into macOS? What did your team do to improve iOS perf?

ronyfadel said 14 days ago:

So many of the frameworks have shared code between macOS and iOS (e.g. MapKit, Foundation, Contacts etc..), so a perf fix in iOS pays dividends on macOS too.

Perf changes are too numerous to mention, I’d recommend watching last year’s WWDC keynote describing the iOS 12 v/s 13 perf advancements.

neuronic said 14 days ago:

They set "fast = true" as a global constant variable.

shripadk said 14 days ago:

I would give anything to have my Mac be fast again. I have no idea what changed but even 10.14 feels a whole lot slower than it was earlier. Haven't upgraded to 10.15 seeing all the negative reviews it is getting when it comes to perf. Apple needs to seriously give perf a priority for Mac. Do they really expect developers to use a Mac to develop Apps when it is slow as molasses? I shudder to think what will happen to the Apple ecosystem if developers migrate to another OS for development. Apple will come crashing down. I don't wish for that to happen but looks like there is absolutely no one at Apple focused on making it better.

acdha said 14 days ago:

Remember, people don’t write blog posts saying nothing changes. The negative reviews tend to be one of two things: spotlight reindexing shortly afterwards, or attribution error where every new thing is blamed on the OS upgrade and similar old behavior is mentally discounted. App development didn’t suddenly get “slow as molasses” and for most users the install was a reboot and back to work.

leephillips said 14 days ago:

This is completely insane. I am so glad I decided years ago to leave closed operating systems behind.

This design seems to cement the trend at Apple to position their products as consumer appliances, not platforms useful for development.

Nextgrid said 14 days ago:

> I am so glad I decided years ago to leave closed operating systems behind.

The problem is, there's nothing else out there. Everything is going to shit in one way or another. Windows is now a disaster, Linux was always a disaster in terms of user experience and isn't improving.

Mac OS was the last bastion of somewhat good, thoughtful design, user experience and attention to detail and now they've gone to shit too.

julianeon said 14 days ago:

If you add "unfixable" to "disaster" the problem becomes more clear.

Windows is a unfixable disaster, you can't fix it sorry.

Mac OS is now an unfixable disaster, you also can't fix it sorry.

Linux may be a UX disaster, but you can, uniquely, modify it. You can change your UI. You can attempt to fix the problem, and have a real shot at doing so.

Linux is the only one where you can do something about the problem - which is a strong reason to prefer it.

gurkendoktor said 14 days ago:

Not only can you modify Linux in theory, it is actually getting _easy_ to do so.

The biggest reason I enjoy elementary OS as a distro is that everything lives on GitHub, package releases happen through GitHub Actions, etc. Fixing a bug can be faster than merely filing a radar in the Apple ecosystem.

oscribinn said 13 days ago:

>Linux was always a disaster in terms of user experience and isn't improving

I'm honestly pretty baffled as to what keeps this meme alive, as KDE and GNOME are both very popular and provide simple, intuitive interfaces for the typical user. Plasma is only complex if you're the type that really wants to customize, but there its complexity is (mostly) necessary for its wide range of possible configuration. People have this idea that desktop Linux users are all a bunch of dorks playing around with Arch and tiling window managers all day and then posting their anime wallpaper setups on /r/unixporn, but that hasn't actually been true for a long time.

nightowl_games said 12 days ago:

Yeah Linux is awesome. I don't get the hate either. I have like 5 apps I use in Linux Mint, and they look exactly the same way they do in MacOS (Spotify, Discord, Firefox, Godot, Sublime, VSCodium, Terminal)...

The settings UIs in Mint are easily way better than in Windows and Mac.

Yetanfou said 14 days ago:

> Linux was always a disaster in terms of user experience and isn't improving.

Nonsense, 'Linux' can be what you make it. You can have it as sleek as something straight out of the fruit factory or as spartan as a VT100 and anything in between. If you're new to the game the pre-packaged 'consumer' distributions might be a good starting point but for those with a bit of nix savvy - of which I assume there to be many on this board - those bells and whistles probably just get in the way.

If my 8yo daughter and my 82yo mother can use Linux - the latter through a remote X2go session from her kitchen table in the Netherlands to my server under the stairs in Sweden - I'd say people around here can be assumed to be able to handle it. The nice thing about 'Linux' is that you can change out those parts which you find disagreeable for whatever reason for those you like better, this in contrast to that last bastion of somewhat good, thoughtful design, user experience and attention to detail* which by your own statement has been changed into excrement. Just take out the shitty bits and replace them with something better... oh, no, not possible...

That is why the parent poster is right in this sense, things in 'Linux' land might not be perfect - and can never be 'perfect' since one person's perfection is another's nightmare - but at least you get to do something about it.

kick said 14 days ago:

Linux was always a disaster in terms of user experience and isn't improving.

Curious: what have you tried? People who use "Linux" as a catch-all in terms of UX usually have only tried a single distribution with a single desktop environment.

tsukurimashou said 14 days ago:

I feel like people still have in mind what Linux desktop was 15 / 20 years ago. It improved a lot in the past years, battery life improved on laptops, Ubuntu that was already very stable and feature complete also got a lot of things with previous releases and I've personally been running Arch on my main computers now for 5+ years and haven't got any major issues while upgrading.

defnotashton2 said 14 days ago:

Try using the latest version of software that has a more frequent release cycle than arch. If you have an incompatibility there goes your install.

Have yet to see a distro do multi monitor hi dipi that results in readable fonts out of the box..

This gets updated yearly - https://itvision.altervista.org/why.linux.is.not.ready.for.t...

ubercow13 said 13 days ago:

This list is quite comprehensive, but also quite boring. It's just a list of bugs and things that are suboptimal on Linux. You could write one about any operating system. Some of the items like 'such-and-such needs to be configured using a text file' are also not even real problems.

What do you mean by 'there goes your install'? There are multiple ways you could run bleeding-edge software before it's packaged for Arch. See for example every 'xxx-git' package in the AUR. Or Flatpak.

hexchain said 12 days ago:

Arch does not have a release cycle, sorry.

m463 said 14 days ago:

People who have used ubuntu might want to just once try arch linux.

I had an ubuntu machine that took a while to boot even with an SSD. Later I installed arch linux on the same machine and boom! it would be to the desktop in seconds. It was night and day.

zozbot234 said 14 days ago:

Debian is just as quick, and does not have the problematic "rolling" updates of Arch. (It does have the "testing" and "unstable" channels which are roughly comparable, but the Debian folks won't tell you to use them in production.)

m463 said 13 days ago:

> problematic "rolling" updates

Rolling updates for me have not been problematic.

I've had a few updates that gave an error message, and they were easily fixed in one minute after searching the arch website.

I think one was a key expired - I had to manually update it and redo the update process.

The other I can recall was a package that had become obsolete/conflicting and a question had to be answered.

In general rolling updates are a tiny blip every few months.

In comparison, the several debian based distributions I've run have been a "lost weekend" type of upgrade for major updates.

kick said 14 days ago:

Debian is not just as quick (significantly slower and higher resource usage), but Arch isn't all that fast nowadays, either.

Yetanfou said 13 days ago:

Debian - or Devuan if you don't want systemd - can be made as spartan as you want. It boots in those mentioned few seconds on my 15yo T42p (Pentium M@1.8GHz, 2GB). Use Sid/Unstable if you want more up-to-date software with the accompanying larger flow of updates.

catalogia said 14 days ago:

> Debian is not just as quick (significantly slower and higher resource usage)

In which respects? Are you talking about apt vs pacman or something? Default DEs?

kick said 13 days ago:

Default install; a default Debian install has about 3x running.

the_af said 14 days ago:

Moreover, I've been running Linux for decades now, both in my personal laptop and at work, and Ubuntu has been (mostly) frictionless for me. I'm not an average user, of course, but for most users a friendly distro would work just as well as Windows (browsing the internet, using whatsapp web, watching movies). In some cases I've had a better user experience with Ubuntu than with Windows or OS X, namely seamlessly installing a wireless HP laser printer.

hrktb said 14 days ago:

I only tried Ubuntu, a few month ago. For the day or two spent with it:

- multi-language support requires a lot of work to get to the same point as macos.

In particular I use third party shortcut mappers to get language switching on left and right command keys (mimicking the JIS keyboards, but with an english international layout). That looks like something I’d have to give up on code myself.

- printer support is not at the same level.

Using a xerox printer, some options that appear by default on macos where not there on ubuntu. I’m sure there must be drivers somewhere, or I could hunt down more settings. But then my work office two other printers. It would be a PITA to hunt down drivers every time I want to use another printer.

- Hi DPI support is still flagged as experimental, and there’s a bunch of hoops to jump through to get a good setting in multi-monitor mode. Sure it’s doable, but still arcane.

- sleep/wake was weird. It would work most of the time, but randomly kept awake after closing the lid, or not waking up when opening. Not critical, but still not good (I’d ahte to have the battery depleted while traveling)

Overall if I had no choice that would be a fine environment. But as it is now, with all its quirks, I feel macos is still a smoother environment.

the_af said 14 days ago:

Fair enough. I'm not a Mac OS X user so I don't know how it would compare. I can only compare it with my past experience with Windows, and I think it's superior (for me) to Windows circa 7 -- I stopped using Windows entirely at that point, so I wouldn't know how later versions of Windows fare.

Portability is also a fair issue to raise, but it's simply not a problem for me. When I say Linux "on the desktop", I literally mean it: to me a laptop is simply a slightly more portable desktop computer. I sometimes take my work laptop to/from the office, and the battery lasts long enough for that. I'm not worried about longer trips, since I don't use laptops for that. Again, if you do care about this (which is completely fair), I'm aware many Linux distros still have issues with battery life. You certainly can't compete with a Macbook Pro, that's for sure!

I do note that my experience with printers is opposite to yours. Like I said, when trying to connect to an HP wireless printer, Ubuntu autodetected and self-downloaded the necessary drivers; however, it took a lot of patience to get it to work with a Macbook Pro. Today, that I have it configured for my Ubuntu laptop and my wife's Macbook Pro, the Mac will sometimes fail to print (the print job simply stuck in limbo) while my laptop prints reliably. Who knows?

And like I said in another comment, I game (or used to, anyway) a lot with Ubuntu, and many games are even AAA (though they tend to arrive later than on Windows).

So I really have a hard time believing Linux is not "ready for the desktop". It is, and has been for many years now.

edit: one last thing. You mentioned HDPi modes, multimonitor, multilanguage... none of those are for average users. My mom would be comfortable browsing the net, reading mail and watching movies on Ubuntu. She doesn't even know what HDPi is, nor does she want external monitors. (Spoiler: she still uses Windows because she can't learn anything else at this point... I've thought of tricking her by themeing Ubuntu to look like Windows, but that would just be mean).

bgorman said 13 days ago:

Without HiDPI support lots of applications become useless when you use a HiDPI display. Even Steam does not respect HiDPI settings in Gnome 3 even when setting custom environment variables.

the_af said 13 days ago:

It's probably a case of "I don't miss what I don't use" then. I'm a power user, I cut my teeth with MS-DOS and I've been using Linux for work and gaming for more than a decade (and less intensive usage before that) and I really never noticed anything about HiDPI. That has to mean something :)

hrktb said 13 days ago:

Thanks for the additional details.

For the printers, you are right in that it’s far from being a solved problem on macos. I had an EPSON all in one before, and it was also a pain to get everything working. If I remember correctly the generic driver could print, but we didn’t get “advanced” options without going through the EPSON pkg installer and all the garbage coming with it. I’d totally imagine the linux driver being done cleaner than that.

For the record I’ve worked with a decent number of devs using linux workstations, so I totally vouch for your use case. I’d just temper the niche nature of multi-language support; that’s an everyday need for basically all Asia. Granted my use of shortcuts is niche (I wouldn’t need them if I had enough keys), but looking at maintenance projects annual reports there seem to be a sizeable amount of quality of life fixes still on the way.

the_af said 13 days ago:

Right. I forgot about Asia. In that case it must be painful, agreed!

bgorman said 13 days ago:

With Linux you have to pay for proper support. HP is by far the best company in terms of supporting Linux printers. It isn't the Linux ecosystem's fault that other printer companies do not care.

BruceEel said 14 days ago:

Interesting. I regularly use RHEL (server/CLI only) but have not tried desktop Linux in a while.

I get a fair bit of weekly exposure to Windows 10 and well, it's not like heaps of fun, UX wise.

I'm reluctant to drop Apple mainly because I'm so 'tied up' with the rest of the ecosystem, iphone, Apple Music, iCloud etc.. They are not irreplaceable (for sure) but it always feels like moving away will cost way too much effort and be a pain... Well played, Apple.

The_Colonel said 14 days ago:

> I'm reluctant to drop Apple mainly because I'm so 'tied up' with the rest of the ecosystem, iphone, Apple Music, iCloud etc.. They are not irreplaceable (for sure) but it always feels like moving away will cost way too much effort and be a pain... Well played, Apple.

This is why I don't want anything by Apple.

addicted44 said 14 days ago:

This is a good point.

It's really hard for me to use non i3wm supporting OSes now, even though I have to use Windows from work, and have used Macs for the better part of the last 2 decades personally and in college.

lone_haxx0r said 14 days ago:

I use Linux everyday, and it's a UX disaster. I have tried Gnome, Xfce, Cinnamon, KDE, I like none of them. The only DE that I somewhat liked (Unity) was discontinued.

Linux sucks, but I use it becuase it sucks less than windows, for programming at least.

KajMagnus said 13 days ago:

How interesting, I like Cinnamon and Gnome and KDE, but didn't like Unity. Instead, for me, the problem is poor printer support.

dmitriid said 14 days ago:

> Curious: what have you tried? People who use "Linux" as a catch-all in terms of UX usually have only tried a single distribution with a single desktop environment.

Yup. You've just described a disaster. How many permutations of <hundreds of distros> x <dozens of DMs> must a user try before finding a good UX?

kick said 14 days ago:

Mac is a BSD. OpenBSD exists. FreeBSD exists. NetBSD exists.

Because there are at least four BSDs, Mac therefore isn't good.

Do you see how ridiculous applying that logic to any operating system is?

Linux isn't a disaster. It's a kernel. There are Linux distributions with great user interfaces and great UX, developed by people who are great at it. There are also distributions that aren't.

BruceEel said 14 days ago:

> There are Linux distributions with great user interfaces and great UX

Could you name some? No sarcasm, actually interested!

kick said 14 days ago:

It sort of depends on what really fascinates you, right? I'll avoid naming some of the most popular ones, because it's likely that you've already tried them. If you haven't, I'd really recommend giving them a try. Many people seem to really love them.

In terms of defaults:

I've heard really good things about Solus, and its use of AppArmor seems really cool. Never touched its package manager, so I won't recommend it, but it might be worth checking out. Its desktop environment is really snappy and has an interesting design philosophy.

Elementary is really cool as a boutique distribution; I don't personally feel any urge to use it seriously (I dislike apt as a package manager), but I always keep its live environment on a flash drive, because it works without any setup on basically anything I throw it at, painlessly, and without error. It's got a cool indie app store full of curated Elementary-centric free software, and overall just feels great. Using it, you'll probably notice a few areas that it clones Mac on, and a few that feel delightfully different.

Clear Linux (Intel's desktop distribution) is pretty popular right now because of how simple it is & how Intel seems to be going to great lengths to optimize it and make it a serious contender, but I don't like its desktop environment (vanilla GNOME 3 as far as I'm aware) all that much.

ChromiumOS is probably the best-designed desktop operating system on the planet right now technically, and I say that as a person who really hates Google. UI-wise it's so-so, but UX-wise it's really something special.

But more interesting are desktop environments in general, since they can be used with any variant of Linux you feel the urge to use. There's an exception there, though, in that Elementary's DE and Deepin's DE tend to not work so well or nicely on platforms that aren't Elementary or Deepin.

There are modern environments:

Plasma has hands-down the best UX of any sort of desktop operating system assuming you've got an Android smartphone; you say you're coming from Apple's environment, so imagine the interop between your Mac and your iPhone, but going both ways instead of just Mac -> iPhone. Texting, handling calls, taking advantage of the computing resources of connected devices, using your phone as an extra trackpad, notifications, unlocking your PC, painless file-sharing, pretty much anything you'd like. There are a bunch of distributions that ship with Plasma by default.

Solus's Budgie is kind of neat in that it takes the main benefit of GNOME 3 (ecosystem) with far fewer downsides.

There are also retro environments, if those are your thing. There's a pretty much perfect NeXTSTEP clone (including the programming environment, not just the UI), amiwm is still pretty interesting, there are clones of basically every UNIX UI under the sun, so on.

I'm not the best person to answer your question, because for the most part I don't go out of my way to use new desktop environments and distributions, and nothing above is my first choice. (In terms of window management, I usually stick with 9wm & E just because I have ridiculous ADHD and 9wm forces me to focus while E allows me to tile painlessly if I ever need it. I use three distributions overall, none of which are very popular at the moment, pretty much solely because I'm really picky with package managers & design philosophies.) That's a "me" issue rather than a Linux issue, though.

BruceEel said 14 days ago:

This is excellent and indeed largely novel information, thank you.

It sounds like the finding right combination of DE and package management solution plays a big role here. I don't remember much of my experience with Gentoo's package manager in the early 2000's other than finding it generally did its job (if a bit slowly)... Experience with package managers on Mac (brew, macports) hasn't been great so I'm eager to play around with modern ones on Linux. Same goes for the DE actually: stock, out-of-the-box, macOS is essentially unusable for me until I get my customization (scroll, trackpad, KeyboardMaestro) done exactly right, I can't imagine this not being better on Linux, if anything for the ability to switch among the various DE's.

I'm starting to contemplate this (fully untested) strategy: trying out a few distros and installing the one I like best on VMWare Fusion and then try to use it as much as possible, falling back to macOS if I get stuck or I'm short on time but gradually replacing Mac-specific stuff as I find suitable replacements.. TextMate, the masterpiece of Allan Odgaard (author of the article being discussed here) probably going to be the toughest one. If I'm successful, I should eventually be able to let Linux 'out of the box' and run it on real hardware..

PS: amiwm! This is going to be a must. I do miss the Amiga, a fair bit..

kick said 14 days ago:

My favorite package managers, personally:

xbps

apk (terrible interface; wonderful technically)

pacman (wonderful interface; so-so technically; dislike the distro that uses it because of technical choices)

InstallPackage (GoboLinux is kind of cheating, because InstallPackage isn't a "real" package manager, but that's kind of the point)

I love TextMate, too! Something you might find nice is how easy it is to run Mac in a VM on Linux; there are scripts that manage the entire thing for you, and it's pretty painless (and so fast; I was surprised). Useful if you have a few packages you can't find replacements for.

You mention Apple Music elsewhere, which you might be interested to know has an Android client and a web client, and you can probably get a native client on Linux, though I'm not immediately aware of one.

BruceEel said 14 days ago:

> I love TextMate, too! Something you might find nice is how easy it is to run Mac in a VM on Linux; there are scripts that manage the entire thing for you, and it's pretty painless (and so fast; I was surprised).

That would be excellent! I like the idea of swapping host and guest with this VM strategy, sort of evolutionary platform switching.

kick said 14 days ago:

Take a look at this! It's pretty simple; it just fetches macOS and then gives you a shell script that launches qemu with a few flags:

https://github.com/foxlet/macOS-Simple-KVM

Really, really fast, and fairly painless.

BruceEel said 14 days ago:

It's fetching the disk image right now. Gold... Thank you!

BruceEel said 13 days ago:

..and it works, high sierra, is back!

lioeters said 13 days ago:

Thank you for writing this overview of interesting Linux distributions, their UX and package managers, such good info.

The last few years I've run Linux VMs on a Macbook, but I'm transitioning to a Linux desktop probably running a macOS VM, which you mentioned in another comment - didn't know there was a practical solution.

It sounds like distros like Elementary and PopOS might suit me as a gentle transition from Macs.

3combinatorHN said 14 days ago:

Stable distributions Fedora manjaro ubuntu UIX gnome kde xfce all works

dmitriid said 13 days ago:

> Do you see how ridiculous applying that logic to any operating system is?

Somehow, when you ask a person about PC or a Mac, the answer is: Windows or MacOS, and then the discussion is about their quirks, or advantages, or deficiencies.

You ask about Linux, and this is what you get:

> Linux isn't a disaster. It's a kernel. There are Linux distributions with great user interfaces and great UX

So, once again: which one of the hundreds of permutations of <distro> x <DM> has a great UX?

kick said 13 days ago:

Ask a person about UNIX, they'll list Mac, Solaris, whatever. All UNIX distributions! I listed a bunch elsewhere in this subthread. Feel free to check them out, but for some reason I'm beginning to suspect that you're probably not going to.

saagarjha said 14 days ago:

macOS is actually kind of mediocre at being a BSD these days ;)

the_af said 14 days ago:

Ubuntu pretty much works out of the box for a lot of "regular" users (I'm excluding gaming, which also works but is not as easy).

I'm sure there are other user-friendly distros that similarly let average users browse the internet, write documents, listen to music and watch movies painlessly.

captainbland said 14 days ago:

I'd say gaming on Ubuntu LTS (if not Linux in general) is quite easy provided you stay in the safe haven of games that natively support the OS, which to be fair is a pretty solid selection of games these days albeit one which is pretty much a strict subset of the games on Windows. As soon as you go outside that area and start messing with Wine or whatever all bets are off, though.

the_af said 14 days ago:

Agreed! I play a lot of games on Linux, bought via Steam or GOG, occasionally with help of WINE but mostly without. I excluded gaming because if one thing is likely to cause more problems than on Windows, it's games. But yes, I use Ubuntu even for gaming.

The fact I can install Steam and play an AAA like Mad Max or Shadow of Mordor mostly seamlessly makes me wonder why people still claim Linux on the desktop is a no-go.

konart said 13 days ago:

>The fact I can install Steam and play an AAA like Mad Max or Shadow of Mordor mostly seamlessly makes me wonder why people still claim Linux on the desktop is a no-go.

Because they and few others are exceptions? Can you play the latest CoD? GTA V? Assasin's Creed maybe?

the_af said 12 days ago:

I think you're missing the point. I'm not arguing that Linux is the best platform if your use case is primarily gaming. Nothing beats Windows -- or a console! -- if gaming is the most important thing to you.

> GTA V?

I honestly don't know, but it wouldn't surprise me if I could using WINE. A huge library of Windows AAA games work on WINE.

> Assasin's Creed

I don't know, but Mad Max and Shadow of Mordor are pretty much the same kind of game as Assassin's Creed, following the same kind of gameplay and using the same kind and complexity of 3D graphics/engine.

In any case, these are not exceptions. I forgot to mention the XCOM remake, Alien: Isolation (this is interesting because it has tons of graphics effects, including chroma aberration -- it looks awesome on Linux), SOMA, Victor Vran, Warhammer 40K Dawn of War II, L4D2, and many others. There are tons of Linux games on GOG and Steam, many of them AAA games. If you count indie games or 2D platformers there are literally thousands of them, but I guess that's not what you're after.

konart said 12 days ago:

My questions were mostly rhetorical.

My point is that you can't run most AAA games actually, and many of those you can - will give you enough problems (like frame drop or some graphical features unavailable).

And I really don't understand what's the point of being able to run some games. I want to play the games I'm interested in, not the ones that 'are playable'.

>I don't know, but Mad Max and Shadow of Mordor are pretty much the same kind of game as Assassin's Creed, following the same kind of gameplay and using the same kind and complexity of 3D graphics/engine.

No sure what's your point here. You can't replace one with another just because they have similar mechanics.

Steam\GoG has many games that run on linux and macos (by the way), but most of them are indie platformers or things like that. People don't play random games just to kill some time (well, some do), they play TITLES.

> I forgot to mention

more exceptions. They will stop being exceptions when you will be able to run 80% of titles without any issues and not sooner than that.

Gaming is not important to be, I'm a PS4 guy ever since macos switch, just pointing out that games are still has little to do with linux unless we are talking about rare AAA titles and indie scene

the_af said 12 days ago:

My point is that Linux is a valid gaming platform with many AAA titles and tons of indie games, not that it's the best or ideal gaming platform. Of course Windows is better for gaming.

> And I really don't understand what's the point of being able to run some games. I want to play the games I'm interested in, not the ones that 'are playable'.

With this definition neither Windows nor the PS4 are valid gaming platforms, since not every game can be played on them.

> They will stop being exceptions when you will be able to run 80% of titles without any issues and not sooner than that.

So now it's 80% when before it was "a few exceptions"? Sorry, I'm uninterested in discussing your arbitrary definitions with you. Nice try moving the goalpost.

PS: re: "without any issues", back when I used Windows for gaming, there was always some issue. The graphics card, drivers, config issues. I guess Windows is not a gaming platform either then?

catalogia said 14 days ago:

> Yup. You've just described a disaster.

Hardly. The existence of a distro I don't like doesn't degrade my experience using a distro I do like. You may as well be upset at an ice cream shop for having dozens of flavors when you only like strawberry. Choose the one you like and ignore the ones you don't. It's not rocket science, even children can figure that out.

wtallis said 14 days ago:

> The existence of a distro I don't like doesn't degrade my experience using a distro I do like.

The problem under discussion here is not that of using a distro you like, but finding a distro that you like.

catalogia said 14 days ago:

If an icecream shop only has one flavor, I might get lucky and discover it's the flavor I like. But more likely, I'll just be screwed and have to settle for something I don't like. Only an icecream shop with variety can hope to give the most amount of people an optimal experience.

dmitriid said 13 days ago:

Unless the ice cream shop provides you with hundreds of flavours, 90% of which are nearly indistinguishable from each other. And hardly anyone on this planet can answer a straight question of "Which flavour is good".

catalogia said 13 days ago:

If they're 90% indistinguishable, how is that distinguishable from an icecream shop that simply has fewer flavors?

swebs said 14 days ago:

Linux has been a delight to use for me. Things were rough 10-15 years ago, but it's pretty amazing now.

BruceEel said 14 days ago:

Any distro in particular you'd recommend?

markosaric said 14 days ago:

Fedora 32 Workstation is pretty good if you want to see the best of what Linux can offer. It may not be the lightest and fastest distribution but it is easy to install and everything works. You'll get to experience Gnome which is the most original Linux desktop environment and the best one in terms of user experience in my opinion.

If you want something more traditional with the start menu or dock or desktop icons, perhaps something like KDE Neon is better place to start. It might feel more familiar. Will be lighter/faster too.

Put each of them on a USB and run them live on your machine for few minutes each and see which one makes more sense to you.

vetinari said 14 days ago:

Ubuntu, Pop!_OS, Fedora...

Each of them has something done better than the others, but all of them are delight to use.

tsukurimashou said 14 days ago:

not him but same experience, from my previous comment:

I would recommend: Ubuntu, Linux Mint, Elementary OS, Pop!_OS

if you want: nice experience out of the box

I would recommend: Arch, Gentoo, Debian Net inst, Void

if you want a base system and install things you want on top of it

BruceEel said 14 days ago:

Thank you @all for the suggestions! I'm going to set aside some time to experiment with these and see how far I get.

tsukurimashou said 14 days ago:

Nice, I would like to hear your experience with it once you do that

BruceEel said 13 days ago:

Well, my head is spinning, but I've made a bit of progress. I thought I'd start by trying out a few of the ones you and others have characterized as user-friendly as well as one of the more bare-bones ones.

The (hopelessly unscientific) test plan was:

Challenge 1 - write live system ISO to USB drive and boot it on my 2015 MacBook Air (which, though old, still counts as exotic, I guess.)

Challenge 2 - make sure display, network, trackpad and keyboard (+ intl. layout) work correctly. Be able to SFTP to my Mac

Challenge 3 - with little to no docs reading (how is the package manager invoked from CLI?), use the terminal to set up the right environment for a couple of relatively portable hobby projects I've been recently working on (on Mac), compile and test them. This includes, among other things, installing clang or g++, SDL2, Wine (to run an ancient ARM assembler) and finding a usable GBA emulator.

Limitations:

   A: 8GB RAM. More ambitious stuff (KVM macOS, VisualStudio Code) will have to wait for an actual install.
   B: Deliberately avoiding exposure to the docs is silly but I thought 
      such an approach would give me an indication as to whether 
      there exists a distro that uhm, "thinks like me".

Candidates: Ubuntu, Mint, Fedora, KDE Neon (which, if I'm not wrong, is Ubuntu LTS preconfigured as the latest KDE) and Void.

Results:

Challenge 1: unremarkable. All worked right off the bat except for Void, which made it as far as showing the mouse pointer but then froze.

Challenge 2: well, boring ;) All distros were pretty much ready to use and required minimal tweaking. With the tweaking part ranging from effortless (Mint) to minor headscratching (Neon). Not sure whether /etc/X11/XF86Config still exists but I did not miss editing it today.

Challange 3: more interesting:

Neon: all worked as expected except some trial and error required to get Wine working: wine32 was required but it wasn't getting installed by default, apparently. (Not a whole lot easier on Mac anyway, with separate downloads & installs for Wine and XQuartz)

Ubuntu: I failed as apt refused to acknowledge the existence of the packages I needed. This is weird as I believe Neon relies on the same package database. Though undoubtedly my fault, not reading the manual, it is perhaps a bit interesting that I could not readily find my way around the problem.

Fedora: everything worked except for Wine, as the live system ran out memory (disk space) on installing it. Not a big deal, everything else worked very well. Aside: I'm an avid runner and "DNF" is not the most likeable of names for a program I have to use very frequently! j/k..

Mint: everything worked at take one.

I know this isn't even scratching the surface of the surface but I think for now I'm going to go ahead and play more with Mint and Fedora after installing them on MB Air hardware or MB Pro VMware.... with a mind of getting back to KDE/Neon eventually.

tsukurimashou said 13 days ago:

interesting! Thanks for posting your feedback, I think mint is really great, I'm an ArchLinux user but I like having mint installed on some laptop, the installation is very straightforward and I feel it's way less bloated than Ubuntu for example. And pretty much everything worked out of the box with the laptops I've installed it on (mostly dell laptops).

I haven't used Ubuntu much lately but I remember always having to add community repository to get some package I needed. (Also one of the reason I love Arch, a lot of packages there updated more quickly than most distro + the AUR for everything not present in official repo)

BruceEel said 13 days ago:

Aye, very happy to have found what look like really viable alternatives, this is promising. And if I manage to make the transition, I will eventually want to try out more sophisticated distro's like Arch, I am quite sure of that.

BruceEel said 13 days ago:

I shall post my findings.

RockIslandLine said 14 days ago:

Gentoo needs vastly better documentation to be useful.

tsukurimashou said 13 days ago:

I would say that Archwiki covers a lot of things for a lot of distros, but yeah I would only recommend Gentoo to 'advanced' users, or if you really want to get into it the hard way.

bproven said 14 days ago:

IMO Fedora or Ubuntu. I've used Fedora now for the last few years on Thinkpads (currently Carbon X1 6th gen) and it has been pretty much "just works"

t289yhoi said 14 days ago:

The trick is to go all-in on KDE if you want that Windows feeling where things just work.

distances said 13 days ago:

And in that case the distro choice should be KDE Neon.

BruceEel said 13 days ago:

...added to the list.

2OEH8eoCRo0 said 14 days ago:

Fedora or Ubuntu

coldpie said 14 days ago:

I think the fact is there simply isn't a solution that works for both the "layperson" and highly technical people who want to do development. Laypeople cannot be trusted to admin their machines, but experts need access to those bits. Leaving a backdoor to real admin access for the experts just means laypeople will abuse those backdoors and mess up their machines again, with dire consequences for the entire planet. You see the same problem with power user UI features vs dumbing down for phones and average users. People keep trying to bridge this divide and I'm just not sure it can be done.

AlexandrB said 14 days ago:

> Laypeople cannot be trusted to admin their machines

Yeah, but they're the ones who paid for their machines. So... you're saying they're not allowed to use them how they wish?

> Leaving a backdoor to real admin access for the experts just means laypeople will abuse those backdoors and mess up their machines again

Remembering the last 20 years of computer history, most of the critical fail wasn't caused by "laypeople abusing backdoors" but horrible security holes in popular, widely used software packages: Outlook, Flash, Acrobat Reader, Internet Explorer. Apple/Microsoft are not locking down their OSs to protect users from themselves, but rather from other developers. We, software engineers, seem to have completely failed our users as a profession.

saagarjha said 14 days ago:

Someone being tricked into installing malware doesn't usually make the news.

bitcharmer said 14 days ago:

> Linux was always a disaster in terms of user experience and isn't improving.

This as true today as saying java is slow. Why not just try? You might get pleasantly surprised.

chacha2 said 13 days ago:

I've tried it recently and still find it true. Death by a million paper cuts.

bitcharmer said 12 days ago:

What did you try recently? Java or Linux?

saagarjha said 14 days ago:

Chrome OS?

leephillips said 14 days ago:

I happen to enjoy using linux on my laptop. In fact, I think it’s pretty great. But that’s because I can customize it to work the way I want—something that I found hard or impossible to do back when I was using MACOS.

paddlesteamer said 13 days ago:

I hate bloated OSs and unfortunately Mac OS is one of them. I know how everyone wants everything to work out of the box and I know it's very natural to want so but I cringe if I find out my OS doing something behind my back. That's why I'd never use Windows, Mac OS, Ubuntu, etc. They all violate my privacy and slow my system to do so.

I use Debian, I like Debian. When I run Wireshark I don't see unknown requests destined to debian.com. That is the definition of simplicity for me. And yes, it doesn't always work out of the box, you have to install some drivers, change configurations but it's getting better and easier. Yet, I'm a software developer so I understand and like that stuff.

> Linux was always a disaster in terms of user experience and isn't improving.

No, you can't define it as a disaster, it's not. If you're an end-user that understands nothing of computers maybe you can but otherwise it's not a disaster. It's just harder and getting easier by day.

dhruvkar said 14 days ago:

>> Linux was always a disaster in terms of user experience

Try Pop_OS!. I switched from macOS and it's been a relatively painless experience with some tweaks.

t289yhoi said 14 days ago:

The funny thing is, Linux has amazing User Experience if you go all-in on the latest KDE and its associated tooling.

bitwize said 13 days ago:

I set my Mac-loving girlfriend up with Kubuntu for this reason.

3combinatorHN said 14 days ago:

I’m pretty sure that you have never use linux ... Just try it

godzillabrennus said 14 days ago:

Buy a Mac and put ElementaryOS on it to avoid the slowdown and have a slick experience.

https://elementary.io/

zozbot234 said 14 days ago:

Might want to make it a used/refurbished Mac. Newer Macs don't run Linux well (at least as of yet); the whole T2-chip based stuff on newer machines is especially problematic.

Terretta said 14 days ago:

From the comments, roughly, are you running third party "security" tools?

> Is there any "security" software running on your Mac? I've seen this sort of thing caused by that, but not in general.

> I ran the two line test and it had no delay at all. The Mac doesn't check for notarization on shell scripts or any non-bundle executable. I just did it again with a new test2.sh and Wireshark capture and there is nothing.

> I do a lot of Keychain code and I've also never seen those delays. The reason I suspect they told you not to use that API is that it's in the "legacy" macOS keychain. They really want everyone to move to the modern keychain but lots of people, myself included, still need the older macOS specific features.

> I'm not saying you are crazy, but all of these things though are the trademark reek of kernel level security software that is intercepting and scanning every exec and file read on the system. We had an issue with Cisco AMP once that took Xcode builds from under 10 seconds to over 5 minutes until we were able to get it fixed.

oefrha said 14 days ago:

The only kernel-level security software on my systems is Little Snitch, and I’m pretty sure it doesn’t do anything unless there’s network activity, so it doesn’t explain anything.

oasisbob said 14 days ago:

Reminds me of the terrible delay I faced after having Sophos installed on my Mac.

Having to wait 5-10 seconds for a new terminal tab as Sophos churns (checking autoccomplete scripts, rbenv, etc) was infuriating. Oddly, there was fate sharing with Internet interception, so there was a good chance the browser was getting dragged down too, and vice versa.

Convincing corporate IT of how bad the problem was was maddening. Based on what this author says, 10.15 on rural internet sounds like hell.

jwlake said 14 days ago:

The funny thing is its not transitive. No slowdown if you invoke bash specifically with a new shell.

% rm /tmp/test.sh ; echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh

% time bash /tmp/test.sh && time bash /tmp/test.sh

Hello

bash /tmp/test.sh 0.00s user 0.00s system 83% cpu 0.004 total

Hello

bash /tmp/test.sh 0.00s user 0.00s system 77% cpu 0.003 total

vs the one from the article:

% rm /tmp/test.sh ; echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh

% time /tmp/test.sh && time /tmp/test.sh

Hello

/tmp/test.sh 0.00s user 0.00s system 2% cpu 0.134 total

Hello

/tmp/test.sh 0.00s user 0.00s system 73% cpu 0.004 total

(edited for formating)

saurik said 13 days ago:

When you run "bash hello" you are calling exec() on bash, passing "hello" as an argument, which bash then reads; when you run "./hello" you are calling exec() on hello: the kernel then treats "hello" as an executable, but notes that "hello" starts with "#!" and then will run the specified interpreter for you, passing "./hello" as an argument. The kernel doesn't think of "hello" as a program when you run "bash hello".

said 14 days ago:
[deleted]
azinman2 said 14 days ago:

Are you sure it's just not cached from the prior result? If I run the article's commands twice in a row, the 2nd time is faster.

halotrope said 14 days ago:

I am using Ubuntu 20.04 on a Thinkpad X1 Extreme Gen2 and you would be surprised how "normal" it feels as a development machine. Sure there some little annoyances, the touchpad behaves a little worse than on windows, sound is a little worse. But the most important things, Keyboard and Screen are excellent. The system in general does not feel like the horror stories that people keep telling about linux on desktop(notebook). Now that WSL2 is getting Cuda even windows looks workable. Their new terminal app is amazing. After a decade of Mac notebooks it was quite liberating and I would not switch back even if the flaws in macOS would be fixed. It is for sure the nicest of the big 3 operating systems but for development work Ubuntu is hard to beat for me. YMMV but it won't hurt to look around you what else is there.

kristopolous said 14 days ago:

I've been seeing the trajectory of Windows (pre-2012 or so) -> Mac (2012 - ~2019 or so) -> Linux (~2018 - now) play out with quite a few people without any issues.

And I don't mean developers. They're all pretty educated people but it's taken me by surprise. They come to me in frustration over Mac, they don't want to return to Windows and they really, really, really want linux. I've been using linux since about 1997 so they come to me. I usually push back, thinking "do you really want a unix workstation?!" but they insist.

My strategy has been some x2xx lenovo (like x230 or so) for about $300 from ebay, 8/16gb of ram or so with an SSD, the extended battery pack, putting mint on it and then just handing it over. Everyone, much to my continued surprise, has loved it and are really happy with it.

It's happened 4 times now and I'm still shocked every time. They've told me they use youtube to figure things out.

They're fine with libreoffice, gimp does what they need, supposedly spotify works on it fine, they don't know what bash or the kernel is and it's all fine. Incredible.

alluro2 said 14 days ago:

Adding to anecdotal, same trajectory for me, for web development. Really happy with Manjaro on Razor Blade 15 for a year now.

azinman2 said 14 days ago:

I recently _really_ tried adopting Linux on a hobby development machine that I built back in 2016 (hardly new hardware -- and desktop not laptop). Sleep never worked, graphics sometimes borked, UI felt janky and inconsistent, icons are super fugly and often too theme-y to the point of being undifferentiated at a glance, HiDPI support is a giant mixed bag (in 2020), machine would randomly freeze (mostly elementOS; Ubuntu didn't freeze as much), Hauppage drivers rarely worked consistently and often required reboots, I hated the mouse acceleration curves and was horrified to learn they were effectively hardcoded in X (I'm not talking just speed which is tweakable), gstreamer was nightmare to develop for, the Ubuntu & elementaryOS stores are a joke, and the mix of apt/snap/nix was very frustrating and the opposite of user-friendly.

I switched back to my 2012 MBP and it's predictably gone well since, plus I get iMessage integration with my iPhone.

YMMV

bproven said 14 days ago:

Yeah - the hw really has to be curated. I havent tried using a machine cobbled together from various parts (custom desktop), but off the shelf quality laptops work fine for me last 2 years or so and have none of the issues you mentioned. Emphasis on quality - not cheapo models. I think if you treat Linux same as OSX and run it on known good hardware supported well by Linux you are fine today IME

>HiDPI support is a giant mixed bag I will say that this is still a thing, although with experimental gnome fractional support it works pretty well now.

Honestly I have a 2019 macbook pro 15 and have more problems with it than I do with my Thinkpad X1 Carbon 6th gen with Fedora 32.

kristopolous said 13 days ago:

See, that's the response I was used to and the one I expected to get from everyone.

The crazy thing is that I haven't heard it yet from the people I helped. Times may actually be changing now, just not swiftly. Perhaps it's the "decade" of desktop linux.

It's also not because linux is so great but because windows and apple are constantly stumbling over their own shoelaces and shooing customers away.

FullyFunctional said 14 days ago:

True. Amusingly, I was always trying to make Windows behave more like Unix, but now I'm trying to make Linux behave more like Mac (just a few things, like the global keyboard bindings).

The major pain points are nearly all related to lack of integration with my iPhone (with Messages being the big one, followed by Notes).

said 14 days ago:
[deleted]
neuronic said 14 days ago:

Not associated at all but due to loving it, I wanted to share PhotoPea as you mentioned Gimp.

https://www.photopea.com

kristopolous said 14 days ago:

try this:

$ google-chrome --app=https://www.photopea.com

peferron said 14 days ago:

Seconded. I used to work on a Mac laptop for years, then started using a beefy Linux desktop tower on the side for some work that benefited from higher hardware resources. A few months later I realized that I had slowly grown into doing all my work on Linux, even when I didn't need the hardware, mostly because i3 and apt were so much better than the Mac equivalents, and that I was only opening my Mac laptop to walk into meetings. After realizing that I ditched the Mac laptop for a Linux laptop and haven't looked back.

I still use a Mac at home for entertainment (I'm typing this comment on one), and I have to say it works much better used that way. I don't have to worry anymore about random Mac OS upgrades breaking functionality that Apple doesn't care about because it's not part of their vanilla out-of-the-Apple-Store experience, but is vital to me as a developer such as 3rd party window management, dock improvements, keyboard tweaks, or not delaying every new execution by phoning home (LMAO).

kstenerud said 14 days ago:

Yup. Ubuntu 20 is the first desktop linux OS that just worked. Every other Linux desktop before it has had suspend/resume issues, wifi issues, sound issues, 3d issues, ratchet settings (things that can be set but never unset without some arcane magic), weird desktop behaviors, buggy software that crashes all the time, etc etc. Yes, I've tried ALL of them, including pop os and deepin.

This year marks the first year that I can just use linux without having to debug it.

zozbot234 said 14 days ago:

These things are highly hardware-dependent. Typically it takes a few years until support for new hardware devices, features or platforms stabilizes. But it can even take way more than that, and some less common and lower-quality hardware may fail to get support altogether.

mindfulhack said 13 days ago:

But macOS is very hardware dependent too.

huffmsa said 14 days ago:

Been putting off upgrading from 16.04 finally got it working a while back and was afraid to touch it.

Might give 20 a shot

julianeon said 14 days ago:

Longtime Linux user (Manjaro) and I never thought I'd see the day when I could pitch it as noticeably superior to MacOS, considering Apple's once-legendary attention to user interfaces. It seems like those days are behind us, now.

Linux as an actually better experience, without gigantic embarrassing flubs like this, is looking better by the day.

cerberusss said 14 days ago:

A slowdown when you run an app for the first time, for security reasons -- I wouldn't categorize that as a "gigantic embarrassing flub". I haven't noticed it, actually. But I don't run new apps every day.

julianeon said 14 days ago:

I think you're misunderstanding the problem, respectfully. This is not a problem for end users. This is a problem for developers - and a gigantic, embarrassing flub is justified for something as bad as this.

Think that's hyperbole? Look at this, from the link:

> The first time a user runs a new executable, Apple delays execution while waiting for a reply from their server. This check for me takes close to a second.

> This is not just for files downloaded from the internet... this is everything. So even if you write a one line shell script and run it in a terminal, you will get a delay!

Consider a developer in this situation.

If your job involves lots of scripting - not unusual, for a dev - and you create dozens of scripts a day, or more - every single one will take about a second, and up to 7 seconds (!) to run, that first time you run it. And that could easily happen upwards of a dozen times a day, because it will happen for each script you create.

That's pretty terrible, for a developer. I don't think you can normalize startup times, for some hacky script, of 1 second as pretty okay or not noticeable. Certainly not if you're talking about a high end work machine.

Times that bad are associated with some junk laptop that's 15 years old - that's not supposed to be Apple.

Even if you build apps (I do), you might have the need to create scripts now and then, possibly even a lot of them (I do, for testing). I don't consider it acceptable to wait 1 sec+ each time I run one. It really does suggest that Apple has gotten extremely careless about their developer audience.

So, yeah - compared to that, Linux performs way better, and looks like a premium work machine by comparison.

marssaxman said 14 days ago:

I never intended to switch away from Mac OS; it just sort of... happened. As Mac OS has grown more paternalistic over the years without adding any notable capabilities that I care about, it's felt steadily easier to just go use Linux instead. It has its own frustrations, but it can always be made to do what I want, and then it just behaves. Starting around Ubuntu 16.04, I found that the balance of frustration was tipping; these days I don't really bother to use my personal Mac any more. I still have one for work, but I'd certainly rather use Linux there too if I had the option.

slaw said 14 days ago:

For touchpad issues in Ubuntu uninstall xserver-xorg-input-synaptics and keep only xserver-xorg-input-libinput installed.

chacha2 said 14 days ago:

Isn't Ubuntu much worse than this with the push for Snap packages? It can take 10-30 seconds to open software installed through it.

simion314 said 14 days ago:

From what I head the snap packages complaints is a lot of FUD, ubuntu is still using normal packages except the Application Store application. You can always use Debian or Kubuntu if you prefer function over form.

seertaak said 14 days ago:

I have a ThinkPad with Ubuntu 19. I'm very happy with it; it's nice to have apt, and to be able to eg use minikube with docker driver rather than a VM.

It's also true that the trackpad isn't as good as Windows. (It used to be that Mac had the best, but Catalina managed somehow to screw up the trackpad and make it laggy. Catalina has not been good for me!)

levesque said 14 days ago:

Windows is still very much subpar, even with support for CUDA in WSL2. Loading packages is terribly slow in Windows, for some reason. Also don't get me started on package management (no, Anaconda doesn't cut it).

seertaak said 14 days ago:

I got pretty good results with chocolatey.

But I agree that even WSL2 didn't cut the mustard, and I doubt GPU support will fix it. MS is advancing too slow, I think.

doktrin said 14 days ago:

I've gone full circle. Went from desktop linux (mostly Arch) to OSX ~7 or so years ago, and now due to a combination of frustration with the butterfly keyboards and then a slew of issues with macOS itself, I'm back to linux desktop for my dev machine.

From my perspective as a quote-unquote power user, it feels like Apple just constantly insists on shooting themselves in the foot with unnecessary and ill conceived innovations. Either way, I'm happy with my new setup and probably won't go back to macbooks anytime soon.

Myrmornis said 14 days ago:

I would love to switch back to Linux but Apple's Retina displays are absolutely beautiful and there is no way I could enjoy going back to anything with noticeably lower pixel density on a laptop. I'd like to be told I'm wrong, but as far as I know it's not really possible to recreate a comparable high pixel density experience under Linux on a laptop.

tsar9x said 12 days ago:

Well, it is. However it's much easier with resolutions perfect for 2x scaling, so 4k on 15" XPS works great. As for fractional scaling (needed for 4k on 14/13") it's still kinda work in progress, I think it will be ready when chromium on wayland finally lands (I expect at least 1 more year). If you don't use electron/chrome, you can use it right now.

Obviously you can use less elegant solutions like changing fonts but it won't work with multiple displays with different resolutions.

cosmojg said 14 days ago:

Two years ago, I helped a friend install Ubuntu Linux on a Retina Macbook Pro, and it worked like a charm. If you're looking for a new laptop entirely, there are loads of 4K+ Linux-compatible laptops out there (ThinkPads are probably your best bet).

Myrmornis said 13 days ago:

Thanks. What do you think about this post? The author sounds knowledgable and I think it contradicts what you said to some degree (in that the experience and app support is not good even though Linux is installed on a machine with a high dpi display):

https://news.ycombinator.com/item?id=22958647

cosmojg said 13 days ago:

I don't know about Ubuntu, but my experience with Gnome on Arch Linux and Arch-derived distributions has been pretty good as far as high-DPI displays go. I've only had to make minor tweaks to a few configurations here and there depending on the application.

If you want to avoid tweaking, stick to native applications, and perhaps more importantly, go for a manufacturer with proper firmware support for high-DPI screens like System76 (Adder WS), Dell (XPS 13), or Lenovo (ThinkPad P1/P53/X1).

davrosthedalek said 14 days ago:

It seems the new Dell XPS finally have a touchpad which is close to the ones on the MacBooks. The touchpad and display are the two things which hold me back from switching away from Apple.

ubercow13 said 13 days ago:

Many of us who have been using Linux just fine on desktops and laptops for decades find those horror stories to be overstated...

mosburger said 14 days ago:

I would definitely consider moving to Linux for my next laptop - unfortunately I do a decent amount of iOS development, which I realize isn't impossible to do on Linux, but I can't imagine it'd be worth the hassle. :/

kstenerud said 14 days ago:

When I switched, I just made the macbook not suspend on lid close, plugged it in and left it running 24/7. Then I just screen shared or ssh'd in in whenever I needed to do something iOS related.

Sangeppato said 14 days ago:

The dual GPU is a pain in the butt since Nvidia still doesn't support Optimus on Linux (and probably never will).

rudiv said 14 days ago:

Have you tried 19.10 or 20.04? Before that I had a lot of issues with my Dell XPS 9560 because of optimus, but it got a lot better in those versions. YMMV but it actually worked out of the box with nary a hint of manual configuration when I installed 20.04 recently.

Edit: should note, when I say work I mean you can switch between GPUs/launch an app on the dedicated GPU with ease.

Sangeppato said 14 days ago:

I've tried 19.10 and Arch Linux and the only option still was to statically choose only one GPU and reboot. How does the offloading work now? I haven't heard anything about it

hvis said 14 days ago:

19.10 added the "NVIDIA On-Demand" profile in Nvidia Settings. It needs the driver version 435 or newer.

It works okay, but you have to launch processes with a specific set of env variables to use the Nvidia card.

halotrope said 14 days ago:

That is not true anymore. With 20.04 it supports hybrid graphics just fine. The only issue I had was sharing cuda and OpenGL context since GL ran on the Intel card. This should not be a concern for most people I assume.

Sangeppato said 14 days ago:

Can you run everything on the iGPU and only activate the Nvidia GPU to do the render offloading on single apps? If you can, I should try 20.04 on a laptop

halotrope said 14 days ago:

Yes exactly. This way you have all the GPU memory available for accelerated apps. Not sure if it works for all use cases but worked for me.

kebman said 14 days ago:

OSX used to be the OS that started really quick, and ran really smoothly. Certainly far better than Windows. Also search was lightning fast. It was a selling point on its own. But recently it has slowed to a crawl. And I have to ask, what business is it to Apple whether I store a script somewhere? I don't even want them to have a checksum. And I don't want to go through the bother of having to change settings for it either. Do they even ask if this is OK? For me this is just yet another reason to steer well clear of Apple products in the near future. Very sad, because I really used to love their stuff.

haunter said 14 days ago:

>OSX used to be the OS that started really quick

Coldboot Windows 10 from pushing the power button to reaching the login screen is 7s for me (i7-7700, m2 SSD, 32GB RAM).

I never ever had quicker startups on OSX.

kebman said 14 days ago:

Once I tried out Mac OS X for the first time during the late 2000's it was really striking how much better OS X was, compared to Windows, epspecially for "creative professions," for video, design and the sort. But since then, I have to hand it to Microsoft; they've really stepped up their game. They even seem to be fixing some of the non-UX compatibilities now. Granted, it's nowhere near good enough, but with PowerShell it's workable, at least for the projects I'm currently working on. For the more demanding stuff, I'll probably still Vbox a Linux distro however, while that has remained completely unnecessary for me on OS X. (I'm speaking about the whole personal experience and package deal here, so that's why I'm not mentioning things like Docker.)

zozbot234 said 14 days ago:

> OSX used to be the OS that started really quick, and ran really smoothly.

It was quite slow compared to OS 9, but even most Linux installs have way better performance on equivalent hardware. Windows really is dog slow by comparison.

kebman said 14 days ago:

This is true, but then Linux has a whole host of other issues that makes it nigh unusable for Muggles and non professionals. Thus, if they're not an avid gamer, I'd usually recommend OS X, until about 2016. Then I stopped doing that.

oefrha said 14 days ago:

Damn, I too have noticed that when developing in compiled languages (C, C++, Go, Rust, what have you) the first execution after a recompile is always noticeably delayed. I thought it was odd but didn’t bother digging into it. This must be why! (Can’t recall having this problem with scripting languages, but maybe subsequent modifications don’t trigger a notarization check? Edit: Yeah TFA does mention this.)

dkmar said 13 days ago:

For anyone looking for more information on what happens on the first run of an app in Catalina, see [0]. Here's a direct link to the diagram [1].

[0]: https://eclecticlight.co/2020/01/27/what-could-possibly-go-w...

[1]: https://eclecticlightdotcom.files.wordpress.com/2020/01/appf...

dcow said 14 days ago:

Can anybody actually confirm these claims? I'm no fan of the new notary system, but in my experience the behavior described is not how things work. Has there been an update or change in behavior recently?

I've been running a Debian thinkpad for the last meaningful stretch of time, but from what I recall macOS quarantines any files created by the user via an extended attribute `com.apple.quarantine`. Quarantined files are not allowed to be executed by gatekeeper. It's not about a network check, they just can't be executed. If the user removes the quarantine attribute, then gatekeeper will shut up and the files will execute normally. Alternatively, if a file has a signed hash stapled to it i.e. if it has been notarized, then gatekeeper will also allow execution after verifying the signature. This doesn't require a network check either.

Interestingly, the way to bypass the quarantine behavior is to unarchive a folder. Archives themselves include the quarantine attribute, however, files extracted from the archive using a terminal program (a "developer tools" program) don't. And so macOS doesn't care. Also tools like `curl` don't apply the quarantine bit to downloaded files so curling a binary or shell script still works just fine.

saagarjha said 14 days ago:

Notarization is an additional check that ensures that Apple has not revoked permission for the software to run.

inimino said 14 days ago:

It looks like my time with MacOS is rapidly coming to an end. Any Linux distro recommendations these days?

markosaric said 14 days ago:

I switched almost 2 years ago after 15 years on Macs.

Fedora 32 Workstation is pretty good if you want to see the best of what Linux can offer. It may not be the lightest and fastest distribution but it is easy to install and everything works. You'll get to experience Gnome which is the most original Linux desktop environment and the best one in terms of user experience in my opinion.

If you want something more traditional with the start menu or dock or desktop icons, perhaps something like KDE Neon is better place to start. It might feel more familiar. Will be lighter/faster too.

Put each of them on a USB and run them live on your machine for few minutes each and see which one makes more sense to you.

jcadam said 14 days ago:

I switched from MacOS to Linux years ago. For a developer workstation these days I'd probably either go with Ubuntu LTS or Fedora (my personal choice). Either runs fine on my XPS 13.

Note: I really wanted to like WSL, but it just didn't work for me.

_fullpint said 14 days ago:

Have you looked into WSL2?

I just recently switched from Mac OS to windows and it really hasn’t been a bad experience.

I would go full Linux but the drivers for the GPU on my laptop seem to be a bit of a mess currently.

jcadam said 14 days ago:

GPU switching (NVIDIA Optimus and the like) seems to be a major headache to get working on Linux. My current laptop (XPS 13) only has an integrated GPU, so I ssh into a desktop for running CUDA stuff.

But no, haven't tried WSL2, I'm comfortable with my Linux setup so not to keen on messing with it at the moment :)

j45 said 14 days ago:

Ubuntu 20 has been a pleasant surprise, it seems to have turned a productivity and speed corner.. I've been getting lost in it for hours on end and forgetting to use my MacBook.

The feeling reminds me of the first Macbooks I used when switching away from Windows Vista.

mindfulhack said 13 days ago:

That feels amazing to finally hear some good Ubuntu news. We need it. The only sleeker options for privacy (Windows and macOS) are horrendous. Thanks for sharing, might try out Ubuntu 20 then, might be as sleek at Linux Mint?

j45 said 13 days ago:

It’s funny you mention Linux Mint, it was the only other distraction I could get lost in for hours. I’d still be fine with Mintfor personal browsing. At the time, I was running mint in a vm on MacOS to try it out and Cinnamon was much more performant than Ubuntu 18. Ubuntu 19/20 however seems to have narrowed or closed that gap.

So far Ubuntu has been great as a default dev/staging workstation. It’s nice not to have to fight with homebrew or docker permissions or other issues on the Mac and spin up most anything.. and it just works.

gnalck said 14 days ago:

Fedora "just works" and has the some of the more sane defaults. Only tweaks one typically needs to do is add the RPM Fusion repos and, at some point, disable/tune-down SELinux when it is a bit too paranoid.

swebs said 14 days ago:

Give Pop OS a look. It's based on Ubuntu with some additional UI polish.

https://www.youtube.com/watch?v=QGcvHMNaDd0

dhruvkar said 14 days ago:

Pop_OS!

By far the best linux I've tried when trying to get feature parity with macOS.

m463 said 14 days ago:

After you've gotten used to Linux, you might want to try Arch.

It is lightweight, since you choose everything that is installed, sort of opt-in.

It has all the latest software.

It has "rolling releases" which means there is never a giant lost-weekend distribution upgrade.

It has the AUR (arch user repository) for just about any software ever.

zozbot234 said 14 days ago:

I've never lost a weekend to a Debian dist-upgrade. Just read the release notes carefully beforehand, take a full backup of your data (which you should be doing anyway), make a note of any non-Debian applications you're using on that machine (that's the stuff that will need the most extensive testing post-upgrade) and it should simply work.

m463 said 13 days ago:

I have. debian, raspbian, ubuntu. A few times it has gone well, only to find there was cruft left over from previous installs.

"it should simply work" is not a given on any linux.

I'm not denigrating those distributions, there are lots of reasons to have a stable release without a lot of things changing (especially development).

It's just that changing lots of assumptions at once is fragile.

inimino said 14 days ago:

I used Arch on a server once (still running) but found the experience on Debian was more to my taste, and somehow never liked pacman. Maybe it's time to take another look. I never tried it on the desktop.

sergeykish said 14 days ago:

Interesting, I have opposite experience. Pacman looks so much simpler than aptitude, apt-get, apt-cache, dpkg. And makepkg - it just works. I have not managed to create packages on Ubuntu.

No outdated packages, no ppa. No upgrade. Install is rough but it nails how simple the system is.

Ubuntu is a good starting point. But there is so much more.

m463 said 13 days ago:

I agree about makepkg / PKGBUILD -- I've casually made packages.

https://wiki.archlinux.org/index.php/PKGBUILD

For debian/ubuntu it is not as straightforward.

speedgoose said 14 days ago:

Windows 10 with WSL if you have a laptop.

Debian or similar or ArchLinux if you have a desktop.

inimino said 14 days ago:

For reasons of personal prejudice, I'll never install any Windows version on any hardware I own. Debian was always my first choice back in the desktop linux days, and still is for servers, but I haven't looked at the landscape recently. It seems to have become more consolidated, which is not surprising but still mildly disappointing.

Edit: and WSL is not Linux

yjftsjthsd-h said 14 days ago:

> WSL is not Linux

It is Linux as of WSL2, it's just also Windows, so you lose many of the advantages that would make a person recommend Linux in this thread.

inimino said 14 days ago:

TIL. But yes, for me, not having Windows installed is the primary advantage of any non-Windows OS.

lgl said 14 days ago:

Also my first choice for servers and have used it several times on desktop so Debian would also be my recommendation even for a desktop these days.

Plus, if you're already familiar with how Debian works it should be a no brainer. None of that Ubuntu or other Debian-derived distros with extra sugar and bloat and that many times differ from actual Debian in just the right way to keep you scratching your head.

Even Debian "stable" is pretty good for desktop these days which in the past was always notorious for having super outdated packages but has greatly improved in that regard. Obviously, "sid" is still also a good pick for a desktop if you really need to always run the latest of mostly everything.

inimino said 14 days ago:

Debian still feels like home. Unless I try a BSD or something without systemd I think this is probably where I'll end up.

lgl said 14 days ago:

Well, Debian does use systemd by default now unless you want to go through some hoops to remove it (which I believe is still possible but not sure).

I personally have really no issues with systemd and now even go as far as completely removing the ifupdown, isc-dhcp-client, resolvconf and ntpd packages in favor of having my entire network stack configured by systemd-networkd, systemd-resolved and systemd-timesyncd instead.

It's pretty much a standard now across the board and I can't really find any arguments against it besides old habits so I've embraced it. Although it's obviously a bit opinionated, there is a good deal of functionality and flexibility on that thing.

inimino said 13 days ago:

> there is a good deal of functionality and flexibility on that thing.

That's also what seems worst about it. Unfortunately there seem to be few other choices these days.

speedgoose said 14 days ago:

I understand but for laptops it's pretty bad these days if you want all features your laptop is providing, and a good energy management.

On mobile it's much better with Android, but Android isn't adapted to laptops. I haven't tried ChromeOS but it's pretty restricted from what I understood. WSL2 on Windows is Linux and it works great for me but I understand if you don't want windows in your life.

yjftsjthsd-h said 14 days ago:

Depends on the laptop. I've had good experiences with thinkpads and business class Dells on Linux (and BSDs, for that matter).

speedgoose said 14 days ago:

Probably. My ThinkPad has so many issues and unsupported features according to the ArchLinux wiki that I don't even want to try.

inimino said 14 days ago:

Same.

3combinatorHN said 14 days ago:

>paying for windows to install linux

tsukurimashou said 14 days ago:

I would recommend: Ubuntu, Linux Mint, Elementary OS, Pop_OS!

if you want: nice experience out of the box

I would recommend: Arch, Gentoo, Debian Net inst, Void

if you want a base system and install things you want on top of it

wetpaws said 14 days ago:

Mint been my daily driver for a year, does a fine job so far

sergiotapia said 14 days ago:

https://www.linuxmint.com/

It's ubuntu without the bullshit monitization.

nightowl_games said 14 days ago:

And with a better default DE

valeg said 14 days ago:

Kids love Manjaro these days.

andarleen said 14 days ago:

If in doubt just switch to ubuntu (there are better alternatives, but its a good starting point). I’m done with macos (tho i really loved it).

KevinSjoberg said 13 days ago:

Thought I was going insane seeing delays myself on a daily basis since Catalina. Turns out I'm not insane but a victim of Apple's continuous neglect of Mac OS.

How can something as damning as this ever reach end consumers without getting detected?

marcinzm said 14 days ago:

If Microsoft wasn't doing ever worse privacy things with Windows I'd seriously look into switching away from Mac OS given the ever growing issues it's been having with every release.

lol768 said 14 days ago:

The set of possible operating systems to consider does not contain two items.

gfxgirl said 14 days ago:

It does depending on what software you want to run.

There is no actually good alternative to Photoshop. gIMP is not remotely in the same league. Pixelmator and Affinity Photo are brought up but they're also like nano vs emacs. Photoshop doesn't run on Linux AFAIK. I'm sure for a graphic designer the same is true for Illustrator. The cheaper alternative exist and you can maybe get by but there's missing so many features.

If you're into games there is really only Windows. Same for VR.

I'm sure there are other categories.

I did serious dev on Linux and that dev didn't require any games or apps so it was great and I loved it. It ran my editor of choice and otherwise I only needed a browser and a terminal. But as soon as I step out of that small subset it's pretty much MacOS or Windows only, at least for the things I want to do with my computer.

mindfulhack said 13 days ago:

I wonder how viable just running PhotoShop in a VM is these days, if you have the extra RAM and are OK with the extra minute to boot up the VM each time to use the program?

VirtualBox has a 'seamless mode' as well, I wonder how well it works on a Linux host and a macOS/Windows guest.

nsxwolf said 14 days ago:

I find Linux to be a usability nightmare. Weird cut and paste behavior, difficult to resize windows, terrible trackpad support. macOS and Windows will have to get a lot worse before I switch.

Accacin said 14 days ago:

I found at least in Gnome and KDE Plasma window management works pretty much just how Windows works. Cut and paste it just cut and paste - Do you mean how you can select text and use middle click on the mouse to paste without even needing to do anything but select?

rrdharan said 14 days ago:

There are two X clipboards. They are implemented differently (as in "ownership" model of the content) and the implementation bleeds out everywhere.

You can't remove or change this behavior because some people love it.

EDIT: FWIW the above statements are oversimplifying the situation of course: https://en.wikipedia.org/wiki/X_Window_selection

And more here: https://unix.stackexchange.com/questions/13585/how-can-i-use...

Most fans of Linux will claim the fact that you can choose any number of clipboard managers to customize things to your liking is a critical aspect that draws them to the platform.

Others among us (whether reformed or uninitiated) will commonly cite this same stuff as the reasons we avoid Linux on the desktop.

C1sc0cat said 14 days ago:

Why I prefer the three button UNIX style mouse style and I don't ever seem to recall having problems with windows resizing on UNICX an unixlike systems.

tsukurimashou said 14 days ago:

how many DE did you try? you have a variety of choices now, I would recommend trying a popular one such as Ubuntu / Elementary OS / Linux Mint

You should get a very nice experience out of the box with these, which can be reproduced quite easily with less "bloated" distributions such as Arch or Gentoo if you prefer to install things yourself

said 14 days ago:
[deleted]
lostgame said 14 days ago:

Without WINE, and it’s associated instability, which operating system, other than MacOS or Windows, would run Ableton, Logic Pro, Adobe Premiere, or Final Cut Pro, all applications I depend on for my income and, due to the fact that my clients use this software, for which an FOSS equivalent or alternative doesn’t exist?

Now imagine the millions of other people in my situation and rethink your comment.

lol768 said 13 days ago:

> Without WINE, and it’s associated instability, which operating system, other than MacOS or Windows, would run Ableton, Logic Pro, Adobe Premiere, or Final Cut Pro, all applications I depend on for my income and, due to the fact that my clients use this software, for which an FOSS equivalent or alternative doesn’t exist?

> Now imagine the millions of other people in my situation and rethink your comment.

The comment still holds. Linux should still be considered. I didn't proclaim that it would be a realistic alternative in every case, but I'd wager that for a large proportion of software engineering roles, it would be.

Is there software that may also be suitable for basic image and video editing work and therefore fine for a subset of these creative professionals you refer to? Absolutely. I've seen great results from folks using Blender, Inkscape, OpenShot, GIMP, Krita and others.

We shouldn't just dismiss an OS immediately, and that's what my comment was trying to get at.

wl said 14 days ago:

At least 10.14 is supported for now.

It's really frustrating to see Apple make all these poor decisions and they almost never are willing to admit their mistakes and go back. In the rare case when they do (e.g. butterfly keyboard, Mac Pro), it takes them years to turn around.

astronautjones said 13 days ago:

> it takes them years to turn around.

or until they need something to throw out for investors. "dark mode" did not come about because of a technical breakthrough

ksec said 14 days ago:

That has been my view as well. It isn't Apple that is particularly good with anything Software ( I will give them they have an Edge in UX ). But Microsoft is just horribly bad every time I look at it makes macOS looks good.

philwelch said 14 days ago:

Switch to Linux then.

marcinzm said 13 days ago:

The other thread reply on this topic notes the reasons Linux is not considered a viable desktop replacement for many people.

Personally I'd need to run a VM for a bunch of software or fight Wine. That's assuming my machine has the right hardware support for everything and even then the trackpad support is likely to not be great.

philwelch said 12 days ago:

shrug I’m not gonna play a game of “why don’t you”/“yes but”.

650REDHAIR said 14 days ago:

Ew

kar1181 said 14 days ago:

I completely understand why things are going the way they are as our computing environment has become ever more hostile. But I am very nostalgic for the time where I would power up a Vic-20 and within seconds be able to get to work.

Teaching my daughter to program on a modern computer, we spend more time bootstrapping and in process, than we do in actual development.

tragomaskhalos said 14 days ago:

That computers are just slower to interact with now is such a truism that we hardly remark upon it any more. It seems utterly insane that in the early 90's I could just run Windows 3.1 on a bit of kit that in all likelihood wouldn't even power a toaster today, and the experience was, well, frictionless. I don't recall ever thinking "wtf is this thing doing?", whereas today, by contrast, if I have the audacity to be afk for long enough for my Windows 10 box to go sleep I know I am in for an infuriating waste of minutes' worth of disk thrashing before the bloody thing even deigns to reacknowledge my existence.

karatestomp said 14 days ago:

I remember being able to watch network traffic and if you (or some other actual person on you network) weren't doing anything nothing would be there. Yes even if you had a few webpages open but weren't clicking anything. Now your machine's "idle" and you capture on your network interface and it scrolls at hyperspeed.

kar1181 said 14 days ago:

I've been doing some network programming lately, specifically low level raw socket work. Sitting there with wireshark running the sheer volume of traffic with applications dialing home was kind of shocking.

I mean, I know it's happening, I (sadly) expect it to happen now. But seeing all the bits whizzing over the wire brought home just how much your machine is reporting about what you're up to.

dvfjsdhgfv said 14 days ago:

This is upsetting for me, too. And for a few others. But actually very few people care because they just don't see it. The people who designed it this way take care that users at large have no idea what is going on.

saagarjha said 14 days ago:

It's really very sad, because users have no idea what is going on and there is no incentive for bad programs to improve (actually, there is generally incentive in the opposite direction, because it's work to write well-behaving apps). Users just know that they need to keep buying new computers and that their battery life is worse, but they can't figure out why so they point fingers at everyone but who they should actually be blaming.

karatestomp said 14 days ago:

Remember when shitty user-hostile spying wasn't a library you included that assured you in its readme it was "made with [heart] in California"? Ah, the days when only criminals and bigcos casually engaged in shady crap.

saagarjha said 14 days ago:

That's a somewhat unrelated discussion, but yes, I am not very happy with the current state of software where people think they are entitled to out-out analytics information coming off my machine.

dvfjsdhgfv said 13 days ago:

Well, I remember the days when a message in Windows cropped up saying (standard at the time when a program crashed): "Do you want to send the error report to Microsoft" and my boss called me, asking a bit concrened, "Please, tell me honestly, what do you think - should we send them this error report?"

blyry said 14 days ago:

I switched to a linux desktop full time last week because of this exact problem. VPN w/ windows would flake out on me all the time, and I got sooo tired of just...waiting. Remember when windows search worked? Like, you could press the windows key, type what you were looking for and find it? Quickly?

Being able to turn the computer on, type in my password and have it be just..ready is so incredibly refreshing. Having a terminal with 0 latency, where copy/paste is sane? Worth a zillion dollars to me right now.

Currently playing with opensuse tumbleweed, i'll probably get frustrated by something and move to arch, so I can fix that something and also be frustrated by a hundred other things.

cjsawyer said 14 days ago:

Windows search turning into bing search is one of the most frustrating little things. You used to be able to instantly pull up files by name but now it just dumps you random garbage from the internet.

1123581321 said 13 days ago:

It’s still really fast if you disable Cortana and Internet search results. I launch most programs by hitting the windows key, a few characters and enter.

cjsawyer said 12 days ago:

I’ll look into it, I would love to have that functionality back

fetbaffe said 14 days ago:

Rumors on the internets have spoken positively about Opensuse Leap & Tumbleweed, any truth to that?

blyry said 14 days ago:

I don't have a ton of experience with other options, but 2 weeks in and tumbleweed has been pretty plug and play! 0 issues getting my netcore/python/golang/docker dev stack up. I get a weird popping noise in my usb dac at the login screen but that's the only issue I've had so far. Teams screen sharing even works perfectly! I chose it over Ubuntu 20 because I knew I wanted kde and it seems like a first class citizen in tumbleweed, while still being vaguely stable. Not-quite-bleeding edge! I ran freebsd/kde for fun back in the halycon days of lamp stack and gnome never felt...right to me when I would test drive Ubuntu desktop.

fetbaffe said 12 days ago:

Good to know. Personally I think that Ubuntu has gone downhill. I preferred unity over gnome. On a fresh install of Ubuntu, gnome is confusing with it's split with two taskbars that has some overlap in functionality.

ChuckNorris89 said 14 days ago:

Another vote from me for tumbleweed.

WrtCdEvrydy said 14 days ago:

I call this 'Outsourcing the cost of development to the user'...

Getting knowledgeable people costs money so we build more abstractions that lower the cost of development and pass the costs of development from the company to the user in the form of requiring more hardware to do the same thing.

How come I need 16Gb of RAM these days when 8Gb did it yesterday? How come my phone needs 4Gb of RAM while my 2012 tablet had 1Gb? Sure the hardware is cheaper but we're still not using the hardware to it's fullest.

karatestomp said 14 days ago:

My 256MB RAM, 900Mhz Duron machine (single core, naturally) in ~2002 (IIRC?) could do just about everything my modern one can. We even had video chat! It was just much lower res. The limiting factor in online stuff was, by far, connection speed, not the power of my hardware. That was about the point where the hardware was fast enough and had enough memory that I could multitask in a modern way without hitting problems like popping/stuttering audio or bad swap issues. Aside from legitimate increases in memory use for higher-res media, most everything since then, from my perspective, has been pure bloat. Why does 16x that memory and two cores at double the clock feel insufficient for extremely similar workloads and software feature-sets? Fucking bloat is why. Largely, but far from solely, web-tech infesting everything.

Before that, my 64MB RAM 100mhz Pentium could usually have a couple things open before it'd hit swap too badly. I'm talking like Word and a web browser, not calc and notepad. None of the equivalent programs to those can even open all on their own in a footprint smaller than 64MB these days, let alone with other programs and the OS in the same space. Hell, how many operating systems fit in that with a GUI as capable and usable as, say, Win98se (let alone something really incredible on the performance front, like BeOS)?

aclsid said 14 days ago:

I agree with the main sentiment, but I have made my peace with it. Mainly Java and Electron based apps because they do provide us with a nice thing that was impossible years before unless you wanted to become a digital hermit: Linux on the desktop.

I can now use simplenote, discord, slack, the jetbrains dev suite, visual studio code, and this is without including separate developments like Steam, which has made it effortless to switch between Windows, Linux and Mac.

That being said, I still consider Mac OS the superior OS (this call home issue from the article aside), mostly because the font rendering still works better after all these years, Windows and Mac still have better quality software available for them, and Mac still does not have the forced updates as Windows does. Also I have noticed that in Ubuntu, some electron apps like Simplenote, the copy and paste of text is funky at times, like not even letting me select stuff.

coliveira said 14 days ago:

The reason is very simple: developers don't want to develop anymore, they just want to offload real programming to third party libraries, where what used to take 100 lines of code to accomplish will take 10K or more (because, obviously, the library will do the most general version of what it wants to do). All this is considered "good development practices", which means that programs will inflate to take whatever memory is available and run slower for as long as we continue to use the same practices.

astronautjones said 13 days ago:

and is absolutely encouraged by google and amazon, as delivering that bloat makes them money

said 14 days ago:
[deleted]
valuearb said 14 days ago:

What’s the point of cheaper disk and ram, and faster systems if not for supporting higher level abstractions?

npongratz said 13 days ago:

To watch more, higher-def cat videos faster. No need to get lost in the weeds of higher level abstractions to do that.

jcelerier said 14 days ago:

is this a serious question ?

rhizome said 14 days ago:

And now that "the web is the internet" even more than ever, developers and designers are giving us spinners/loading indicators ALL THE TIME. At least in my tabs they are.

The web is much, much, much slower than it used to be.

Domenic_S said 14 days ago:

> Windows 10 box to go sleep I know I am in for an infuriating waste of minutes' worth of disk thrashing before the bloody thing even deigns to reacknowledge my existence.

Yeah, what the heck is this? I use a win10 box solely for gaming, and every single time I wake from sleep, Antimalware Executable keeps my machine from doing anything for several minutes. It's infuriating.

Spooky23 said 14 days ago:

Silly user. The computer exists to update itself. Whatever trivial task you want to do is a secondary concern.

saagarjha said 14 days ago:

You joke, but there is a surprising amount of software that does not have its user as the primary thing it cares about.

aclsid said 14 days ago:

Just get a proper antivirus and it will probably disable the built-in security suite for you

saagarjha said 14 days ago:

While making your computer even worse?

andai said 13 days ago:

For many years, I had a very nice experience with NOD32. By far the best antivirus I have used in terms of UI and resources. Well, admittedly not that high of a bar.. but they really seem to care about efficiency and and elegance.

Considering the built in one is pretty slow (and gives useless notifications), I expect it would be an improvement.

zeroimpl said 14 days ago:

I recall windows 95/98 being pretty slow to boot. I also recall being warned by teachers not to move the mouse while things were booting as that would allegedly slow things down further. These days the only real time I wonder "wtf is this thing doing" is when I'm waiting about 5-10 seconds for my mac to wake up from sleep.

shanemhansen said 14 days ago:

Surprisingly, wiggling the mouse actually speeds up some windows operations.

https://retrocomputing.stackexchange.com/questions/11533/why...

TheOtherHobbes said 14 days ago:

Win 95 and its descendants had legendary poor boot times.

Things finally improved with XP, but W3.1x and W95 were anything but fast - unless you were playing Solitaire.

WillPostForFood said 14 days ago:

Here is a Pentium 200Mhz starting Win95, only about 20 seconds from "Starting Windows 95" to the login screen. 40 seconds including the full powerup/BIOS sequence. Not too bad.

https://www.youtube.com/watch?v=PwRR7-P-8fc

npongratz said 14 days ago:

> It seems utterly insane that in the early 90's I could just run Windows 3.1 on a bit of kit that in all likelihood wouldn't even power a toaster today, and the experience was, well, frictionless. I don't recall ever thinking "wtf is this thing doing?" ...

I generally agree, but I sometimes ran Windows 3.0 on a 386SX-16 in the early 90s, and often wondered why it ran so slow on my admittedly underpowered but supported system.

At some point I read (perhaps in Compute! or BYTE) that Windows made something like 20 or 30 syscalls to draw one line of a window's border. That seemed exceptionally inefficient to me, so I stopped using Windows. I generally worked in DOS, but if I wanted a GUI, Geoworks provided an experience at least ten times better (subjectively) -- smooth UI, ability to multitask, a surprisingly good word processor and other well-designed software included.

andai said 14 days ago:

Are you on a hard disk drive? I have bestowed upon myself the unique misfortune of running Windows 10 on a spinny disk.

bscphil said 13 days ago:

This has quietly become a pretty serious issue. Most software developers have simply stopped caring about systems with traditional HDDs. This is even true on Linux - I found out a while back that all the KDE developers are using SSDs, which is why they weren't fixing issues where startup time is affected by disk latency. I eventually gave in and bought a 250 GB SSD for my old laptop, there was simply no other option.

massysett said 14 days ago:

If that’s what you really want, grab a used ThinkPad and put Arch Linux on it. It will boot in a few seconds and is much more powerful than a Vic-20.

yjftsjthsd-h said 14 days ago:

Still doesn't give you a programming environment, unless you want to do bash.

armatav said 14 days ago:

How does that even make sense? It’s an OS, go grab a Desktop Environment and download nvim, VSCode or whatever.

yjftsjthsd-h said 14 days ago:

The original line that I was responding to was

> Teaching my daughter to program on a modern computer, we spend more time bootstrapping and in process, than we do in actual development.

Arch Linux does not help with this, unless you make it boot into a VIC-20 emulator or something. Arch can help with boot speed, but once you're booted you're back in a full modern OS. So fine, install VSCode and Python... okay, now you get to figure out libraries. Manage terminals. Arrange a filesystem. This is not getting you closer to the VIC-20 or C64's "boot into BASIC".

cosmojg said 14 days ago:

This is very possible on Arch Linux, moreso than other distributions. After installing Arch, just run the following two commands:

  sudo pacman -S xonsh

  chsh --shell /usr/bin/xonsh
Bam! You're booting straight into a full Python environment when you turn on your computer. This is similarly achievable with other languages as well, including BASIC.
smcameron said 14 days ago:

How about Processing. https://processing.org/

goatinaboat said 14 days ago:

How does that even make sense?

Because that was the experience on those old machines. Switch it on, straight to BASIC prompt in a second or so. If you want to program it’s frictionless. And you can’t break it because BASIC is in ROM.

harpratap said 14 days ago:

Flexibility vs complexity is a slippery slope.

cycomanic said 13 days ago:

If you want that today get a BBC microbit, switch on and you're directly in a python environment

gorrillaribs said 14 days ago:

Doesn't arch come with python & gcc out of the box?

yjftsjthsd-h said 14 days ago:

No, although `pacman -Syu python base-devel` isn't exactly a burden. But then what? If you're trying to get back to a simple "turn on computer, land in simple programming environment", how does it help that you have python and gcc available? You still have to manage libraries, learn to use a compiler, and all the other joys of modern development. The only thing Arch Linux gained you was a bit simpler OS and maybe better boot times.

Throwaeay2928 said 14 days ago:

Yes it does. When you pacstrap you include base devel. From that moment onwards your you will have a full programming environment all ready to rock and roll on your installation.

yjftsjthsd-h said 14 days ago:

Yes, and you have a full operating system and all the joys of modern development. You absolutely do not have anything like a VIC-20 that you can power on end have a basic programming environment 5 seconds later. At best, you turn it on and 5 seconds later have a python shell, where you can do a certain amount of development before you get to experience the joys of managing libraries and dependencies. Thus bringing us back to what I perceived as the primary complaint that there's way too much setup and baggage required just to get to the actual programming part.

californical said 14 days ago:

You can use python without needing to manage any packages -- you'll have to write most things from scratch, but isn't that the hardware BASIC non-internet experience regardless?

amelius said 14 days ago:

We're moving away from general purpose computing, and Apple is one of the greatest forces in this.

Also, they are a threat to a free market for software, as they regulate their walled garden with arbitrary rules and skim off a lot of value.

I honestly don't understand why a large portion of developers have so much love for Apple. I'm personally a proud owner of a desktop PC with an ASUS motherboard. It serves me fine, and gives me full control over the software installed on it. I'm not a laptop-person but I believe there are many perfectly capable non-Apple laptops out there.

pjmlp said 14 days ago:

Because for those of us that care about graphics and selling desktop applications, it is mostly Apple, Google or Microsoft platforms.

kens said 14 days ago:

At the Computer History Museum, I use an IBM 1401 mainframe (1959). When you hit the power button, relays go ch-ch-chunk and it's immediately ready to use. Because it has magnetic core memory, it even has the previous program already in memory, preserved over power-down. Computers have taken many steps backwards as far as startup time. Of course, loading a new program from punch cards is slow, so some things have improved :-)

kar1181 said 14 days ago:

I've spent surely coming up on years watching and reading all the content you've either created or helped produce. Indeed some things may have improved, but I sure enjoy the heck reading and watching all your exploits with 'legacy computing'!

gorgoiler said 14 days ago:

Watch a repl.it boot. It is the new joy, for children, to see an entire machine appear before their eyes and be able to instantly code away on it.

downerending said 14 days ago:

On the plus side, emacs now starts far faster than most computers.

blondin said 14 days ago:

> I completely understand why things are going the way they are as our computing environment has become ever more hostile.

care to elaborate a bit? what did you understand?

i just can't get my head around this idea that most non-mobile OSes have become such hostile environments...

yes, the population at large only uses their phones and tablets and doesn't care much. but they would be left without any entertainment if it wasn't for those of us who still need decent non-mobile environments.

chooseaname said 14 days ago:

So, the question is will people get to a point and say enough is enough? And if so, will enough people be saying it for it to make a difference?

hota_mazi said 14 days ago:

It takes less than five seconds for my Windows 10 to go from asleep to ready for work, and that includes logging in with Windows Hello (the fingerprint reading is crazy fast).

konart said 14 days ago:

I've been using linux distros (~5 years of Ubuntu and ~3 years of Arch) before switching to macOS somewhere around 2013-2014. And now years later I'm thinking about moving back. But every time I'm think about this I start with digging about current Linux situation and every time I realise than it is still a horrible system for anything outside of work, especially if you can't really do without a decent UI\UX.

Apple's ecosystem is also an issue. iOS + macOS is still much better than anything on the market (no alternatives really).

PKop said 14 days ago:

Switched from macOS this year having used it for about 8 years to first PoP_OS and now Manjaro. Both were great (GNOME environments) and very productive for both development and general use. I really like the streamlined, "get out of your way" UI.

I would say go for it, I'm glad to not be dealing with any of this nonsense, while paying a premium for it.

konart said 14 days ago:

I've seen both of them, but the "get out of your way" UI is a limited feature. Apps are still do not respect the rest of it.

You install this new distro (like Elementary if it's still alive) and fall in love with the new Finder clone. But then you install twitter client, torrent client and a dozen of other everyday apps. And they all look terrible. And feel even worse. People still don't care.

As much as I hate certain things about macOS - I'd still chose it over Manjaro for example (haven't really tried PoP)

And not to mentions things like continuity and handoff. I can live without being able to copy paste token from my phone to my computer but this is so convenient T_T

PKop said 14 days ago:

Makes sense.. especially if you're still hooked into iOS. I had already given up iPhone couple years earlier so was easier I imagine.

I just use messages.google.com and save it as an app shortcut, and Telegram native app, and both work well. And generally am fine with web apps if a native app doesn't look right. But finding the right native app for the desktop environment can be an issue. The GNOME skinned apps are pretty nice.

And Manjaro has the AUM for plenty of available tools and such. But that's more dev focused

sergeykish said 14 days ago:

Yes, UI consistent mostly in terminal and chromeless applications. Really shows how bad alternative OSes are.

Seriously though with i3, beautiful fonts, so much in the browser it's not bad.

jfkebwjsbx said 14 days ago:

> twitter client, torrent client and a dozen of other everyday apps

I don't install any of that in work machines, and I'd hope most devs don't either, specially if the company owns the device.

If you really need those, why cannot you use the browser?

> continuity and handoff

Why do you need that for development?

Even if your workflow requires it for some strange reason, why don't you use an alternative? There are plenty of ways to pass data between devices.

konart said 14 days ago:

I think you are missing a point here.

tl;dr: I don't have and don't want to have two PCs for two use cases.

I have my personal macbook that I use for work (development) and everything else. I use it when I have to be at the office or when I want to work outside of my apartment. Needless to say I want my personal computer to have applications that I use. For both - work and ... not work.

>> continuity and handoff

>Why do you need that for development?

I don't. I don't use a computer only for development (see above). But even during development something it can come in handy. For example when you are working on a service that has sms auth. Can I just put in 6 digits by hand? Sure. But having them being copied from you phone for you is very convenient.

jfkebwjsbx said 13 days ago:

That is definitely not wise.

Many companies lock down devices for good reason. For starters, to prevent employees doing that and risking the entire company.

konart said 13 days ago:

Many companies also take a screenshot of your screen every 10 seconds to "keep you in shape". I'm not taking part in this shit show thankfully. I've had my time in corporations that do this or similar stuff. Never again.

And the only channel I'm connected to the company is the email and selfhosted gitlab. Now tell me how can a twitter client on my working machine harm this. Not in a fictional one in a life time scenario out of Mr Robot.

jfkebwjsbx said 12 days ago:

Don't mix privacy and security. Privacy-invading policies have nothing to do with the discussion.

As for examples, you have many, including ones discussed in HN regularly.

inimino said 14 days ago:

I use my work machine for work and my personal equipment for everything else. My iPhone is more standalone then they used to be. I don't see any reason why I'd ever connect my personal phone to my work computer. So I don't see many downsides to making the switch.

konart said 14 days ago:

Well, I don't have 'work' computer. I have my personal macbook and even more personal iMac.

Obviously in case you work only at the office or you use your computer only (lets say 90% time) for work - than there is no problem.

inimino said 14 days ago:

When I used my personal machines for everything, then I isolated my work from everything else. Remote servers are perfect for this, then you can just ssh in from any machine and do your work.

bitcharmer said 14 days ago:

Linux on the desktop has been my daily driver for years (mainly xfce and gnome).

I use linux to watch movies, create music, play games and everything else. What exactly makes it a "horrible system outside of work" for you?

konart said 14 days ago:

>Linux on the desktop has been my daily driver for years

Same for me, I've even been a maintainer of one (ONE! lol) AUR package.

>especially if you can't really do without a decent UI\UX.

Outside of a few Electron-base apps and maybe a few native gtk\kde one - everything looks like a work of high schooler. Nobody thinks about the UI\UX.

Compare Things3 and something from linux word. Or Bear. Or Twitterrific\Tweetbot.

But go no further than your system's settings: https://imgur.com/a/p0kl7wM - wtf is this? You have a window that takes 80% of your screen some huge ass controls that still take some 20% of the the whole view. Who thought this was a good idea?

Gnome 3 is even worse (I loved gnome2 back in 2009)

formercoder said 14 days ago:

PC + WSL + somewhat illicit OS X VM has been a dream for me as a former Mac user.

konart said 14 days ago:

My mother asked me to help her out with her win 10 installation on her work notebook. This was terrible.

UI is still inconsistent between apps, sometimes it feels like you are using 3 different OS from 3 different time periods. But you can get used to that I guess.

OS settings are still a strange place created to make an average user (or someone who haven't been using the OS for more than a decade) feel as an idiot.

No, amount the Big Three - Windows is the last place I'd look moving too. At least Linux gives me freedom at the expense of UI\UX. Windows give me... well games. I can't thing of any other reason to install linux except competitive gaming.

formercoder said 14 days ago:

Interesting it’s possible that we have different priorities, but I’m not bothered by UI inconsistencies. I use chrome, office, adobe suite, a trading application, games, VSCode, they all have different interfaces that I know how to navigate. I agree that the settings can be tough. Half the time you are in “new” stuff and half the time you’re pulling up the screens from XP. I just google what I need to do though, and never have trouble getting it done.

konart said 14 days ago:

> priorities

Not priorities but rather attitude maybe? (Not sure if the best word but this is the best I can think of with my english, hopefully it doesn't sound offensive or tactless)

Imagine you have a car. Great engine, relatively comfortable seats, a new set of tires and a body so ugly you want to ram it into a wall everytime you are behind the wheel. It does its job well but you do not enjoy the time with.

Being able to enjoy my time with a device or an OS (or any other thing or person for that matter) is what I want. Obviously sometimes the issue is on my part.

halotrope said 14 days ago:

Give windows 10 and WSL2 a try. With the new terminal and editor it is really a neat setup. macOS is hard to beat in terms of smoothness and looks but unfortunately it gets more and more clunky for working.

konart said 14 days ago:
jfkebwjsbx said 14 days ago:

> iOS + macOS is still much better than anything on the market (no alternatives really).

The Windows + Linux combo is way better for all productivity, gaming and development than the mess macOS has become since Jobs passed away.

konart said 14 days ago:

I'm too much into gaming this days, PS4 is enough for me.

As for the rest I've commented about win10 https://news.ycombinator.com/item?id=23274273 and Linux distros: https://news.ycombinator.com/item?id=23274492

I still find macOS to have best balance of productivity, development and feel. Windows is still terrible and linux is just for work.

jfkebwjsbx said 13 days ago:

The issue is that you claimed that "there is no alternative to macOS", but you are talking about your particular use cases (not gaming) and subjective opinions (does not like Win10, does not like Linux).

macOS’ only strength for development is the ability to target iOS. For the majority of developers, a Windows/Linux setup is better because it covers everything. Linux is the best environment for most dev fields. Windows is the best for some of them (graphics, gamedev, C#).

konart said 13 days ago:

No, I didn't claim this. Unless you are trying to take one phrase out of context and dance on it.

What I said is that macOS is the only OS that provides the needed balance of everything (except gaming). Other platforms are not alternatives because you have to chose - either you are getting a good dev machine that is not a enjoyable to use for other use cases, or you are getting windows which is not enjoyable for the reasons I've described in the other comment. The only two reasons to chose windows (as I see it) are gaming (and game development maybe) and windows (often enterprise) development.

To sum it up with an analogy and close the topic: a truck is not an alternative to a volvo s60 just because it is also a car and can do even more than a volvo s60.

PS:

>macOS’ only strength for development

This is your second comment where you for some reason ignore most of my comments and focus just on what suits you.

konart said 14 days ago:

>this

these

zimpenfish said 14 days ago:

Their "see!" shell script example is a bit rubbish because I get 0.012s, 0.005s on this Mac laptop whilst getting 0.022s, 0.023s on Linux box 1 and 0.006s, 0.006s on Linux box 2.

Changing the filename to test2.sh on the Mac (which should trigger the delay, right?) gets 0.006s, 0.006s.

I don't think the shell scripts are doing what they claim (and wouldn't the second run be faster anyway because of caching?)

egorfine said 14 days ago:

If they are caching based on inode, this will not invalidate the cache. Do cp test.sh test2.sh and try again.

saagarjha said 14 days ago:

I feel like cp might do an APFS CoW and this might still cause problems…

ken said 14 days ago:

No, even "cp -c" creates a new inode.

zimpenfish said 14 days ago:

Sorry, when I said "changing the filename to test2.sh", I meant in the commands run, not `mv test.sh test2.sh`. i.e. I have both `test.sh` and `test2.sh` in `/tmp` now.

Nextgrid said 14 days ago:

I've been forced to update to this pile of shit because latest iOS requires latest Xcode which in turn requires Catalina. It's a nightmare.

First off the new apps (music, podcasts, etc) are terrible. They killed off iTunes but replaced it with much worse. These apps don't behave like standard macOS apps, the UI is full of inconsistencies and is just so empty. This website has nice examples of the failures of modern Mac OS: https://annoying.technology

For some reason after updating the "new updates" badge was stuck on the system preferences icon (and even on the preference pane itself) despite no updates being available. I ended up having to delete a plist and reboot to fix it, apparently a common issue.

The Mail app will now randomly play the "new mail" sound. I can't confirm it for sure but I'm assuming it's treating read, existing mails when they are moved to the trash/archive or newly created drafts. They screwed up the mail app, a problem that has been solved for decades. WTF? The worst is that I see no major changes in there, so why touch the mail client in the first place if you're not even going to give me additional features in exchange?

Xcode was stuck upgrading in the App Store. It would start the process and never make any progress. Cancelling it had no effect. Rebooting cancelled it but the second attempt, while making progress, ended up failing with a generic error message with no actual information. Logs are useless because they're being spammed by all the background processes even during normal operation making it impossible to find anything. Finally the third attempt succeeded.

1Password now takes 5 more seconds to unlock my password database. Somehow this disgrace of an OS slowed down the password hashing process by an order of magnitude.

Switching screen resolutions or connecting to an external screen takes a good 10 seconds of flickering and frozen UI before everything starts working again. This is now actually worse than both Windows and Linux. I dread moving the laptop or touching the USB-C cable (also because USB-C is so brittle) when it's connected to an external monitor out of fear that it'll disconnect/reconnect and I end up in a 30-second cycle of flickering.

I upgraded a couple of days ago, so those are not early bugs. Apple had a year to fix all of this. The Xcode thing might be an isolated issue but there's no excuse for the general performance penalty or the stuck update badge which has many hits on search engines suggesting it's a widespread issue.

BruceEel said 14 days ago:

> I've been forced to update to this pile of shit because latest iOS requires latest Xcode which in turn requires Catalina. It's a nightmare.

I'm literally halfway there as I type this, Xcode 'installing components'. Having to upgrade essentially everything just to get the right dev tools for the current iOS is madness, feels like buying a new house to fit the new coffeemaker...

saagarjha said 14 days ago:

I install new versions of Xcode about every two weeks on average. The amount of time it takes to have a new Xcode running is at least an hour: first you download a massive XIP, then the system "verifies" it forever when you try to open it, then it takes forever to unarchive because it's huge, then you need to copy it from ~/Downloads to /Applications which takes another couple of minutes. Then you hit the component installation part… (I think this step has something to do with installing new MobileDevice frameworks?)

Throwaeay2928 said 14 days ago:

Forcibly relocated to a refugee camp tent with leaking water pipes next to your air mattress. But at least everything around you in your tent is white, flat, and material and your coffeemaker works.

saagarjha said 14 days ago:

> The Mail app will now randomly play the "new mail" sound.

It’s not quite random: it plays the sounds as it gets new email, but then it takes anywhere between a couple of seconds to a minute for the new email to be visible in the UI. Infuriating.

> Xcode was stuck upgrading in the App Store. It would start the process and never make any progress. Cancelling it had no effect. Rebooting cancelled it but the second attempt, while making progress, ended up failing with a generic error message with no actual information.

I just normally kill the store-related daemons when that happens.

davidvartan said 14 days ago:

Re: downloading Xcode, this page has saved me hours: https://stackoverflow.com/questions/10335747/how-to-download.... It's just a list of direct links to each version of Xcode at apple.com. Mystery why Mac App Store downloads still can't be bulletproof after all these years.

eklavya said 14 days ago:

This one drives me nuts. I mean what in the hell is that downloading doing that it manages to fail arbitrarily. This is downloading files, how the fuck can it be so complicated and broken.

Nextgrid said 14 days ago:

I actually prefer the App Store approach because that way the majority of my updates are in one place and can be done automatically in the background. The problem is that it used to work fine and they managed to break it.

sixstringtheory said 14 days ago:

I usually keep at least one prior release of Xcode on my machine, up to the latest patch for its series. So right now I have 11.5 and 11.4.1. I've hit so many problems with new versions in the past. I wish I could just let MAS handle it for me, but it's just never been an option, aside from the issues it has actually working.

dmix said 14 days ago:

I don't share your issues with Catalina [1] but I have to agree Podcast app's UI design is very strange. The primary interface should be the "Episodes" tab.

Just like Twitter's UI, app developers think they know what content is best for you with a 'feed' or 'featured'... they've completely abandoned chronological ordered lists of content unless you click 2-3 buttons.

[1] Catalina has been painless for me, not sure why my experience was different than everyone else

inimino said 14 days ago:

I also upgraded days ago, assuming they would have had time to fix the bugs. However, I can say the USB-C external screen flicker was plaguing me before the upgrade and hasn't gotten worse. Turning off hot corners, oddly, helped, although the problem hasn't gone away.

maevyn11 said 14 days ago:

I've had a similarly painful experience upgrading last week. Though it doesn't seem quite so bad as the posters above, and after making a few fixes most everything is back to normal.

My one remaining serious annoyance is that my external monitor color settings are screwed up and there appears to be no fix. Reds are purple and everything is just a little washed out, which is a shame for a 4k monitor that was beautiful with Mojave.

Strangely, right before the computer restarts, or if booted in safe mode the color starts to look perfect again, but I can't seem to replicate that in normal operation.

SlashmanX said 14 days ago:

I have this issue constantly, even the laptop screen itself will get 'washed out'. The solution is to go to Displays > Colour Profiles and change the profile to any other one and then change back to the default.

Nextgrid said 14 days ago:

> My one remaining serious annoyance is that my external monitor color settings are screwed up

Could it have something to do with Night Shift? Have you tried enabling and disabling it and see if it fixes that?

2ion said 14 days ago:

Our help desk is wise enough to keep existing mac users on the oldest supported macOS version; but inevitably at some point in the future they'll have to roll out the latest version. This will be the week when I will exchange my macbook for a Windows 10 ThinkPad. A lot of our dev teams have moved to this setup alreay using WSL or a VM for Linux if really needed and it has been really smooth (our helpdesk staying on top of the Active Directory and Windows Update management game also).

mst said 14 days ago:

If WSL turns out to be insufficient, https://multipass.run/ is worth a look.

cosmojg said 14 days ago:

Or, you know, just run Linux outright.

fxtentacle said 14 days ago:

Do you know of anything similar that supports GPU acceleration?

neuronic said 14 days ago:

I share almost all of these issues. What drives me super nuts is the multi-display support which NEVER "just works".

I have to disconnect and reconnect USB-C 3 times, turn off the second monitor, switch inputs, restart the €3000 machines twice or whatever. So annoying, how does this pass QA at all?

Also, don't setup and use multiple users at the same time. That's really messy as well.

ourcat said 14 days ago:

Since Steve left us, over time I've witnessed so many issues crop up in the Apple ecosytem, for users/customers and developers, and it's clear that there's nobody to be shit-scared of anymore at Apple.

So many recent things would have pissed him off.

There's no way the 'notch' would have appeared. Nor the fact that the iPhone camera design stopped the device sitting flat on a surface.

unix_fan said 14 days ago:

if Steve were still alive, iOS would never have been as open as it is today.

FireBeyond said 14 days ago:

They don't give a shit if you're not using an Apple monitor. Witness the ProDisplay, which doesn't even have a power button, and talks to the computer to turn on.

fredsted said 13 days ago:

Your experience certainly sounds bad, but none of this is normal; mail sound, USB-C cable brittleness, 1password slowness, all of it works nicely for me.

ehutch79 said 14 days ago:

Have you actually done anything to try and fix these issues? Because this is not typical

I use 1password and it doesn't take 5 seconds to open. Did I accidently install linux or something? because since it's the OS causing your delay it would be causing me to have the same delay.

xcode installs just fine for my entire team. Just did the update myself, worked just fine.

I plug into a dock and undock constantly during the day, and while it could be quickinger, 10 seconds and flickering is NOT my experience.

and what the fk are you doing to your connections that you consider usb-c brittle?!?

inimino said 14 days ago:

There's a lot more non-determinism in a modern MacOS install than you imagine. "WFM" doesn't invalidate the anecdote to which you reply. TFA is about putting network requests in system calls ffs.

yyyk said 14 days ago:

OP is a typical Apple "You're holding it wrong" reaction. It's never Apple's fault when its OS doesn't work right - it's always the user's fault. Despite the user paying a premium for Apple, or Apple having control over hardware its OS works with.

Nextgrid said 14 days ago:

I've just tried connecting to my external monitor again and 10 seconds is exactly how much it took - no exaggeration there. The internal monitor goes blank for 1 or 2 seconds, then both monitors turn on and it takes another ~8 seconds for the UI to adjust and the windows to be moved to the proper place.

> you consider usb-c brittle?!?

It's much easier to unplug USB-C than HDMI or DisplayPort, for one. USB-C itself is a terrible mess that requires an engineering degree to figure out what's compatible and not, and maybe it's just me and I have a shit hub but I had an external hard drive crash midway through a file transfer due to power issues despite being powered by a Apple charger (the hub and all the peripherals went dark and the laptop stopped charging, then started cycling on and off where every time the drive tries to start up again it kills everything).

gmanley said 14 days ago:

What makes you think that your experience is the typical one? I've had these problems as well and so have a lot of people I've talked too. Obviously that's just more anecdotes and doesn't prove anything but neither does your comment.

chadlavi said 14 days ago:

> You can test this by running the following two lines in a terminal:

>

> echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh

> time /tmp/test.sh && time /tmp/test.sh

Am I missing something here?

I just did this, and the timing between the first and second run was barely noticeable -- in fact, the first run was slightly quicker:

> echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh time /tmp/test.sh && time /tmp/test.sh

> Hello

> /tmp/test.sh 0.00s user 0.00s system 55% cpu 0.006 total

> Hello

> /tmp/test.sh 0.00s user 0.00s system 41% cpu 0.010 total

This is on macOS 10.15.4.

vegardx said 13 days ago:

I had put off upgrading for a long time because nothing good can come from running the latest stable release. They've never been stable. But Apple sort of forced me to update recently since wanted to back up my phone, which I wanted to do before switching to a new one. I imagined that it would be better after a year. Boy was I wrong, and I regret doing it much. It has been a constant pain ever since, bluetooth is completely broken.

- My external trackpad isn't able to connect, at all. Audio devices require that I kill coreaudiod before connecting, otherwise they just disconnect after a few seconds.

- I can wake the laptop with a bluetooth keyboard, but when it's awake the keyboard stops working. Flipping the switch on the backside of the keyboard lets it reconnect again.

- There are transitions that you cannot disable that makes your laptop feel super slow. In Mojave you could disable them, in Catalina you can't unless you want to run with SIP disabled.

- There's also a super fun bug with mobile hotspot failing to activate, and there's no way for you to just manually connect to your own hotspot, it has to go through this bluetooth activation, even though your mobile hotspot is visible and connectable on all other devices. You end up in situation where you connect to your friends hotspot and they connect to yours, since neither of you are able to connect to your own.

I've given up. The quality control in Apple is down the drain, and have been for quite some time. I'm fixing to downgrade to Mojave this weekend, hopefully that will make it more stable. But I'm not holding my breath. To add injury to insult I'm on my third broken keyboard now. Next time it breaks I might just use the consumer laws and make them refund the laptop so they'll have to take a big loss for creating such a flawed device.

1123581321 said 13 days ago:

Those all sound like unusual problems. What external hardware and phone are you using?

hitekker said 14 days ago:

> Another way to reduce the delays is by disabling System Integrity Protection. I say reduce, because I still do get some delays even with SIP disabled, but the system does overall feel much faster, and I would strongly recommend anyone who thinks their system is sluggish to do the same.

The tone of this article reminds me of a passage from the seminal Google+ Platforms Rant:

> Like anything else big and important in life, Accessibility has an evil twin who, jilted by the unbalanced affection displayed by their parents in their youth, has grown into an equally powerful Arch-Nemesis (yes, there's more than one nemesis to accessibility) named Security. And boy howdy are the two ever at odds. > But I'll argue that Accessibility is actually more important than Security because dialing Accessibility to zero means you have no product at all, whereas dialing Security to zero can still get you a reasonably successful product such as the Playstation Network.

https://gist.github.com/chitchcock/1281611

rtomayko said 13 days ago:

I made the jump to a System76 Adder WS laptop and pop!os for development after buying the lemon first gen MBP with the terrible keyboard. It was my seventh and possibly last MBP (including powerbooks before it).

I was considering one of the new 13” MBPs but that seems unlikely if injecting network latency into syscalls is the direction things are going.

If you’re not building Mac/iOS apps, find a Linux laptop you can tolerate for development and an iPad Pro for everything else.

justinclift said 13 days ago:

Thinking about it, this probably also gives Apple a ~fairly accurate set of usage stats for software.

All they'd need to do - and it's very simple - is count the number of requests of each given hash lookup.

Since they know the hash for each of their own executables, that gives a direct count of "most used" through to "least used" programs.

Not sure if they'd have the hash for third party executables though, to know what the given hash request corresponds to.

If they receive the hash for 3rd party executables when developers sign things, then Apple seems like it's able to generate usage stats for their entire OS and 3rd party app ecosystem.

grandinj said 13 days ago:

This seems like a natural outflow of a company design process that (a) prioritizes security highly (b) prioritizes regular users over developers (c) does not allocate sufficient resources to the product to thoroughly cover all the bases (d) is developed by people in North America, for whom the USA === the whole world, and are used to near 100% seamless internet connectivity with latency < 20ms.

I love macOS, but their software generally has issues with flakey internet connectivity and long latencies - down here in South Africa, ~400ms RTT is not uncommon.

soraminazuki said 14 days ago:

Up until the release of Catalina, I've always upgraded to the latest version of macOS within a month or two. But some of the changes this time is really stopping me from upgrading.

As of Catalina, there's no sane way to install the Nix package manager without losing functionality because macOS now disallows creating new files in the root directory[1]. Nix stores its packages in the /nix directory and it's not possible to migrate without causing major disruptions for existing NixOS and other Linux users. This is too bad, since apart from Nix being a nice package manager, it also provides a sane binary package for Emacs. The Homebrew core/cask versions only provides a limited feature set[2][3].

[1]: https://github.com/NixOS/nix/issues/2925

[2]: https://github.com/Homebrew/homebrew-core/issues/31510

[3]: https://github.com/caldwell/build-emacs/search?q=support+is%...

yalogin said 14 days ago:

Brew never had this problem because they chose a sane path without corrupting the system directory. It’s a bad design on part of NixOS and one can even say the changes in the macOS were designed to encourage good/sane design.

skohan said 14 days ago:

For me it's aperture. I like the interface better than lightroom, and I don't want to pay a monthly fee to have access to my photo library which I only add to once in a while. It's a shame because it's a great piece of software, and even the UI doesn't feel dated, but I just won't be able to run it if I upgrade.

glofish said 14 days ago:

IMHO the original choice of the path seems incredibly ill-advised and the main burden lies with the original developers.

sometimes old errors and mistakes come back and bite

lilyball said 14 days ago:

You can install Nix without losing functionality, it’s just annoying because it requires setting up a separate volume, and if you want it encrypted and available before the GUI session restores then you have to use a login script to force-mount it. Personally I just keep my Nix volume unencrypted because I don’t build any proprietary software in it and I don’t care if someone can see what I have installed.

I really wish Apple would give third parties the ability to create firmlinks (or at least give Nix one), or barring that, give us a sane way to mount encrypted volumes at the same time that the system volume is unlocked.

joosters said 14 days ago:

You can create permanent symlinks inside / by creating a file called /etc/synthetic.conf - 'man synthetic.conf' has the full documentation. This sounds like it would solve the issue?

mjhoy said 14 days ago:

It's funny, I just had to do this a few days ago.

This comment has worked for me on two machines: https://github.com/NixOS/nix/issues/2925#issuecomment-539570...

said 14 days ago:
[deleted]
mkchoi212 said 14 days ago:

I understand the purpose of notarization but I feel like they could've come up with a much better solution to this. A network call __everytime__ someone runs an executable is not acceptable. But for the cases where the user is offline, Apple must keep a list of notarized apps on the machine...

thedanbob said 14 days ago:

Nearly every article I see about macOS or Windows these days further confirms to me that switching entirely to Linux was the right call. Maybe 2020 will be the year of the Linux Desktop by default.

luckydata said 14 days ago:

anyday now...

aflag said 12 days ago:

Did apple make any comments on this? I haven't been able to find any public responses from them. I'm really interested on reading their side of things. This is quite jarring, it's hard to believe it is a thing. However, as I read through tests people did, it seems just as bad as it sounds.

I was actually getting a mac mini now that I'm working from home (I thought I'd get better integration with some of the company's wfh infrastructure while still having a unixy environment, so a win/win situation), but I cancelled the purchase after reading this. I get that you can jump some hoops and set some apple specific flags to things so that it works better, but the reason I wanted a mac was to make things easier and not having to look into obscure APIs and features to get simple things working. I was really looking forward to that, but I don't feel that sort of investment will be justified with issues like this in their OS :/

pram said 11 days ago:

This is frankly hyperbole. A single checkbox in a GUI menu that is routinely accessed for managing other system-wide sandbox privileges isn't exactly obscure. It also isn't some difficult, inconvenient task. It needs to be done once.

ambernightcrush said 13 days ago:

This is also the case with APFS on rotational disk drives. Why does APFS perform so much worse on HDD vs SSD? Will Apple fix it? https://bombich.com/blog/2019/09/12/analysis-apfs-enumeratio...

cmckn said 13 days ago:

APFS was not designed for spinning disks. No, they won't fix it; because they don't even sell a computer that ships with only a spinning disk (asterisk on the iMac's hybrid drive). HFS+ is still available, just use it if you need to format a spinning disk. I think this is a very different type of issue, with much more reasonable trade-offs.

dkmar said 13 days ago:

Perhaps related: "How come someone notarized my app?"[0]

It mentions that anyone with an apple developer ID can notarize a qualifying app and submit this notary to the Apple Notary Service. However, the proof of notarization—the notarization ticket—might not be stapled to the application.

In the case of no stapled ticket, Catalina contacts the notary service to see whether a ticket exists. If so, the app is good to go.

[0]: https://eclecticlight.co/2020/05/22/how-come-someone-notariz...

EDIT. More informative link here[1]. It specifically outlines what happens on first run of an app. (and there's a great diagram if you scroll down)

[1]: https://eclecticlight.co/2020/01/27/what-could-possibly-go-w...

tozeur said 14 days ago:

I feel like the continual development of MacOS is making it worse and worse. Similar to Windows, where every extra feature causes more and more complications.

But alas the 1000s of engineers gotta be put to work somehow.

saagarjha said 14 days ago:

There are significantly fewer than 1000 engineers working on macOS.

sneak said 14 days ago:

Increasingly I find macOS only to be tolerable with iCloud (and Siri, location, suggestions, bug reporting, et c) entirely disabled, and Little Snitch’s built in/automatic whitelisting for Apple services disabled, and most of the background processes entirely denied networking access. It phones home constantly even with all of the services disabled/opted out.

It’s indeed a huge mess, from a privacy standpoint too, not just a performance one. It’s sad also to lose things like AirPlay or iMessage as collateral damage in the process. :/

I just can’t tolerate a machine that hits the network hundreds of times a day when doing normal computing tasks that do not involve the network. They even tolerate this sort of spyware in App Store apps, too.

Is it too much to ask for a polished workstation OS that lets me boot and edit a local text file of notes and save and quit without notifying 4 different parties that I did so?

m463 said 14 days ago:

and there are a lot of background processes.

running just firefox and terminal, ps -ef|wc -l is 198

and many of them have no reason to be on my system.

cmckn said 14 days ago:

I run a pihole at home, which has intermittent issues. When macOS can't resolve a hostname, almost every user-facing UI grinds to a halt. It's truly bizarre. Applications won't launch, menus don't respond, etc. Feels like a decade ago when your spinning disk was going bad. Not cute :(

skykooler said 14 days ago:

If it checks with Apple servers every time you execute a new binary, what happens if you don't have an Internet connection? Are you just unable to run new code?

nromiun said 14 days ago:

> One way to solve the delays is to disable your internet connection.

I think it just skips the checks if internet isn't available. But doesn't that kind of defeats the point of notarization?

OskarS said 14 days ago:

The linked website isn't loading, so I don't know what it says, but: if we're talking about notarization, you can "staple" the notarization to a .app or a .pkg, which means you don't have to do the internet lookup at all, and you can run the apps without having access to the internet. I'm not sure about the technical details, but I would assume you add some sort of signature that's like "This .app with hash X has been notarized and it's fine" signed by Apple's secret key.

EDIT: how to staple: https://developer.apple.com/documentation/xcode/notarizing_m...

cpncrunch said 14 days ago:

The article says "One way to solve the delays is to disable your internet connection" so I assume it just doesn't bother with notarization when you do that.

enriquto said 14 days ago:

> If it checks with Apple servers every time you execute a new binary, what happens if you don't have an Internet connection? Are you just unable to run new code?

It waits 5 seconds while trying to connect, and then it gives up and caches the program as un-notarized, allowing it to run faster on later executions.

Notice that notarization seems to be disabled if the network is disabled from within the OS. To observe the 5 second delay you need to cut the connection outside (e.g., on your router), while the mac still thinks it is connected. I observed it by running catalina inside a virtualbox, and disabling its network.

ken said 14 days ago:

> With internet enabled, it was reproducible by relaunching the application and triggering the code that called SecKeychainFindGenericPassword.

I have issues with a lot of APIs, but SecKeychain has got to be one of the worst. I don't think it's gotten any love in many, many years. Unlike literally every other Apple API that a Macintosh application might reasonably use, you call its functions (even from Swift) by passing strings as (length:UInt32, data:UnsafePointer<Int8>?) pairs, and getting results out by passing (length:UnsafeMutablePointer<UInt32>?, data:UnsafeMutablePointer<UnsafeMutableRawPointer?>?) pairs, and checking OSStatus return values. Every aspect of it is painful.

In Apple's "Documentation Archive" there's three "Sample Code" downloads related to Keychain. The newest one is for TouchID, and the oldest is for PowerPC. This is an area of the OS that doesn't get much attention.

> This issue has been reported to Apple and assigned FB7679198. Apple has responded that applications should not use this function, though the documentation for SecKeychainFindGenericPassword does not state that it is deprecated

I see that it's now grouped in a section of the docs called "Legacy Password Storage", but not actually "deprecated". Strange. That means you won't get any indication of its non-current status from Xcode, or even reading the release notes.

I like that there's a newer (and presumably less awful) interface. I don't look forward to having to rewrite/retest that corner of my application. Seeing all the CFString/CFDictionary casting and OSStatus checking with the new functions, it still doesn't look all that great.

xvector said 13 days ago:

What a ridiculous feature. The people involved in making this decision ought to be fired.

parhamn said 14 days ago:

I'm showing 20-200ms longer on first run of the exec. Modified the test script a bit to show that it doesn't happen again if you modify the executable's contents.

    echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && \
    chmod a+x /tmp/test.sh && \
    time /tmp/test.sh && \
    time /tmp/test.sh && \
    echo 'echo Hello2' >> /tmp/test.sh && \
    time /tmp/test.sh
eugenekolo said 14 days ago:

Another slight modification to make this show the effect every time:

    f=$(mktemp) && \
    echo $'#!/bin/sh\necho Hello' > $f && \
    chmod a+x $f && \
    time $f && \
    time $f && \
    echo 'echo Hello2' >> $f && \
    time $f

On my system:

    Hello

    real 0m0.131s
    user 0m0.001s
    sys 0m0.002s
    Hello

    real 0m0.004s
    user 0m0.001s
    sys 0m0.002s
    Hello
    Hello2

    real 0m0.004s
    user 0m0.001s
    sys 0m0.002s
unilynx said 14 days ago:

I got hit by this yesterday, borgbackup (installed using home-brew) had a 5 second delay on every invocation.

Setting Terminal as a Developer Tool in Security&Privacy fixed it

blackrock said 14 days ago:

One frustrating experience on the Mac is keyboard shortcuts.

Yes, they have polished the GUI, which makes it easy to navigate by mouse. But, when you need to work in speed mode, then you reach for the keyboard shortcuts.

The problem, is that there are plenty, too much sometimes, and they are often inconsistent between applications.

And yes, the Mac has a keyboard shortcut assignment tool, but it often doesn’t work correctly.

I must give credit to Microsoft here. They at least seemed to have perfected most of the common keyboard shortcuts.

Some good features about Windows shortcuts.

1. Alt-Spacebar to open the windows control menu, to move, minimize, maximize, or close the window.

2. Alt combinations are used to control the active Window application itself.

3. Alt-F4 to close the window. But, I would have preferred Alt-Escape instead, to close the window.

4. Control key for shortcuts inside the application. Like, Ctrl-C for copy. O for open. P for print. Etc.

5. Then the Windows key, to control Operating System level shortcuts. Like Win-M to minimize all windows. Win-L to lock the computer. Win-R to launch a command.

Some feature I would like are to use, Win-Spacebar to open a command search, similar to Win-R, but with the ability to list all possible commands. Similar to activating the command palette on VSCode.

And Ctrl-Spacebar, to activate keyboard commands for the active window. Kinda like Emacs, where I can run macros on it, like highlighting the words that I want, and execute something on it, like changing to uppercase, or converting to comma separated, or whatever else is needed.

astronautjones said 13 days ago:

this has always been the case. the underlined shortcuts in menus are a godsend in non-osx OSes. I am still astonished at the hostility of macos when it comes to Yes/No dialogs - you usually can't hit Y or N! This changed at some point after snow leopard. If I could run HDCP on my old macbook, I'd still be using snow leopard. aesthetically, they have made no innovations of use since then.

jakearmitage said 14 days ago:

This seems to be, once again, a case of user experience being degraded due to lack of attention, testing and measurement of impact by security engineers.

inimino said 14 days ago:

Once you have security engineers, security is no longer the responsibility of all engineers equally, and you've already lost at security.

HugoDaniel said 13 days ago:

I have been running OpenBSD for all my dev work in a VM for quite some time now.

This just makes me wanna start using it for more things besides dev work :(

herova said 14 days ago:

Windows + VSCode + WSL2 + Terminal + PowerToys = Just one love, never looked back.

xyst said 14 days ago:

The only problem I have with that is "Windows"

I'm currently trying to figure out how to emulate windows from a *nix distribution using qemu. I plan to use this as a "home lab" (k8s cluster or just plain fucking around), but still retain the ability to play an occasional AAA game.

rb808 said 14 days ago:

The weird thing is the price of windows laptops have skyrocketed with the shortages. New MBPs are cheaper than X1 Carbons and XPSs with 10gen chips.

asdff said 14 days ago:

New MBP with a 10th gen chip is a $600 upgrade over the base model with an 8th gen chip.

jarjoura said 14 days ago:

Every other week Lenovo has some crazy 25-50% off coupon for their laptops.

PopeRigby said 13 days ago:

Just did a test using the command the author listed. Benchmarked on ArchLinux and got 0.00s. I then did the same test on MacBook Pro and got 0.332s. I feel like that's pretty bad. 0.332s might sound inconsequential, but that's just for a single echo command. I would imagine it gets exponentially worse as your executable grows in complexity.

swiley said 13 days ago:

How do people put up with the complete brokenness in commercial OSes? Is this really better than having to edit the occasional config file?

saagarjha said 13 days ago:

Personally, I know which process to kill when things go south. It's not early to acquire this information, though.

inimino said 14 days ago:

Last year I was preaching that if you can't develop in a submarine or a space station (or on the metro), from a fresh git clone to your next git push, then your development environment is broken and you should burn it to the ground and start over.

It'll be interesting to see how much power we developers will let Apple take from us before we jump the garden wall.

saagarjha said 14 days ago:

Interestingly, I hear that iPads cannot be used on the ISS because apps will stop launching if you disconnect from Apple's servers for too long.

mnm1 said 14 days ago:

I'm getting 10-15 minute beach ball of death freezes on a month old MBP 16". That recur until I hard reboot. I can't open the 'force quit applications' window during this nor the apple menu. Can't reboot or shutdown from the cli or otherwise. Some apps lose network connections, some don't. The entire system becomes unusable. It requires a hard reboot. I think it's related to Intellij IDEA and similar IDEs somehow, but profiling those shows the slowdown is not in their apps but in the OS. It won't start with anything plugged into the USB ports, not even just power. Been trying various things but if it doesn't go away, I will return this when the Apple store here reopens. The only good thing about this coronavirus is that I've had more than 14 days to test this and find out what a clusterfuck this OS is even on a $4400 brand new mbpro. Do they even test anything anymore?

jrochkind1 said 13 days ago:

Do you think developers make up a significant portion of Mac buyers? I think it's possible, but I'm not sure.

I am pretty sure the laptop market has been shrinking generally (as more people have a phone but no laptop). And most developers I know have macs. They probably don't want to make the OS significantly worse for developers...

vsskanth said 13 days ago:

After this, you can be sure the developer interest will go down even further

gautamcgoel said 13 days ago:

This why having a vibrant open-source ecosystem is so important. Firstly, the needs of users is the main priority (as opposed to profit or liability minimization or advertising...), and secondly, users have so many options to pick from. For example, if you don't like systemd, you are free to pick an OS without it.

mleonhard said 13 days ago:

I don't want to send over the Internet a record of every program I run. Is there a way to opt-out completely?

dahfizz said 13 days ago:

Buy a machine not from Apple.

sfj said 13 days ago:

Unplug from the internet.

vortico said 14 days ago:

I used to use Mac pretty heavily for design and audio work, but around 10.14 because of Apple switching the way they do things, I've now entirely switched to Windows for that, and Linux for everything else. I just don't want to deal with the nonsense described in this post, among several other things.

headmelted said 14 days ago:

“ Another way to reduce the delays is by disabling System Integrity Protection. I say reduce, because I still do get some delays even with SIP disabled, but the system does overall feel much faster, and I would strongly recommend anyone who thinks their system is sluggish to do the same.”

Nope.

jasoneckert said 14 days ago:

"Another way to reduce the delays is by disabling System Integrity Protection."

Definitely agree on this one here - I've noticed a big speed improvement when disabling SIP debugging with "csrutil enable --without debug" while in recovery mode.

I should note that the main reason I disable SIP isn't for speed, but to install the yabai window manager to make Aqua far more useful as a developer. I wrote a recent blog post on this, actually (https://triosdevelopers.com/jason.eckert/blog/Entries/2020/5...).

saagarjha said 14 days ago:

I believe disabling System Integrity Protection actually carries over to everything you boot off the computer.

heinrichhartman said 14 days ago:

> [...] it appears that low-level system API such as exec and getxattr now do synchronous network activity before returning to the caller.

WTAF. If this is really true, this is a reason for me to leave the platform for good. This is just in-acceptable in so many ways.

enriquto said 14 days ago:

> a degraded user experience, as the first time a user runs a new executable, Apple delays execution while waiting for a reply from their server.

Wow, this is extremely infuriating! I just ran the "hello world" test script with the network connection disabled and it took 5 seconds to run!

     $ echo $'#!/bin/sh\necho Hello' > /tmp/test.sh && chmod a+x /tmp/test.sh
     $ time /tmp/test.sh && time /tmp/test.sh
     Hello
     /tmp/test.sh  0.00s user 0.00s system 0% cpu 4.991 total
     Hello
     /tmp/test.sh  0.00s user 0.00s system 77% cpu 0.005 total
crazygringo said 14 days ago:

I'm so confused about the comments here.

There are a bunch of people who can't reproduce the slowness at all, but nearly all downvoted or you have to wade through 100's of comments to get to them.

The majority of comments are just dumping on Macs, nothing whatsoever to do with the content of the article, and seem to be blindly assuming it's true.

And I can't seem to find any substantive discussion of whether this is actually real or not, or just some weird bug on the author's machine.

I don't see any evidence that Catalina is "slow by design", just a single anecdote from the author. I was definitely hoping for some more substantive critique/discussion...

defnotashton2 said 14 days ago:

Op linked validated bug reports.. One of which Apple responded with "by design" of which op derived the title.

The down votes are because it seems pretty clear that the people who don't experience have long lived instances of their os and likely have grandfathered or disabled security settings. There are a lot of people saying ita pretty easy to replicate with a new os.

And it is, I just did it. Did you?

tinco said 14 days ago:

Did you run the test yourself? Why do you assume people are blindly assuming it's true? For me first run was 0.5s, second run was 0.004s, so there's definitely something going on.

saagarjha said 14 days ago:

> There are a bunch of people who can't reproduce the slowness at all, but nearly all downvoted or you have to wade through 100's of comments to get to them.

It's possible that they have certain security features disabled.

> The majority of comments are just dumping on Macs, nothing whatsoever to do with the content of the article, and seem to be blindly assuming it's true.

Welcome to Hacker News…this is common on any discussion on any topic, especially one that many people can understand in some way.

vbsteven said 14 days ago:

With Apple degrading the developer experience with each release and Microsoft working hard on things like WSL(2) and the new "package manager" I think within a year or 2 lots of developers will go back to Windows-based machines.

xvector said 13 days ago:

As a security engineer myself, what Apple is doing here is completely fucking insane. I honestly cannot believe that anyone thought it was a good idea.

said 13 days ago:
[deleted]
jaykru said 12 days ago:

Has anybody in the tech media picked up on this? Doesn't seem like it from a cursory browse of my favorite sites (HN do your magic) This seems like something that Apple really ought to be taken to task for. I'm sure the privacy concerns if not the performance will rile up the broader non-HN public if only the information reaches them. Perhaps then we can get Apple to move to a less stupid system.

sorryitstrue said 14 days ago:

An issue I've been dealing with forever on my mbp 2013 is the machine just pausing input for 2-4 secs (video and audio don't hitch, just keyboard/mouse input).

I recently took the trouble to completely wipe the disk and reinstall macos mojave and it's still happening so it's not due to cruft installed over time in OSX. I dunno. I'll deal with it until it gives up the ghost and probably move to a windows machine with the work they're putting into WSL2

rch said 14 days ago:

High quality laptops shipping with Linux have been available for some time now. I know of a couple of companies that are providing an option for employees to switch.

harpratap said 14 days ago:

This coupled with the horrible docker 100% cpu usage bug (https://github.com/docker/for-mac/issues/3499) might be the top reasons why I hate WFH right now. My Linux desktop in office was so much faster at everything (granted its desktop vs laptop but still, it's a laggy mess developing on OSX now)

csomar said 14 days ago:

It gets even worse. I was doing some web dev in the last couple months and I noticed that my "localhost" was ridiculously slow. At first, I thought it was NPM/Gulp but then I noticed that it behaved irrationally, sometimes it is slow and sometimes it works.

The problem was: Parental Control. Apparently, every request was checked and thus slowed the whole thing down. Needless to say, a couple days at least were wasted in this.

sub7 said 13 days ago:

Just switch to Windows and WSL. For most cases, it works just great/not noticeably slower.

There's a lot of bullshit on Windows too but nothing near OSX levels of wannabe big brother shit.

Can't think of a better long term short right now in the market than Apple (and sister cult Tesla but the electric story is at least in the early days so they may do ok)

kasabali said 12 days ago:

Windows has SmartScreen and MAPS (which was previously called "SpyNet") turned on by default, on top of telemetry level that goes to eleven and cannot be turned off in consumer editions.

They're not implemented in a braindead way that's being discussed here but they're at the same level big brotherness-wise, if not worse.

trollied said 14 days ago:

The only time I’ve seen similar delays is when my mac decides it needs to do something on an external disk that needs to spin up. I have a 12Tb external that can take 10 seconds to spin up, so get a 10 second stall waiting for I/O once in a while.

I do wonder if the author has something similar going on, either with a directly attached disk or a network share.

trashburger said 14 days ago:

Did the site get hit by the Slashdot effect? Can't access it.

Archive: https://web.archive.org/web/20200522164507/https://sigpipe.m...

blinkingled said 14 days ago:

Apple has an opportunity here - to fix all these issues in the first release of ARM macOS and disable some more functions that "don't really work well" or are "insecure" - all of a sudden ARM Mac will be so much better there will be many blog posts and videos about it smugly proclaiming how Intel could not keep up!

sigjuice said 13 days ago:

I intend to stay on Mojave for as long as possible, but I am curious to try out Catalina. I believe it is easy enough to install Catalina on an external SSD. My concern is whether this would be safe enough and if my computer would remain unmodified (e.g. could there be changes to firmware settings or firmware updates?)

crazygringo said 14 days ago:

Sorry but it's just not happening for me, on macOS 10.15.3, on my late 2016 MBP. (And I've certainly never done anything like disable SIP.)

I run the commands and get:

  Hello
  /tmp/test.sh  0.00s user 0.00s system 8% cpu 0.045 total
  Hello
  /tmp/test.sh  0.00s user 0.00s system 75% cpu 0.005 total
If I'm reading this correctly, the first run takes less than a twentieth of a second, and the second a two-hundredth? I've never experienced anything like "have the entire machine freeze for 1-2 seconds every 10th minute". And I have the slowest internet package I can buy.

The only delay that's ever noticeable is when running a program I've installed for the first time, which yes usually seems to take a few seconds, before often telling me the application couldn't be verified or something, do I want to run it anyways. Which makes sense if you're running a checksum on a 400 MB application binary. But after that first time, starting an app is always instant.

Can anyone else elucidate what the author is talking about? They're presenting it as a universal, but maybe there's something else going on with their machine? Clearly something's wrong on their end, but possibly it's just some kind of bug. I'd avoid jumping to conclusions that executables taking a second to launch is "by design".

EDIT: switching from zsh to sh gives more granular results:

  Hello
  
  real 0m0.009s
  user 0m0.002s
  sys 0m0.003s
  Hello
  
  real 0m0.005s
  user 0m0.001s
  sys 0m0.003s
john_alan said 13 days ago:

I can see the delay when I remove my terminal from the DevTools permission in Security preferences.

So it's real.

However, scripts are NOT notarised, so what is it doing?

EDIT:

So after digging the scripts are being "checked" for malware, as part of XProtect.

This is interesting, it seems to be hashing scripts and testing to see if its known malware.

Anyway, easy to disable, but weird stuff.

anderspitman said 14 days ago:

"Modern" OSX, iOS, and Android are so secure and safe they even protect you from using your computer.

kup0 said 14 days ago:

10.15.1 and then 10.15.4 both introduced random kernel panics on my iMac. Only way to solve was to reinstall MacOS on top of itself (via Recovery, kept files/apps intact).

Still no idea what or why the panics would happen, or why the reinstall solved it.

Catalina has been a very bumpy road for me so far.

apatheticonion said 13 days ago:

Just wanted to drop this here but WSL & WSL2 makes a compelling case to move to Windows.

mattbillenstein said 13 days ago:

Man, I think I was having this issue earlier in the year and thought it was some funkyness with the firewall or application -- custom golang apps.

Who at apple thought it was a good idea to hop on the internet when invoking an application without any warning? This is loony.

mshockwave said 14 days ago:

I don't think they do the notarization for shell scripts and program you build from source. I've been doing large scale software development on my Catalina for quite some time and I observed zero performance degradation compared to previous OS X version.

e40 said 14 days ago:

I really hope the mess that is Catalina is fixed in the next round, or I might be on Mojave until I can switch to another OS. I've been on macOS for a long time, and I really like it. I'm productive on it. But Catalina... no, I won't touch that.

s800 said 14 days ago:

Anyone of packet captures of this behavior? I'm still on 10.14, or I would check it myself.

commandlinefan said 14 days ago:

I can't upgrade IntelliJ any more, because it's trying to write to privileged file locations that I (the owner of the computer) no longer have access to. Believe me, I've tried to work around this, macOS has it locked down completely.

stephenr said 14 days ago:

... Can you elaborate? I use IntelliJ on a daily basis on Catalina, and I have zero issues updating it.

dfabulich said 14 days ago:

The latest IntelliJ 2020.1.1 works out of the box on macOS 10.15.4, without disabling System Integrity Protection (SIP).

Whatever problem you're having, it's a problem specific to your machine.

tebruno99 said 14 days ago:

I use and upgrade IntelliJ fine. Install Jetbrains Toolbox and everything is installed in your home dir. What kind of locations are you having troubles with?

noworriesnate said 14 days ago:

I agree: use Jetbrains Toolbox.

A few months ago I installed Rider (an IntelliJ-based IDE) on my Mac without toolbox, and upgrading it was a pain. I don't remember the details, but using JetBrains toolbox makes upgrading as simple as clicking a button and waiting until the download / install is complete.

ehutch79 said 14 days ago:

Why do you need access to the areas protected by SIP?

commandlinefan said 14 days ago:

Beats me - it's a common problem, though: https://stackoverflow.com/questions/40251201/upgrading-intel.... The only thing that ever worked was uninstalling and reinstalling the whole thing.

mschuster91 said 14 days ago:

You can disable SIP in recovery mode.

discourses said 13 days ago:

I have this kind of issues on Mojave. I blamed the firewall. With ethernet disconnected, everything runs smoothly. Connected: random freezes of 1-2 secs.

Why does it need the internet all the time?

sj4nz said 10 days ago:

Did anyone try the setting the terminal to "Developer Tools" permissions and find that things go worse?

dre-hh said 13 days ago:

Upgraded only in Spring. Waited long enough. Never have been I saw wrong. Now when I want to reboot my computer I just try to pair my Bluetooth headphones - instant hard reboot

saagarjha said 13 days ago:

Does this literally panic your machine?

msie said 13 days ago:

Lack of upgradability of MacBook Pros, numerous bugs in Catalina (ImageCapture Im looking at you), T2 chip and secure boot issues. It all adds up...

markdog12 said 14 days ago:

Can we get a MacOS @BruceDawson0xB up in here?

https://twitter.com/BruceDawson0xB

soapdog said 13 days ago:

If microsoft was doing this there'd be a riot but since it is Apple but will rationalize this bad behaviour and say it is for the best.

gitgud said 13 days ago:

Why would they send off binary hashes synchronously before execution of the program?

Are they checking if the app is dangerous? Are they logging all my activity?

bad_user said 13 days ago:

I like the fine grained permissions on Catalina, but along with dropping support for 32 bits binaries, this is getting ridiculous.

fulldecent2 said 14 days ago:

NSA had a "hardening macOS" guide on GitHub that I can't find.

I wonder if that defeats the phone home that this article is highlighting.

AlexanderDhoore said 14 days ago:

I noticed recently that the first `git` command I run takes longer. This is insane. What's the status of debian on macbook?

ben-schaaf said 14 days ago:

Last I heard you can't even access the SSD on newer macbooks. If you want a good experience with running Linux on a laptop, don't use a Mac.

stephc_int13 said 14 days ago:

Wow, this is incredible and clearly a huge step in the wrong direction.

I clearly won't switch to their system anytime soon...

MintelIE said 13 days ago:

When will computer and OS companies start telling us exactly what data they’re taking and who they give it to? I was an Apple user from 2002 until last year. I just can’t be spied on and telemetized any more. It’s not beneficial to me and I can see all kinds of downsides. Especially since big tech has it in for anybody politically to the right of Bernie.

mickotron said 13 days ago:

My 2011 era MacBook Pro has run Linux most of its life. It runs super fast compared to its performance under MacOS even a year into its existence.

I've heard people ask me "why bother with Linux when MacOS is Unix?". Well technically it is from its heritage, but it gets less unixy by the day.

LeoNatan25 said 14 days ago:

Disabling SIP and amfi kills all the process startup delay and limitations.

seemslegit said 13 days ago:

The slowness seems like the smallest concern here

bfrog said 14 days ago:

I feel like this is one of those times, a wut moment.

dwighttk said 14 days ago:

How many new applications are you people running?

zapf said 14 days ago:

One more reason to stay away from corporate OSes

RyanShook said 13 days ago:

So should we disable SIP on our Macs?

rmrfrmrf said 13 days ago:

By this logic, HTTPS is "slow by design" and a nefarious plot by Big Certificate to siphon money away from tech companies.

zelly said 14 days ago:

Linux is waiting for you.

waynesonfire said 13 days ago:

now I understand the importance of niche OS.

Craighead said 14 days ago:

People please check how hot your devices are.

andarleen said 14 days ago:

I switched to a sleek amd based setup and ubuntu, 64 gigs of ram, tons of nvme storage and for a decent price. Sad to see macos go out my daily toolkit, but fortunately i no longer have to deal with this kind of crap. I still use mac occasionally but day by day it becomes less relevant.

znpy said 14 days ago:

congrats on realizing that your macbook pro 16" is a 4000$ facebook machine.

dang said 13 days ago:

Please don't post unsubstantive comments and/or flamebait here.

https://news.ycombinator.com/newsguidelines.html

3combinatorHN said 14 days ago:

Beyond me how people still paying for mac and windows botnet , just switch to linux everything just works

3combinatorHN said 14 days ago:

Beyond me how people(and specially “power users”) are still paying for mac or windows botnets , just switch to linux everything works

shmerl said 14 days ago:

Switch to Linux and forget about it.

beders said 14 days ago:

You should know by now:

Apple is the Father, Apple is the Mother.

After Apple has re-invented or re-written the MSFT playbook of the 90s, nothing surprises me anymore.

Yet I cling to these machines, that take away the freedom to do with my hardware as I please. It's odd.

inimino said 14 days ago:

The UX is good. Freedom has always been a little more subtle.

bluedino said 14 days ago:

In many unrelated ways, Mac OS X has just always been slow.

The first computers I ran OS X on were a Pismo Powerbook and one of the first iMacs. Both with upgraded hard drives and maxed out RAM. They were almost unusable, and we'd put classic OS back on them, a new release of OS X would come out, and repeat.

I later got a chance to use a shiny new G5. I couldn't believe how slow it felt. Same goes for the PowerBook G4. The first Intel MacBook Pro didn't feel any faster.

Somewhere around the i5, Mac OS started to feel 'okay'. But I'd always still feel blown away at how fast a similar machine felt running Windows or Linux.

But I've stuck with it ever since 2010. I remember talking about my 16", saying "It's really fast...for a Mac."

api said 14 days ago:

All of these complaints are about security features.

Yes these features could be better implemented, but I'm happy they're there. It's very important to be able to opt out of them, but I like that they're the default.

Notarization needs a cleanup pass and the rest of it seems like it needs an optimization pass.

P.S. The rationale for notarization is to not distribute and thus advertise the filters and detection mechanisms Apple uses to detect malware. If these things were distributed then malware authors could analyze and evade them. Security through obscurity does make a certain amount of sense here as the Church-Turing thesis means there are an infinite number of ways to implement any given thing including malware and there is no single filter or analytical step that can detect all possible malware permutations.

inimino said 14 days ago:

Being able to run arbitrary software on the hardware Apple has graciously lent me is an annoying level of power that I'm not fully comfortable with either. I'm liable to shoot my foot off if Apple the all-seeing doesn't save me from myself.

philwelch said 14 days ago:

The OS phoning home for every executable I want to run on my machine is a “security feature” the same way a key logger is.

JadeNB said 13 days ago:

> the Church-Turing thesis means there are an infinite number of ways to implement any given thing

That's true (or else there are 0 ways), but it's not what the Church–Turing thesis says.