The audacity of it! (pun intended) Fork it, call it Audacious and be done with it. There’s no reason for an offline application to collect data “necessary for law enforcement”. We’re already in an age where software is increasingly tilting towards becoming SaaS. We don’t need offline applications to behave as if they’re online or SaaS ones.
If this is only about telemetry or analytics, make it an opt-in choice on first run.
I wasn’t aware that Audacity had been acquired by The Muse Group recently. [1] I don’t think this will go well for users when private interests and concerns conflict with public ones.
The next obvious question would be about equivalent alternatives that are free and open source.
I've forked the codebase and removed most of the code related to networking, crash reports or sentry telemetry.
While I have been cautious and was trying to make sure that not any code could run by mistake (or false environment variable / ifdef checks down the line), I have to say that The Muse Group's intent seem to have been clearly that the code can be built without any of the tracking.
All networking related features seem to have been optional, and could be disabled via environment variables like `HAS_NETWORKING` or `HAS_SENTRY_REPORTING` etc.
I've detailed this more in my comment in the GitHub repository's issue [1] to make sure others will read about it, too.
Personally I think this is a classical miscommunication error, and it seems as the intent of Muse Group was to be as legally compliant as possible. Of course, a dialog where end-users could opt-in to everything would've been nicer than enabling/disabling it via build environment flags - but that could've been a simple rational discussion.
My current temporary (and building) fork is available at GitHub, too [2] and I would love to see this project being governed by a community of developers rather than a single owner that can potentially dictate integration of tracking mechanisms on others.
"Personally I think this is a classical miscommunication error, and it seems as the intent of Muse Group was to be as legally compliant as possible."
To be clear, this legal compliance is with the laws and government of Russia, where the parent company is domiciled.
>"For the purposes of this Notice, WSM Group with registered office at Moskovsky pr-t,40-1301, Kaliningrad, Russia, 236004 (“Audacity“, “us“, “we“, or “our“) acts as the data controller for the Personal Data that is collected via the App and through the App."
>"All your personal data is stored on our servers in the European Economic Area (EEA). However, we are occasionally required to share your personal data with our main office in Russia and our external counsel in the USA."
Yes, there are crash reports, but AFAICT they are all but universally ignored.
Example: Verizon's "Message+" for the Mac crashes for almost any reason, including "you just selected an old conversation." I allowed its auto crash report to work, several times.
I gave them a negative rating on the Play Store, citing this. They wrote and asked me for details on "the problem" so they could fix it. I told them to start testing their software.
If you're a developer on the other end of these things for some company, please speak up. How many are actually looked at?
I formerly worked on $LARGE_SOFTWARE for $COMPANY and would frequently get bug reports for crashes, and spent a reasonable amount of time analyzing memory dumps to understand what was causing those crashes so that they could be fixed.
I think correlating those automated crash dumps with reviews in a software distribution channel would have been non-trivial, and not something I would have spent time on though.
In this case, though, I think you would have said about the review "Oh, yeah, that's the embarrassing bug where it crashes when you do something totally ordinary. Fixed (or fixing) that."
That said I'd say I fixed 95% of the crashes, even the trivial ones. The remainder were difficult enough to understand the cause (ie: out of memory and we got caught holding the stick) that risk of change outweighed fixing.
"Everybody"? Emacs does not. vim does not. OpenSSH does not. Python does not. ffmpeg does not.
I think this is a culture-clash question. Open source software generally does no automated crash reporting. Some software like the Rust compiler will print out a crash report and ask you to copy/paste it into a GitHub issue on your own; you can decide not to. The general cultural assumption is that if you want the software to get better, you have to participate, and if you don't participate, the developers don't owe it to you to fix your crashes. In fact, with the software I mentioned above, you generally don't even get it directly from the developer; you get it from a redistributor like a Linux distro, who may have modified it. Therefore it may not even make sense to send crash reports directly to the original developers, and the social contract is between you and the redistributor.
Some open-source software like VLC does prompt you to send crash reports if you want, but it doesn't send them by default, and their data collection terms say nothing about law enforcement (https://www.videolan.org/privacy.html
Some open-source software that operates at a scale that resembles traditional "shrink-wrapped" (conceptually) software does even more telemetry - Firefox, for instance, but even so it doesn't say anything about collecting data for law enforcement: https://www.mozilla.org/en-US/privacy/firefox/
There's also a world of closed-source software where there isn't any culture of participation and where asking a user to file a bug makes little sense. This is the world of Windows and macOS and mobile apps, all of which do automated crash reports. This is the world of web analytics tools that track exactly where your mouse cursor pointed so developers can build more intuitive websites, and more generally, the world in which software runs online and server-side because it helps developers. The social contract here is that the developers improve the software fix bugs without your active participation, and in turn the software needs to send them data about how you use it - both crash reports and general usage data - on its own, without you doing anything.
The real question is which world Audacity now lives in. A clause like "data necessary for law enforcement" is perfectly standard for websites, and probably they copied and pasted it from some standard terms of service document. But can you imagine Emacs or Python having such terms?
>Emacs does not. vim does not. OpenSSH does not. Python does not. ffmpeg does not.
And yet, Fedora [0], Ubuntu [1], and KDE [2] do. So what you're saying doesn't paint a full picture: even if upstream of those projects you mentioned doesn't take the automated bug reports, there is likely some downstream that is using them to aggregate crash data and turn those into bug reports. It is not feasible, on a large scale, to manually comb through crash dumps and ask people to figure out how to use gdb. Open source projects don't have any special sauce here.
Also, it seems very strange to complain about companies collecting data that they're required to collect by law. "Forking" there is a fool's errand, any other company legally doing business in the same areas will be following the same laws.
I think the question is about automatically reporting bugs versus giving the user tools to report bugs in a consistent manner if they choose to.
Fedora's tool does not automatically report the bugs - it just automates the process of allowing the users to do so if they choose, and handles generating stack traces locally or remotely (configurable) and dealing with core dumps and submitting them if so chosen (again configurable), and tries to tie crashes to existing bug reports if you choose to submit the data to prevent devs from being flooded.
It even has some pretty great functionality to help make sure no private/sensitive data leaks in the report, and lets you go through things line-by-line that based on a default or custom list of keywords that may indicate sensitive data (such as passwd, key, etc.) before you finally choose to submit it.
And it's all done on a case-by-case basis, so for each crash you can choose whether to submit any data or not, in case you're running one app that you'd be concerned about leaking sensitive data versus another.
> Also, it seems very strange to complain about companies collecting data that they're required to collect by law.
Thanks for clearly phrasing this, because I think this is exactly what I'm complaining about.
Audacity is, in fact, not required to collect anything by law, and for a very long time, it did not. It's a piece of software that runs on your computer. It doesn't even require you to have a network connection. It doesn't require the developers to even know that you exist.
No law (in any slightly free country, at least) prevents the existence of software that doesn't phone home. WordStar wasn't illegal. Emacs and Python and ffmpeg aren't illegal.
It is true that once you start collecting data, legal requirements start to apply, and you need to comply with requests for that data. That's a really good reason not to collect it in the first place.
If all you have is a stack trace, and you have nothing personally identifying (including filenames), then the data you have is useless for any other purpose besides fixing the bug. You're protected from legal requests, you're protected from illegal requests, you're less of a target of hacks, and so forth.
Fedora's ABRT, for instance, goes out of its way to ensure that any crash data they collect is suitable to make public: https://abrt.readthedocs.io/en/latest/ureport.html#ureport That's a pretty good way of squaring this circle. If the data is okay to make public, then by definition it's not a problem if anyone (law enforcement or otherwise) wants to see it, and you don't have to rely on their good intentions.
KDE doesn't claim to collect information about crash dumps; it just collects statistics about how it's used.
It is, of course, harder to debug things from just a list of symbols. And as you say, it's not feasible to teach everyone how to use gdb. So there's a tradeoff. You acknowledge that the software isn't going to be as good as possible in terms of bug-free-ness (or you find other ways to address that problem, like fuzzing or static analysis), but in exchange, you make the software better in terms of how it treats the user's privacy.
I don't understand why you're comparing Audacity to Python. ffmpeg would be a much better comparison to emacs or python – and I admit I'd be surprised if ffmpeg was emitting telemetry like crash reports.
Audacity's more comparable peer would be Adobe Photoshop. I'd expect crash reports in a GUI app like either of these. I'd also expect that anyone with data on a server would need a clause like "data necessary for law enforcement".
I would not expect that law enforcement regularly hounds Adobe for data from crash reports.
Well, I'm making that comparison because Audacity is - historically - an open-source app created by volunteers (originally created by a CMU professor and grad student). That's why I think it's a culture thing. (Emacs is a GUI app too, so the line isn't quite "GUI apps." Or maybe consider Signal vs. Facebook Messenger.) Is it part of the culture of open-source programs like Emacs and ffmpeg, or is it part of the culture of closed-source programs like Photoshop and Facebook?
I do actually think Firefox is a very interesting comparison here, because despite coming from the open-source culture, they collect extensive telemetry and they do work in a model where they can't expect user participation. But they in fact have no clause about data necessary for law enforcement.
(I'm also not defending Firefox as perfect, and a complication of Firefox is that it's designed to be used with an internet connection anyway. Audacity is an app that makes perfect sense to use offline. So was Photoshop, for that matter.)
In any case, the clause itself is not the interesting part; the interesting part is the implication that they have enough data to be interesting to law enforcement. Adobe and Facebook certainly do; whether or not it's used, it's there, and there are so many other reasons to be concerned about the data existing (rogue employees, hacks/leaks, etc.). The developers of Emacs clearly don't. I would guess the developers of Firefox try to avoid it, which is why Firefox doesn't need such a clause.
I intended "KGB" as a stand-in for "Government Intelligence and Law Enforcement Bureaus" but in the specific case of the FSB I doubt they'd give a rat's ass what a Terms of Service contains if they really wanted to come after somebody.
EDIT: I am worried about governments coming after some people, but I am not worried that Audacity's ToS telemetry or ToS will be pivotal to their success in doing so. Just look at Assange – they got him on falsified rape charges, not his history of torrents or website visits.
The rape charges against Assange may have been partly exaggerated, but calling them "falsified" is so much of an exaggeration that it itself comes closer to being a falsification than the charges ever were.
Hey, just wanted to tell you that the code you have currently doesn't compile.
`SentryHelper.h` is required for `ADD_EXCEPTION_CONTEXT` macro, which can be empty.
`HelpMenus.cpp` has a syntax error in preprocessor directives and an extra comma.
I thought I amended those changes to the latest commit, but I seem to have forgotten to push those fixes. Originally I just injected the macro via command line as it didn't do anything when reporting was disabled.
Can you please confirm that your HEAD contains the ":bug: Fixes" commit? [1] - I'll upload the built Arch package to the GitHub releases as soon as it's built again.
I was using the "audacity-git" [2] AUR package's PKGBUILD file (replaced repo URL with my one) to generate my package and it seemed to work without issues with those fixes applied.
There was also a "pthread_cleanup_pop(1)" patch in the PKGBUILD for the alsa library which I wondered about why it hasn't been pull requested upstream...but no idea what's up with that tbh.
Genuine question here, why go through all the trouble of manually removing the networking code?
Wouldn't it be much easier to replace the import/include/whatever of the low level networking bindings with a stub library where all functions just immediately return with a networking error, as if the computer was off line?
That would make it all but trivial to keep up with upstream code changes, and keep the disconnected version up to date.
Nah, I think it still makes sense to have a connectionless version. There are Linux folks who object to the concept of application firewalls and don't use any.
I have used it for a long time. To my surprise, it is still alive and seems to have evolved a bit seeing the screenshot and it looks good, I think I'll look into it.
The issue with these kinds of forks is that it'll eventually lose steam and no one will maintain it any more, or they'll fall far behind. In this case, the person has forked it, and is already lost interest as per the read me. On top of that, it's some 200+ commits behind.
I disagree. You likely want to evolve the plugin API over time to let people implement more effects, you need to update to keep up with the preferred audio API for each platform (Windows, Linux and Mac each have at least one), and you need to support new input/output formats. Without that, Audacity will be we useful as any abandonware application - likely works, but will limit the way of working in the modern world.
The author has only changed the license text and README. No changes to the code base have been made. The networking capabilities of Audacity seem to be disabled by default anyways when building from source.
that's good enough for me. someone who makes sure that this network stuff is never activated without my knowledge.
as a linux user, i expect my distribution to take care of that, but for others it's good to promote a publicly available branch that safeguards our privacy
You don't write networking code, just for kicks and giggles.
The new Audacity owners (named elsewhere) are moving in a certain direction. A potentially unpleasant one. We should be doing as this post is, keeping an eye on each strange/potentially nefarious change made.
Cooperations are the bacteria/viruses of the open source ecosystem. If one gets infected, the project is left behind to die, while a new project is recreated.
What's the scandal? There are companies build enterprise features on top of open source software, there are companies that hire lead devs of open source software to work on it, etc.
It's about what they do to it after acquisition. If they mistreat it and it's popular enough, it will be forked.
Yup. In fact, one of the original intents of GNU licenses was to protect user freedoms in the context of codebases getting routinely bought and sold. There has never been any obligation to develop in the open. The internet barely existed, at that point.
Since they have had most of the copyright for contributions assigned to them they can rewrite the few parts that they dont have copyright for and relicense the whole thing though.
But so what? The fact that it is (or was) open source means that anyone can still take and re-use the code, regardless of whether it gets re-licensed in the future. The re-licensing prevents nothing.
Did you read the article? It sounds like a fantastic way to get full time engineering devoted to software that will stay free and open source, by a company that has a track record of making free and open source software for music professionals.
I'm assuming that's the company the bought Muse Score and seem to have locked it all behind paywalls. Seems they're iterating on taking people's open source/open license contributions, buying up the project and locking its value away for private gain.
Is there a FOSS license that inhibits private ownership?
There's a reason most open source licenses say they grant a "non-revocable" license. But that doesn't mean that the copyright holder has to release future updates under the same license, nor does it mean anything about their ownership of the associated trademarks (which you seem to be conflating with some kind of overall "ownership").
Audacity now has quite a bit of networking code. Some of the more interesting parts involve something called "Sentry". Code for that went in 26 days ago.[1] I'm not sure when this runs, but there is code here [2] to read the contents of the URL
No, it cant. Read the damn file you linked. It’s a script that uploads release files to Sentry so it can be used while correlating crash reports, when a release is made.
And nobody in their right mind would ever send crash reports to Sentry from a CI. That’s completely nonsensical.
viraptor was not really responding to the discussion about the shell script at all (which is weird in its own right), but was rather talking about stuff like https://github.com/getsentry/pytest-sentry
It depends on what you're trying to achieve. It's totally reasonable to report stack traces from the CI, especially from the master branch for a quick overview on flakey tests. I don't know anything about their setup specifically, but that approach does have use cases.
For me it solves the issue of tracking flakey tests. It's the easiest way to do it if you're already using the same error reporting platform in production. Please consider others' use cases before declaring this nonsensical.
You beat me to it! But I doubt Sentry is the vector - it's a generally legitimate service that helps software devs track failures in the wild without the users having to report anything - I've worked with it for single page web applications. Of course, it could be abused to track anything.
The right short-term thing to do is deny it networking access with LittleSnitch (mac) or the equivalent, while buying time to produce a privacy-preserving fork. FWIW, I don't think Sentry is appropriate for this kind of application. OSS should have opt-in error reporting only.
It would also be a good opportunity for the fork to factor out the UI into something like Electron, because the Audacity UI has always been pretty bad, IMHO.
(I suspect the biggest obstacle to producing a viable fork will be fights over which pun to use as a replacement name.)
I'm curious what's different about OSS in this regard?
I thought this was an argument about the user's privacy, so what does it matter how the software was built?
Also a sidenote: this whole thread feels quite pedantic and counterproductive to me. Sentry is an error reporting tool that helps solve issues in software, the alternative is your software having many, many more bugs that never get fixed. It's not used for tracking or "spying".
I feel that blowing your stack about a tool like this weakens your argument against true privacy invasions. It trains people to stop listening because we whine about things that are not actually a problem.
OSS is code is generally made by unpaid volunteers. There may be some operational effort behind it, too, in terms of building and distributing binaries. But gathering remote telemetry is NOT typically something I would expect to be part of typical project operations. After all, who is gathering the data? What is the organization, what is the locus of control? That's easy to answer for a business, but hard for an OSS project.
The "telemetry" mechanism I have come to expect from OSS is the "bug report", where telemetry will be produced explicitly by me. I understand exactly what this means, what the data is, and why it's being collected. Moreover, I have expectations about how it will (and will not) be used (although, now that I think about it, I think those expectations need to be examined. I can see how bad actors could be using the data in these bug trackers for nefarious ends!)
This is a material privacy invasion because it violates the norms around what we expect from our software. I don't expect Audacity to phone home, and I don't expect the entity that distributes binaries to include functionality designed to satisfy the Russian government's need for data, for example. The fact that Audacity is used to gather and process sound data makes this particularly concerning, as this can be very sensitive data, and they are clearly serving more than one master here.
> The "telemetry" mechanism I have come to expect from OSS is the "bug report", where telemetry will be produced explicitly by me. I understand exactly what this means, what the data is, and why it's being collected.
I understand what you're saying but in practice there is a big problem with user-driven bug reports: they don't happen.
So there's a gap there between your expectations of "only user sanctioned network usage" and developing a product that is better to use (for you!). Besides, does everyone share your expectations? I don't.
Your last paragraph goes off the rails quite a bit and is exactly the kind of thing that is counter productive IMO. Suddenly the Russians are involved when we're talking about logging some exceptions to Sentry? This is what makes people stop listening.
We need to find a balance between privacy and practicality if we want the general public to use more OSS, this absolutism is effectively gate keeping and makes me sad for the missed potential.
First, it is necessary to state my assumption, clearly. Especially when it seems it was at least partly false. Good faith means that you're interested in learning, not winning. But the remaining points show your motive is winning, not learning.
Second, you argue against the straw-man that I think everyone shares my expectations. I don't, and in my experience this is precisely the kind of mistake someone wanting badly to "win" makes.
Third, you attack me personally by characterizing one of my statements as "going off the rails", and follow that up with your own statement that assumes you know what "people" will and will not listen to. Again, it's the rhetorical baggage of the sophist, not the reasoned debate I would prefer.
Go ahead and keep arguing that passive phone-home behavior in FOSS is fine and that arguing against it is actually harming the public in general, and their privacy in particular. I think it's a ridiculous argument on its face and I don't think the person making it is making it in good faith, so I am not going to discuss it further with you.
Your "good faith" statement only exists to elevate yourself, it adds nothing to the discussion, neither does the rest of this personal attack.
Stating someone's argument is "ridiculous on its face" shows that you're actually not trying for reasoned debate, you clearly haven't even given the point made a single thought. This is very much my point: it's absolutism that leads to aggressive lines of argument and yes, I believe this is harmful to the general public.
If YOU don't want it, you can turn it off. That's the beauty of open-source. You are clearly trying to argue that they should not include any sort of telemetry - that everyone shares your expectations.
Is that still true? Iirc, most contributions to the notable foss projects I know about are industry sponsored, whether by companies or trade organizations
I think at this point it's clear we need to stop calling those OSS, and instead call them "volunteer-developed" (or some other euphemism for the owner externalising the cost of dev and test).
I'm not sure what you're getting at. Do you think that most not-hobby FOSS projects are somehow 'profiteered' by freeloading companies who own them but don't invest any time or labor into them?
If you don’t know the history: most open source software is freely available and built as a community effort by volunteers. FOSS has, by necessity, had a strong liberal culture from the outset. The genesis of many projects has been about sticking it to the man and winning back the freedoms taken away by big companies, especially hardware manufacturers. I’ll pay you to build me a minicomputer, but I’m damned if I’m letting you dictate what I can and can’t do with it.
Telemetry is about gathering data and sending it to a single place. Having a central focus point of potentially personally identifying information, controlled by one entity more privileged than the others, is anathema to the idea of many of traditional open source projects.
> The genesis of many projects has been about sticking it to the man and winning back the freedoms taken away by big companies
I'm not sure it's working as intended. The current trend is for big companies to take FOSS, put it on a server, and run it as a service with extra quality-of-life trinkets on top - like GitHub or AWS.
> Telemetry is about gathering data and sending it to a single place. Having a central focus point of potentially personally identifying information, controlled by one entity more privileged than the others, is anathema to the idea of many of traditional open source projects.
This is the most clear eyed description I've seen yet, thanks.
I guess my angle is not that of a FOSS developer but a regular user out in the world who has to get shit done.
Most people (rightly, in that context) do not care about this one iota. This kind of absolutism limits the reach of the software, preventing it from reaching the stated goal of "sticking it to the man and winning back the freedoms taken away by big companies".
When you need to report errors from offline product, you create a dump file, zip it, and ask user to send it if he wants the problem investigated
File creation can be done automatically as a segfault handler for example
Less than 1% of crashes are reported by users. And only around 50% of users who do report a bug will provide follow up info such as crash reports or diagnostic information.
Before I had automatic crash reporting, I was blissfully unaware how bad my app was.
Well what you said is correct, but there is a difference between a commercial product, and free software
I would expect users to care more about the latter.
For example, I am specifically talking about open source games, where often users report errors with zipped stacktrace attached
In my experience it's the other way around - but the difference is not that big anyway. Commercial programs typically imply paid support, which can tease out more details when the user contacts them. The users themselves are also more invested, as they already committed their money.
Open source software faces the same problems about a lack of bug reports. Check out this article and the related HN discussion:
Yes, there are some users that are really helpful. I also thought that my users are great at reporting bugs. But when I added an automatic crash reporter, I was shocked how many crashes happened that nobody bothered to report.
Maybe 1 in 10 crashing bugs was reported at all. A serious bug that made the app all but unusable was reported by 2 or 3 people (out of at least 100 people who were affected before I pulled the broken update).
Most users actually don't care about the particular program itself. They care about the problem said program claims to solve. If the program gets in the way, they get frustrated and they may even move on.
Only a fraction of users will voluntarily report bugs. Even less will do it in a meaningful and detailed way.
Often they don’t because they are running hundreds of programs and don’t particularly care about this one. Not everything is a high-profile app that a lot of people pay attention to.
What percentage of Sentry's customers are collecting the data unethically?
At some point the tool vendors share responsibility.
It's like saying it's not AdWords or Facebook's fault there are so many ads everywhere, because it's
the people who pay to run the ads who are at fault.
Reading this thread I was surprised about the amount of conjecture when someone could literally just read the source - or read your comment - to know precisely what the intent is.
This is entirely correct. The `sentry-cli` binary is primarily used to achieve two things:
1) to upload commit metadata, so that its easier to trace a problem to the commit graph
2) to upload debug symbols (or similar), so that Sentry can turn machine data into something human readable, again to make it easier to diagnose issues
Sentry is a poorly designed crash reporting service that can be a massive vector for supply chain attacks. E.g. a sentry crash report makes an interesting target since it can save an adversary a lot of time finding exploitable parts of an application. Sentry claim that github and Disney is using their service ("among many other fortune-500 companies"). The problem isn't so much that there are potential security holes in sentry but that a service like Sentry should never be a SaaS (or if it is should come with a massive disclaimer - but since they just raised 69 MM last year and want to be seen as the github of crash-reports they're not gonna talk about it). Outsourcing something like that and sending it over the network demands a huge amount of trust. One might say that Anti-Virus companies or certificate authorities demand the same, so where is the problem? The problem is that nobody raised this (yet) so they don't even position themselves as a service that ought to take security seriously. There is also the elephant in the room of how many Sentry employees get to see the crash reports of these companies, what security clearance they have and how well they're vetted. Scrubbing of PII in the crashes is left to the client using the service and it often happens that PII still leaks anyhow. Whoops!
Ignoring all this for a moment, Sentry also has other problems limiting its usability: it can't be isolated from the main application logic and its ability to be useful is affected during times of a crash (when crash reporting is actually needed most). The correct approach to do something in the observability space today would be to make it live inside a eBPF VM (and not have the crash reporter run in the same context as the application).
Edit: The (genius) growth-hacking and business model of sentry is to convince companies that it provides value by creating a dependency of Sentry API calls in their clients application code. This is hard to pull off but once a customer has been convinced that making sentry.io calls is useful, they're basically joined at the hip. This is what makes them a highly valued company (from a financial pov) since getting rid of it is as much effort (and even potential risk) as putting it in place.
You are misrepresenting Sentry pretty heavily here. It's an old crash reporting system and it works well. You can install it on premises completely for free and don't use any of the SaaS stuff. Your data never leaves your data center.
There are also various mechanisms to avoid PII and passwords from being sent in crash reports regardless of if the installation is on-premises or in the cloud.
I don't really know what you mean about sentry dependencies either. For most cases you don't add any code at all, you just add it to your project and it catches exceptions and reports them. If you don't want to use it anymore, you just remove it. I'm sure they would love to get a hook in you, but I haven't seen it yet :)
> You are misrepresenting Sentry pretty heavily here.
instead of generalizing and claiming that this is a misrepresentation why not engaging with my specific points that make it abundantly clear where the problems are?
> It's an old crash reporting system
there is nothing "old" about sentry. it's new tech using old (outdated) means to monitor an application crash. Your comment is at best ill-informed and at worst whataboutism because most fortune-500 companies in their list do not use on-premise Sentry but their SaaS service. So getting pwned with sentry is not a question of if but when.
Meanwhile my post has been flagged which is normal considering that several sentry employees are on HN with high karma. That doesn't change how dangerous the ideas are that they are peddling. And I'm not talking about the wannabe-FOSS (but technically not really FOSS) on-premise service. My beef is with how they commercialize this via a SaaS (recurring revenue / subscription model) when the whole thing should be a bash 3 liner written by your friendly internal sysadmin.
This post and the fact it got flagged into oblivion will be used as an "I told you so" the moment the first fortune-500 companies gets hacked via Sentry. It's not a question of if but when considering their authentication and roles are split across 2 continents and 3 countries already and there isn't any strategy for the existing 200+ engineers on how to access highly sensitive customer information and (accidentally leaked) 2nd party PII.
I will let HN users educate folks on what Sentry is and isn't, but as the creator..
> there is nothing "old" about sentry. it's new tech using old (outdated) means to monitor an application crash. Your comment is at best ill-informed and at worst whataboutism because most fortune-500 companies in their list do not use on-premise Sentry but their SaaS service. So getting pwned with sentry is not a question of if but when.
It's 13 years old - I wouldn't call that new in the age of technology - and consistently using modern techniques and technology to improve upon a timeless problem.
What on earth? Sentry is a high quality, open source, self hostable crash collection tool and service. Your comment is FUD through and through, to the point where I'm wondering what your agenda is. You work for newrelic or something?
Once the declared period has passed, yes, it’ll be open source, though I’d still say that “X is open source” and “N-years-old versions of X are open source” is a distinction worth maintaining. However, AFAIK there isn’t a single version of a single BSL-licensed project (... that I’m interested in?) that has been released as open source, so far, so while I want to assume good faith from their authors, I’ll suspend my judgment on that part until the open-source release pledge becomes business-relevant for them.
I’ve written elsewhere on HN (https://news.ycombinator.com/item?id=27496025) why I consider the essence of open source (as defined in the OSD) to be the author saying something akin to “I will not use the legal tools of copyright to stop anyone else from doing what I am doing with my stuff”. The terms of the AGPL or even the Open Watcom license (however much I’d prefer to avoid them) qualify; the (by almost all measures more permissive) terms of the BSL or even the Unsplash license do not. That (and not a sentimental attachment to the OSD) is why I protest a meaning of “open source” that would allow conditions “discriminat[ing] against fields of endeavour” such as those you listed.
Everything the above link says about how my unwillingness to accept this dilution of terminology does not imply I’m passing a moral judgment on those releasing the software also applies here. I believe this makes your last sentence irrelevant.
That said, my original motivation for the “essence” above is a belief that people ought to be able to study software and do whatever they want with the knowledge. If we take the delayed open-source release pledge at face value, the BSL then occupies a curious borderline position between satisfying and not satisfying this desideratum in the same way that delayed open access does. I’m ... not sure what I think about either, in general (as opposed to the specific practice of applying the term “open source” to the former). This is part of why I’m suspending my judgment regarding this part—if, for example, the industry eventually converges on the delay being practically infinite, as copyright and patent terms already are, my mental resources are better spent elsewhere. Also, hopefully, in the intervening time smarter people than me will say things that I’ll be able to steal.
> Once the declared period has passed, yes, it’ll be open source
No, it's open source software on day 1. It becomes FOSS after the allotted time.
And the post you were commenting on also mentioned the need to find a solution to let FOSS become economically feasible in the age of oppressive copyright law. I too am looking for a solution to this as a proponent of FOSS, while also doing the best I can politically to declaw copyright law in my region.
Unless you can see their private config, we don't know where SENTRY_DSN_KEY is pointing. You can host sentry yourself. (https://develop.sentry.dev/self-hosted/)
I get you're worried about crash reports being shared, but that warrants a big "if they're using the hosted version" qualifier.
I mean, you can easily see what the DSN is in any distributed version. This isn't a SaaS. And by the way, it's not a key; it's not even remotely secret.
> All your personal data is stored on our servers in the European Economic Area (EEA). However, we are occasionally required to share your personal data with our main office in Russia and our external counsel in the USA.
Piggybacking on this, the first thing I wanted to look at was what data do they exactly collect. Well, they do not explicitly list it (not sure if that would be considered legal in countries with strict privacy laws). I quote from the offcial announcement -
> Personal Data we collect
> Data necessary for law enforcement, litigation and authorities’ requests (if any)
This, and this alone is enough for me to neither use this piece of software, nor recommend its usage.
There's more however. Not only do you not know what information they leech off of your machine, it may be sold to a third party for a price.
> Who does Audacity share your Personal Data with?
> to a potential buyer (and its agents and advisers) in connection with any proposed purchase, merger or acquisition of any part of our business, provided that we inform the buyer it must use your Personal Data only for the purposes disclosed in this Notice;
That's pretty much what I assume is true of any software I use these days, and it doesn't bother me.
As long as governments need a warrant to access my data, I'm okay with their ability to do so - it's incumbent on me to be careful about the information I allow into software.
Product usage telemetry and crash reports from an open source audio editing app is an obvious whitelist from me.
If they start scraping the contents of my hard drive, that's another thing, but Occam's Razor doesn't indicate that motivation here.
The problem is as more and more people like yourself acquiesce to this kind of surveillance, we will slowly approach the day when another version of you is fine with a government scraping your hard drive data as long as it requires a warrant to access it.
"This kind of surveillance" meaning… crash reports from audio-editing software?
I am absolutely NOT fine with audio-editing software scraping my hard drive, and would uninstall it immediately and request my friends do the same if that came to light.
It's sad to see "telemetry" getting infused into everything, and disappointing that developers are happy with --- or even encouraging --- this behaviour. I think Microsoft really started the trend, before that it was just called "spyware".
I have a very old version of Audacity, that I still occasionally use, and see no reason to "upgrade". Now I'm even more convinced not to.
True that Google was doing it on its sites before MS, but there's an implicit sense of that being more acceptable because it's a remote site and not software which should otherwise have no use for a network connection phoning home.
Some commenters in the GitHub thread have pointed out that this license requires audacity to ban people 13 or under from using the app for privacy/consent reasons. This means that it’s not usable for education now, and also that audacity is in violation of the gpl. However, as the copyright owners, does that matter? How does that work?
> audacity is in violation of the gpl. However, as the copyright owners, does that matter? How does that work?
IANAL, but the license is something the rightsholder distributes so that others know what they can and can't do with a copyrighted work. It doesn't apply to the rightsholder past putting down to paper what they can't get litigious at you for. (EDIT: reading the GitHub thread, it seems a bit more complicated than that due to outside contributions back to Audacity still being GPL)
GPL still applies to the last version it was distributed with.
> license requires audacity to ban people 13 or under from using the app for privacy/consent reasons
I don't know about today's youth, but I know in my generation we did a fair bit of checking boxes that said we were 18+ when we weren't.
What’s stopping a public school district from straight-up ignoring that bit of the license agreement and using it in their classrooms anyway? Who will enforce that clause?
It's actually really funny because you get to see how poor everyone's technical literacy is. They pretend to be ,very, smart. But then claim a bash script part of the build is selling your soul to Russia or something.
Then this is what they should be focusing on. Although, for example, to clean up excessive modals with a lonely OK button that annoy you after every operation, you don’t need to fork wxWidgets, and you need exactly zero telemetry to know it’s annoying.
After a decade+ of use, I somehow haven't installed Audacity on my newest laptop yet, because I've found sox surprisingly fitting for all my work. It's obviously not a full replacement (cli rather than gui) but if you just want to merge tracks, slow things down, truncate files, etc, you may find that you don't need Audacity either.
I used to use ecasound for stuff like that and simple effects and multitrack recording. I found that the simplicity changed my focus to what I was playing rather than how it sounded, and in turn it resulted in me being happier about the music I made. It may not work similarly for everyone though, but for me it made a big difference.
The reddit thread raises a good question I have often wondered. What is my best option for an application level firewall? I want to deny network connections by default and specifically enable who can speak to the outside world.
> I want to deny network connections by default and specifically enable who can speak to the outside world.
I also want to filter the network data. I want my firewall to inspect what the software is sending over the network and delete, randomize or nullify all data that isn't strictly necessary for it to perform the desired function. Like uBlock Origin but for the network stack.
This would enable normal operation of the software while also at least partially subverting the "legitimate business interests" of these corporations.
> I want my firewall to inspect what the software is sending over the network and delete, randomize or nullify all data that isn't strictly necessary for it to perform the desired function. Like uBlock Origin but for the network stack.
You'll have to TLS man-in-the-middle yourself for this to be viable.
Absolutely. Interception is okay when we're the ones doing it. Software exists to serve us, it doesn't have the right to establish private communications with third parties against our wishes.
I've intercepted the traffic of many mobile apps to see what information they're sending. I hear developers are pinning certificates now. I can just replace the pinned certificate, right?
Could also work with file access. Most applications never need to access the whole filesystem. My browser mostly needs access to the profile-specific data and the downloads folder. A music player doesn't need access to anything outside the music directory and also no networking unless maybe that one URL it uses to load album cover images.
Although I'd only prefer this approach if it's actually done right. Android, Flatpak etc. mostly showed ways to do it badly.
Agreed. We need ways to filter every Linux system call. Not just disallow the system calls themselves. We need the ability to apply policies to parameters and filter I/O in a transparent way.
On Windows I use henrypp/simplewall. It is lightweight UI on top of Windows Filtering Platform. My only complaint is to all the self-updating programs that keep changing its binary and I need to re-enable them periodically...
One point I rarely find when something like this happens is the question of why was the software sold. It's likely that if, for example, people had a culture of giving to OSS developers the acquisition may have not happened or even made economic sense.
It's easy to say "bad companies, let's fork!", but who's going to maintain it in the long run?
I thought Timidity would be a good name for a fork -- but it turns out that's already been taken by another, more appropriately named, audio application: https://en.wikipedia.org/wiki/TiMidity%2B%2B
I have a wild theory here. Some Taliban guy recorded an MP3 message tagged with authorship by Audacity, and someone in the State Dept said "what do you mean we can't get his IP address?"
From the notice: https://www.audacityteam.org/about/desktop-privacy-notice/ It looks more like they want to get basic analytics info abut users, which means legally they need this notice, and also of course the govt may come after that data. I don't know if the feds can force you to collect info about users if your software didn't already.
Their way of writing it is very careless, including having "litigation" and "etc" in the sentence. It sounds like they can spy on you for their own purposes (collecting evidence against you) or based on copyright (copyright litigation) or basically whatever.
Mentioning "capitalism" in any even slightly negative connotation is a surefire way to garner downvotes in HN. Makes sense, of course, given that the audience is mostly entrepreneurs and entrepreneurs-at-heart, emotionally attached to capitalism and hoping to gain from it.
Oh give the guy a break, people. It's not like he's wrong here. Detach your emotions from your proof-of-stake for a minute. We're all greedy bastards but please let's be honest about it.
Upvoted because yes, Occam's Razor suggests that a new owner hunting for a profit model by selling Audacity user data is a more likely explanation than the multi-part conspiracy theory I just pulled completely out of my ass. There are times to blame "government" and other times to blame "capitalism" - we should jettison that dichotomy at this point in America, because they're one and the same. A market-dominated state and a state-dominated market aren't all that different. But if common people knew who to hold accountable for dark patterns, or even knew about the patterns operating on their lives, our civic bodies would be in better shape.
[edit] I'm still an entrepreneur. I like making money and hate paying taxes. But plenty of 'capitalist' people recognize that sneaky data theft, corporate/gov't collusion, and unethical EULAs are actually a short- and long-term hindrance to our ability to innovate and earn.
Occam's Razor would be: They seriously just want telemetry to understand where to focus on. What is being used. What is crashing.
(While I don't think telemtry-driven developement is good: with an audio workflow there are some filters one uses always by default, but the relevant features are the filter one rarely uses, but which make a difference when being used)
Is this going to be Freenet in software form? The first bit of fuckery with FN wasn't enough to drive most people away. Nor the second even. But it KEPT HAPPENING, over and over, and eventually critical mass was hit.
You'd think after the last time the Audacity vultures would have learned their lesson, but of course they don't care. It's just another thing destroy in the vague hope of making money.
Audacity should have no online components. NONE. And as such, no such notice should be needed.
That's not an accurate representation of the freenode situation: freenode was taken over, the old staff disagreed with the decisions of the new owner and left to create Libera.Chat, and then the new ownership started banning people and taking over channels for talking about Libera, changed the server software, effectively deleted the nickname database, and is producing really delusional messaging, such as https://freenode.com/news/introducing-irc .
it might be a bit offtopic but there is a somewhat easy solution to isolate networking for some applications if needed;
sudo ip netns add lonly
sudo ip netns exec lonly sudo -u $USER -i
this should start an interactive shell session with isolated networking, the shell will only see a separate loopback interface without further ado. anything started like this will share the same isolated network environment but you can add and use as many separate network namespaces as needed...
huh, for some reason i did not know about this utility but it really looks like a better approach. i feel a bit unsure if running in a new user namespace could break things in subtle ways somehow though. however, it works without user namespace separation too. the cgroup namespace separation is required on my machine to make it work without root for some reason yet unknown to me.
The bad actor in question has a known history of smash and grab “audacity” with copyright and guitar tablature, Ultimate Guitar. It sounds like some of the historically aggrieved parties need to get together and do something about these trolls
For me, I just need the audio equivalent of notepad or nano. A simple editor with the basics.
For those playing along at home, be aware that sox has a typosquatting link on their own page to http://sorceforge.net/ (The U was missing in both the text and actual URL so make of that what you will).
People are apparently selling Audacity in what are basically just re-skins then expecting the original project to provide support. This is likely one trigger for all the monetisation paths going on. The arguments around "its gpl so its ok" start getting a bit weak when the for-profit offerings are probably just thin layered re-skins. I think people need to learn ethics. Just because its legal doesn't make it right.
Keep reading that issues/5 thread, especially the replies from Muse Group's Head of Strategy workedintheory. It doesn't explain this Audacity stuff, but the musescore-downloader response makes sense (TLDR: copyright holders forced them to add a paywall or remove non-public domain content).
That doesn't excuse the behaviour from MuseGroup. They essentially were trying to do same thing that rights holders tried to do to youtube-dl, with (somehow) even less standing, since MuseGroup does not even represent the rights holders in this case.
Fraudulent DMCA claims and scare tactics ("Information about users who use this extension on musescore.com will be shared with local police"[1]) are inexcusable regardless of how they are justified.
Agreed on the police threats and DMCA attempt. They seem to have succeeded in getting it removed from the Chrome web store, due to its name violating MuseScore's trademark.
> They seem to have succeeded in getting it removed from the Chrome web store, due to its name violating MuseScore's trademark.
That can't be the reason, since the name "musescore-downloader" doesn't violate MuseScore's trademark. If you have a trademark, there is no prohibition on me using the mark to refer to you. Doing that is the whole point of your trademark.
But would it cause user confusion over whether the software comes from the same organization as MuseScore itself? Saying it’s just a reference doesn’t seem like it would get you off the hook that easily?
But in this case the maintainers of the Chrome Web Store get to decide. If they think it’s too close, it’s too close.
MuseScore tried to do a DMCA takedown on the project. Only the copyright holders themselves, or their legal representatives, can file those according to the law. MuseScore is not the copyright holder for tabs and scores they're asserting the project author is infringing on.
Doesn’t this tie in with the recent YouTube video by the guy who critiqued music composing programs? And Audacity hired him to clean up the UI which added telemetry or something?
It's a great illustration of how celebrity is leveraged by the powerful. Muse group picked tantacrul, a celebrated figure for his strong opinions, and told him, "here's how you can do good." And he can do lots of good, but this isn't a tasty way to do it. This is using him instead of as an experienced ui consultant in music software, as a smokescreen for an exploitation of open source culminating in sending user data to Russian law enforcement.
He was hired as the project lead to improve the UI/UX of Audacity, yes. I don't think the spyware ideas are strictly his, but he is the face of the project whether he wants it or not.
I was analyzing the network traffic for the app (as i use a self hosted AdGuard Home), but couldn't find which domain it wants. I figured blocking all internet should be fine as Audacity works perfectly fine offline.
Seriously you add crash reporting to your app and all the idiots who can't read swarm like flies. Reddit and HN actually sell your info, and they're using this for crash reporting.
"Good people make good things" is 'fear, uncertainty, doubt'? Have you been huffing paint fumes again aspaceman? Try as you might, you'll never get to space that way; all you're doing is killing your braincells (which is probably why your head hurts!)
First of all, learn how to talk on a public forum without coming out like a total jackass. Second, read the actual text.
> Personal Data we collect
> Data necessary for law enforcement, litigation and authorities’ requests (if any)
They don't disclose exactly what they collect. That is the very definition of shady, and grounds enough for a privacy sensitive user to abandon the software. It's not just "crash reporting" as you proclaim.
Moreover, if you think they won't sell it, I quote again from the original announcement (which, if you made the effort to read, would have saved us this conversation)
> Who does Audacity share your Personal Data with?
> to a potential buyer (and its agents and advisers) in connection with any proposed purchase, merger or acquisition of any part of our business, provided that we inform the buyer it must use your Personal Data only for the purposes disclosed in this Notice;
It's basic telemetry. This is all FUD. Like, massive FUD.
It's a standard disclosure notice for when you collect any telemetry data. It's also opt-in, so it doesn't apply unless you opted in for telemtry. They're not selling you.
Renderdoc also has one. We gonna throw a fit over that?
The telemetry fits are really infuriating because they conflate privacy violations with perfectly reasonable behavior.
From a quick look at the Github releases page and the acquisition articles, it looks like Audacity 3.0.0 and 2.4.2 are the most recent versions of Audacity people can fork / download from before the acquisition.
Sounds like there are already other forks on their way though.
Does not seem very much in line with their response at the time of "basic telemetry" addition [0], or at least that's my impression. Telemetry can be useful, but can also be the first step towards something much worse that we could see with this project now. Is this data worth their image and Audacity's longstanding fame as a reliable, no-fuss program?
Linking to a Reddit thread is a lazy substitute for writing a blog post and submitting that to HN instead. This topic is serious and deserved better. One coherently presented viewpoint essay would have meant a lot more, and had a lot more impact, than this incoherent Reddit discussion.
I recently moved from linux to mac and was looking for an alternative to Audacity - it turns out the audio editing portion of Da Vinci Resolve does what I need just fine.
In case someone wonders - I moved to mac because of the macbook air m1. It's modern powerful hardware and silent, which is great for creative work. Previously I was on an HP business laptop from 2013 :-P
The deal to be acquired by Muse Group was announced 30 April, and "Muse Group was formed just a few days prior to the Audacity announcement, on 26 April"
The latest release "Audacity-3.0.2" tag on github is 14th April, as is the date that the Windows 3.0.2 installer was signed. So the current latest release predates the announcement and formation of Muse Group.
The privacy notice was updated 2nd July. We don't know if that's when the new clause was added, however issue #1213 on github suggests that it was.
But I've not looked at the code, so don't know what telemetry was already in place before the project was acquired. Anyone?
I agree with news cracker, and I would like to know what open source options are available aside from this that would run in a Linux environment.... googling
Audacity is a GPL licensed public domain offline audio editor. There is nothing to compel such a nebulous community driven software effort to behave as you suggest. Not everything is made by companies, you know. This community software is being hijacked by a slimy corporate operator who is doing things we don’t have to accept, as such a code license means that they are not the owners of the code, we are. All that is required is a telemetry-free fork and bad publicity and removal from distros for the bad corporate actor. Ultimate Guitar are bad actors. Denounce them. They certainly are cheeky, putting something they didn’t make out with a new skin that phones home.
Except Audacity is a fully local audio editor. It doesn't need to know that the internet is a thing at all. It has no business creating network sockets.
I’d take a shoddy UI and no spyware over a good UI that’s spying on me despite being an offline application. I’ve used Audacity for years and I shan’t be upgrading until a spyware-free fork is available.
Ultimate Guitar have a history of abusive behaviour towards open source projects and I sincerely hope this is a PR catastrophe for them.
Like loan sharks care about helping poor people. No thanks to predators offering help. Audacity ui worked fine for me for over a decade. Your opinion otherwise doesn’t justify such rapacious behavior on the part of Ultimate Guitar, who basically hijacked a community project for their own ends. Bad faith actor, and I’m having trouble seeing your comment as innocent. Who do you work for here? I find your association of good ui and telemetry to be highly suspect and certainly not logical. Feel free to correct my ignorance and elaborate on your strange proposal. NSA backdoors don’t make themselves, there’s always that special someone on every committee
> Courts can compel them to log this information, so all claims about not keeping logs are just theater. The second they're ordered to by a court in the US, they will.
But the consensus around here seemed to be that courts can't do that[2].
Maybe the story is that people now know that courts can, indeed, do that.
Are you a lawyer? Why are you more credible than anyone else on this matter?
We have, in fact, seen companies publicly push back on doing additional work by claiming that it was an undue burden on them when they are not a defendant. If I recall, this was one of Apples arguments when they were involved in their well publicized case with the FBI.
If this is only about telemetry or analytics, make it an opt-in choice on first run.
I wasn’t aware that Audacity had been acquired by The Muse Group recently. [1] I don’t think this will go well for users when private interests and concerns conflict with public ones.
The next obvious question would be about equivalent alternatives that are free and open source.
[1]: https://www.prosoundnetwork.com/business/audacity-acquired-b...