Hacker Newsnew | past | comments | ask | show | jobs | submit | akappa's commentslogin

That's interesting. Are you aware of any stereotype about/against italians?


anecdata of 1: I used to work for an italian boss who worked her ass off to become a professor. I had a lot of respect for her. But I could tell that at some level she had been fighting stereotypes (both on the nationality/gender issues) her whole life. She hired an italian postdoc who was basically a walking stereotype. He never came in to work, did shoddy work when he did come in. It was a disaster. Generally I thought of her as a very "tough but fair" boss, there was one time where she got extremely upset at grad students for using the wrong instrument for critical experiments (but really, they should have known better). But. I could see steam coming off of her head in her interactions with him. She eventually quit being a professor, but didn't tell him that she was quitting, basically he showed up to work one day with no job.

My understanding is that there's a north/south work ethic stereotype within Italy, and maybe that the regional attitude differences are not as well appreciated outside of Italy, moreso outside of Europe. She was from the north. Non so de dove il postdoc viene.


In Canada the stereotypes I heard about Italians weren't the nicest ones but I see their reasons

But the perception is usually neutral it seems


What kind of reasons? Old stuff like mafia or instead something (semi-)factual like our relaxed relationship with time management?


The Mafia is not "old stuff" in some parts of Quebec (and maybe other places)


An ironic thing to say in a seed accelerator-owned website.


I think there's a sizable camp of HN members who just come here to read the various interesting tech articles, with no plans to join the cult of SV.


I think there's an obvious and more general question here to be answered: if a piece of software is good enough, namely, it "disappears" when you are trying to do your job, then why bother to change it?

Yes, the new version might enable a more efficient workflow, or might be faster to boot-up.

But people hate changes and you need to invest considerable time in upgrading a piece of software (the upgrade itself + learning to navigate the new system + solving whatever goes wrong during the process). And that this is not just limited to "novices": I've heard wonderful things about Arch Linux, but my Ubuntu system works well enough and I've been using it for forever, so the incentives are pretty low and the time to be invested pretty high.


Profit. How else is Microsoft (or any other company) going to make money off software no one needs or wants? Other than rehashing the same old product and pushing it onto users for more money, more surveillance, or both, they have nothing of value to sell on the software side. Even if they did have something of value to sell, over 90% of the people have already bought their product. Gotta keep the cash rolling in so they tweak the UI, maybe in a way that people hate like removing the start menu. Or they add surveillance features you can't turn off and monetize that way while pretending to somehow be secure. Or forget all that, let's just push the new update forcefully onto people's computers and pretend that it's not malware.

When you have no real, defined product that solves specific problems to sell anymore, you sell garbage that pretends to be a real product. Microsoft is far from alone here. Almost every successful software product that doesn't know when to stop building and start maintaining suffers from this degradation later on in life.


There's been some really good progress made on the tablet mode in Win10. It's not really apparent if you're using it as a Desktop OS.

Specifically the Surface Book does a really good job of making the windows "experience" work in a keyboardless scenario. I'm sure the same is true of the other convertable hardware that's been popping up lately.

I'll concede that this isn't the majority of users but I would argue it's not just a rewrapping of existing software.


> There's been some really good progress made on the tablet mode in Win10. It's not really apparent if you're using it as a Desktop OS.

It's very apparent to me, it puts itself back in tablet mode after every restart, it's a pain to get that slide out menu working with a mouse. I bet a lot of people unknowingly get stuck in it.


I don't quite follow. You seem to be saying that Microsoft make changes just to justify the release of a new OS version. But they announced quite a while ago that they would never release a new major version. There will never be a Windows 11, you will never pay to upgrade Windows to the new version again. They adopted the macOS platform, where the OS sticks to a single version and gets an incremental annual update (macOS has been on version 10 for 16 years, with no plan to ever create "OS XI", and stopped charging for updates 4 years ago). So why would change for change's sake relate to profit at all?


Because MS makes money off of selling software vice Apple's model of making money off of selling hardware. If they continue to emulate Apple's policy of free upgrades, they'll see an eventual revenue drop since they'll be dependent upon users upgrading their hardware.


Main money comes from comercial support, not from single Win 10 disks ...


Right now Win 10 Home is $119.99. If Microsoft actually said what you claim, they're lying. Either way, they're doing it for the money. Whether that money comes from the actual software, hardware, ads, user data, or some other way, it's the reason why Microsoft will continue to push their software onto people, including people that don't want it. In fact, with the addition of ads and user tracking, I wouldn't be surprised if they drop the actual dollar cost of the software in favor of data cost a la Chrome OS and other Google products. And it still wouldn't change anything. Profit would still be the reason for them adding new features and forcing people to upgrade.


> macOS has been on version 10 for 16 years

OSX/macOS promoted their point releases to be something between service packs and major version releases, keeping it at version "X" for marketing purposes. From what I can see, between Cheetah and Sierra the expected workflow of the system has changed strongly a few times.


And yet, they have set an End of Life for Windows 10 in 2025. https://support.microsoft.com/en-us/help/13853/windows-lifec...


Microsoft change its policy way to quickly and often (eg. about XBox One features, Windows Phone) for this announce to be taken at face value. Maybe its true but if the Windows 10 market penetration does not go as expected (as it seems to be) the strategy will change.


"you will never pay to upgrade"

I don't recall them saying that.


Obvious answer: switch to a subscription-based model. They are already doing it with Office.


That's because in most cases the "change" is 90% costs externalized by the vendor, IT, or both.

Imagine if it was 1955 and we thought it was a good idea to randomly go into every company every year or two and say "Alright, this inventory system, this time tracking system, this customer service system... All of it, right out. Progress is here folks, and you don't want to be change averse. Time for some new fresh paper."

It's ludicrous. One thing you learn very quickly in business is that companies have little or not stomach for doing things that do not make them money. Every minute or dollar you spend on them appears to be money down the toilet. So if Microsoft wants people to use Windows 10 they need to offer more than hand-wavey bullshit about features and progress and security, because all of that stinks to high heaven of a company trying to convince its customers to give it more money because it wants more money - not because it's offering anything of real value to the customer.


> I think there's an obvious and more general question here to be answered: if a piece of software is good enough, namely, it "disappears" when you are trying to do your job, then why bother to change it?

I find I care about the applications I use: my IDE, git client, console, and the various web apps I use. I don't so much care about my browser UI, as long as it gets out of the way and shows me what URL I'm on and renders the page.

Likewise, I don't care about the UI of the OS itself, as long as it does its job properly: lets me start and switch between apps, connects to wifi, manages the display power, and sleeps and wakes up.

When I'm looking at the computer screen, I'm looking at what I'm doing, not the UI chrome. Even now, I'm only paying attention to this comment text box and your original message.. Everything else -- all the other tabs I have open, the visible bits of the OS, the other window open on my next monitor -- all blends into the background and gets ignored.


Interesting, readers of HN should know that upgrades to the newest version of Windows and updates is essential for reasons of software security. Newer versions of Windows incorporates security measures not in older versions -- sometimes even taking advantages of new security features in Intel hardware.

For example, Target and Home Depot were hacked because they failed to upgrade their point-of-sale hardware from Windows XP embedded to Windows 7 embedded or later which was an upgrade recommended by Microsoft. Windows XP embedded had a security flaw later patched in later versions of Windows.


People vastly underestimate how massive, complex and heterogenous the likes of Target, Home Depot, Walmart's stacks are.

I've worked with people very used to working with enormously complex systems and even they say Walmart etc is on the higher end of that scale. We're talking weeks of people on site to get new software stood up.

This isn't to diminish your point about the need for upgrades, but it's nothing like a push button process.


Shouldn't these stores standardize on vendors? Why would they use different vendors which only adds to complexity unless they've bought out a different store chain and are integrating existing systems?

Also, they should pay the vendors contracts for maintenance instead of trying to do the upgrades themselves. The vendors are generally more likely to do the testing necessary and have the skills for upgrading systems across from various customers.

At any rate, as you put it, eventually they still should do the upgrade.


Even if they do standardize on vendors, they're then also dependent on, say, Oracle Retail supporting a given platform.

If you want your POS to talk to your marketing automation system, that's another integration and maybe another vendor who Oracle may or may not wish to support etc.

There's no way for a bank to standardize on vendors as it's back office systems might have been designed in the 1980s. If it wants to add an iPhone app or mobile payments, it has to rely on another vendor almost automatically.

Making these stacks work generates huge revenues for people like CA, Automic, IBM. Process automation is big business. Big meaty huge Fortune 500 companies held together with the software equivalent of sticky tape.


Not to be crude but what you call "standardizing on vendors" I call a great way to spend your days getting screwed by said vendor(s).


Sometimes one has to pay that price. Can you suggest an alternative that reduces complexity?


This is interesting. Could you give a bit more detail?



Of course, but I was talking about the incentives of an average user, which I'm not sure care about security as much as we do.

Just to make what I'm saying more concrete: most people in my home town in Sicily think it's perfectly OK to bring their phones and laptops to some random guy owning a tech-assistance shop and tell him their Facebook/Email passwords straight away so he can reinstall stuff and save their login for them. Do you think they care about upgrading because of "improved security"? I'm not sure they even understand what a security issue is...


You describe the proximate cause of those breaches, but perhaps the ultimate cause was the difficulty of updating that OS in the first place?


The point-of-sale terminals have vendors that produce them in large quantity and they should have the expertise to upgrade the software of the machines that they built.

One can always hire experts with a proven track record to help with the install of the new OS.

Incidentally, many people may have trouble with upgrades to the new OS because 1. Running old hardware 2. Not running quality hardware -- e.g. for Windows laptops traditionally Thinkpads. 3. Do a fresh install. E.g. don't upgrade, but backup the data, clean the disk, do a fresh install.

I use Mac and did the fresh install of Sierra 10.12.1 and then upgrades for point version updates.

I also have been running Windows under Parallels and on Thinkpads prior to 2011 and not had problems with new versions of Windows.


Details on the Target and Home Depot attacks that would have been prevented by the upgrade from Windows XP embedded to Windows 7 embedded or later:

http://www.dailytech.com/Appalling+Negligence+DecadeOld+Wind...


Most HN readers probably don't even use Windows. I think it's objectively the worst OS for a tech enthusiast.


All the time I've invested in acclimating to Windows 10 feels like a burden, while all the time I have spent tweaking OSx feels delightful. People don't hate change...we hate change that is painful. Windows 10 (to me) embodies painful change.


A lot of people resist change to things they are familiar with.

Between each significant revision of Apple's desktop OS, my mother would get frustrated at settings that were moved elsewhere/renamed/removed entirely, for example. OSX is no stranger to unnecessary changes (or at least that's what my mother thinks of them).


Here's one small change to macOS that drives me nuts. At some point they changed how application hiding works.

It used to be that cmd-h hid an application and moved it to the end of the application stack. If you wanted it back you could use cmd-shift-tab.

Some suit, in their wisdom, decided to change that so cmd-h now only hides the app.

It's a tiny change ...that's a stone in my shoe every single day. IMHO that change is was completely unnecessary and breaks my workflow.


I get (maybe) your argument when it comes to Windows 7 -> Windows 10, but I think you're overestimating how hard it is to get up and running with Arch Linux. Following an Arch install tutorial shouldn't take anyone here on Hackernews more than 2 hours to get their DE, networking etc. up and going.

You should try it. It's fun!


Arch isn't that hard to set up - I'll agree with you there. Where it falls down, though, is when you need to install something where there isn't a package for it.

Now, of course, you could just go old-school and "manual install" whatever you want. But let's say you didn't want to do that, because you got tired of having a system full of cruft due to all these funky pieces of software scattered about the system that the package manager had no clue about - oh my!

So - you want to install the software as a package under Arch. Oh - and just to make things more fun, the software is something proprietary and closed-source. There is more than a few bits of useful Linux software out there like this - much of it niche areas where a) probably no one else in the Arch community uses, and b) it's proprietary - so you can't distribute it anyhow.

What does Arch require you to do in this situation? Well - last time I looked anyhow - and that was a couple of years ago, so maybe something has changed... It seemed to me to create an install package for a piece of software, the process (according to the docs) was to install the software "manually" and keep track of where everything goes. Then, once you have done that (and the software is running fine), you are supposed to create your package manifesto (or whatever it was called) that tells the system how to install it, then take all the parts and bundle them up (as a compressed file of some sort) with that manifesto. Oh - and then manually "uninstall" your software you installed, then use your new package to re-install your now-packaged proprietary software.

That's basically the process I read about for any kind of software package for Arch; it was a very heavy manual process. While other package management schemes do require some manual effort, none of them that I recall seemed to require as much effort as Arch's did. This wouldn't normally be an issue with most open-source software, because once you made that package, it could be distributed and used by the community. But for a proprietary software package, only you could use it - so it was a ton of extra work for little gain in the end.

Don't get me wrong - I liked Arch, and their community support and forums, wiki, etc - is pretty top-notch (I like to use that part of the ecosystem myself for help and hint purposes when I need it). I honestly think it is a great distro, but it does have some drawbacks to it (and they probably don't have good solutions, either).


> It seemed to me to create an install package for a piece of software, the process (according to the docs) was to install the software "manually" and keep track of where everything goes. Then, once you have done that (and the software is running fine), you are supposed to create your package manifesto (or whatever it was called) that tells the system how to install it, then take all the parts and bundle them up (as a compressed file of some sort) with that manifesto. Oh - and then manually "uninstall" your software you installed, then use your new package to re-install your now-packaged proprietary software.

This might be technically correct (not completely sure), but hugely misleading. Packages are defined by PKGBUILD scripts. When you run `makepkg` to create a package, it creates a directory called $pkgdir. In your PKGBUILD, you just delegate to whatever installer came with the source code & tell it to install into $pkgdir. e.g. `make install --prefix=$pkgdir`. makepkg takes all the files in $pkgdir and tars them and bam - there's your package.

If you look in most PKGBUILDs, they contain some metadata (where to get the sources, etc), and then it's more or less:

    build() {  
        cd "$srcdir" && make  
    }  
    package() {  
        cd "$srcdir" && make install --prefix="$pkgdir"  
    }
That's all you need for quite a few packages.


cr0sh posited that the application being packaged was not in source form. So perhaps the hugely misleading thing is to argue against xem based upon how one packages up things that are in source form. (-:


Well as long as it ships with an installer, the above method still works - just omit the build step altogether (do a chroot if the installer doesn't offer a choice of where to install). If it doesn't have an installer, but comes self-contained, then just `cp $srcdir $pkgdir/opt` & call that good.

And if you're dealing with a non-source package that doesn't have an installer and isn't self-contained, then installing/packaging it is difficult on any OS - not just Arch!

For reference, here's a PKGBUILD for a non-source, proprietary application. It's pretty much as above, but includes some fancy .desktop files & such, too. https://aur.archlinux.org/cgit/aur.git/tree/PKGBUILD?h=disco...


In fairness, that's pretty much how you are going to have to package a binaries-only software with most package management systems. Install the binaries. Make a package description of those binaries.

It's manifest, by the way.


Way more painful to make packages for Debian, IME, but I avoid making packages altogether:

> Now, of course, you could just go old-school and "manual install" whatever you want.

Yep, stop right there, do that.


I worry more about the long term support workload than the initial install.


I found this to be a real problem with using Arch in production. Updating is a bit of a catch-22: if you upgrade frequently, the workload of keeping up with the minutiae of not breaking your system is overwhelming. And the longer you leave it, the closer the chances of breakage when you do approach 100%. Result: you inevitably fall off the update treadmill at some point, usually during some sort of crunch time, and then never update again because who wants to tell their boss they spent a day debugging a self-inflicted problem? Safer to leave it alone and get on with your work.

This is why I switched back to Debian. Updating is stress-free and there's no downside to doing it as often as you like.


fwiw, I find the long term support pretty minimal. I'm only slightly more conservative in my use of`pacman -Syu` than I would be with `apt-get dist-upgrade`. If I notice a kernel or DE upgrade, I might hold off until the weekend on the off-chance that it needs a bit of extra maintenance. I appreciate handling these changes in smaller batches, as opposed to the 6-month system-wide major upgrade cadence. Of course, if I was just jumping between LTS releases, maybe only dealing with serious maintenance every other year would be an appealing trade-off.


(of course, I don't mean to undermine the argument for hesitating to jump to Arch on the grounds of maintenance. It certainly takes thinking about maintenance more often.)


> I think you're overestimating how hard it is to get up and running with Arch Linux

Not really.

Arch was a PITA. FreeBSD has a smoother install (not a compliment to Arch).


And OpenBSD installation is even smoother than FreeBSD!


I didn't touch OpenBSD from, let's call it, the latter half of George Bush's presidency through the first half of the Obama administration. My first install of OpenBSD in years was a moment of "oh this old thing still?" followed by "huh, that was just as easy as I remembered."

It's all about audience. OpenBSD's installer is a thing of beauty for technical people. I can see where it would be daunting and texty to someone who's never touched a command line, but that isn't the audience.

At this point, I feel forcing technical users through a plodding GUI install wizard is as cruel as making grandma install OpenBSD with a complicated RAID configuration and volume encryption.


I installed Arch earlier this week. It took two beer's worth of time to do.

Followed the instructions, hit a few snags, and resolved them by reading the wiki.


Indeed, Win7 being good enough and not degrading in any way is a big challenge facing Win10 adoption, and that has been said times enough before. Once Win7 is out of support, may it be from popular applications or simply security updates, we may see things changing.

There's also the factor of familiarity to consider. People buying new machines will learn the new system, and overtime as they learn how to use it, they will prefer it to older system versions. However, this is another long-term factor, and as such Windows 10 seems doomed to only slowly gain adoption.

Note:

Considering the reputation Windows 10 got for itself at launch, this is less surprising. For non-technical users, the upgrade to Win10 has probably been pointless or frustrating: new concepts, new default applications and system UIs these users have learn to perform their ordinary tasks. What's more, for many users, upgrading their system to Win10 left them with an unusable machine. Win10's launch was a mess.


Some say Windows 2000 was good enough already. Everything worked well.


Windows 2000 was the best version I've ever used. Back in the day, when XP was relatively new, if I bought a machine and it came with XP I would try and retro fit 2000 onto it. The 2000 installation would be noticeably faster for what appeared to be a similar set of features.


Currently checking the history of Windows 2000. Seems W2K was aimed at enterprises while ME at consumers. A bit less than 2 years after W2K's launch, WinXP came out and unified the consumer and enterprise line.

Then, almost 8 years after WinXP's launch, MS released Win7. There's a lot of improvements between the 2, but I wonder what were the major factors that made users upgrade to Win7. I only remember a few pain points in XP that were relieved in Win7: file/app search, connecting to the internet, system updates, and window management.


Personal reasons for upgrading to 7 in order of importance:

64-bit memory space support (crucial, considering enormous browser bloat after the FF 3.5 era)

TRIM support for SSDs (I switched to 7 when consumer SSDs started to become affordable)

DX11+ for gaming (DX10 was a wash in Vista, but by the time I switched to 7, there were some games that actually did look noticeably better with DX11)

Half-Open connection fix for torrenting.

But I'm a power user, so reasons for regular users may vary.


I upgraded to Windows 7 over XP for mainline x64 support and new window management feature (aero snap). Also, being able to sort system tray icons, as well as the taskbar were very welcome.


I upgraded from XP to 7. For me the main plus was stability - with XP you had to reinstall from an image when it got screwed up quite regularly.


Is Windows XP still "good enough"? Even if it still got security fixes, the actual security model it was built on is definitively no longer adequate for the modern world. I'd very much argue Windows 7 is rapidly falling into that territory as well. It's easy to make this "forever Windows 7" argument today, just like it used to be that people could say they were never leaving XP.

But in reality, we're rapidly approaching the point where we absolutely need to be able to sandbox apps and segregate them from the platform like with UWP.

While the basic functionality of the OS you need may not change much, the constantly moving target of security definitely will, and both 8 and 10 introduced significant improvements in security.


And before that people were vowing never to upgrade to XP, and before that people said 98 would be the last version of windows they would ever run...


The difference is that after XP, the majority of users didn't upgrade.


Windows 7 did get a majority of the Windows market.

Windows 10 will get a majority of the Windows market. In fact, in some areas -- the USA and the UK, for example -- Windows 10 overtook Windows 7 late last year, on StatCounter's numbers.


Yes, when you trick and manipulate users into upgrading, it results in a substantial short-term boost in market share.


I don't think moving people from one version of Windows to another counts as an increase in market share.

And since they gave away a free upgrade for the life of the device, at a considerable cost in terms of servers and bandwidth, it wasn't profiteering either.


And since they gave away a free upgrade for the life of the device, at a considerable cost in terms of servers and bandwidth, it wasn't profiteering either.

Pro tip: when a for-profit company does something that appears to be uncharacteristically altruistic...

...Aw, heck, you know what? Never mind. Enjoy the garden.


There were advantages for users and advantages for Microsoft. Both sides won.

As for your "garden", anybody can build their own Windows PC and sell it, without restriction. Anybody can write Windows software, and sell it. Anybody can use a Windows PC and software for any purpose they choose.

If you want to attack people for restrictions, you can find far more of them in other parts of the tech world.


> Is Windows XP still "good enough"?

If you have hardware that only runs in XP you don't care.

I have more than a few Windows 7 installations that are nothing but hosts for virtual machines that run XP.


I agree with you, but looking at the problem from Microsoft's perspective, they have to sell you something, whether you 'need' it or not. It's interesting that Microsoft is the only company still selling a consumer OS, everything else comes free with the device, including updates. This problem might be part of the reason for that.


The best way to maximize sales is to align your interests with that of your customers. I wish Microsoft would realize that, because Windows 10 is heading in the opposite direction.


World has moved heavily into mobile, touch-screen operations, use of sandboxed apps from online stores, AI assistants, cloud integration, biometric security etc etc.

Windows 10 has moved heavily into mobile, touch-screen operations, use of sandboxed apps from online stores, AI assistants, cloud integration, biometric security etc etc.


Microsoft has successfully trained me that OS upgrades can be very traumatic. I consider myself an early adopter, and I always run cutting edge on my phone or win laptop, but I only ever upgrade my primary workstation when it dies and I have to.

Maybe in place upgrades are fine now, but I haven't trusted anything but a fresh install for as long as Windows has existed.


Not just OS upgrades - even updates alone are problematic.


The users may perceive operation satisfying, yet it may not be so, eg. because of security issues, which on the other hand pose dangers to other users as well, so the it doesn't matter to me attitude doesn't stand.

Given that the upgrade was free just as in the case of a linux upgrade the good enough point should consider these factors as well.


I'm sure you would agree that security updates/upgrades are an exception to your rule.


Forced, untimely, security updates are certainly unwelcome and the #1 reason I'm off Windows completely at home.



MS forces the change to force a sale


You're aware that the overwhelming majority of those forced updates were free, right?


Pro tip: If it's "free," it just means you're the product.


Because developers want to make more money. This is why the app store format is so incredibly popular amongst the developer crowd - it forces people to keep paying you to upgrade app to follow breaking Apple/Google changes just to keep the functionality of the app the same and it appearing for users of new phones. 15 year old software that works well and you buy once does not bring a steady income.


I think that you're wrong about why developers love app stores. It's about someone else solving all of the problems which aren't "make your app". It handles updates, feedback, discoverability, CC info, password resets, etc.

I've never had the experience you describe; which app(s) are you talking about which need to be bought repeatedly with no new functionality?


Not a fan of the app store model, but I've literally never heard of people paying for updates on app stores. In fact I've heard many developers complain about the lack of paid upgrades on the Mac App Store.


that was... anticlimactic, to say the least.


What did you expect from a very simple ARG called 'save the date'?


Anything except... sigh, I guess nevermind. I fell for it.


I'm more impressed that they actually created those numbers to be seen by satellites


They didn't. Go to the actual coordinates on Maps and they're not there.

It's Photoshop.


Actually look at the videos on the ending page. You can see that they actually did create them. There are people in some of the gifs.


Well, yes and no.

As you know, the italian train system can be divided in two layers: "Le Frecce", fast trains that only stop in important cities and "Regionali", (relatively) slow trains that stop even in tiny towns. The former are fast, pleasant and reasonably priced (especially the Freccia Rossa, the fastest trains in Europe IIRC); while the latter are extremely cheap thanks to state subsides but the experience offered is awful: perpetually delayed, dirty and overcrowded.

People usually complain about the "Regionali", also because they are used on daily basis by commuters.


> perpetually delayed, dirty and overcrowded

The regional service deserves this reputation because it has been neglected for years as Trenitalia built the Frecce but I think if you return to Italy you'll be pleasantly surprised, as Trenitalia has been quietly upgrading the regional trains to newer, double-decked cars (almost 2x capacity) and from personal anecdotal experience the ontime service seems better than it used to be, as well.

Obviously YMMV depending on what region you're in, the time of day, etc. but there's definitely been improvement on the regional front.


This was a very interesting read, thanks for sharing it with us. As an italian, however, I've to say that wolf-calls here are definitely perceived as very rude and as a thing of the past.


What's a wolf-call?


It’s not clearly defined, but it goes from: narrowly, the howling that Tex Avery’s wolf character does when he sees an attractive female to, more generally any equivalent reaction: whistling, clapping, openly appreciative remark.

Practitioners generally describe it as a positive reaction; feminists see it as sexist and objectifying, up classify it as sexual aggression.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: