I've been putting together fanless PCs for my own use for a few years now, also disliking the aircraft-taking-off sound of many desktop PCs when they are switched on. I started with the Zalman Reserator[0] water cooling tower, then moved to the Zalman TNN 500-AF[1] where the case was a giant heatsink (it was an amazing piece of kit), although more recently have been using ready-made fanless PCs (which are more widespread now most of the components run much cooler). The site at http://www.silentpcreview.com/ used to have useful information for people building quiet PCs, and https://www.quietpc.com/ still has some good components (and ready-made machines) too.
I can recommend the NoFan passive heatsinks. The CR-80EH costs less than many premium HSFs and will easily cope with an i7-6700. It's now possible to achieve dead silent operation without compromising on performance or spending a fortune on exotic parts.
Different people have different noise threshold, and different ability to filter out noise. I couldn't sleep next to a running computer even using large low-speed fans until I built one using a Reserator. I have friends who can sleep next to what are essentially heli turbines. The big annoyance of the Reserator is it's a pain in the ass to carry around.
The TNN didn't use watercooling it used heatpipes. These are not waterblocks, they're heat absorbers. The part that's bonded to the case would be a heat spreader.
A regular case wouldn't be that good at dissipating heat though.
Funny/scary anecdote: I have a PC whose CPU fan has stopped working about 18 months ago. The thing has been running as my home server, reliably, and it is very stable.
Of course, the fan sits on top of a huge heat-spreader, which still (passively) works, and the case is open, and the machine has not all that much to do serving me. On hot summer days, the system log tells me frequently that the CPU has throttled itself to prevent overheating, but that's about it. And that's Core2 Quad, so it has not been exactly engineered for this scenario. And still, to my utter surprise, this thing runs along merrily, with an uptime of currently 148 days.
If I can create a "fanless"[1] computer by accident that works and is - in face of the rather modest load I put on it - rock-solid stable, building one intentionally should not be that hard.
I own two notebooks, for Pete's sake, that are fanless, and they work well.
So, yes, the idea that a computer must make some kind of noise to show you it is working is gradually becoming a thing of the past. If sufficient cooling can be achieved without moving parts, that is preferable.
OTOH, the sounds devices emit used to be indicators of what was going on inside them. With SSDs, you can no longer hear your hard drive work. When I sit at one of our CAD workstations and watch a designer model something in Autodesk Inventor, more and more fans are spinning faster and faster as the temperature rises - the noise level correlates to the workload of the system.
If the computer becomes absolutely silent, we need substitutes for those metrics. Blinkenlights, maybe.
[1] Strictly speaking, the power supply still has a fan, and that one is not all that far away from the CPU, which might improve the situation somewhat.
I believe a lot of fans in modern systems are for worse case rather than average case situations.
As far as I know, there's nothing in the Dell et al. warranty that prohibits you from running a high-load system in a confined nook in a dust-clogged room with a high ambient temperature. So cooling needs to be designed to be sufficient even in pathologically stupid situations.
> If the computer becomes absolutely silent, we need substitutes for those metrics. Blinkenlights, maybe.
At least once a month, I notice my MacbookPro (with the silent fan) going very loud, I investigate, kill the offending process (usually Chrome or something from Adobe) and continue with my life.
On the other hand that fan is the only thing that can break,
and replacing it is probably expensive, so I have mixed feelings about passive cooling.
Perhaps the new keyboard-screen-bar-thing can be programmed to show system load.
The configurations that allow multiple 60Hz 4K monitors appear to depend on either having a NVIDIA GeForce GTX 950 or NVIDIA Quadro M4000 graphics cards.
But the NVidia GeForce GTX says it needs a 350W power supply [0]. The NVidia Quadro says it needs 120W and says it has a "Ultra-quiet active fansink" [1] ... i.e. quiet, but it still needs a fan.
Can anyone therefore confirm if the Airtops are completely passively cooled? Is it just the CPU that is passively cooled, or the graphics card too?
It is entirely passively cooled. The GTX 950 has a thermal profile (TDP of 90W [0]. The 350W recommendation is for your entire system, it's not asking you to add 350W to the size of your current power supply. The Airtop can passively cool about 200W, so they put in a GPU at 90W TDP, and a CPU at 85W TDP
Awesome. I bought a NoFan from quietpc.com last year but I struggled to get a completely fanless setup capable of driving 4K at 60Hz (there was exactly one fanless card they offered which had displayPort 1.2).... Now though it looks like Airtop will be my next upgrade!
An ordinary GPU can be made fanless with an aftermarket cooler. The Arctic Accelero S3 will handle 135w TDP cards without a fan. That's sufficient to cool a GTX 1060. You will need a large case with plenty of ventilation.
My next system is going to be an Airtop as well! I'll be in the market at the beginning of next year, so I'm hoping that they'll have a new model by then with the 1050 or 1060, and that they'll get one of the newer processors that packs more performance per watt.
Their video makes it look like they remove both the CPU and GPU default cooling and replace it with their case. So it wouldn't matter if NVidia says their card ships with a fan -- these folks rip that cooler off and replace it with their own.
Funny, I was looking at these guys' latest offerings just earlier today, before the HP announcement.
This is built by the same guys that built the FitPC (http://www.fit-pc.com/). I've had one of their little 5-watt fanless systems running 24/7 in my home for a couple of years now. I love it. Never had a problem with it. It's easily one of my favorite pieces of computing equipment.
I was looking at their stuff earlier because I'm considering getting some more from them.
I find the hum of lots of computer fans comforting. I did a lot of my work at uni very late at night in the computer labs when it was empty, just the constant noise of a hundred or so desktop machines. It's my ideal working environment.
That's awesome. http://mynoise.net has similar ambient sounds, they have a section on "Transports" that contains (just a few); Spaceship, Railroads, Sailboat, Aircraft cabin, Flying Fortress.. etc
It's awesome when I'm writing short stories, they've also got stuff like; RPG Dark Forest, Dungeon, Battlefield that helps me get in the mood if I'm writing a fantasy story.
IIRC it's a sound engineers hobby side project. I've got nothing to do with the project, just been a happy user for years.
My fear with these designs is that because the chassis is the heatsink, there is the issue of the continuous deformation (expansion and contraction) of the chassis body/heatsink, and you get mechanical stresses on the mobo that eventually cause internal vias to break. So yes, it may work fine for 12 months, but the overall lifespan of the product might be significantly shortened.
Not going to happen. They're too small to expand much.
Aluminium is 22 microstrain per degree apparently. Say it raises to 60C and is 30cm long, that gives you're looking at 0.26 mm. Easily accommodated by flex in the mountings.
As long as there's enough distributional area (if it's only warm to the touch), there shouldn't be enough heat build up to cause it to wear faster, right?
I've own and have built a number of these and not not seen this ever. The cases are much thicker and rigid than you'd expect. 5+ years for a few of them now.
Last PC I built I looked very hard at fanless, but eventually decided to choose my fans well and run them as slowly as I could get away with to keep the computer cool.
It works well - the swoosh of the 120mm on the back is just audible when the computer is working hard (with the computer 4ft away from me). Otherwise I don't hear it.
Oh - and when I game occasionally I hear the graphics card fan, but that was expected. The fan only runs when needed (ASUS Strix) so is silent in normal use.
Yup, large (100+mm) and slow (<1000) RPM fans are the way to go. Put it under the desk (desktop to deskunder?), use SSDs and a noise insulated case, and that's a fairly quiet machine. And that's before going overboard with liquid cooling.
Gaming has lots of sound to mask any kind of noise from the case, unless there's something quiet and emotional. I have 1000w 7.1 speakers, and never notice case noise.
I have a four year old GTX 680 with the blower reference cooler (fairly quiet). I've blown air into the outside vents to get the dust out, but a few weeks ago, I took off the shroud for the first time, and was amazed that there were no dust bunnies anywhere, even though I hadn't blown the dust out for at least a year.
I recently got my parents a fanless desktop computer. There a lot of options, but I went with a cheap Shuttle XPC slim barebone [1]: It is tiny, has a really nice industrial design and two front facing COM ports (!).
Yes, I noticed that to. I just ended up going for a cheap ~6 Euro USB adapter. On the plus side you get an SD card reader. (And besides Bluethooth it has great connectivity: 2x LAN + WLAN with external antennas.)
"Two serial ports: Many PCs do not have these legacy ports any longer, since they have been superseded and replaced by USB for most consumer applications, but they are still commonly used for applications such as industrial automation systems, scientific analysis, POS systems and other such fields of application."
I am using this https://www.amazon.com/Qotom-Q190G4-Celeron-Processor-Barebo... machine as a Linux router and also as a media player. It's astonishingly cheap for what it is and is not an Atom which CPUs I really dislike because of their incredibly slow single thread performance. This is not fast either but it's about on par with a i3-4020Y so don't be put off by the "Celeron" marking. It's very similar to the Shuttle @dpfu mentions in this thread except a little different in ports and appearance and 20% cheaper. The Shuttle one has two DIMM slots while this only has one.
Qotom boxes are great! I run one as a router with pfSense, and it gives me no trouble. If I ever start using my TV again, I plan to get another for media center use.
One note of interest: depending on the shipper, you may need to pick it up in person, as photo ID may be required if it comes through customs. This was no issue for me since the DHL depot at BWI is less than a mile from the light rail stop - I even got to see a couple of 777 and 747 freighters up close! - but may be inconvenient depending on shipper and depot location. Still very much worth it IMO - the price/performance of Qotom hardware is extremely impressive, and nothing I could more easily source compares.
Yeah when I looked at the price I was extremely skeptical of getting four Intel Ethernet controllers but there they are... LOL I remember when a four port Ethernet card costed more than this whole machine :D
I'll second the Qotom boxes. I use it as a firewall and run Sophos UTM Home. It's fanless and the integrated heatsink gets kinda warm to the touch but with some air flowing over it (ceiling fan) it works great.
Checked this out last night when it was linked to in the HP post. I like the direction these fanless PCs are going in, but as others have said, it's more or less a solved problem.
Fanless laptops are where I'm now looking. The UX360CA Zenbook is fanless, low-power, and very reasonably priced (at least in the USA - still waiting on price to 'settle' over here). Silence would be very welcome for night reading and viewing.
> Fanless laptops are where I'm now looking. The UX360CA Zenbook is fanless, low-power, and very reasonably priced
The fanless Zenbooks are fantastic. But they're also really low power with the Core-M CPUs with a TDP of 4.5 Watts [0]. That's in a power budget that could fit in a tablet.
The Airtop-D has a Core i7 CPU, which has a TDP of 65 Watts (can downtuned to 37W) [1]. You can roughly expect it to have 10x+ more computational power.
You will be sorely disappointed if you expect those CPUs to have a 10x difference. First, you have a 2.75x clockspeed difference and then a double core count so 5.5 times is what you could expect if they were the same architecture but the 6Y75 is Skylake (don't be mislead by the "Core M" moniker, these the same architecture as their larger wattage brethen just with a lower TDP, they used to mark them with an Y postfix like i3-4220Y and the Y is still in the moniker) and the 5775C is Broadwell but it has eDRAM so the difference is negligible here http://www.anandtech.com/show/9483/intel-skylake-review-6700.... So in real world, you can expect to see a 3-5x speedup.
Yes, 3-5x (peak performance) sounds quite reasonable. But 10x is not out of the question under high load for extended amounts of time (like compiling big software) when you take the form factor thermals into account. A laptop like that will start throttling quite aggressively quite quickly.
The Zenbook I have isn't a powerful computer by any measure. It's a fine laptop for light use but the CPU and thermal characteristics aren't suitable for intensive computation. I'd expect any fanless desktop computer to wipe the floor with it when it comes to perf.
This is why I rent a server at Hetzner. Considering you can get an i7 3770 for 33.61 EUR a month, it's pretty economical. It also serves as a backup (2x3TB) and occassional webhost.
Ah, that explains that. Bearing in mind I'm thinking of upgrading from an AMD 350e Fusion CPU, so I know I'll be blown away by the performance of a new machine. It's just picking the right form factor.
I can't wait to see how quickly mutt and vim load. [Removes tongue from cheek]. In all seriousness I do run a VM, a situation I'm trying to address by writing the main software tool I use in Windows as a Vim plugin. The VM works for now, but is obviously a strain on a minimally resourced system.
And if I'm completely honest I'd love to be able to run my business and do all my work from a Raspberry Pi and/or netbook (think Chromebook pricing but entirely Linux friendly). I'm only editing text, why complicate things!?
Yes, absolutely. I'm not comparing it to the Core-M/iX range though, just making a point that in my situation either of those will be a vast improvement. Despite the fact that the E-350 does what it needs to just fine right now, it could be a touch quicker.
Thanks for that. It's something I've been trying to get to the bottom of, with a very limited understanding of CPU comparisons. I understand clockspeed is not something that can be directly compared, where MIPS might be more of an apples-to-apples comparison?
In that light the M7 turbos up to 3.1 Ghz, where the i7 turbos to 3.7 Ghz. That's with double the cores/threads. On paper not a 10x improvement, but in terms of MIPS it could well be. In reviews people don't seem to be phased by the new CPUs, they say they can watch HD videos and run VMs as normal, but nobody has yet given a succinct summary of the limitations. Not to mention that in these specific 2-in-1 Zenbooks I've seen the CPUs are Core-M3s, not M7s.
Aside: I didn't know the Core-M CPUs couldn't handle more than 16GB RAM, that's an interesting stat from that spec sheet.
The decision between new desktop or laptop is still to be made. A situation where something like the Zenbook could remote into a more powerful desktop sounds good but in practice would it work - I sometimes prefer to work without an internet connection, completely remotely. Sure I could sync the machines with git, rsync or similar, but the laptop still draws me in. Particularly the 2-in-1 form factor. My Kindle (keyboard!) recently went to Davy Jones' locker and I'm quite happy reading on my netbook if the settings are right. So then the tablet + desktop combination comes to mind. I don't know. Power vs portability still an issue in 2016. Hopefully not for much longer!
> I understand clockspeed is not something that can be directly compared, where MIPS might be more of an apples-to-apples comparison?
There's a joke saying that MIPS is acronym for "Meaningless Information Provided by Salesmen".
It's just not a good measure of overall performance, because there's no guarantee that it translates to real world performance.
The only way to get good performance indicators is looking at benchmarks, and with an emphasis to benchmarks that resemble the workloads you expect to be running.
For ballpark estimates (of similar hardware), the power consumption is a good measure because the performance per watt figure is typically quite close for similar CPU architectures. It doesn't work for GPUs or CPUs of very different architectures, though.
For the comparisons you linked on the Surface, it will tell you how the CPU works on that particular form factor. But it's not a good estimate of the CPU power in general, because a small device like that is always going to be thermally constrained. You can't take this conclusion and apply it to another kind of laptop/desktop.
To go down the rabbit hole even further common benchmarks will run with one-off drivers to give the appearance of better performance.
I've seen vendors skip mem-zero, skew buffer sizes, overclock, disable thermal limits and all sorts of other shenanigans specifically targeted at gaming benchmarks.
Only real way to evaluate hardware is to run the exact loads(or simulacrum of them that's not disclosed to the vendor) which probably overkill for consumer hardware.
I have been using a laptop and a docking station for a while now. I think this is the optimal configuration.
- no / very little noise
- no need to synchronize between work / home (Given that I also have a docking station at home, I essentially use my laptop as a portable hard drive with a screen)
- I can take my current work to meeting rooms anytime.
Doesn't that heavily depend on usage? At least I've never had a laptop which wouldn't turn into a noise generator once you start compiling, computing or heavy imaging/editing.
But yes a single machine + docking station at home and one on the job isn't a bad solution
> Doesn't that heavily depend on usage? At least I've never had a laptop which wouldn't turn into a noise generator once you start compiling, computing or heavy imaging/editing.
Exactly. For the past decade I've built desktop computers that are silent no matter the load (I don't do any GPU heavy work). I really cannot say that for my work laptops.
This is awesome but expensive. The base configuration is $800 ($795) for case+motherboard+power_supply. I'm used to $50 case, $200 mobo, $100 ps. $800 is a lot more than $350.
Maybe you'll get $450 of enjoyment out of a sweet silent workstation.
http://airtop-pc.com/product/airtop-customized/
Well, you should at least compare apples to apples. A fanless power supply will cost your more like $200, a quality small case will cost you $200, a fanless cpu cooler like the NoFan will cost you $100, so you're at about the $700 range, which is not too far off. A $100 permium to get all of this integrated, and able to cool 200W (as opposed to about 100W of cooling in the generic solutions) is a lot less extravagent.
I've got a MintBox 2, which is a fanless PC produced by CompuLab (same manufacturer as the Airtop).
From what I can gather, CompuLab's primary market is industrial/embedded PCs. As such, they have extremely long SKU lifetimes.
The MintBox 2 for example comes with a 5 year warranty, and the "Long Term Availability" of the Intense PC SKU (released in 2012) is 2019. So they will still be selling 3rd Gen Intel parts in 2019.
The MintBox 2 cost $599 US when released, and if you check the price today it's still the same $599 as on release day. So don't expect the models to get any cheaper over time.
The only issue I've had thus far with the unit is that it's very picky on memory. You have to install modules from their QVL or you'll end up with memory errors and crashing.
Otherwise, if you want a compact fanless PC, you don't want to build it yourself, you want a long warranty, and you're willing to spend Apple-level money, then CompuLab has the product for you!
Arguably they've been around for a while now (I have two fanless shuttle mini PCs running on my desk, and one laptop made fanless by stabbing a screwdriver through the vent holes into the fan). It's just that whenever you want the last three inches of extra performance, you pay a heavy price with respect to thermal power.
I was hoping for something like this with the skylake iris 550. Top of the line perf/watt (for a cpu+gpu). GPU fast enough for light gaming/webGL, or older games.
There's an intel NUC that's close (M2 slot + iris 540), but it's a little pricey at $340 and uses 15 watt version (that throttles very quickly), instead of the 28 watt version that can actually sustain a decent level of performance.
Fast, quiet, decent CPU+GPU shouldn't really be that hard.
Seems like GPUs are getting their with the gtx 1050/60/70 having tiny half length cards available, but I've seen very few cases designed for them.
I've owned a couple NUCs. One that was fanless (Gigabyte BACE-3000), but a bit slow. I upgraded to an Gigabyte Intel i5 NUC, it has a fan but I can't hear it at all. Brilliant machine.
I think the "quiet pc issue" is a solved problem provided you aren't building a massive gaming machine.
interesting. the whole case is a heat sink directly applied to the cpu.
although, 100W cooling does nothing for the r9 390 which is something like 275W TDP. i have a veritable wind tunnel below my table, blowing really hot air up my shorts.
I wouldn't call it insane (except for the cable cutting, that seems like it's just unnecessary when you can just remove the component or likely modify with software), but I also think it's ill-advised for most use cases that aren't yours, which is likely why most people are saying it's nuts.
A lot of modern computing is just greedy with resources, and will gladly eat up as much as is available, be it games or web-browsers. My first computer building experience I didn't fasten the heatsink to the mobo and it would lose it's contact with the CPU when the case was upright; powering the computer on when it was upright would allow it to load for ~ 10 seconds then shut down with no warning. CPUs and other high performance components hit their thermal limit fast, meaning they get really hot and stay really hot pretty fast without some mitigating factor. To me it seems illadvise to not give the components all the help you can to keep them cool.
Granted, my first home built was with an e8400 wolfdale, and the power and thermal efficiency of CPUs has come a long way since then, but they still get toasty without a bit of assistance. Well planned passive cooling is great - my first GPU was passively cooled and it served me well for a good 5 years until a decent night of blackjack in Vegas afforded me a new GPU. But I could still see how it struggled when the card started to reach it's thermal limits. The card ended up getting donated to a friend of mine, and I think she's still using it successfully to this day. But that being said, the passive cooling on the card was nearly triple the size of the card itself.
Modern CPUs will just downclock themselves. I mean it's not insane; you won't damage the components, but you will be running A LOT slower than you could be.
Do you run tools to see how far down your CPUs frequency scaling is taking it?
I was about to write the same thing. The CPU won't be damaged, but it may underperform. The bigger concern would be other components, especially motherboard VRMs which seemingly tend not to have any thermal failsafe (on occasion, there are reports of them catching on fire).
I would be surprised if any downscaling takes place. I run some heavy computation software from time to time and compare the performance to other computers. Many of them in datacenters. I would probably notice if a machine suddenly downscales.
That said, I would be happy to measure it and post the results. Running the computations now. According to "top" it saturates 2 of my 8 CPUs. So far, "lscpu" reports the CPU at 1600 MHz. "sensors" reports the CPU Temperature is slowly rising. Was 48°C at IDLE. Is 50°C now.
15 minutes later. CPU temp was raising so slow that I stopped it now. Don't think it will get to anything critical ever. It was at 52°C. CPU speed was still at 1600MHz.
A lot of CPUs will cycle-skip which causes worse performance than just manually reducing the multiplier. Especially for real-time tasks like audio or video.
It works in the sense that the hardware will survive quite a while, but you are likely giving away a lot of performance to thermal throttling (at least if you have a mini-PC, it can be factor 5 - 10 or so)
I used 120mm fans in my previous build, worked fine. However the enthusiasts seem to have moved to 140mm. Fits all the normal size cases and moves either more air for the same RPM, or the same air at less rpm. My favorite thing about them is the pitch of the hum is just a bit lower.
There are discreet graphic cards with passive cooling available. The power-hungry models fitted with dual 120mm fans and a large heatsink are not too noisy either.
> "Whereas IBM's PC (and almost all PC compatibles) had a power supply in a corner of the main case, the PC1512's power supply was integrated with that of its monitor. The monitor had sufficient venting to cool itself by convection, instead of needing a fan. The PC1512 was therefore quieter than other PCs. Rumours circulated that an Amstrad PC would overheat, and while existing owners would note that this did not happen, new buyers were discouraged. As a result, later models had a cooling fan integrated into the main case."
This is not all that new of an idea. I had one of these around 2005/2006: https://www.quietpc.com/hfx-mini-metal - completely fanless and (virtually) silent - back in those days there were no SSDs, so one had to content onesself with slow spindle speed HDDs mounted on soft rubber to minimise vibrations.
Despite the fact that the mfr appears to think it's a "revolutionary" idea, I'm not really seeing any material difference between that and the case above, from over 10 years ago. Perhaps I'm wrong though?
I had a couple of machines from Hush Technologies (long since out of business), also ~10 years ago. They were passively cooled through a finned chassis, connected to the CPU and other hotspots by heat conducting pipes.
Like yours, sounds like the same sort of thing as these Airtop machines.
If this thing can dissipate 200W of heat, that I must say it's truly impressive. I cannot say I want to buy one (I'm not that sensitive about noise), but from technical point of view it seems like a little marvel.
Seems to me that if someone could figure out how to create a stable Hackintosh install package that exactly fit one of their builds it would be a thing of major usefulness.
Does anybody understand the copper heatpipes? It mentions virtual vacuum, but my understanding is that heatpipes are typically filled with a liquid that boils and turns to gas as it heats, thus moving up along the pipe and carrying heat further away to allow for a larger contact surface with the heatsink. Are airtop's heatpipes somehow different? What "virtual vacuum" and how does it work?
On a slightly related note: does anyone have recommendations for embedded PC distributor/manufacturer in EU with online order system? I don't want to waste time talking to sales reps.
From time to time I need a PC to stuff inside places a regular desktop wouldn't survive too long (think factory floor).
I've ordered from IPC2U but they introduced a minimum order of 10 for their Intel NUC based systems.
A fairly powerful, fanless computer with a 6 year warranty that can run multiple 4K displays for less than a well built MBPro is probably worth it to quite a few people. Besides people who just "want" a fanless PC, there are lab applications that require noiseless computers. Noiseless lab computers are VERY expensive. I'm sure there are many other professional use cases where $1800 isn't that much money. Maybe video/audio production?
Too much compromises and too expensive. For this money you can buy a normal i7-k processor with Noctua cooler, Asus Strix video card with DirectCU and PSU by 'be quiet!'. Put all this stuff in a good case and you will get a PC with a great performance and almost no noise.
What is the use-case for this? It seems pricey for entry level, but has a lot of nifty options for power-users.. but, then why would I risk having no airflow or extra power (200watt PS) for when I get cookin' with storage or fast video?
I love the fans in my PC, the way they make almost no noise.
But they make noise. And worse, they are a marvel of engineering. If some accident of dust or bumping were to upset their balance I fear will I be back to the loud old days.
Worst of all, they are moving parts that might simply fail. Having a thing that is fanless from the start is much less likely to suddenly find its assumptions about heat dissipation violated.
I'm a 'fan' of silence, but a Silencio ATX case is <100 and I never notice my computer, ever. I also spent a lot of time finding just the right keyboard and mouse, a sweet-spot of solid action without too much click... us techies sure are picky now that I think about it, and for a good reason, tools of the trade.
> I'm a 'fan' of silence, but a Silencio ATX case is <100 and I never notice my computer, ever.
Sounds like you have a pretty high "silence" threshold. I have a Define R4 with undervolted front fans, and while it's good enough when awake I couldn't sleep next to it.
For starters there are presumably use cases for which silence is of critical importance. The computer at the production desk in a broadcast or recording studio, church, or theatre comes to mind as one example.
I've never had any issue with PC fan noise - I have a fish tank in the room where my desktop lives and the PC can't be heard over the sounds of the aquarium's filter and air pump, which sound quite pleasant to me.
I'm kind of disappointed that it relies on an external power brick. If I'm spending a premium on a desktop, I don't want another bit of crap cluttering up the desk or floor.
I am not the target market for this. I'm pretty sure I might be in the minority, but I enjoy the sound of desktop PC fans, especially the larger, noisier ones.
Looks like we took the site down. I have an Intel Skull Canyon NUC and love it, will be using these from here on out. Be nice to see more companies offer fanless cases.
No. 10dB is the volume of calm breathing, 0dB is the threshold for auditory perception of a sound at 1kHz.
With regards to a CD, the decibel measures the gain rather than the volume. 0dB in this context simply means "full volume", while -10dB means 10% of full volume, and -20dB means 1% of full volume.
On the other hand, there are no fans pushing dust inside that PC(in fact, the whole inside of this thing can be sealed airtight) and the amount of dust that naturally settles on the radiators outside should be much smaller than on any active-cooled PC.
These (AirTop, Fit-PC, etc.) are completely sealed machines so no dust enters. It does build up on the case - depending on your environment - but doesn't effect the thermal characteristics at all.
It looks like the airtop has open air pipes that work like chimneys, letting the heated air inside flow up and out of it. These ought to be easy to clean with compressed air though.
200 W for gaming card? Just ridiculous. Even gtx 1050 need 300 W PSU. And modern quiet fans in quiet cases are really quiet - below 20db, so it's difficult to hear it at all. My son have such with Gtx 1080 - it's the gaming card.
GTX 1050 is just 75 watts, nvidia is just being conservative because many people have crappy power supplies or tons of power consuming extras.
Assuming an intel cpu (mostly 65 watts or less), a SSD (few watts), some dims, and the normal overhead for usb/power conversion, etc. you can get away with quite a bit less than 300.
card: 75w, cpu when gaming: 80w, memory: 6w, 2 ssd drives: 6w. Total 167 W. 200 W pcu with 80% efficiency: 160 W. PCU with 95% efficiency: 190 W. Step left - step right (any gaming hard moment) and bye bye. Looks like nvidia recommendation is not so dumb. And it's for a low-level gaming card.
> 200 W pcu with 80% efficiency: 160 W. PCU with 95% efficiency: 190 W
That's not how efficiency is computed (in general, but especially in PSU).
The power rating (200W in your example) is what the PSU can provide to the system as DC power, the efficiency is the difference between AC input (wall-draw) and DC output.
An 80%-efficient 200W PSU will provide 200W internally but draw 250W from the wall-plug (and waste the extra 50W as heat).
If your 200W PSU maxes out at 160W it's not because it has 80% efficiency it's because it's a worthless piece of shit. You should throw it away as there's significant risk it will blow up (figuratively or literally) and/or catch on fire. Here's an article (in french but you can look at the pictures) checking out garbage PSU exhibiting that sort of behaviour: http://www.x86-secret.com/dossier-36-3000-Alimentation_Nonam... spoiler: shitty components, detonations and burnt smells.
In order to run it as a hackintosh natively, you would need compatible hardware and drivers, which this doesn't have. A niche alternative (starting to be more documented) is to install Linux and run an OSX virtual machine. Traditionally, that gave you terrible performance, and specifically terrible graphics.
A hypervisor is a very lightweight wrapper around a virtual machine that either requires a modified virtual machine (not the case here) or hardware support in the CPU so you get to run a VM with a performance hit on the order of 3%.
The newest intel CPUs have added hardware virtualization support, so if you have a discrete GPU (like for this box), you can give the integrated graphics to the host Linux system, and dedicate the discrete GPU to the virtual machine (again, performance within a few percent of native).
All of this is to say that you can run a virtual OSX system on top of linux, and get almost native performance like running a hackintosh, but without having to have a perfect match of already supported hardware components.
There are fanless NUC cases available. I have one from Akasa, it's a beautiful piece of kit. Solid black metal with heatsink ridges on the top, it looks and feels like a hi-fi amplifier.
The only NUC that I've heard that's loud under load is the new skull canyon which has the IRIS 580 and looks like a black plastic GPU, and sometimes sounds like a GPU.
The "normal" NUCs that are approximately mac mini in size seem quite in a wide range of use cases.
The site is currently reporting an "Error establishing a database connection." If at all possible, please try to make simple content sites like this static using a tool like Netlify or http://stout.is. Static sites are faster, more reliable and cheaper to operate.
I got so annoyed with the complexity and poor performance of web frameworks that I sat down for a day last year and made my own little database-less CMS for this kind of usecase. It's not quite as fast as a purely static solution, but I think it would be fast enough for HN:
It'd depend on the shared hosting provider and your site.
Having hundreds of MB of content that people are going to want to load, even if it's all served statically, will probably still have you in a bad way with the cheapest of hosting providers.
But it'll have a much better chance of surviving than someone running wordpress or something that's got zero caching and needs to do a ton of calls to a shared MySQL host.
Then yes, it would probably be fine, within reason.
When running a static site at very high traffic loads it becomes about how many connections the webserver can handle and how much bandwidth you have to serve the site itself, rather than pure power of the server to process all the requests a dynamic site would generate.
You can start to chew through bandwidth allocations pretty quickly when you get a couple thousand concurrent visitors, so a shared hosting plan might run out pretty quickly if the cap is small, even with a fairly small site. And something like Apache would need tweaking a fair bit to handle that number of connections without eating all the RAM. Nginx is better in that regard and could pretty easily handle thousands of concurrents.
So a small DigitalOcean VPS can easily handle millions of 'hits' per day if set up with a little care, and more than that if setup well, but you just have to watch you don't saturate the connection.
I mirrored a few sites that had faced the reddit hug of death (just to see what the load was like) and I found you can easily hit 200mbps* sustained connection requirement with all the visitors it brings just from the comment thread, which people don't enter as much as the list posting.
*(depending on the size of the page obviously, the bigger the page the more mbps it'll need to serve, average was around 20-50mbps).
A VPS is quite a bit different than regular shared hosting though.
Like the absurdly low single-digit-per-month hosting plans with 'Unlimited' everything. They're run to push the most customers onto the fewest boxes possible, and offer incredibly large feature sets.
So, while you might be tuning your static content to be as small as possible, you're probably on a box that's serving a heck of a lot poorly optimised sites. Disk IO is probably going to be an issue if you can't keep your content in memory (remembering that memory pressure is probably pretty high)
Ah, you're right, disk IO is one I hadn't considered too. I"m so used to availability of cheap VPS and SSD I'm not sure the last time I used actual shared hosting myself.
Those sorts of shared hosting plans have a funny way of disabling your site if you get a big surge of traffic, despite the 'unlimited' claims.
I was told once, "Yes, your plan is unlimited, but not infinite.' when I had a post on a long forgotten site go mildly popular.
I suppose on those types of machines the sheer number of connections would get you in hot water too, especially if they're running vanilla apache and it starts burning through memory there.
If it's static, which is really recommended for content like this, I'd opt for Netlify (we love it for our projects) or a comparable service.
A traditional shared host will likely run into trouble once you reach a significant traffic spike, which is exactly the moment when you absolutely don't want to experience an outage.
Netlify, even on their cheapest 10 bucks/month plan, will let you use their CDN network and offer performance that is far ahead of shared hosts, let alone database backed websites.
That's what I do, with GitLab. I chose GitLab because it allows you to use true/full https with custom domains thanks to letting you upload your own certificates.
I know you can get what looks like https with Cloudflare, custom domain and GitHub but it's not full end-to-end.
GitLab also has built in CI (rather than something external like Travis with GitHub) so you can simply push a commit and have a free 'runner' (really a Digital Ocean instance) spin up, run a build script then deploy to GitLab Pages, all for free. It's pretty amazing what you can do there to be honest.
I'm using it with Hugo but there are 'runners' for just about any SSG. I think many people have switched to using it for the ability to use Jekyll plugins unlike over at GitHub.
From the connection times I'm guessing the the servers are somewhere on the East Coast of the US (maybe they're still on Azure?) so I can hit sub second loads in Europe, the US and just about get under 1.5 secs in Australia/Asia.
It's impressive for the grand old price of 'free'!
[0] https://www.quietpc.com/reserator1-v2
[1] https://www.quietpc.com/tnn500af