My own thoughts on the negligence and indolence of the PC industry are full of rage. But this guy makes my rants seem a little tame. I love it!
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks [1] with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
One thing desktop displays desperately need is a way for each pixel to become brighter than its surrounding pixels at will. A lot brighter. Like, 50x brighter.
Dynamic brightness range is a necessary step for writing a 3D renderer that makes you feel like you're looking out a window. 256 levels of brightness aren't nearly enough.
We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824. It doesn't need to be that fine-grained. What we need is the ability to overdrive the brightness of specific pixels. That way sunlight filtering through tree leaves will actually give the impression of sunlight filtering through tree leaves.
Tangentially related, it reminds me that OLED has utterly failed to become a thing on the desktop, and it breaks my heart. It was a decade ago when I read that OLED was the next hot thing and it would bring unprecedented contrast and brightness to displays.
Today, I like OLED mobile phone displays.
But my Dell U3014s are disappointing crystal-over-backlight garbage. Not only that but expensive crystal-over-backlight garbage.
OLED will come. It's just now finding it's way into TVs, and still costs a fortune... When you think about it, a 5" OLED screen comes in a 700 dollar phone, most PCs cost that. Not many people would be willing to pay 2000 for a laptop with an OLED screen. Some people maybe, I know I would, but most wouldn't.
I think in 2-3 years you'll see Samsung Chromebooks and Ultrabooks with OLED screens...
It's not just the number of brightness levels. On an LCD display, if you brighten the backlight by a factor of 50, a black pixel will be about as white as a white pixel with the backlight at a brightness factor of 1. One of the most frustrating things about LCD displays is their inability to completely transmit or completely occlude light. This can be seen most obviously by trying to watch a movie in a dark room.
"We don't need the ability to specify that a pixel should be brightness level 64823 vs 64824."
Actually what is needed is anatomically defined range - the "true brightness", like it is for the 24-bit "photographic" (tetra-chromatic) range of colors. This would be a limit defined by the anatomical limits of human eye. There is a limit of absolute darkness (relative to the eye) and there is also a limit in the brightness of the light that the human eye can safely be exposed to. In this defined range there is a limit in granularity that the human eye can distinguish. I am not aware if such a definition exists, but I am aware that a display to respect the "true brightness" is impossible (for the mere fact that the light reflected from our faces sheds upon the dark portions of the display, and those dark portions can not perfectly absorb the external supplement of light).
It's funny how your opinion contrasts against mine.
For what I've seen, displays have become a commodity and that's a good thing. I can go buy any kind of display, choose a size and I'm probably fitted with more pixels than I ever need, the panels are neat, flicker-free and flat, and the best part of it is that they cost next to nothing. You buy a laptop and you choose the size based on how much hardware you want to carry around — not because you need this huge screen because all laptop screens are "good enough" as you mentioned: I haven't had a laptop with less than 1440x900 for... a little less than a decade. And the resolution has never been inadequate for browsing, coding, drawing, writing and watching movies which is what I mostly do.
This is completely the opposite of what we had in the 90's when 15" CRT was the baseline, you were always a bit short of resolution and you never had the money to buy that huge one-cubic-meter-in-size display that could do your 1280x960 at 60Hz or something, and for which you probably had to upgrade your graphics card and probably your PC too. That totally, totally sucked. One could live with the basic resolution and screen size but I remember the agony of something better always being almost at reach. These days screens are something you don't think about twice. Everything is, again, dreadedly good enough, and if you need something professional you can get that too it probably won't cost you the price of a small car.
In the last 10 years or so I haven't considered once whether I should soon upgrade to a "better" display or to a laptop with a "better" screen. That's bliss, IMHO.
Get a phone with a higher resolution screen, and maybe your opinion will change. My phone is 4.7" with a 1280x720 display. I now prefer to use my phone for a lot of reading over my 1920x1080 17" laptop display because the text is so much crisper that the laptop screen is annoying. Even more so the 23" screen I'm writing this on, which seems extremely fuzzy and pixelated.
My next phone will likely be a 5" phone with 1920x1080 given current prices/specs for full HD display phones - at that point I expect my laptop and desktops will annoy me even more. Current relatively low end tablets are now starting to get substantially higher resolutions than that.
I actually have such a 720p screen on my phone. It's hard to compare because I look at the phone at a much closer distance than the laptop screen. I usually read using my laptop because the phone can't fit as much text on the small screen anyway, regardless of the number of pixels in it.
I can only say that I don't see the pixels on my laptop screen either and text looks "good enough" (as in, not pixelated and I can see serifs which also look natural) and that the phone would be quite unusable if viewed from the same distance as I look at my laptop screen.
This is a really good point. When I've heard others cite their satisfaction level with desktop displays, I ask them to go through an experiment.
I'll put aside the size portion of the debate since I can't really fathom how anyone could argue against a desktop display that can fill their entire field of view, assuming such a display were available and economically priced. I think those who argue for small displays on the desktop enjoy being contrarians, and they bring up matters of taste and style (such as "I don't want something so large on my desk.").
But in terms of clarity, the argument posed by those satisfied by the status quo usually is composed of these points:
1. Users sit at a distance of 2 to 3 feet from a desktop display.
2. Therefore, high-DPI is not meaningful because the human eye cannot perceive additional clarity at that distance.
3. High-spec IPS LCD screens are good enough.
So the experiment is simple enough:
1. Find an iPhone 5+, Galaxy Nexus+, or Lumia 920+. Something with a high-DPI wide contrast-ratio display. Open up a web site or document or whatever.
2. Do the same on your PC.
3. Hold the phone up side by side at the same distance. Zoom the phone's text to match the text size seen on the desktop display.
4. Behold how the phone's display is considerably more readable. For most combinations of phone versus desktop display, the phone's display will be crisper, have better contrast-ratio, and better color accuracy.
My Lumia 920 (not even an OLED, just a nice high-DPI IPS display) utterly shames my Dell U3014 IPS desktop LCDs.
Reading text rendered with high-DPI, high-quality displays at 2 to 3 feet is an absolute delight. Not only that, you can lean to see greater detail. Yes, 2 to 3 feet is the typical distance, but sometimes I like to get closer to my desktop display to work with fine details.
I would pay a steep premium (but obviously I wouldn't break the bank) for a desktop display that matched the clarity of my phone's display. If I could just tug at the edges of my mobile phone and make it magically grow to fit my desktop, I would be so happy.
You're just wasting all your battery life with having so many pixels you never ever see with your eyes on such small screens. Do you feel happy being manipulated by phones manufacturers Marketing?
I have a 22'' screen, FUll HD on my desktop and I never complain about it. Maybe you just have a crappy LCD one.
The point is that it does not make so much of a difference in visual quality but it does impact your GPU performance and your power consumption highly, so the benefit/cost ratio is very, very low.
They haven't stagnated, they've regressed. 1900×1200 dells were 15". 24" 1900×1200 desktop displays are now almost all "full HD", which cuts 120px at the bottom and doesn't add a thing. IBM had >4k displays in 2001 (the T220 was 3840×2400, 4k is 3840×2160. And the T220 was a 22" monitor too, not a 40" TV), try to find even as good today.
I had one of IBM-s P70 bought second hand in 2001. It was amazing - it had 1600x1200 on 60Hz refresh and it was able to push gaming resolutions (1024x768) at 120Hz horizontal refresh . I still cannot find as smooth gaming experience.
Its really hard to find a decent 15inch laptop with high resolution screen - unless its targeted to gamers.
For some reason a few years ago the PC industry decided that 720p is good enough as if people only watch movies on their laptop. Trying to squeeze a full featured IDE in 720p is like looking at the screen through a key hole.
You raise an interesting issue. But is there enough money in large displays?
Let's look at use cases, and focus on the average person(assuming that the gamer market is too small to justify opening a top manufacturing line for large displays).
Usecase 1: movies. Even assuming there's value in 4K resolution(which is not certain) you still need to align and improve so many industries to make it work. Really hard.
Usecase 2: games. Most games plain folk play are casual games, and maybe playing angry birds at high resolution don't justify spending that much money . And even if there are ideas for such games, it's still chicken and egg problem.
Maybe the right strategy is to sell 10" retina displays, used at a short distance, to let people experience quality cheaply, build the relevant industries, and than offer large retina displays .
I believe you missed the target here. The argument against 4k gaming is more about the fact GPUs have to draw this increased amount of pixels, this is why the current generation consoles only draw at 720p.
Gaming also hasn't peaked in any way at the current level of resolution, as only the very top end of cards handle AA, AF, etc. in games, GTA 5 shows the need for every ounce of processing power by the fact that it looks horrible on both consoles.
As far as movies are concerned the jump to 4k has already begun, but I do believe you're right and there needs to be an alignment between the industries. The BluRay still competing with DVDs shows that, hopefully the new console generation will outside enough for a serious shift in physical media.
Man do I agree with you. I think the disconnect is that consumer tablet/mobile devices lack the ability to be views for our desktops. PCs are powerful enough to be the locus of consumer computing; we just need a good architecture to make them so.
Not to mention: when it comes to real productivity, you can't beat the desktop. It isn't the profitable sector for manufacturers, but it's still the productivity toolset.
We just need to make them exciting to the overall ecosystem.
This is one way WebRTC will be really useful (since, IMO, consumer IT is mostly Web or mobile apps). With signalling services in place, we'll be able to run hosts from desktops without registering domain names or establishing fixed IPs. The user-centric, desktop-hosted systems can grow out of that.
> Desktop displays stopped moving forward in capability in 2001
Are you kidding? In 2001, it was pricy to get a 15" 1280x1024 LCD monitor. Are we in the same state of affairs today? Would you be willing to wager that the quality, viewing angle, etc of that LCD monitor compares favourably to today's models?
I am speaking of the state-of-the-art of 2001 versus today. IBM's 2001-era T220 was pushing 3,840 horizontal pixels for something around $7,000. A massive investment to be sure. But had computer monitor R&D continued unhindered by the taint of "HD", T220-class monitors would have eventually became mainstream and the state-of-the-art would have marched forward still. I would have bought a "T220" in say 2002 or 2003 at $2,000. But the advance of technology seemed to halt and that never became an option for me.
Yes, prices have (mostly) come down. Yes, we have IPS versus TN. Thank goodness. But in terms of the top-tier specification of displays—resolution—we've stagnated and regressed. As others in this thread have pointed out, laptop resolutions in 2001 were higher than they were in ~2011. I don't own Apple products, but I give them credit for ending the tyranny of HD. Thank you, Apple!
Also, prices are not uniformly better, or at least they were not until very recently when some Korean manufacturers decided to shake up the monitor cartel. The consumer high-end in particular had been stuck with 2560x1600 30" monitors at ~$1,100 for about seven years.
The price has moved down, the tech hasn't moved up, at least not a whole lot. I bought a Sony 20" flat trinitron in 2002 from a dotbomb property sale. It was capable of 1600x1200 and up to 120Hz. Traditionally, the high end displays come down in price and the volume sales pays for R&D for the next high end displays. Today this is not the case, look at the displays going into 300$ tablets compared to desktop displays. tablet resolutions are rocketing while laptops are static at 768, 900, or 1440 (if you pay for it). The GPUs are more than capable of 4K now yet how many 4K displays are available?
Of course the problem here is the software. It's not that it doesn't exist, it's that tools to use and create such a system easily don't exist.
We can make applications which do what you describe but it is a hell of a lot of work. It would require a custom affinity system (which desktop is the tablet attached to, how do I authenticate it), a custom server-client system (how do I talk to the desktop regardless of how I am connected, including NAT traversing, ect), a custom user interaction system (how do I deal with two users using the same program at once?) ect. for every device. Of course the key problem is that tablets, phones, ect. use a myriad different technologies.
So the solution today, the high intensity one, is that the application developers spend hundreds of thousands of man-hours writing tens of different versions of each piece of code. This is how Netflix, Google, Facebook, and others do it. These applications have integration across every device, TVs, smart-phones, desktops, tablets, smart-boards, cars, ect. This is not cheap, this is not easy, and this is not very useful.
The better solution (one that I am working on personally, albeit slowly) is to build tools that allow us to write one large piece of multifaceted application code against a myrid of idealistic DSLs (one for rendering, one for communicating, one for user identity, ect.), which can then be auto-magically (note the magic) ported to every device. This what unity does, as a domain specific example.
The poorman's solution is of course HTML, but that doesn't work well with current desktops due to NAT.
I agree that software is a principal constraint. As a programmer myself, I find implementing what I would like to see as well beyond my capability, even if I invested significant time. As you point out, it would require a holistic ecosystem-wide approach. This is why I'd like to see a software titan like Microsoft attempt something along these lines. They have the resources to unify all the devices in a person's life into a single computing experience.
I very much appreciate what you're attempting to do and I think it will measurably improve matters. However, where I personally differ is on the key point I made above: I don't want applications to run on multiple devices. I want applications to be singular instances available everywhere. If I begin typing an e-mail at my desktop, I want to view my e-mail application on my tablet or phone and see the exact in-progress e-mail—to the letter—available and interactive in real-time. As I type on any device viewing my e-mail application, the letters appear on all over views instantly. Presence information (for the purposes of notification sounds and the like) could follow whichever device I interacted with most recently.
A premise of MVC that we have in large part forgotten is that views can be plural and diverse.
As you point out, countless developer hours are used porting applications to myriad devices. I'd rather conserve that effort. Have the computational guts be on a high-performance, high-connectivity, well-known (e.g., x86) application host. Then only the views need to be made plural, in a manner similar to (but obviously more comprehensive than) responsive web design. More comprehensive because some devices will be touch-enabled, some will be large, some will be small, some with a keyboard, others without, etc.
All that said, I do like what you're talking about and building. I look forward to seeing that project come to light!
Oh I completely agree with what you describe. (I use exactly the MVC pattern you describe when I work on contracts to pay the bills.)
Personally, however, I maintain that even at the level you describe the tools just don't exist to reduce the time significantly unless you give up native application feel. I.e. streaming the view, and just sending the input back. The web is a great start to such a system, but it still has a lot of hurdles to cross if it wants to be that platform (including performant (i.e. native) 3d rendering, NAT traversal for user ran applications (just switch to IP v6 already!), saving data client side, threading!, different input and display methods like you mentioned, ect.)
I also have other, fundamental, problems (more like pet peeves) with the way the web is designed to work. But my main motivator is the massive problems I have with programming languages. I should probably start a blog...
Sitting behind a triple monitor setup with large displays I don't think display tech is what is holding us up, neither are gesture control or higher bandwidth.
It's simply 'good enough' for just about anything that I'd want to do with a PC (and then some). The problems - if you can call them problems, I'd prefer to call them challenges - are to reduce the need for all these interfaces.
The best computer would work like siri does, only it would be really intelligent. That sort of quantum leap would transcend any mere improvement in hardware. All this eye candy and visual stuff does not allow me to work any more productive than what I could do 20 years ago with just a 15" green phosphor CRT. Displays are not the problem.
I think you have to realize that nobody wants to me "desktop computers" because the margins you have to make in order for that to work aren't available at the moment.
What I have seen is that it started out being computers (look at the Altair for god's sake with its switch panel!) and then it became a computer and some 'office applications' , then 'office applications' and you could develop on it, and now 'applications.' The best way to develop for a Tablet or Win/RT system is on a workstation with some development tools.
What is interesting to me is that the "PC" overtook the "Workstation" (Which was very much a dedicated development device) and killed it. Now as the "PC" market moves to more turnkey solutions, nothing has yet backfilled the void being left behind.
I see that many folks believe that the workstation of the future runs a virtual instance on the other side of a network connection, your "terminal" is a couple of specialized applications running on your application running device. I can easily see Google taking the Pixel and making it work sort of like 'spaces' used to work on the Mac, except when you zoom into your 'Workstation' space with your xterm windows, your debugger, and documentation browser its really hosted by some RDP or VNC like protocol to a service back in the cloud somewhere. It isn't a diskless workstation it's a terminal with a really rich serial protocol that runs at 10 megabaud.
You claim "But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views."
And I suspect it will, if the price of doing that is better than the price of doing it "in the cloud" (or remoted to the network).
> Now as the "PC" market moves to more turnkey solutions, nothing has yet backfilled the void being left behind.
I don't understand. I can buy a very powerful PC at very low cost from any nearby mall. I can build one custom from components I buy at NewEgg or Amazon. I can get a Mac with its great fit and finish. I can choose Windows, Linux or OS X.
What void? Nothing went away. Workstations are better and cheaper than ever. Go buy one.
I would like a better screen, I'm not interested in the Metro interface, and the new MacPro has no internal slots. There's not much excitement happening in desktops/workstations, sure. But they didn't go away.
> I think you have to realize that nobody wants to me "desktop computers" because the margins you have to make in order for that to work aren't available at the moment.
Let's regress further. Why are margins so slim in the desktop computing market? Well, for starters, aside from Apple, there are no non-windows desktop vendors worth talking about. And Apple doesn't license their OS.
So the Windows tax has basically crippled the PC market innovation-wise. And Microsoft has done little to promote progression in PC standards in the past 10 years.
Microsoft and Apple have given up on the desktop market (Apple because they discovered they could print money by making smartphones, and Microsoft because they're a monopoly that gets their money innovation or not).
I can imagine there being a large market for large, slightly curved or similar displays running on ultra-fast processors that seamlessly create environments like this http://blog.100percentgeek.net/wp-content/uploads/2012/08/de... without looking aesthetically ugly.
A sort of super-iMac, if you will.
They would probably be hugely popular among the IT/sysadmin/monitoring crowd too.
I definitely agree that desktops can have a huge win in terms of immersion. In addition to the ideal desktop screen being very high-res and very large, I think they would also use Parallax Barrier technology to show 3D imagery without requiring you to wear shutter glasses.
Perhaps once again Apple could be the one shaking it up. They already lead the charge with retina displays on the laptops, and we can hope it will come to desktop as well.
if the new mac pro is any indication, they might try to do innovative things on the graphic side. As they are in position to force a boost of the graphics performance of all their hardware they might be the only one to be in position to do so.
On the control side, I found the magic trackpad to be really different and a lot more usable for basic tasks and gestures (I'd use a mouse for gaming and a pen for drawing, but everything else is easier on the trackpad)
Too bad no company seem to be really advancing on the personal networking side. Google would have the hardware and software knowledge to pull it out, but as long as advertisement is their core business it woudn't make any sense. For now a Synology like NAS with pluggable apps seems to be the less cumbersome solution to have devices talk together.
Thanks! I have come to accept my ideas as a little crazy—perhaps if I were charitable with myself, I'd say unorthodox—so I appreciate your kind words. My blog is: http://tiamat.tsotech.com/
I believe I have an unpopular opinion about desktop PCs. The conventional thinking is that desktop computing is boring because a modern PC does everything it is intended to do just fine. That may be true, but the problem is that the industry is not interested in establishing new usage patterns—new things the PC should do.
At the end of last year, I started a series of rants about how modern technology sucks [1] with particular emphasis on the frustrating stagnation of desktop computing and the bothersome way every new portable computing device wants to be a center of attention.
I was pleasantly surprised that the author of the linked article hits the target squarely when he lists off what PCs need. The first item: better displays. He may be speaking more about laptops (and they are deserving of the shame), but allow me to rant a bit about my preferred computing medium—desktops.
The stagnation of desktop displays is, and has been for a decade, the crucial failure of desktop computing. Display stagnation is the limitation that allows all other limitations to be tolerated. It is the barrier that leads the overwhelming majority of users (and even pundits!) who tolerate mediocrity to declare everything else—from processors, to memory and GPUs—as "good enough." I absolutely seethe when I hear any technology declared good enough (at least without a very compelling argument).
Desktop displays, and by extension, desktop computing is so far from good enough that it should be self-evident to anyone who observes users interacting with tablets or mobile phones(!) while seated at a desktop PC. Everything that is wrong with modern computing can be summarized in that single all too common scene:
1. Desktop displays are not pleasant to look at. They are too small. They are too dark. They are too low-fidelity. And they often have annoying bezels down the middle of your view because we routinely compensate for their mediocrity by using more of them, side-by-side.
2. The performance of desktop computers is neglected because "how hard is it to run a browser and Microsoft Office?" This leads to lethargy in updating desktop PCs, both by IT and by users ("I don't want the hassle"). In 2013, I suspect many corporate PCs in fact feel slower than a modern tablet or even mobile phone.
3. Desktop operating systems are actively attempting to move away from (or at least marginalize) their strong suits of personal applications and input devices tailored for precision and all-day usage.
4. Desktop computers--and more accurately personal home networks--have lost their role as the central computing hub for individuals by a misguided means of gaining application omnipresence: what I call "the plain cloud." This is because none in the desktop industry (Microsoft most notably) are working to make personal networks appreciably manageable by laypeople.
5. Mobile phones and tablets are often free of IT shackles and therefore enjoy more R&D (more money to be made).
Desktop displays stopped moving forward in capability in 2001, and in large part regressed (as the article points out) since then. Had they continued to move forward--had the living room's poisonous moniker of "HD" spared computer monitors its wrath--I believe we would have breathtaking desktop displays by now. In that alternate universe, my desktop is equipped with a 50+" display with at least 12,000 horizontal pixels.
Desktop computing needs to leverage immersion (without nausea; VR goggles need not apply, yet). Large form-factor super-high-definition displays would bring all manner of new technology needs with them:
1. Gesture controls.
2. Ultra high-bandwidth wired networking (win for wired network folks) to move super high definition files.
3. Ultra high-capacity storage.
4. Extremely fast processors and GPUs to deal with a much greater visual pipeline.
Such a computing environment is a trojan horse for today's tablets: it turns tablets into subservient devices as seen in science fiction films such as Avatar. The tablet is just a view on your application, allowing you to take your work away from the main work space briefly until you return. I say trojan horse, but that's not quite right because I actually want this subservient kind of tablet very much. I do not want a tablet that is a first-class computing device in its own right (even less do I want a phone to be a first-class computing device). I only want one first-class computing device in my life, running singular instances of applications for me and me only, and I want all my devices to be subservient to that singular application host.
For the time being, that should be the desktop PC. In the long haul, it could be any application host (a local compute server, a compute server I lease from someone else, or maybe even a portable device as envisioned by Ubuntu's phone). But for now, the desktop should re-assert its rightful role as a chief computing environment, making all other devices mobile views.
[1] http://tiamat.tsotech.com/technology-sucks