> Intel customers who use Ubuntu will not see any regressions as we will simply continue to support XMir in the Intel driver as part of Ubuntu.
This is an issue only if you wish to use XMir outside of Ubuntu. Ubuntu controls their own ecosystem so they can just patch/compile/repackage their distribution as needed, like they do for many other packages (i.e. Mesa, xf86-video-ati, and xf86-video-nouveau). The core issue is that they wanted to push XMir support upstream so that other distros could use XMir.
Cryptic messages are making people draw their own conclusions as to what the issue is.
> We do not condone or support Canonical in the course of action they have chosen, and will not carry XMir patches upstream. - The Management [1]
> I've said it before, I'll say it again. You will not make your open source project better by pulling another open source project down. - Michael Hall [2]
I think we are not seeing the full discussion happen in public light and there is a lot of guess work happening.
In all likelihood, few other distros will be interested in shipping XMir. Pushing upstream is usually a better thing to do because it helps reduce the maintenance burden - you're not maintaining a patchset.
Intel's decision is basically saying they don't want any of the burden of helping maintain (X)Mir. Canonical/Ubuntu make a rod for their own back every time they settle on these Ubuntu-only solutions imho.
I don't really see why Ubuntu is getting bent out of shape about this. Maybe this is just a media/blogger generated grudge. Many very popular open source packages are maintained out of stream before getting pushed upstream. DRBD [1] was a very popular package which, after years of patches, finally got accepted into the kernel. I think there is a burden of proof required before asking someone to maintain your patch set upstream.
I don't think "Ubuntu" is particularly bent out of shape. Obviously as the authors of these patches they're already maintaining them in their tree already. All distros have long lists of local feature patches they maintain. And X.org/Intel aren't particularly concerned either, all they did is say "no". Patches get rejected every day.
This is just the community yammering away over a particularly flameworthy fork. It's a proxy war over the "Why did Ubuntu write Mir?" question. The actual path to upstream for the patches is mostly irrelevant.
I think you're mistaking what is probably largely a political issue for a technical one. The sense I get here is that there is nothing technically wrong with the patch, it's not like Intel went back to Ubuntu and said you have to fix X Y & Z to get this merged. They initially accepted the patch and only after having accepted it on technical grounds they've come back and rejected it because Mir isn't the horse they're backing.
If anything I would guess Intel is the one getting bent out of shape here, though I couldn't possibly know for sure due to lack of details.
Intel is acting in the best interests of the community, which almost unilaterally backs Wayland.
Linux is not about choice[0]. It's 2013, and the hard lesson that choice is death to a reliable platform should have been well-learned by now, if by nothing else than by the sheer fact that Apple steamrolled the Unix workstation market into oblivion. Mac OS -- which provides comparatively little choice -- grows YoY, whereas non-Mac desktop Unix, including Linux, has faded into statistical noise.
If non-Android Linux is going to claw its way back into relevance, then the community needs to decide on one framework for graphics, one for sound, etc. and back them wholeheartedly. It is a technical issue because focusing all developer effort on one good thing produces better technical results than splintering it among a dozen mediocre things. Apostasy needs to be punished, and that's what Intel has done.
If Linux is ever to become mainsteam the way OSX and Windows are then 'apostasy' needs not be accommodated.
Now you can debate whether the philosophy of the Linux and open-source community is for creating a standardized product that can be easily adopted and used by the mainstream user-base. But for companies like Intel, I understand why they want to push it in that direction.
Linux already is mainstream, just look at the success that Android has had.
And there are still competing frameworks out there that are objectively better than the decided on counterparts.
ALSA replaced OSS, then OSS4 came out. The latter is more stable and can do more things than the former can do alone, such as per application volume controls, yet it's only used in the BSD sphere.
Likewise, OpenRC is a great replacement for SysVinit without changing the overall structure of the init system, unlike systemd which attempts to consolidate all of those parts into 'optional' modules that other projects are making compulsory e.g. Gnome Shell has a hard dependency on systemd as logind cannot be used alone any more, not to mention the large changes to the init design.
Linux doesn't need to become mainstream the way OS X and Windows are. Linux is mainstream the way Linux is, and will continue to be with Android, Steambox, and a huge chunk of the smart TVs, Blu-ray players, set-top boxes, and other embedded device markets.
I'd say it's understandable in this instance - I sympathise because it was initially accepted, and it's not a large patchset. I suspect if there hadn't been the various tech articles about how Intel were accepting XMir, the patchset could have stayed in (supposition, of course).
I'm confused. I've seen this a bunch of times, and I don't understand why Canonical submitted a patch to a video driver to "support" their Xmir window server. Why does a video driver need to have code in it to support a window server? Isn't it supposed to be the other way around? Isn't the window server supposed to work on top of the video driver (and all the other hardware drivers)? What's going on?
On Linux it is actually the window server that provides the interface between applications and the video driver.
Linux does not have a standard graphics driver interface like Windows does, at the moment is sort of defined by X, and since Wayland and Mir aim to replace X, the interface will be replaced too.
This means that every driver needs to interface with each of the window managers.
I would agree that is a silly idea, but such is the world when applications need very low level access to hardware to attain real time performance. You just get a tangly mess.
> On Linux it is actually the window server that provides
> the interface between applications and the video driver.
This hasn't been true for a long, long time. Modern Linux applications directly call into the graphics driver, either through OpenGL or an intermediate library such as Clutter.
> This means that every driver needs to interface with
> each of the window managers.
This is also not true. Window managers are driver-independent, and you can easily write your own window manager without any knowledge of the underlying hardware. You might mean "display server" instead of "window manager", but there's only one display server used in Linux, and that is X11.
X11 was designed in an era before standardized graphics or input APIs, so it doesn't use OpenGL. As a result, it needs its own set of drivers for nearly everything (keyboards, mice, graphics, overlays). That's why Linux graphics drivers need to have special X11 support.
Wayland and Mir are an attempt to replace X11 with a much thinner layer, handling only the bare necessities of getting the driver and user application together so they can talk OpenGL together.
Ok, that makes more sense (or at least explains why). I'm puzzled that there hasn't ever been anything done to make this less of a tangly mess; it seems like it'd be a huge step towards making linux a better platform to iron something like this out. However, that's probably just wishful thinking, as I have no idea what it'd take technically or politically to make that happen.
There have actually been many tries, Wayland being the most recent of them and now Mir. But because the earlier ones relied too much on the old X code, they were not succesful, and Wayland has not yet secured support from the graphics cards manufacturers (except Intel). Mir is promised to change all that, but we will have to see.
Yes, I don't understand this, either. Note that a) the Intel X driver does not, AFAICT, have any special code to support Wayland and b) the code Ubuntu want to add appears to be for XMir, which is an X server that runs on top of Mir, not for the Mir display server itself. An X server running on top of a different display server should not need specific support in the drivers of the X display server; that XMir apparently does makes me worried Mir is going to be an architectural clusterfuck.
This code is an optimization, not a strict requirement, as you're right that in theory XWayland or XMir could be supported in a hardware-agnostic manner. For XWayland this is the (formerly "wlshm") xf86-video-wayland X driver: <http://cgit.freedesktop.org/xorg/driver/xf86-video-wayland/>, but it is much slower than the hardware-specific options (which also exist for the radeon and nouveau X drivers). I don't know if there's an equivalent hardware-agnostic X driver for Mir, but in no case would a production-quality windowing server have that as its only method of compatibility for an important hardware segment--both Wayland and Mir developers expect X-dependent applications to stick around for quite some time, so they must perform as well as possible.
For what it's worth, my hat is firmly in the Wayland camp; its design is simpler, its development more open, and its motivations more clear than those of Mir. Canonical does a bad job of software stewardship and I'd hate to see them in control of the dominant graphics solution for Linux.
Indeed, also X.Org is X11 Licensed (MIT license variant).
Wayland and Mir driver support is pretty much a mystery, and Canonical has been trying to engage hardware designers (Nvidia, Intel, AMD) for support. This recent spate between Canonical and Intel may be indicative that the process isn't going well.
Ubuntu's next LTS release is 14.04 so it's 8 months away, and if there's any surefire way to piss Canonical off then Intel us definitely getting that job done. I'm not trying to defend Canonical but it's hard to ignore the reality of the situation.
I wouldn't be surprised if BSD licensing is on the table front-and-center during negotiations, considering the legal barriers surrounding GPLv3.
Also, it makes me wonder where this leaves Valve. I get the distinct impression that they've been getting cozy with Canonical and they might be getting nervous about the situation.
Hmm, I would say the oppposite. The recent spate between Canonical and Intel might indicate that Canonical's engagement with competitors NVidia and AMD are going well. Canonical doesn't really need to cooperate with Intel, because Intel already makes excellent open source drivers for their graphics cards.
I think Valve is the reason that Canonical can work together with NVidia and AMD, and that they're probably not nervous about this at all.
Not really. I much prefer the GPL, since I think copyleft is (at least partly) what has made Linux the massive success it is, and I think a copyleft licence for a display server would similarly be better. But in this situation, I have to support Wayland over Mir. The reason largely being that Wayland vs. Mir actually gives credence to the argument that Linux can't succeed on the desktop because of fragmentation. Normally in response to that argument, I'd say it's not that big of a deal, and that things like different package managers and desktop environments keep a healthy level of competition, while still having a great amount of cooperation. But when it comes to display servers, we're talking about something that requires driver support, which is already one of (if not the) top problems on GNU/Linux. Forcing the small teams that deal with Linux graphics driver development to choose between using their already limited time to support the general standard or what the most popular distro uses is a horrible decision.
Not really. GPLv3 will prevent hardware manufacturers from locking down their devices if they intend to ship Ubuntu Mobile. This is a good thing for users of those devices. I can't defend Canonical's support of proprietary drivers, especially on these devices that are known to be used for such pervasive spying, but at least a non-hardware-locked device allows for workarounds (i.e. deployment of your own free drivers).
I'm sorry, but must be missing something. How does a GPLv3 display server prevent locking down the hardware? Afaik, the anti-TiVoization stuff only applies to the software that's GPLv3, not the stack it's running on.
Can you point me towards some people arguing this? It's patently false. If it were true, every single application on your typical GNU/Linux box would also be GPL.
Yep. However, if you're trying to replicate something that's already built on top of an existing piece of software using the existing API's, I'd assume you wouldn't have to patch the platform just to build on top of it.
However, as tinco explained[0], it seems that there is no "standard graphics api" for linux, and the "link" between window servers and graphics drivers is hard coded. So you do in fact have to patch the video driver to provide support for a new window server (on Linux at least).
I find the dissonance of this conversation and the optimism of Gabe Newell in the "Gaming on Linux" article fascinating.
Graphics has been screwed up on Linux ever since the 3dfx Voodoo 1 and nVidia TNT wars. It spilled over into the ARM SoC space.
I don't know what the answer is, but I recognize that chip companies not supporting someone's efforts to make their software useful on that company's chip for more people is mind boggling.
It's about maintaining a clean codebase. If a maintainer believes that a proposal is genuinely the wrong way to achieve a goal, it is counterproductive to accept it. This happens all the time in the Linux kernel for instance, where one or more further iterations are required to get something accepted. It's just juvenile to stamp ones feet because some upstream disagrees with a particular approach. You argue the technical merits, and then you live with their decision by going your own way or revamping your submission. It's hard to imagine open source working any other way.
So you're saying Wilson thought Mir would allow a clean code base a few days ago when he accepted a Mir patch and now he doesn't, despite not giving any technical reasons for the change in opinion? (Me - complete outside but the turn of events sure sounds fishy).
The answer to un-screwing Linux graphics is Wayland, and every major player in the ecosystem agreed to it. Then Ubuntu decided to go their own way so the other players are freezing Ubuntu out.
Based on the grant of rights in Sections 2.1 and 2.2, if We include Your Contribution in a Material, We may license the Contribution under any license, including copyleft, permissive, commercial, or proprietary licenses. As a condition on the exercise of this right, We agree to also license the Contribution under the terms of the license or licenses which We are using for the Material on the Submission Date.
That doesn't make sense. At the time, Wayland was more ready than a project written from scratch. Their initial manifesto also called out a bunch of things as failings of Wayland that were patently false. The Wayland core devs also did not know that Ubuntu was having these issues, IIRC.
So either:
1) Ubuntu didn't understand Wayland at all, amd decided to write their own from scratch without talking to or consulting the Wayland devs.
Or
2) Ubuntu has an ulterior motive (e.g. Not-Invented-Here-syndrome).
The hardware supports it. The specs are open. But they don't have to take responsibility for any patch that rolls their way. When you accept a patch like this, that means you have to be responsible for fixing it if it breaks. If Ubuntu wants to make their own window manager separate from everyone else, they can maintain their own Intel driver patches too.
I wish Linux video would go the way of Linux audio.
Linux audio just works. There's a single userspace API (PulseAudio), a single kernel driver API (ALSA), and all the old nastiness of the '90s (ESD, aRts, OSS) is dead and gone.
In contrast, getting video to work properly involves figuring out what hardware you're running (nVidia? ATI/AMD? Intel? misc other?), then hunting down the drivers, installing them, rebooting, hoping the hardware gets detected correctly, and so on until the user gives up and settles for a half-broken system. I consider myself fairly skilled with Linux stuff, and I still can't figure out how to get my desktop to have both kernel modesetting and accelerated OpenGL at the same time.
Except that it doesn't just work. Ignoring the hardware aspect altogether for a moment, PulseAudio itself is inherently buggy. If you run it for long enough with its volume control app open, the daemon sometimes crashes and everything stops playing sound. (It looks like that might be due to a bug in the peak volume reporting code resulting from an incredibly misleading variable name, but I'm not sure - none of it's commented, so it's impossible to figure out how it's meant to work). Resampling glitches out randomly and requires a restart of the daemon. There was a patch to the resampling code in the latest release which was outright incomplete and broken but got included anyway, and since the PulseAudio devs don't release stable versions, the only release with the fix in will be a major one with a whole bunch of new regressions. I've encountered major bugs with fixes in the Fedora package that weren't submitted upstream at all, and the patches were written and added by Lennart Poettering himself.
Then there's the hardware-specific fun: headphone outputs that don't mute the speakers when they should due to driver or hardware quirks, outputs and inputs that plain don't work, weird volume glitches where exiting one app causes all the others to shift in volume due to PulseAudio's info about the hardware volume control being wrong...
Another problem with PulseAudio is with latency and routing, which is why a lot of professional audio programs on Linux use JACK instead of PulseAudio. It's sometimes very hard to get JACK to play nice with PulseAudio, and this is exacerbated by the fact that ALSA only allows one program to use one audio device at a time. So it doesn't always "just work"...
Isn't it kinda the kernel's job to allow multiple processes to access shared resources? Imagine if only one process at a time were allowed to access the disk.
I've heard that you can get PulseAudio to connect to JACK, which in turn connects to ALSA. The idea here is that latency-sensitive applications that use routing will connect to JACK, while applications that connect to PulseAudio (like Skype, Firefox, etc.) don't care about latency. However I haven't been able to get this to work properly (why isn't this set up as the default on any of the distros I've tried this on?)
Frankly both audio and video "just work" in Linux, and have for years. It has been several years since I bothered to look up what chips a computer had and what Linux support was like before making a purchase.
I just buy the hardware I want, assume it will work, and find that it does. Maybe I've been getting lucky, but I don't think so.
The only real remaining pain in the ass is printer support, but who the hell uses printers these days?
Audio just works if you're not a power user, or use only programs compatible with each other. Try using a JACK application alongside an ALSA one and you'll see it doesn't "just work".
Even then, if you stick to ALSA applications, you'll have a few issues:
1) You're going to be limiting yourself to a lot less possibilities (Skype uses Pulse, IIRC, for example)
2) You can't have simple settings such as per-application volume control.
Sure it "just works", but it does if you use a set of programs which only the people who say "it just works" (a lot of people, I know) restrict themselves to. While it may work for you, different people have different needs.
I'm still waiting for the day Linux audio (and video, but less hopeful because of proprietary drivers) simply works in the same way it does under Windows/Mac.
Footnote: I completely forgot to mention latency issues, which is awful for audio processing.
JACK is not apart of the standard Linux sound experience for anybody but sound technicians or musicians. I'll fully concede that this particular situation may be shit, but it does not represent the typical Linux user, or even the typical Linux power user, experience.
Can't say I've ever noticed Pulse latency, even while playing Quake, except when using Pulse's networking features. Maybe I have crude ears.
I'm trying to convey the fact that you need something like JACK in order to do real time audio DSP, while in Windows/Mac, it just works. More importantly, in 2013, per-application volume control should be a given. It's possible under Pulse AFAIK, just not under ALSA.
It may not represent the typical Linux users, but while audio processing also doesn't represent the typical PC user, it works under mainstream operating systems. Until such a simple thing also "just works", I won't be happy with sound on Linux.
Basically, I'm from the group of people who want things to work on Linux for the same use cases they work on mainstream OSs before I state that it "just works". I believe saying things "just work" only because it works for most of my needs is detrimental, as it gives a wrong image and may frustrate people trying to do something different , as it has happened to me in the past, I've caught myself trying to do non-standard things a lot (read: most) of the time.
> I'm trying to convey the fact that you need something like JACK in order to do real time audio DSP, while in Windows/Mac, it just works.
There's a ton of Windows applications that take over audio and do a lot of driver weirdness. There's still ASIO incompatibilies, latency issues, etc. In fact it happens in the same space where PulseAudio and JACK intersect on Windows.
Having something that just works similar to the way CoreAudio just works would be a killer feature to have in Linux.
Instead, we get ALSA and Pulseaudio, the odd couple of audio interfaces.
Did you try nVidia graphics + UEFI bios + encrypted disk?
Wasted a whole day at work with that one. The nVidia drivers don't support framebuffer consoles, only VGA, but UEFI doesn't provide VGA console I/O. And an encrypted disk needs human interaction to type the password in at bootup, meaning some kind of graphical support. Meanwhile, the non-proprietary drivers crash the X server if you so much as jiggle a window about.
Your best bet with desktop Linux is still to buy old, cheap, common-denominator hardware. Anything recent is a gamble unless you're very technically adept or have access to professionals.
Not with proprietary drivers. AMD/NVidia's failure to produce usable drivers of their own isn't particularly relevant as far as I am concerned. In fact, it serves to highlight just how good the the FOSS drivers are at "just working".
"Meanwhile, the non-proprietary drivers crash the X server if you so much as jiggle a window about."
That makes absolutely no sense whatsoever. Frankly, I do not believe it.
My HTPC has the desktop off the top of the TV and frequently the HDMI audio doesn't work.
With my ATI graphics card on my desktop I have a script that runs on startup to force the display drive to to detect my 1920x1080 monitor otherwise it displays the screen with big black bars all around the edges.
Proprietary driver I assume? That always has been, and always will be (as far as my crystal ball permits me to see), a clusterfuck. If you opt-in to that sort of pain, don't act surprised when you receive pain.
Yeah, I was given a poulsbo netbook as a "present" few years ago.. at that time I thought cool, intel has great linux support... not!
I'll stick to NVidia for now despite all the fuckyou they get.
I've had real problems with the display stack once in the last maybe 10 years. And that was simply a case of the software support for Ivy Bridge not being released far enough before the hardware, and thus not supported out of the box on the latest stable release of anything at the time. (Of course our criteria for working might be different -- I care about stuff showing up on my monitor and X not crashing. Couldn't care less about kernel modesetting).
In contrast, audio tends to randomly completely break after every other update. Most recent major breakage I had: for one whole Fedora release, the headset mic input would not work in the one app where I need it to work (Skype). Worked fine on the previous and next releases, with the same Skype binary.
Minor problems that can be worked around just randomly come and go. Right now one one machine all audio playback will stop working every few days, until pulseaudio is killed. Another computer randomly goes to a mode where all audio playback from Chrome is accelerated by IIRC 7%. Just enough to make you uneasy due to things being off somehow, but not enough to make obvious that there's a problem. This persists until pulseaudio is killed and restarted, of course, which magically fixes things.
The only time I have hunt down video drivers for Linux is when I decide to use the closed nVidia driver. Other than that it's all kernel/X supplied drivers.
You must be drunk or something.... I for example.owned a machine.where audio only worked after I installed OSS4 in the kernel. AND removed PA and Alsa so they stopped crashing some Wine stuff.
It won't be like Linux audio, because all Linux audio drivers are open source. If Intel can't get AMD and NVidia to support Wayland, and Canonical spoke the truth when they said that AMD and NVidia will support Mir, then what reason will people have to use Wayland? It will only work with Intel cards.
Intel is actively making a stand against Open Source. I know that's unlike them, but their stance just makes no sense from a free software perspective.
When you look at it from the perspective of Canonical enabling AMD and NVidia to cooperate with introducing a new graphics driver interface, a shadowy doubt is cast over Intel's motivations.
This is the part that irritates me most about Mir:
By rolling their own display server, Canonical has set back efforts to get all display vendors on board for supporting Wayland.
If Ubuntu would have switched used Wayland (along with the other distributions), there would have been a very strong incentive for all vendors to support this in their drivers.
If Red Hat had supported .deb packages maybe packaging wouldn't be such a pain in the ass and we wouldn't have a .tar, .deb and a .rpm for every program.
There are a lot of things in linux land that don't make sense, I think picking on Canonical for choosing to do what they want as opposed to all the others who are choosing to do what they want, is very un linux like.
The difference is that the choice of package system is completely in the domain of open source software. Proprietary drivers require support by hardware vendors.
If we had solid, feature-complete open source graphics drivers this wouldn't be an issue.
If this were the case, then why is a large swathe of proprietary drivers supporting only Surfaceflinger?
If it's the most popular display framework in terms of use, then why not simply make it the default? Not Wayland nor X11 could garner the numbers required to overtake SF.
Can someone summarize the story in a way that doesn't presuppose prior knowledge of what Mir or XMir is, and who isn't familiar with the Ubuntu-Intel history?
The X Window System (known as X11 - 11th version) is currently the predominant way for Linux users to use a graphical interface, specifically a windowed interface.
It is old and was designed for a purpose that doesn't match current usage nor technology. For this reason, a number of projects have risen to replace X11.
'Wayland' is the best supported of these projects, whilst 'Mir' is similar but has much less support. Mir was started specifically for Ubuntu, and has caused significant controversy in the community.
Until these new projects are completed, and in order to support legacy software written for X, bridges that allow X11 compatible software to run on Wayland or Mir have been created. These are called XWayland and XMir respectively.
== Current Story ==
The open source Intel drivers support X11, and increasingly will support Wayland (not sure on current state of this).
Canonical wrote a patchset that adds support for Mir to the Intel drivers, and requested that this be included in the main Intel project.
Intel has, for seemingly political reasons, denied that request. Canonical will thus have to maintain the patches themselves (it is common for many distributions to maintain patchsets in this way). If anyone else wanted to use Mir, and use Canonical's XMir bridge, they would need to source it from Canonical directly rather than from the main Intel repository.
It seems that Intel doesn't want to support multiple new windowing projects, and has backed Wayland over Mir (as have many others in the community).
Does anyone know if Intel could just expose an API that is entirely agnostic of the display server, so Wayland, Mir, XMir, or some random other project could all use the same interface and get full functionality?
X.org is an ancient beast that works, but is creaky. Intel has really been leading the charge to develop a new display server named Wayland. After much publicity from none other than Mark Shuttleworth about how Ubuntu was going to adopt Wayland, they decided to write their own display server from scratch named XMir.
They want to use the same code for Ubuntu on phones, desktops, tablets. Intel doesn't much care to support anything other than Wayland because after 10+ years of X.org, they know better than anyone (most of all Canonical) what is best when it comes to display servers. They are putting all of their weight behind Wayland and don't see XMir as a viable future. As a result, they made a decision to not support it upstream.
Their driver their call. Seems relatively cut and dry to me. As usual, another case where Canonical can't play with the rest of the open source community.
Edit to add all below:
I would also highly suggest anyone curious to read:
Those two articles are both critical of Mir, but are also from some very senior developers in the low level Linux/X related plumbing of Linux. I'd argue they are more qualified to speak about the failings of X than the Mir developers primarily because they've been the people fixing it's bugs the past many years.
Maybe some people have it out for Canonical no matter what, but the fact that Wilson accepted the patches only to reject them days later while blaming some anonymous "management" seems to pin this entire "controversy" on IBM. If the only accepted course for Canonical is to shit-can Mir and take whatever Wayland eventually becomes (I mean, who's to say the same wouldn't happen to any Wayland patches Canonical produces?), it shouldn't surprise anyone that Canonical prefer working on projects they have some opportunity to influence.
This is an issue only if you wish to use XMir outside of Ubuntu. Ubuntu controls their own ecosystem so they can just patch/compile/repackage their distribution as needed, like they do for many other packages (i.e. Mesa, xf86-video-ati, and xf86-video-nouveau). The core issue is that they wanted to push XMir support upstream so that other distros could use XMir.
Cryptic messages are making people draw their own conclusions as to what the issue is.
> We do not condone or support Canonical in the course of action they have chosen, and will not carry XMir patches upstream. - The Management [1]
> I've said it before, I'll say it again. You will not make your open source project better by pulling another open source project down. - Michael Hall [2]
I think we are not seeing the full discussion happen in public light and there is a lot of guess work happening.
[1] http://cgit.freedesktop.org/xorg/driver/xf86-video-intel/com...
[2] https://plus.google.com/109919666334513536939/posts/QzAyr5fo...