Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
StarLite 12.5-inch Linux tablet (starlabs.systems)
629 points by focusedone on Aug 17, 2023 | hide | past | favorite | 332 comments


A bit ago I did a bit of an experiment with an old surface tablet, where I threw some linux distros at it to see what the experience was. I tried Ubuntu, Manjaro, and Fedora, and all were extremely janky to downright broken. On screen keyboard was completely broken with Wayland, with some apps like Firefox just not popping it open, to others ignoring input. X had trouble with screen rotation. Fedora wouldn't let me past the login screen unless I connected a keyboard, which was compounded with a bug that was causing the screen to lock every sixty seconds. Manjaro ended up lasting the longest, but at that point I had just given up trying to fix things and reverted to using the Surface as a laptop with a touch screen.

I don't know how much touch screen only computers are considered during development of desktop environments and display servers, my experience left me with the distinct feeling that I had gone way outside of the bounds of the target user of these applications. Maybe I just had bad luck. Hopefully if this is truly an under-served side of the Linux desktop experience, a successful product like this will help push the ball forward on improving support for this niche. I wonder how much the popularity of the Steam Deck is pushing the KDE team to improve touch screen support as well.


I just bought a ThinkPad X1 Tablet Gen 3. Everything works great, the on screen keyboard is actually functioning properly and appears on demand every time. Screen orientation is handled properly and promptly. The only let-down so far is battery life, but I need to make some adjustments to help there.

I'm running the latest version of Fedora workstation, under wayland too.

This thing is the only truly repairable tablet I've encountered so far, 8 screws and the display pops off, no glue, no can opener required like a Surface Pro. 9 screws to get the heatsink off (they're all captive) and you're at the full size NVME SSD.

https://mos6581.com/pictures/thinkpad/x1-tablet.jpg


I love my X1 tablet gen 1, but I need to repair the screen on it. Not sure if it's still true for the gen 3, but the gen 1 had a lot of modular parts you could add, like an extra battery that also gives you an HDMI port. Plus, I had this ridiculously nice solid carrying case.


What’s the battery life for you? You didn’t specified. Is it bad compared to tablets or compared to that same device with Windows installed? What the bad is exactly, is it 3–4 hours, is it 10 hours?


3-4 hours right now, but there is about 20% battery wear and I've not setup TLP yet.


Idk how recent you tried this but I had the entirely opposite experience with an old Asus transformer mini (which has similar detachable keyboard shenanigans as the surface). Ubuntu had some issues rotating the screen but Fedora worked out of the box. The only complaint was that sometimes you had to coax it to bring up the on-screen keyboard, but everything else, from screen rotation to the gnome finger swiping gestures worked really well in both laptop and tablet mode. Maybe its microsoft using a particularly strange hardware stack or just luck.


>On screen keyboard was completely broken with Wayland, with some apps like Firefox just not popping it open

Did you check if FF was actually running in wayland mode? Not trying to blame you, the current situation is pretty dire. The only reliable way of finding out if something is actually using wayland is running xeyes and seeing whether the eyes move above the target window.


> The only reliable way of finding out if something is actually using wayland is running xeyes and seeing whether the eyes move above the target window.

In Firefox all you have to do is navigate to about:support and check verify that Window Protocol shows wayland.


I think the parent meant in general.


>running xeyes and seeing whether the eyes move above the target window

Or running xlsclients


Been using wayland on Fedora for years and I don’t know what you’re talking about. Intel and AMD graphics here.


"Wayland" works fine, but as soon as you start mixing X11 and Wayland apps, it starts getting complicated due to different levels of functionality support for the two APIs.

I started running into them pretty fast when I added a 4K monitor to my Ubuntu 22.04 system and enabled fractional scaling. All of my Wayland apps look fine, and all of my XWayland clients look awful, blurry and obviously scaled. Working one by one to switch them over to native Wayland has been a hassle for various random reasons, most notably:

* Slack: works from the icon in the taskbar, but when auto-started at login does so in X11 mode

* VS Code: works fine when run from Terminal, but when launched from taskbar icon shows up as a different "app" in the taskbar. Launching from the terminal starts in X11 mode

* Zoom: requires more setup in order to share screen. The Zoom UI for screen sharing doesn't work so it's not obvious how to stop sharing screen. It also freezes or crashes all the time for no discernable reason, including locking up when joining some meetings three times in a row and then succeeding on the fourth time.

So if you're not really leaning into what Wayland offers it works fine, but even just fractional scaling has been a four-month hassle to get things working as expected.


I've been using fractional scaling and only native Wayland apps since May 2021 (on Fedora) though I don't use Slack, VS Code or Zoom.

As you know, "native Wayland" means the app has been modified so that it can talk to the graphics hardware using the Wayland protocol.

Emacs is the hardest of my apps to persuade to talk Wayland protocol because Fedora's "emacs" package was compiled without the code that talks Wayland.

To persuade Chrome to talk Wayland protocol, I start it with specific command-line arguments, which used to cause bugs, but I haven't noticed any bugs for months.


I suspect this is largely because the Microsoft Surface products contain a lot of proprietary hardware components that aren't not well supported by the kernel. KDE and friends have pretty good touch screen support, but it's all for naught if you have no drivers.


> Microsoft Surface products contain a lot of proprietary hardware components that aren't not well supported by the kernel

and not repairable at all as Surface itself, being honest.


I found Ubuntu Wayland touchscreen keyboard very broken on a Dell laptop with decent linux driver support. It's a real shame, worked perfectly with X.


No OSK on X11 though, on KDE.

Gnome has on both.


I am using a Surface Go 2 with Arch / Wayland / Gnome. I use the Linux surface kernel and all of the hardware works except the IR camera and webcam. Overall, it is a good experience. I love using it as a notebook with the pen and Xjournal++. Battery life is 7 hours or so.

The OSK pops up when I need to. There is a Gnome extension to make it work better like adding ctrl, alt and cursor keys for instance.

I also use Gnome extension to force apps to open maximized.

I use it primarily as a tablet at home and a travel laptop for work. I am quite enamored with it.


I'm using Debian Sid with KDE Plasma Wayland on an acer 2-in-1 laptop/tablet. It works fine, Firefox needs to be started in "wayland" mode. The on-screen keyboard could be better, but it does pop up predictably. Screen rotation just works.

edit: Touch controls also work, including gestures. It has a pressure sensitive stylus, which also works OOB.


Surface lineup is a weird one because Microsoft uses a non-standard firmware, but even still the support on many devices are great. I have a Surface Go 3 and despite the poor processor it flies on Fedora and I use it daily to do light tasks. Perhaps try with linux-surface kernel?


As someone who just this week tried to put Linux on a Surface (laptop 4), I can categorically say I'm not going to try that again. Totally broken, not even booting properly.


To be fair...Windows 11 on the Surface Laptop line is janky as shit.

I'd be far more interested in what experiences people are having with Linux on a proper Surface Tablet.


I remember working at an IT desk and trying to help a poor college student with their Surface Tablet, it would immediately thermal throttle just opening Microsoft Word, the whole unit slowed to a crawl. I helped them remove the bundled antivirus trial crapware and turn off a bunch of startup junk, but between the performance and the uncomfortable keyboard and trackpad it seemed like a miserable device to use for any serious purpose.


To be fair it’s running great here.


>I don't know how much touch screen only computers are considered during development of desktop environments and display servers

Probably close to zero. Gnome/KDE devs daily drive desktops/laptops and for phones/tablets they don't run FOSS Linux devices but also iOS and Android.

Desktop Linux is already a niche market, and FOSS Linux tablets/phones is even more niche. The only "Linux" built and polished for touch from the ground up is Android but I put Linux in commas for good reason there.


To answer the general point first:

>I don't know how much touch screen only computers are considered during development of desktop environments and display servers

Phosh, Plasma Mobile and SXMO are DEs for phones and tablets, so they support touch fine. I run Phosh (also on pmOS) on my phone and have none of the problems you described. GNOME is also making its own GNOME Mobile.

---

Now, for your specific hardware: I don't know how old your "old surface tablet" is, but I run postmarketOS on the very first Surface RT and it works fine. It uses the kernel from [1] and requires a one-time semi-complicated procedure to bypass Secure Boot and switch away from Windows [2], and even then it has some issues with CPU scaling etc not supported. So I won't recommend it as a general-purpose Linux tablet, but it's good enough for what I use it for.

>On screen keyboard was completely broken with Wayland

It's a Tegra 3 chipset so I don't run a wayland compositor on it (it would need CPU rendering), just Xorg with i3. I haven't tried an OSK inside i3, but the OSK at boot time to enter the disk encryption password (unl0kr) works fine.

If you do run wayland, wayland-native programs should be able to auto-launch the OSK because they will invoke the input-method protocol, and xwayland programs probably won't. This is the behavior I see on my phone running Phosh - firefox and foot (terminal) showing wayland windows trigger the OSK (squeekboard), whereas chromium using xwayland does not. But even then, chromium does respond to the OSK input when I manually trigger the OSK.

>X had trouble with screen rotation.

The grate kernel supports reading the tablet's accelerometer sensor, so I just wrote a script to listen to iio-sensor-proxy signals and run `xrandr` to rotate the screen accordingly.

[1] https://github.com/grate-driver/linux

[2] https://openrt.gitbook.io/open-surfacert/common/boot-sequenc...


My point was more towards the 2-in-1s and tablet PCs, not really phone or 'true' tablets. Devices that have been pretty popular over the last decade, like Surface Pros (I had used a Pro 2 and Pro 3 during my experiment) or Lenovo Yogas.

----

Didn't really consider postmarketOS. I haven't played with installing mobile operating systems on desktop hardware in a long time.

----

Yeah I think what screwed my testing was applications that were using xwayland without me noticing. The stable firefox snap at the time (which is preinstalled on Ubuntu), uses it apparently. I have only used Xorg before, as I've never felt the need or desire to step into the tarpit that's migrating to Wayland.

----

Neat workaround for your tablet, glad it works well for you.

----

End of the day, I'm sure that I could have puttered on it and eventually got it all working (aside from the hardware on the surface that appears to be completely locked down), but it's a pretty poor showing out of the box. YMMV and all that, and I hope that more investment in the space makes it a better experience in the future.


Phosh and SXMO at least (not sure about Plasma Mobile but probably it too) generalize to running on large screens with external keyboards etc, so they should work for such hardware too, so you can try them if you are still interested.


> I wonder how much the popularity of the Steam Deck is pushing the KDE team to improve touch screen support as well.

Works like a charm on my OLED touchscreen laptop. I am actually surprised by how better KDE looks compared to windows or macos and how stable it is (KDE version 5.27.4 that is).


I have a surface with Ubuntu and the Surface Kernel (https://github.com/linux-surface/linux-surface) , and it works really wonderfully. I will say, before installing the Surface Kernel, it was very janky.


I plan on putting MX Linux on an old Surface Pro tablet, probably in a couple weeks when I have some time. Any gotchas you encountered that I should be aware of?


interesting because I have been running archlinux on a surface pro 7 and it works flawlessly appart from the camera. I use the linux-surface kernel, instructions can be found here: https://github.com/linux-surface/linux-surface/wiki/Installa.... Highly recommend it!


Why did they have to use a micro HDMI port? Couldn't they have done dual USB-C and made one the DisplayPort output? MicroHDMI is not a very nice connector, it's a lot of pins under mechanical stress from the heavy, rigid HDMI people generally attach to it.


From <https://us.starlabs.systems/pages/starlite-specification>:

  Micro HDMI
  USB Type C 3.2 with Power Delivery 3.0
  USB Type C 3.2 with Power Delivery 3.0
  Micro SD Memory Card Reader
  3.5mm Headphone Jack
  HDMI version: 2.0
  USB-C Interface: Display Port (DP Alt Mode)
  USB version: 3.2 Gen 2 (up to 10 Gbps)
Maybe I am missing something, but it seems to have Display Port for their USB-C?


Oh nice! That makes it a lot more interesting


It's very odd that their 12-port dock also shows a DP port in the image, but it's not listed in the specs.

But the 2x HDMI are visible + listed. :(


DisplayPort has the huge advantage of being incredibly easy to passively convert into the ultra-legacy-weird-difficult HDMI, where-as the legacy-centric-pita-gross HDMI requires active absurd adapters to turn into DisplayPort.

100% more usb-c please, with alt modes. USB4 mandates every port be able to do DisplayPort output.

I really hope we start to see phones and tablets which have >1 USB port. Lenovo has an absurd beast phone, a Legion phone, with both dual batteries and dual USB-c (USB3) ports. If we get to 2030 and phones can't plug in to GPUs something is f-ed and the system is broken, tech has ossified grossly. Hopefully happens sooner, and hopefully we see dual ports emerge midway to then too. Would be such a great capability set.


> If we get to 2030 and phones can't plug in to GPUs something is f-ed and the system is broken, tech has ossified grossly.

Why? That’s an incredibly minor use case. If you’re expecting something this niche to become the norm then you’re destined to be disappointed.


We had USB 3 with the Samsung Galaxy Note 3 in September 2013. It is now almost a decade. If rumors are true, Samsung will announce a USB 4 (thunderbolt 3) phone before the end of 2024. I agree with the grandparent. While most people don't care too much for eGPU (?), I'm sure if we let people innovate, good things will come.

My dream is much smaller. I just want whatever hardware circuitry is required to accomplish the scenario where a phone that is plugged in to a good power source runs directly from the wall, shutting off battery charge completely, not trickle charging all day and night.


One of the tragedies of USB-C is how it can be anything from dumb charging only to one's capable of 40GB/s data transfer among everything else.


It makes working out which hole to plug something in to rather tricky. Especially when the little lightning bolt or other graphics have rubbed off.


Seconded.

And even before then.

I have an M1 Macbook Air which will only output video over one of its 2 USB-C ports, not the other. There is nothing visible on the case or in the OS to indicate this.

I have had an Arm and an AMD Thinkpad which both have only dual USB-C, and both unpredictably switch between one or the other being bootable, with no discenable pattern.


> I have an M1 Macbook Air which will only output video over one of its 2 USB-C ports, not the other.

Weird, ever since I had USB-C based Mac Mini or MacBook (two Intel, two M1) they could reliably output video on any of the two, three, or four ports (as long as I don't go past the limit). They're essentially symmetric on all features.


USB4 mandates 40GBps. It mandates DisplayPort. These would be pretty helpful baselines to expect, reasons for consumers to want USB4: they know it will be fairly featureful.

PCIe transport ("Thunderbolt") and Power Delivery are both optional though, I think.


This. People can get away with implementing really, really bad USB-C ports and you're just supposed to expect that no two USB-C ports are born equal.


My Lenovo Y700 does passthrough power. It also has a mode where it only starts charging if the battery is below 40% and it'll stop at 60%

Unfortunately its not a phone. Its a mini tablet. But I find the size ideal for daily use - browsing,reading pdfs, small sketches . (no SIM slot though)


> It also has a mode where it only starts charging if the battery is below 40% and it'll stop at 60%

Wish every battery powered device would have a setting for this.

As opposed to always charging when charger is plugged in, and always charging up to 100% (non-configurable).


My sample size isn't big, but on Linux every laptop I've seen has amazing battery reporting information galore. Oh sure battery level. And various assessments of wear. Things like realtime charge or discharge rates.

But more notably, I think around half also have charge control. It's just been on/off. But it would be a pretty basic bash script to make this happen.


I have a Huawei Mate book 16 and under KDE (Ubuntu) I can set a charge limit in the energy settings

But it depends on the laptop. From what I understand not all laptops have the drivers for power control ie. the ability to from-software tell the laptop to stop charging


Pretty sure most any phone already has the hardware for software defined charging. Android just doesn't have an API call for it, unless they do and I just haven't heard yet.

Maybe we should be filing feature requests for it;


And it was so bad they went back to microusb on 4


All Galaxy S models were Micro USB until they switched to USB-C on the S8 series.


I was talking about Note 4. And Galaxy S5 had usb3


You may be onto something, as Lenovo shut down the super-badass ultra-phone. https://www.androidauthority.com/lenovo-legion-phone-shutdow...

But reciprocally, what's the new ultra-hot device? Gaming decks. Lenovo just announced theirs today. Steam, Asus, and dozen of others have amazing devices. And there are still gaming-centric phones galore.

This should be an easy ask. It should be a lock. Android alas is kind of a weird divergent hard to use OS that makes everything difficult. Apple hasn't supported anyone elses GPUs in a long long time. But in general, this should just be easy simple & doable, were it not for the sins of these ultra-bizarre weird not-PC but so close systems. ChromeOS finally found jesus & is now running on Wayland, because it was obviously the correct & only sensible choice all along, And has huge advantages, such as having a huge world of people optimizing & making the system better. I don't know how Android ever pivots (they just steal everything ChromeOS (which runs Android great) is doing), but it should, so that it can make ideas like this unimaginably easy & simple tasks, versus today where this sort of idea is a painstaking slow & awful endeavor to make happen.

So much of computing is a story of niche finding leverage & finding adherents. The early adopters are just those who see potential, and they continually have reshaped what computing is. Writing off "niche" as uninteresting & minor ignores how sensible & clear & obvious so many advancements are & should be.


I wish Android and Linux would just merge. Give people access to a Linux userland that lets you install Nix packages(Most anything but Nix or similar would be a step backwards from Android's reliability during updates), and then Android could very easily be most people's only OS.

If people could plug in their phone and run Linux apps on a full size monitor they'd probably pay a lot more for the phone, knowing they didn't then also need a nice laptop.


The general population only need to run a browser on their external monitor, with Office 365 or Google Docs and Gmail. There is no need for anything else. If one is not a software developer, which Linux apps do you feel are needed?

I used Samsung DEX a few times on my tablet. The mouse plus touchscreen combination works well. The keyboard is a little worse because there is no Esc key on Android and Ctrl [ is uncomfortable.


Gimp/Krita, FreeCAD, more and more games, Inkscape, Ardour, Audacity, Calibre, Sound Converter, Converseen, PrusaSlicer, etc, plus all the other stuff that people would probably discover.

If one IS a software developer of course, then you probably need a lot of Linux apps, and it would be cool to have them alongside a platform that's way more reliable than Linux, so that your browser and email and calendar and music are always there and don't bog down.


So, maybe AOSP should become another Linux desktop?


Because that’s what standards are for : a single base of universal rules that allows you to build what you want over it.

The point is not that phones should be connected to GPUs, that’s a silly idea.

But nevertheless for plenty of reasons including ecology and autonomy of people, it should be the case that any device should work with any other device as long as anyone is able to develop a driver. And there should be no permanent lock against custom drivers.


DP can only be passively converted to HDMI if the source uses DisplayPort++/Dual Mode, which is not supported by the DisplayPort Alt Mode over USB-C spec (Why? Who knows).

Every usb-c to HDMI adapter has to actively convert the signals which is why one end is usually much larger than the other.


Isn't there not enough pins for that? Plus, it would just be needless extra hassle, when their goal is to just not have HDMI at all eventually.


>I really hope we start to see phones and tablets which have >1 USB port

We're much more likely to get phones with 0 USB ports (justified by 'security', 'water proofness' or 'simplicity') where the only charging is wireless.


I haven't seen one of those in many many years! I wish HDMI in general would die and let displayport take it place.


> I wish HDMI in general would die and let displayport take it place.

That would be an interesting thing for the video production industry. Basically the entire market is divided between "professional" equipment, in which SDI continues to dominate (with some movement towards SMPTE 2110 aka IP), and "amateur" equipment which is all HDMI - with very few products on the boundary and supporting both connectors.

Consumer grade digital cameras have only recently (10 years, maybe less) started being able to output the live video feed over HDMI. Before that, believe me or not I've stumbled upon MANY camera models from 2013 or before that had an HDMI port, but all it was good for was displaying pictures from the memory card on a TV.

It would certainly make life easier in a lot of "small streamer" setups. Currently, if you don't want to torture your camera's battery, you need to get a silly (often third-party) "dummy battery" that you can (hopefully!) plug into an ordinary USB power supply; and on top of that, a separate mini/micro HDMI -> full HDMI cable to plug into a capture card. If you could reduce that to a single USB-C with PD and DP - trust me, every silly cable you can eliminate from your setup is an enormous win.

Even better, if these cameras could talk the regular USB "webcam" protocol in addition to DP, eliminating the capture card for the overwhelmingly common setup of "I just want to look very good on video calls".

But that opens a can of worms: in any non-trivial setup, a camera (one camera) is merely a small piece of a much more elaborate puzzle. Even seemingly simple setups end up converting the signal back&forth between some crazy stuff. On one job, we needed to run an SDI or HDMI cable between the floors, but couldn't do either because the building was untouchable; so we've used a couple of HDMI -> HDBaseT converters to run the signal over existing Ethernet cables. Turns out, it was no longer possible to convert the resulting HDMI signal again to SDI (we've tried many converters, all failed), which limited our choice of video mixers. Would the signal make it through if it originated as DP? Your guess is as good as mine.

Broadcast is a strange place. I still laugh whenever I think of Quad-SDI; only the broadcast industry could ever come up with that. Things need to work with one another and even if every single person in the world agreed that HDMI must die, starting today, I'm fairly certain we'd still see new equipment being made in 2033 that supports it.


I have spent the last few months doing a deep dive on broadcast / audio engineer standards. The lack of reliability and strange standards are interesting...

It seems like the last few standards started really robust and open because of the lack of compatibility, and then greed got involved and vendors just slipped in something to make it difficult cross connect. I assume so people would have to buy more of their stuff.

The focus on "realtime" makes the standards have worse quality in practice (bad handling of dropped or bad bits), and makes it much harder for the IP based standards to be routed (network congestion from high bitrate through uplinks). WebRTC by comparison can be quite nice.

I seriously don't have any hope for sanity in that market.


What's the practical difference? With USB-C there's power delivery. DisplayPort has..?


> DisplayPort has..?

Consistently good colors. Sometimes even PC monitors won't negotiate properly with the PC and start displaying washed out colors. IIRC this had to be forced on a Raspberry Pi on the PI side. Not all monitors can be adjusted.

Consistently no under/overscan. For some reason, the TVs we had in conference rooms figured it would be smart to cut the borders of the image and zoom in, so you get missing bits and whatever's left is a blurry mess.

For the TV situation, you usually don't have a full remote to adjust it, if it even supports that. You often don't have time to look up things in the TV's typically crappy menu system. I've usually found the option to disable over scan, but using full-range RGB seems less common.

Also, HDMI seems to lag DisplayPort capabilities when it comes to higher resolutions and refresh rates. When my 2013 MBP came out, it could drive a 4k@60 screen over DP. HDMI required the 2.0 version to do that, which, IIRC, came much later.


>Also, HDMI seems to lag DisplayPort capabilities when it comes to higher resolutions and refresh rates.

I think that one very much depends on when you chose to look at it. HDMI 2.1 has more bandwidth than DisplayPort 1.4, enough to do 4k@120 which DP1.4 wouldn't be able to do it without dropping color down to 4:2:0. DisplayPort 2.0 devices are starting to come out, but even Nvidia's RTX 4000 series still do not have DisplayPort 2.0 (but do have HDMI 2.1). While TV started supporting HDMI 2.1 around 2019, with the PS5 and Xbox Series X having HDMI 2.1 ports.

So while DisplayPort may be ahead now with DisplayPort 2.0, HDMI was ahead for at least 4 years with HDMI 2.1


That's a fair point. I admit I was judging by the availability on the PC side (I don't follow the console market).

Although, if I'm not mistaken, my particular PC monitor initially didn't support HDMI 2.0, even though it's a 4k panel. There was a further revision which included it. I have that revision, but support is still somewhat wonky, in that it can't seem to switch on its own from 2.0 to 1.4b.


Display support is always last in this chicken-and-egg problem. As far as I know, there's still no DisplayPort 2.0 supporting displays, so any of the higher end monitors require display stream compression. I can't even get a DisplayPort 2.0 MST hub so I can chain multiple 1440p@144 monitors. Which is definitely something that HDMI can't do.


Ok so that sounds like DP has better negotiation protocol spec and/or implementations.

I've only encountered overscan on TVs not monitors, but I don't give conference room presentations, which would be very annoying. On Macs there compensation for that.

I've had trouble with colors where it uses YPbPr rather than RGB, but that seems to be an Apple thing where it's done on purpose for non-Apple-approved displays and it happens for both HDMI and DP. Generating a custom EDID profile fixes that. Can't recall having trouble with color range, sometimes the display has a setting but the default always looked better to me.

I've used 4k@60 HDMI just fine (and 4k@30 on an early Apple adapter), but more often use 1080p anyway. I use USB-C with my 4k displays which likely runs DP on them.


> Ok so that sounds like DP has better negotiation protocol spec and/or implementations.

Right. So... DP is better than HDMI?

The color issue I had was not with a Mac but with an HP laptop on an HP monitor. My understanding is that there's something about "broadcast colors" or something, which is "regular" RGB only with a narrower range. I think the PC thought the monitor was a TV with a limited range, which the monitor was not.

With the TVs I plug into, it's usually harder to judge since their color rendition tends to be all over the place anyway and tend to have the reverse issue (too much contrast).

I remember the overscan control on the mac, but it still was a PITA to have to fiddle with that instead of, you know, just plugging the screen in and being in business.

While I've also had numerous positive experiences with HDMI where things seemingly "just worked", I've never had an issue with DP. It always worked. Hell, even my gaming GPU, which came out a while after HDMI 2, and supported it out of the box, connected to my monitor with full HDMI 2 support, still has weird colors compared to DP. No tweaking in the AMD drivers managed to get me the proper output, so I went and bought a cheap Chinese DP KVM instead. Which worked with no fuss.

All this makes me automatically pick DP if given the choice, and discount any computer or screen that only does HDMI. Which makes it pretty tough to buy a TV, so I just watch movies on my computer monitor.


Yeah totally agree - I’ve had quite a few situations where a monitor or tv looks blurry and washed out with HDMI and it’s immediately fixed with a DP cable.


I've read this article from HN about it: https://hackaday.com/2023/07/11/displayport-a-better-video-i... It's video done right!


USB-C is the connector / cable, DisplayPort is one protocol that can run within such a cable. HDMI (a proprietary licensed protocol) can also. https://en.wikipedia.org/wiki/DisplayPort#Comparison_with_HD...


DisplayPort has MST.


Probably due to cheapening out a few dollars on the BOM, not a great sign to be honest.


But not cheaping out on pixels that you can't even see with a magnifying glass. 1920p on 12.5 inches, lol.



doesn't explain the mini hdmi port


And it already has dual USB-C, just without video out support. Pretty disappointing.


It has DP alt mode support you need to click the big + on https://us.starlabs.systems/pages/starlite-specification next to connectivity


I also scrolled down to look for this (in the market for a low-power long-lasting laptop that will be 99% docked).

I might get one anyway, if they ship to SA. At least micro HMDI is still dockable.

(Just out of curiosity - does anyone know of any other option to use an external display with this tablet? Are wireless displays a thing?)


Click the big plus on https://us.starlabs.systems/pages/starlite-specification next to connectivity to see

> USB-C Interface: Display Port (DP Alt Mode)


There's 2 USB-C ports but it's worded so that possibly only 1 port could support DP out. So even considering just the USB-C ports, there is a likelihood they cheaped out somewhat.

Did you assume it definitely will support DP out for both?


USB-C is equally bad. I really wish they would have the rubber housing of the USB-C go inside the case like an IEC power connector. I break about a USB-C cable every week. Broke one today just by putting it in my backpack while connected to a power brick.

Power bricks of the 1990s didn't have this problem. The barrel connectors were almost indestructible. You could drop bricks on VGA connectors and they'd still work.


Wow.

I have never yet broken a USB-C cable. Not one.

When I used an iPhone for a couple of years, I went through approximately a Lightning cable per month, sometimes more. At one point I took a carrier bag of broken Lightning cables to the electronics recycling.

You must treat equipment exceptionally roughly.


I am full USB-C. I have a kid. Mine are treated very roughly. Not a single one ever broke. Comparatively, microusb has died on me quite a lot from regular usage, and always on the device side, which is so much worse.

(Now the cat chewing on cables in another matter xD)


Yeah, I've never once seen a USB-C cable break in that the physical connector itself was damaged. I've been through plenty that just seem to stop reliably connecting if they are in high-motion environments (the cable connecting my phone to my car's entertainment system, for instance)

I will say that I really wish they had managed to not have that middle section sticking up in the female connectors - cleaning out my phone's USB-C port is about 20x harder than cleaning out a lighting port because I feel like I'm going to break that little thing in the middle.


No, I treat equipment as any consumer would.

Like if my phone is low on battery I plug it into a portable battery and shove the phone and portable battery (connected by a USB-C cable) into my pocket, like any consumer would. I then go hiking, snowshoeing, biking, sleeping, like any consumer would. Cables have bent and broken in my pocket in these scenarios, among thousands of others.

My parent comment was yesterday; today's USB-C cable broke when it got stuck in an office chair's wheel caster, bending the sheet metal housing. Thi doesn't happen with ANY connector of the 1990s. BNC, RCA, DB-9, 1/8" headphone jacks, 5.5x2.5 barrel connectors, they ALL withstood office chair crushing and were all probably designed with consumer abuse like office chairs in mind.

I've even run over DB-9 and BNC connectors with heavy duty carts, they were fine.


That sounds like a lot more active life than an average consumer. It would be nice if USB C could handle that kind of duty, but for most people, the 1/8 and RCA, or at least the cheap versions, might die of fatigue before the USB C has an issue. I remember non-pro audio cables as being insanely unreliable, like, you could break them in a week of light use.

USB C doesn't have that issue. There are high end cables and just kind of OK cables, but at least with the braided ones I normally buy, there aren't many terrible ones at any price. They decoupled reliability from craftsmanship and materials.

Pockets should be considered industrial duty environments.

Right next to a person is a much harsher environment than people think it is, stuff people carry tends to break and wear.

I'm not surprised the cables break in pockets in that kind of scenario, although I haven't experienced it myself with my admittedly much lighter and less varied lifestyle (lots of walking, occasional hiking, maintenance work sometimes including light carpentry, but no sports or biking).

There seems to be a very large variance in percieved reliability of cables and tech in general. Are you buying the same brand every time you need to replace something? Do more athletic people with more muscle have problems because their weight or level of force they commonly apply is more likely to break stuff? Probably just unrelated factors but it does seem like people who lift are more like to hate new tech. Is it about what clothes you wear? Is it the phone itself having something about the connector?

Obviously USB C has a problemwith a few specific environments, but I'm not exactly sure what those cases are.


Barrel connectors I found always fail within 18 months for a lot of laptops I've known people buy (mostly from cheap brands such as Acer).

While not saying USB-C is a great choice instead. Personally I just wish everyone could have generic magnetic Magsafe style connectors.


Barrel connectors are easily one of my most disliked. Even if they don’t fail outright, they grow loose and finicky over time even if you’re trying to take care of them. On top of that, there’s numerous different sizes and pole configurations so even if an adapter is physically compatible, you have to look closely to make sure you’re not going to fry your device.

Just terrible overall.


So just agree on 5.5x2.5 and federally ban 5.5x2.1. The biggest source of finickiness is 5.5x2.5 plugged into a 5.5x2.1 socket.

Inventing a new standard with a weak sheet metal connector and 19 pins to deliver + and - isn't a solution to the above.


Most new devices don't have enough space for 5.5x anything. And 2.1mm is the more common one from what I can see.

Regardless it's a crappy choice for consumer electronics because it's got no voltage negotiation. You can fry stuff with it, and chargers aren't universal.

2.1mm is a wonderful connector for switches, sensors, solar panels, and power you need to daisy chain, but PD is pretty much perfect for point to point medium power devices.

USBC seems to break far less often than anything else that's cheap and small. I don't see why 19 pins is an issue unless you just really like simplicity.

It absolutely looks like something that would fail all the time... But it doesn't. Somehow they made it so even the cheap knockoffs are durable.


USB-C DisplayPort altmode would have required one to two extra chips for the port supporting it and some extra routing on the PCB.

So cost and complexity increase. MicroHDMI does suck, but despite that raspberry pi 4 has two of them.


The Pi also put a nasty old microUSB on the Pi Pico so I'm not surprised they used a crappy connector, when there's way less reason to still use MicroUSB.

They're a great company but MicroUSB? Really?


At the time Pi Pico came out MicroUSB connectors were around ten times cheaper than USB-C. Now it has dropped to 2 - 4 times cheaper. In volume those differences do matter.



1st class Linux support tablet, coreboot firmware, & decent specs for ~$500? I consider that a great deal. I'm very keen on this.


No one has mentioned the 2880x1920 resolution display. That’s fantastic I’m this price range. $500 laptops are (nearly?) impossible to find with displays above 1080p.


A friend of mine recently (2019 or '20) bought a laptop, new, with a 768p LCD display. It's ridiculous. In 2014 you could get a phone with a 1080p display, and it was 5" so it had a high pixel density too! Why does it feel like consumer electronics are going backwards?


Looking for a laptop for my Dad more recently (early 2023) there were no models with less than 1080 that were not also incredibly crumby in other respects (dog slow eMMC drives, too little RAM to run current Windows well especially once it swaps to that slow drive, awful looking keyboards, etc.). We checked for lower resolution screens because with his eyes higher is pretty pointless (and it might have reduced the price) but went with 1080 and set it to be scaled in the OS.

> In 2014 you could get a phone with a 1080p display

You can buy a full laptop fur a fair amount less than many phones with high-resolution screens though, possibly more so back then. What spec was his machine beyond the screen? Also: most people are closer to their phone screen in use than they are a laptop, perhaps making the case for higher resolutions there stronger.

Reliably producing those 720/768 displays at common laptop & tablet sizes was cheap so making laptops/tablets around them was cheap, and more than enough people thought it good enough (or didn't know better). Given our experience above, the economies of scale on 1080+ panels have changed such that the 1080 screens are in the sweet zone.


The fact that $1,000+ 15-17in 1080p laptops are still being sold today is just atrocious.


I mean, yeah that resolution is ridiculous, but anything above 1080p on a screen less than ~14 inches is going to have diminishing returns and most of those returns will be in how "pretty" it is, not how useful it is.


Perhaps because people realized that the pixel inflation is at some point counterproductive. If you can't distinguish pixels, what difference does it make?


It feels like those cursed 1366x768 panels will still be hanging around somewhere or another forever.


It costs much more to make bigger high resolution screens with good yields.


What was the laptop?


My question is if that resolution is usable with 2x integer scaling (1440x960 effective screen real estate), because fractional scaling under Linux is still a bit rocky.


Fractional scaling works well on Plasma under Wayland if you're using an up to date distribution.


The DE itself, yes. This is true of GNOME as well once the hidden setting is enabled. Unfortunately there's still individual programs which don't behave correctly, which for now means that 1x or 2x displays are still best if a trouble-free, low-tinkering experience is the goal.


The GNOME setting is not hidden anymore, fractional scaling seems to be a first class citizen now.


My experience is that it makes a lot of things visibly blurry. Has that improved at all?


I use 150% on my main monitor and everything looks perfect except for two proprietary apps installed through flatpak: Zoom and Spotify. These two are not really known to be quality software in the first place so I think it is safe to blame them for this quirk.


Indeed; it works perfectly on my postmarketOS PinePhone, which has the most recent stable GNOME release.


Though other things don't work well. Like Plasma's whole toolbar with clock widget etc. It freezes, and you can't interact with the frozen widgets. Only answers are kill and restart, reboot, or switch back to X. It'll get there, but it's still not at a level I would say is daily driver (for me - I like my toolbar clock to actually tell me the current time).

(Kubuntu with KDE project's PPA to be on latest stable)


I'd try with a rolling release and see if you still encounter the problems. I use Plasma panels with the clock widget and haven't experienced freezes or had to restart anything.


No. It really isn't. I can tap a keyboard shortcut and change my desktop resolution in .1 increments and everything just works and scales up and down. I've run my internal laptop display at 1.7 for 3+ years. (Sway)


> No one has mentioned the 2880x1920 resolution display.

fyi: On their specification page[0] they list both 2880x1920 (s/b 3:2 aspect ratio) and that the screen is a 16:10 aspect ratio. I'm sure the 16:10 aspect ratio is a typo since 2880x1920 is used in multiple places.

[0] https://us.starlabs.systems/pages/starlite-specification


I don't need another computer right now but this gives me a warm feeling that the Universe cares about me.

May I dream that by the time I do need this they will also have a combo offer along with a fully integrated linux mobile?


> fully integrated linux mobile

It already works fine with PureOS/Mobian.


So I've been using a Tablet PC running Linux for over two years as a primary device, my advice is that if your needs are more as a workstation get something with a reasonable CPU and fans. HP used to make a nice tablet PC, the Asus flow z13 is the second best choice.

My specific setup is here: https://www.reddit.com/r/ErgoMobileComputers/comments/vzs8mm...

Dell also has been selling a fanless XPS 13 tablet for the past year but I haven't heard of anyone using it. This said it's nice to see more vendors considering the tablet form factor with appropriate ports and display resolution.


And what about the software? I mean software that can be used without mouse and keyboard?

Currently the main problem is not the hardware, but we simply don't have usable software. I have been driving a PinePhone for many months (still do), but mostly with programs that I wrote for myself after a lot of frustrations about the available software, because having a keyboard and mouse is mandatory with 99.9% of the available Linux applications. So basic (UX) things are missing and/or fatally broken, it's not even funny. This makes a new n+1th Linux tablet a bit pointless in my eyes.


Sometimes I want to buy a PinePhone. Do you mean that one can't send SMS, place phone calls, browse the web, read (not write) emails without a keyboard ? (these four are 99% of my phone usage).


You can make a phone call, browse the web, send messages. Note that the modem keeps rebooting itself randomly (sometimes once a day, sometimes once every 10 minutes, with all firmware versions - this means entering the PIN again, finding the network, etc), so if you have to make phonecalls at least somewhat reliably, PinePhone is not for you. Same for PinePhonePro - the same crap modem is used in that also. Personally I have a dumb Nokia with me all the time, in case I have to use a phone, and the PP has a data-only SIM.

As of 2023 there is 1 actively supported email client with a mobile and touchscreen optimized "interface": Geary (at least that I know about). The scarequotes are there with a reason: you can't set up your account without connecting the phone to an external monitor: the Next/Cancel buttons are out of the screen during the setup. Normally I would laugh, but now I don't feel like it. Have no idea if it works afterwards, I just deleted it at that point.

Note if you'd decide to get one: at least original PP is kind of abandoned. It is still a nice toy, but I wouldn't hold my breath that anything widely supported and usable will be published on it.


Drew Devault (Sway and Sourcehut guy) wrote this on his experience with Linux phones: https://drewdevault.com/2023/06/16/Mobile-linux-retrospectiv...

tldr: it works and it's great, but it's not reliable in phone/SMS in the way a phone needs to be reliable.


I've been using the Nokia N9 and some of the Jolla Sailfish offspring to this day. The phone part has been very reliable. The rest of the ecosystem is very minimal, almost non-existent, but I can't complain about the basic phone functionality.

With these ARM-style devices that are quite closed down and need their own respective device tree, mass adoption seems a requirement for a company and ecosystem to hit critical mass and be financially viable. I don't think a "Linux phone" is enough of a value proposition to hit that critical mass without any specific features that would appeal to a non-technical crowd and can't easily be replicated by the iOS or Android behemoths.

So personally I think it's not that the open-source/Linux approach can't produce a phone that has good-enough basic functionality, but that there's no clear way to market and sell these phones to a sufficiently large crowd to offset the massive investment into a new platform.

Linux as a whole might be far ahead of iOS and Android as a general computing platform, but a viable device would have to break through the mass production hardware (and marketing!) barrier.


I used to use Sailfish too, and it was fun at the time. But its messaging capabilities felt like they'd stagnated and it's never gotten support for things like group texting, which has only gotten more common in the years since: https://together.jolla.com/question/75552/messaging-applicat...

I would gladly switch back to a Linux phone if it had reasonable battery life and good support for the modern assortment of at least the open messaging protocols. (Obviously things like Discord are out of the question, but SMS/XMPP/Matrix?)



I obviously don't have one, but presumably there's a fan header that they didn't use and didn't bother to remove the reference from the bios in case someone decides to use it...somehow... for some reason.

idk, but a bios setting for fan speed isn't actually indicative of a fan.

edit: could just be a really roundabout way to increase TDP


Same image as used on their laptop https://starlabs.systems/pages/starbook


The Hyper-Threading setting is also present even if Intel N200 doesn't have it.


Reminds me of the Pine Tab (v1) with the contact pins at the bottom, fold-up keyboard, etc. Having an Intel processor would make me heaps more confident about good kernel support out of the box. The A64 by Pine is essentially abandoned for Linux kernel development.

This device looks great, I would just take care expecting too much from that keyboard. It looks almost identical to the Pine Tab keyboard and that was not great.


Sadly, Pine has a long history of using poorly-supported chipsets and doing nothing to improve support. I want to like the company, and their hardware designs show a lot of promise, but ultimately they do not stand behind their product.

It's truly a shame. Their products have potential and could build a huge hobbyist community behind them, but extremely few people have both the skills and the time to troubleshoot buggy kernel modules.

I agree that StarLite's Intel processor will probably work a lot better.


> Sadly, Pine has a long history of using poorly-supported chipsets and doing nothing to improve support.

It wouldn't even be difficult either, they could just charge slightly more and then pay a kernel developer to solve these problems.

> Their products have potential and could build a huge hobbyist community behind them, but extremely few people have both the skills and the time to troubleshoot buggy kernel modules.

It's not just buggy kernel modules, it's also significantly buggy hardware too.

The only thing I would say they have got going for them is good connections for hardware and manufacturing which help them reduce costs, otherwise another player could easily take over.


i have too many computers.... i have too many computers.... i have too many computers....


Product quality has fallen devastatingly in this historic phase: if there is a good product around, better to consider it while it is available.


Yes, but what about tablets?


One more is unlikely to result in divorce.

Probably.


I feel your struggle pal!, I have too many computers too!


As a long time iPad user I’m apprehensive. As others have noted, Linux based touch experiences have so far been lacking.

That said, I love the thought of it. I’ve recently started using my iPad Pro as my daily driver (replacing a Mac Studio M1 Ultra no less) and haven’t looked back. The only gripe with iPadOS is it took me a while to get a smooth vscode setup for development - having that be easier out of the box with the StarLite is quite appealing.


What? How?

I have an m1 iPad Pro. It mostly collects dust. It’s just atrocious to use for anything more than playing games or basic web browsing. Multitasking makes me want to pull my hair out. Window behavior is horrific. Why doesn’t stage manager allow me to actually resize window and put them where I want them to go? File management is garbage, too. There are just so many weird limitations and missing basic functionalities to make it a viable DD device.

I tried multiple times to make android tablets work for me before Google apparently gave up on them. Never could, but even back in the Xoom days android tablets offered more functionality than my current iPad does.

And this comes from someone who dumped android for iOS years ago.

The iPad is just a disgustingly gimped product.


I’m using iPadOS 17 and it is vastly better than 16, so once it comes out you should give it a shot. I can absolutely understand being unhappy with it before. What makes 17 so much better is that you have the freedom to move windows around where you want, but it also does a great job of managing focus and organizing them. I’ve used full-screen exclusively on Mac since Lion, but now I’m all Stage Manager all the way.

I do agree about file management, but I don’t really work with files at all on my device anyway - code is in the dev box I SSH in to, and everything else is in Google Drive.

For reference, 80-90% of my work is slack/gmail/gdocs/zoom. The rest is development work done using a PWA for VSCode, code-server on my dev box and Blink Shell for SSH (I would prefer Termius but I can’t have SSH port forwarding done automatically). This works quite smoothly and gives me access to pretty much everything I need.


I’ve been running iOS and iPadOS beta 17 since day 1. It’s nominally improved stage manager window resizing, however it broke:

iCloud sync (won’t sync my photos until I restart).

Sharing (can’t email stuff unless I restart. Just stays at the “preparing” stage).

Search (both iMessage and spotlight).

Safari is incredibly unstable and tabs or the whole browser crashes all the time.


Interesting! That is not my experience at all. My only issue has been the right click contextual menu behavior being inconsistent with google docs and VSCode. I hate to say this but have you tried a fresh install? I did to that and it resolved the issues I had early on, and then once I restored my backup it was fine.


Yup. My iPad running ios17 is nominally more stable than my iPhone but both have similar bugs.


> Linux based touch experiences have so far been lacking

My Librem 5 works fine with the touch screen.


I was referring to tablets but thanks for pointing out the Librem. I’ll check it out.


“Smooth vscode setup for development”

I assume you’re using some sort of remote dev server?


Yes. I have an instance running on the dev box. The only issue has been an issue with the right-click in Safari not properly handling PWAs like VSCode, causing it to show the iPadOS context menu over the VSCode menu.

It has been resolved in Webkit proper, but isn’t yet resolved in iPadOS, so I use a workaround to deal with it. I’m also enjoying trying out micro, helix, neovim and emacs as well for purely terminal based editors.


Of course an Apple fanboy shows up with this comment. ChromeOS on a tablet is a perfectly fine experience, gesture navigation and all.

(And, very important to me, runs a "real browser", not some limited mobile variant. And the full Linux CLI environment.)


iPadOS Safari is just as capable as desktop Safari. Why do you think otherwise?

Also, it’s not 2006 anymore, maybe stop calling people “fanboy” for preferring a different computer than you.


What motivated me to want a "real browser" was bookmark management. Managing & reorganizing bookmarks on mobile Chrome was frustrating.

And I absolutely will label anyone a fanboy if they barge into a Linux conversation going "Rah rah Apple best Linux sucks" without actually contributing anything insightful, useful, or specific. Please don't just repeat the age old "Apple did it best" line, we already understand you think so. To people actually interested in the topic, all it seems like is you wanted to mention Apple for the sake of mentioning Apple, and that's why I called you a fanboy.


I'm not the person you responded to. I don't think "Rah rah Apple best Linux sucks" is a good characterization of his comment though. "This looks interesting but Linux has historically been a worse touch experience than iPad" is a better paraphrase, and I think the sentiment belongs in this discussion (because any entrant to the tablet market is competing with the iPad, whether you like it or not).


Once again, ChromeOS on a tablet is a perfectly fine experience. If you, or anyone, wants to argue about why a specific Linux-based desktop on a tablet isn't great, do that, don't just restate generalizations like "Apple is best and everything else is unusable and it's always been like that", that comes across as FUD based on opinions and fanboyism.

I've used ChromeOS tablets for years. I haven't tried KDE/Gnome on a primarily touch based device, just ChromeOS, so I can't speak for those, but I see no inherent reason why they'd be worse. There is an iPad some 30 feet from me right now, so I think I am in a position to be able to compare to that.


> iPadOS Safari is just as capable as desktop Safari. Why do you think otherwise?

That would be an incredibly negative statement on the state of desktop safari if this was true.


Looks much better, and cheaper than I expected. Quality can't be fenomenal at that price point but it looks well thought out from the ground up. Impressive!

And thank you for the 3:2 display aspect ratio <3

Gonna keep an eye on this one.


I'm not really at all surprised by the price.

It's more expensive than the base level Steam Deck, which is also a Linux touchscreen tablet (with some extra input hardware...), and that's Zen 2, 16 GB RAM, GPU, etc though with smaller&worse screen. Adding a comparable 512 GB SSD to the Deck makes it $649, compare against StarLite's non-preorder $713 price.

A full Framework 13" laptop is $1049 and that's a hugely more powerful CPU etc.


> aspect ratio

The specs say 16:10...


Where?

I couldn't find the specs for it, but the resolution is stated as 2880x1920 which ought to be 3:2.


At https://us.starlabs.systems/pages/starlite-specification

yes the resolution seems to indicate 3:2, but they also write 16:10 - out of what, it is unclear.

The tablet size is 11.15'' x 8'' , i.e. around 10:7 .


I had been monitoring that Intel N200, periodically checking news about low consumption chips:

Jan 2023 ; 0.01µ tech ; cache 384 2048 6144 kb ; 6W TDP ; 4 cores and threads

Previously used ( https://browser.geekbench.com/v5/cpu/search?page=2&q=Intel+n... ) in some "Google Nissa" (could not find further details about what it is), "Micro-Star ADL-N Cubi N MS-B0A9" minicomputer, "Asus MiniPc PN42", "Lenovo IdeaPad 82XB"...

With a 38Wh battery, 12 hours seem optimistic but achievable under many conditions of power economy.


> "Google Nissa" (could not find further details about what it is)

I could finally find some references: according to Ryan Whitwam ( https://hothardware.com/news/premium-chromebook-x-laptops-po... ),

> Chromebook X machines will have to be built on one of four hardware platforms with high-end x86 CPUs: AMD Zen 2+ (Skyrim), AMD Zen 3 (Guybrush), or Intel Core 12th Gen (Brya & Nissa). The Nissa chips are the Intel N-series

so the N100, N200 and i3-N300


Somewhat related - will we ever see linux on old iPads?

Old iPads can't get updated anymore, they can't easily browse the web, apps can't get updated or reinstalled... Basically recycle material or dust catchers.

I'd like to jail break or whatever my old iPads and have them as simple linux tablets.


The 80s kid in me when I see such a powerful machine being just electronic waste cries


Yeah, it’s really a shame. I have the original iPad and the hardware remains solid. If I could have an updated browser that could display modern websites without crashing, I would still use it.


Those machines have 256 MB of RAM. Is there any way to run a modern web browser in that?


I had to double-check that, I thought it was 512 MB. Yeah, that's not very much.

I also recall that the final iOS version for that device (iOS 5.x or 5.1.1) left less memory for apps and I regretted agreeing to that upgrade.

I really wish I could install a minimal Linux and the best minimal browser available. That would at least be an improvement over the doorstop that I have.


One tab at a time


lite.cnn.com


Another option is to create more progressive enhancement websites which work with such devices.

Giving older devices new life is part of the reason I started developing my framework.

For example, the websites linked in my profile are backwards compatible all the way to Netscape 3.0, while still supporting more modern features like in-place updating vote counters and adding dialogs to a page without reloading it.

And LLM use will only make this type of website easier to build, once you can ask, for example, "operator, please ensure the website markup is compatible with my particular device, which is an iPad 2 running iOS 6.0."

I think we're about to experience an amazing renaissance of the Web, with sites and services which bend over backwards to accommodate each particular user, device, and abilities combination, rather than telling the long tail to fuck off.

And I think it's the right way to go. Each device-user deserves to be catered to and supported, the same way we support wheelchairs and baby strollers with elevators and ramps, even though they're less than 1% of the traffic.


I agree that these devices should be supported, but the only entity that can actually support them is apple.

It's not enough for sites to work, the underlying software also has to be secure enough to handle the internet. That can come from Apple providing software updates to the device to keep it secure, or Apple providing a supported mechanism for someone to install Linux, or some other operating system that can be updated.

Unfortunately, apple does not provide security updates or any way to actually "unlock" them, so they are unsupported. Tailoring your site to work on these insecure clients is in a sense encouraging them to venture onto the internet, encouraging them to try other sites which might have untrusted 3rd party ads that pwn them and steal their bank cookies..... In that sense, it's more responsible to make your site only work with newer secure browsers than the reverse.


For orphaned devices, this is impossible, because Apple has already decided to abandon them. I think they can be still be useful for non-critical information browsing within a closed network of safe websites.

I think the accessibility necessity of supporting older devices is often underrated severely, while the need for security is overstated. New, currently supported devices people use day-to-day are also exploitable.


Had to get regulations for wheelchair accommodations.


Or even just as a second display


Luna Display [1] works with iPads running at minimum iOS 12.1.

Which means iPad mini 2, iPad Air, iPad Pro 1, iPad 5 and greater.

[1] https://astropad.com/product/lunadisplay


iOS 12 is way too new.

I have a Retina iPad stuck on iOS 9, which is now essentially an ebook reader and movie player.

I had two of them stuck on iOS 5. Perfectly working, cosmetically perfect, lots of battery life, but bricks.

Whereas I have a 1st gen Pro and a 5th gen (when the stock iPad got the Air form-factor) which both run the latest iPadOS, but their battery life is now a couple of hours if that. They're nearly useless.

I had a new battery fitted to the 5th gen. It is no better but now the screen is damaged with multiple artifacts visible.

There is no battery replacement for the Pro: it's too big and too fragile.

In other words, in important terms, the old ones were BETTER but they are now useless because of an outdated OS.


Newer macs and iPads can do this. Totally seamless… mouse moves over to the 2nd screen, etc. totally wireless.



Would love to see it, but my guess would be only if there's ever regulation to force Apple to allow it under climate concerns or right to repair type policy.


I think there’s a reasonable argument that manufacturers should be required to open up hardware when they no longer support it. The problem is I doubt that would achieve what most people here seem to expect fur older devices.

You might be able to run a really basic Linux on the older devices, but their very limited RAM will severely constrain what you can run on them.

The native OS is so optimised for the hardware, trying to get better performance, or even matching power consumption on generic Linux is just not realistic.

Frankly other than for hobbyist purposes the last version of the native OS is probably as functional as they’re likely to get.

Having said all that, more recent iPads are powerful machines that are likely to still be very capable fur as long as they will run. Asahi Linux is compatible with M-series hardware.

So while think getting latest gen performance such as modern browsing out of legacy kit is a pipe dream whatever Apple did, I think there are real possibilities going forward.


There's a lot of use cases outside traditional tablet functionality, such as running a webserver or turning those excellent Apple screens into a secondary monitor.


Or you know, you lot could stop giving them money.


Whenever I read this kind of comment, specially about Apple, I always wonder: What are you suggesting we do?

If the answer is "just don't have a tablet and buy a FairPhone or a feature phone, have less tech", I think that's coherent.

But if the answer is just "don't buy Apple, but continue to buy", I'd say they're all just as bad at best, way worse in general. Apple devices have a longer supported life, although others tend to be more hackable.


oh, i don't think tech professionals or software enthusiasts are majority share holders of apple's bottom line


That would be nice. For now, I really like using the iPad Blink app as a mosh/ssh client to a leased Linux VPS. With mosh, I can be in a Linux dev environment instantly and adding tmux I have several screens to bounce between. Adding a great Emacs setup lets me edit markdown manuscripts and code in Python and Common Lisp.

I am at a relative’s home doing a hospice thing sitting with a loved one who is dying. I only have a small iPad Pro with me, and that is sufficient, but only because I am using my VPS.

Having just normal iPad apps does not cut it for me.



How many heavily-NDA'd Broadcom chips are in those things? Good luck getting working drivers, even if you manage to jailbreak it: https://projectsandcastle.org/status


I really wonder its battery life with this new Intel processor. Both with average usage and also in idle mode.

I had a tablet with Intel Atom CPU on it about 10 years ago, the performance and battery life was good during usage but it would stay only 4-5 days in idle before running out of battery (even if left it with 100% battery). But I can leave my iPad aside and come back to it 2 weeks later and it still holds most of its battery.

Having said that, this tablet really looks nice and if it also turns out to have long battery life, I might give it a try.


I checked reviews for Surface Pro and seems that battery has gotten worse over last couple years. You can expect ~5-6 hours of use. which to me renders the device useless.


These look fantastic, what's the catch? Who are Star Labs and why should I trust them? The "About Us" Section doesn't tell me who us is.


Star Labs is trustworthy (typing this on a Star Labs StarLike Mk IV right now), but small hardware brand. They're based in the UK.


What’s your experience with StarLite IV? I’m thinking about buying it for heavy note taking - neovim/obsidian/browser (Google docs and excalidraw). What battery life I can expect? Is keyboard nice to type? Is stuff like sleep, Wi-Fi auto connect and bluetooth headphones really solved?


Not the parent but I have a StarLite IV as well.

The keyboard is very nice. Short travel for keys but pleasant to use. I use iwd as the wireless deamon and wifi autoconnect works just fine. I do not have bluetooth headphones so cannot comment on that, but my bluetooth mouse works as expected.

Battery life depends largely on how you use it. I have a minimal setup with alpine linux/i3 and just typing in a terminal with screen at 33% brightness results in battery-reported power consumption of just under 2W which is great. Obviously a browser like firefox will impact it quite a bit. Hardware-accelerated video playback works fine.

There are a few minor downsides: the built-in speaker is quite awful (not a consideration for me), the webcam is not great and the microsd card reader is usb 2.0. UEFI secure boot is currently not supported on the coreboot fw. The documentation is sparse.

One weird gotcha is that the power button is one of the keys on the keyboard, so to avoid suspending your device accidentally it will only fire an interrupt after you hold it for a couple seconds. Took me a while to figure out.


That's basically my use case, and I'm quite happy. Battery life for me usually is somewhere between 6-7 hours. The keyboard is pretty decent - not quite as good as a Macbook keyboard or similar low-profile keyboards (e.g. Logitech MX Keys line), but good enough, though the oddball arrow key configuration and sizing for the right-hand Shift and Fn keys is a frequent annoyance. Sleep and wifi auto-connect work well for me on Ubuntu 22.04 w/ KDE Plasma. Can't vouch for bluetooth headphone usage, because I don't really use bluetooth with this device.


I was thinking it was extremely familiar before realizing I was remembering the fictional Star Labs wherein Barry Allen got his Flash powers from.


I can't tell you who they are as individuals, but I can tell you I bought a Starbook from them in late 2021 and I love it. No problems whatsoever with performance, stability, anything.

(No affiliation, although they do operate out of a barn within walking distance of my house!)


Yeah but a 1 GHz N200 yeesh. Everything else looks great.


It's a tablet, not a workstation PC. I have a Surface Go w/ 4GB of memory running Fedora 37 that's plenty fast for what I use it for.

Apparently this also supports hardware accelerated AV1 decoding and h265 encode/decode.


I haven't used one myself but the N200 looks pretty ok for a tablet that's supposed to run a long time on battery. Quad core Skylake-ish cores that turbo to 3.7GHz?


Notebook check says equivalent to a Core i5-8250U. That is not good in 2023.

https://www.notebookcheck.net/Intel-Processor-N200-CPU-Bench...


It depends on what you're doing. My 4th low power i5 in my laptop and my i3 7100 in my main desktop are just fine for web browsing and development.


Yes, and in this case 6W TDP and cheap enough for a $500 tablet seem like key drivers of the N200's design. Of course the Apple M1 is probably twice as fast at similar cost and power but compared to everything outside Apple the N200 looks pretty decent.


If they make a Ryzen 7040 APU version, I'll take it as my daily driver.


7040 in its lowest TDP configuration is 15W. Intel N200 is 6W. Even accounting for some differences in how both companies measure TDP only one of these can be passively cooled in that sort of form factor.

The lack of low-tdp products on AMD side was also one of the reasons given by pcengines to discontinue their embedded line. AMD's last 6W APU was the 2-core R1102G which is now a couple generations old.


Where did you get 1Ghz from? The N200 has a max clock of 3.7Ghz.


From the Specification section of the website:

> 1.00GHz quad-core Intel Alder Lake N200

> Turbo Boost up to 3.70GHz, with 6MB Smart Cache


As someone who gave the JingPad a spin, the Linux tablet market has been painfully difficult to pull off. At least this is based on Intel, ensuring that if the company goes belly up, it can be used for something other than a mirror.


IIRC, Ubuntu Touch has been ported to the JingPad, so there is one supported OS option.


Yeah, it has, but it has serious limitations for future ports because of its hardware model. It’s using an SOC with proprietary PowerVR graphics which will limit its future ability.

I wrote a whole thing about this about eight months ago: https://tedium.co/2022/11/30/fydetab-duo-jingpad-comparison/


Has anyone tried their Starfighter? They've been selling it for about a year; but I have yet to see a proper review. Wait time of 4-5 months is a dealbreaker for me; but perhaps someone has gone through with it?


I haven't ordered or used a starfighter, but ages ago when the ryzen sku first went up I e-mailed the company. I was asking if they had any video of the removable camera being stowed in the base. I was having trouble imagining it. That video [1] was finally posted at the end of May.

[1]: https://www.youtube.com/shorts/tzeL2CJTd6g


Cool! Thank you for sharing the video. Such a strange design choice!


I've been mulling over many tablet options recently with a goal to reduce the number of devices I travel with (terminal, email, playing HW accelerated video, reading comics). I considered everything from an iPad through a Surface with Linux and all the way to more exotic options like https://junocomputers.com/product/juno-tab-2/ or https://www.fydetabduo.com/ but they all lacked something. I was close to going with a Google Pixel with Graphene OS but Google's atrocious pricing in Europe made me reconsider. This one finally made me pull the trigger, mostly because it is Linux-first and came with no "This is a beta product" or "generally works, but..." asterisks. Fingers crossed that it delivers.


I was backed into a corner when it came time to replace my Samsung Galaxy Book 12 and had to by a Samsung Galaxy Book 3 Pro 360 --- really wish that there was something like to the Samsung Galaxy Tab Ultra 9 which ran Windows.


This looks amazing on paper, but plenty of linux phones/tablets do and end up being underwhelming for some reason or another. I'm going to reserve judgement till I can test out the full software and hardware compatibility and battery life (which will likely be never, because I doubt this will ever show up to a Best Buy near me).


Finally, a justification for Gnome!


I'm not familiar with the state of the art at the moment. Is a 12 hour battery life "good" for a device used for light duties (web browsing)?


Do they design/manufacture these themselves, or are they rebadged from an ODM? And if if it's the latter, which ODM are they using?


Someone can correct me if I'm wrong, but they're actually one of the few vendors who design their own stuff - e.g, their Starbook line is their own, not some Clevo junk.

The wait times last I checked are dreadful but that could've changed. I really like their approach though, whenever I can finally get away from Apple's stuff they're likely what I'll opt for (unless System76 knocks it out of the park with their custom one...).


300cd/m is not a nice brightness for any screen


300 cd/m is plenty for indoor use, unless you care about HDR.


A shame for an otherwise interesting tablet. The HP Dragonfly Pro Chromebook sets the benchmark for me with a 1200 nits matte screen.


Some years ago people often advocated using 120 cd/m2 for colour calibrated panels, so it could be worse.


That still is the default calibration target for Calibrite/X-rite calibrators.


Nice!

Is there an option for an active stylus?


I think they'd be promoting it if there was. A pity.


Asked support and a stylus is supported by the digitizer --- they just don't make much of this fact, nor bundle one:

>the touchscreen does support a pen/stylus which would be an active capacitive pen type. This would need to comply with MPP 2.0 or WGP.


Wow, good point. I just expected it to support a stylus and was really hyped about buying one. Without a stylus, I don't think Linux on a tablet is really viable.


Even with a stylus, it's kind of clunky.

Cellwriter is a reversion to the comb-based printed inputs of the early 90s.

Maybe the QT option is better?

https://doc.qt.io/qt-6/handwriting.html

(but it only shows printing)


I would have preferred their new Star Lite was an updated 12" fanless Linux laptop rather than a tablet.

Do any decent fanless 12" Linux laptops exist now?


What's the difference between a fanless laptop and a Linux OS tablet? A keyboard on hinges or something?


Looks like StarLite needs a kickstand behind it to keep the display up, the hinge likely has no stickiness whatsoever, it's just like a folded piece of the folio cover. Like a Microsoft Surface. Can't type in lap, hard to use in tight spaces like planes or busy coffee shops.

I have a couple of Pixel Slates with Brydge keyboards that have actual good hinges, that's much closer to a laptop. Still more top-heavy than laptops.


I'd love to get this - when I saw the price I was blown away, it really seems like a good deal!

The visual presentation of it is really delicious.


I can't help but feel like an x86 tablet is a non starter, the architecture is just too power thirsty compared to ARM.


I use an RPi3 with 7 inch touchscreen to listen to webradio all the time. Just the other day I was curious, and measured its power consumption. My laptop with Ryzen 4700u with turned off screen uses less amount of power than the Rpi with screen on (~6W for laptop vs ~7.5W for RPi). I'm really thinking about ditching my RPi's as a radio and even as pihole, as my laptop can do much more while still using close to 0% CPU.

Sure, the screen on/off comparison makes this measurement a bit unfair, but calling x86/x64 energy inefficient doesn't sound particularly correct.


One major difference that could explain the power consumption is that the 4700u uses a 7nm process, and the old RPi3 uses a 40nm process.


I have a tablet with a i5-7200u (2016) with a small-ish ~34WHr (originally, well worn since!) battery that gets 7 hours usage easy. How is this "power hungry"? For an ancient quad core & inefficient old storage drive.

As with 90% of everything, people take a bunch of highly visible indicators to decide their opinions & then shit on anything & everything that doesn't meet their set conception. People hate Electron because they think of Slack, but Slack has shit ass architecture. Dump it's web data and you'll find dozens of multi-megabyte indexeddb databases with mostly duplicate data. It's just a shit app. You can build shit apps in anything. But vscode being slick fast & smooth doesn't register for anyone, seemingly carries no weight. The negative conceptions dominate & rule, are the things that get posted, actively, and with energy. It's a damned shame.


I tried VSCode on a Librem 14 by Purism. It has a tenth gen Core i7 with 12 threads. With only very basic essential extensions like syntax, Git integration, and a Vim mode, I could type faster than the view updated.

Electron may be great for people who want to write Web apps, but browsers are RAM and CPU hungry beasts compared to the GUI toolkits they're trying to replace.

In short, which environments are running VSCode smoothly? Do I need next gen hardware to run a damn IDE smoothly?

Web apps have some advantages, but performance is nowhere in that list.


Your poor performance seems unexpected in extreme. Folks at work are on m1's with quite a lot of extensions & everything feels snappy.

Having a huge git repo could poyentially hurt; would be interesting to test. Who knows maybe vim mode is partly to blame? I personally still mainly use vim, not vscode, so I don't know what usual suspects there are. It'd be awesome if there's good profiling that can show what plugins are taking significant time in the core loop. We have brutal eslint to apply but that for example all happens async. Once again, we're back where we started, of one bad situation - some random plugin somewhere - creating a lightning rod for disdain.


Good luck finding an ARM tablet with an open device tree that you can compile/install mainline linux onto.


Somehow people keep forgetting about this. ARM support sucks till today


According to this comparison the performance per watt is roughly the same for the CPU in this tablet (Intel N200) and the Apple M1: https://www.cpubenchmark.net/compare/5178vs4104/Intel-N200-v...


A Surface Go or Surface Pro, which this is essentially a clone of, will happily run for a school/business day, and that's with the weight of Windows on it.



Which gives you an hour or two of battery life, and effectively no camera support.

I have one, running Linux on it, love it and consider best Linux device I ever had had, but I wish there were fewer quirks.


> Which gives you an hour or two of battery life

Interesting, but that must be some bad configuration, something that must be fixed on that specific line of tablets. It is not necessary on Linux.


And of course some rogues had to pass leave some judgement and a "guess what triggered my reaction". In this case, the very annoying and utterly avoidable game of guesses is among the easiest: that it was a banality? But not all of this public may have been there...

Edit: to expand more on the implicit of the original post «not necessary on Linux»: that «an hour or two of battery life» is quite extraordinary, and should really require a bit of expansion: what is it that drains that battery? That is not normal at all, and requires some more detail, like "Android 12 tried to marry my cousin" or similar would.


x86 is pretty power efficient. Looks like its using an Intel Alder lake CPU, which means it has e-cores which can be even more power efficient.


Even better, it’s the N series which is all e-cores. They’re fast enough, and there’s no worry about the p-cores wasting battery in a tablet.


I don't know much about peak power consumption of these chips, but they are crazy efficient when idling / having little load. Plus, ARM support still sucks :-(


> the architecture is just too power thirsty compared to ARM

Maybe the architecture, but not the products.

You can consume 3 watts on some Intel based laptops.


This is an Intel Atom renamed those have been used for tablets before and Gracemont is supposedly even better?


It’s a 6 watt processor that still scores well in benchmarks, very well picked.


Agreed - ARM put x86 on the clock, and Apple Silicon - the best implementation of ARM yet - is the death knell of x86. Every opinion to the contrary feels like cope, because it's coming, and fast.


A CPU is not a computer. Apple does not sell its CPUs to other manufacturers. A single vendor cannot supply solutions satisfying every person and every business requiring computers, no matter how good their tech is. Fast ARM merely keeps Apple relevant. It is not such an unfair advantage that everything else dies.


Latest Intel chips are faster than Apple Silicon in single and multi-thread benchmarks... (they use more power, but they are faster...)


These days I’ve started to care a lot more about power per watt than raw numbers. Watts = heat = fans = noise.


You realize we're commenting on a post about a fanless laptop running Intel CPU?


Great. To me Gnome 3 always seemed a perfect tablet envoronment while fairly imperfect desktop environment so I always wanted a good tablet with it and never wanted it on desktop/laptop.

I feel like buying this one. What inspires me even more is it seemingly having more than one USB-C port.


I think this might be _exactly_ what I've been looking for. Linux tablet that's easy to open and replace parts? Yes please. It's overpriced for what I want it for (and almost definitely overpowered as well), but if the parts ecosystem allows it to last long enough, that might be worth it.

Although if anyone has any ideas for a better fit, I'm open to suggestions:

I just need a tablet to stay in the kitchen to display recipes. I plan on running a self hosted recipe database. So actual CPU muscle needed is pretty minimal, but I'd really rather have the repairability and am willing to overpay somewhat to get it.


Just received notification and went to order. Shipping to my location (Nordic region) turned out to be expensive enough that I’ll have to pass on this for now and live with older hardware for a while yet. Too bad!


The keyboard is also extra, which bumps the price up to more than I originally thought.

Biggest downside is a 3-4 month lead time though, I'm too impatient for that unfortunately, especially given this would be a bit of a punt for me.


And you'll need the keyboard if you want BIOS passwords or full disk encryption. No OSK or bluetooth keyboard will be available until you're past those.


What about using a regular usb keyboard? With A to C adapter.


Odd - shipping to Central Europe starts at 32 euro


I can't see anything about storage, does this have an SSD and if so how large? Looks very interesting to me, I'm just not ready to give up storage like most of the Chromebook options.


It’s M.2 2242 Gen3 PCIe NVMe, 512Gb in the base spec, configurable up to 2Tb, but presumably end user replaceable.

https://starlabs.systems/pages/starlite-specification


Missed that thanks, certainly piques my interest at this price


> WiFi 5 and Bluetooth 5.1

I wonder if these can be upgraded to the latest versions?


"Wireless" is listed for StarLite in [1]. I'd be concerned about driver/firmware stability and heat dissipation, though.

edit: Now that I look at it again, there are two different "Star Lite" entries, and I'm not sure which one is this tablet.

[1] https://support.starlabs.systems/kb/faqs/is-it-possible-to-r...


None of them. This is the Star Lite Mk V.


I was going to say "nope", but at below $500? Seems viable.

I just don't like this tablet factor. It's a poorly engineered design - a flimsy top heavy structure that takes a lot of space? What could go wrong. I can't count how many times I accidentally knocked my Surface off the table or when I needed to type something quickly and didn't have enough space on the desk and getting that thing on the lap... eh.


Upto 12hrs battery life screen on or standby?


Surely screen on (considering other not too distant products).

But very probably: new battery, low screen brightness, network off...

Reasonably less than 12h - say, 8h, 10h - probably well achievable.


It looks nice.

The only thing I'd change is have a 4:3 screen (or at least 3:2). The extra height really matters on small screens.


I'm in the market for a tablet, but my primary use is media consumption so screen+speakers are of the highest importance. But that said, this still looks very promising. I signed up for email notifications, so it depends on when it becomes purchasable.


I’m using my sister’s ipad and frankly the only good reason i see to get an ipad is the goodnotes app. It’s really a killer feature app.

Is there a linux equivalent of goodnotes?

Getting this tablet and running linux would be ideal as long as i could have something like goodnotes


It looks like a pretty cool device but, since I started working from home, I have no use for a ultralight device. I only have a laptop because that's what everybody gets but if I've moved it a mere meter around, that was too much.


"5-years of updates - Backed by secure updates delivered via the LVFS."

Hmmmm. Why a five year cutoff? Does this have some custom firmware not in mainline Linux that would preclude updates beyond five years?


LVFS is how you ship firmware that installs permanently into flash on the hardware device (e.g. Coreboot).

Hopefully, at least Coreboot itself stays buildable for the motherboard in the long term so you can keep locally building core system firmware for many more years to come. Star Labs themselves are not committing to continuing to ship pre-built binaries from their side for more than five years.


Most PC motherboards and laptops won't give you longer BIOS updates than 3-5 years. It's unfortunately the norm for FW support lifecyle.


I would have included GPS: those tablets are good as maps.


For what it's worth, some dedicated hotspot gadgets expose a GPS service over TCP (the protocol is often somewhat inaccurately called NMEA), and a standalone USB-connected GPS receiver can be <$20.

For tablets, I believe often the GPS receiver is on the same chip as the cellular modem, so you don't tend to get GPS without that.


This looks pretty good for $500, although a 300cd/m2 screen limits it to pretty much indoor areas out of direct sun.


I’m afraid i3/sway have spoiled me and now I find any other wm just awfully bulky to say the least, but also I can’t imagine how to use tiling wm without a keyboard. Are there any projects exploring the idea of tiling wm usage on tablets?


There is at least one: SXMO [0] . There are a few distributions where it's available, at least on the PinePhone, so it's not inconceivable at all that it would possible to use it on an Intel-bsaed tablet.

0: https://sxmo.org/ "Simple X Mobile"

Extra quote from their docs:

> Sxmo >= 1.6.0 runs both on Xorg (using dwm) as well as on Wayland (using sway). The X in our name historically referred to Xorg, but is now open to whatever interpretation you prefer (eXtensible, eXcellent, eXperimental, etc...)

Note: I have yet to try it but I'm getting more interested now..


In some ways, tiling WMs are a natural fit for tablets — consider e.g. the iPad’s ‘split screen’ mode, where two windows take up the whole screen between them. It’s easy enough to imagine a WM where you can drag windows to move them around, and drag the splits between them to resize. On Linux, this kind of paradigm is implemented by e.g. Hyprland. I dislike it for desktop, but on a tablet I’d happily use it.


In my limited experience, sway is pretty easy to use on a tablet because it supports moving windows very will using a mouse/pointer. Obviously you need still need a on-screen keyboard and a lot of config to replace other swaycmds but it seems like a decent base WM for a tablet.


Again with this stupid unlappable design


I disagree, it's potentially a much more versatile format.

What's needed for use as a laptop (aside from a disregard for ergonomics) is a k/b with an extra battery for weight/balance, and actual friction hinges to hold the tablet at the desired angle.

The tablet itself could then have smaller batteries and be lighter, maybe as an option.


> a k/b with an extra battery for weight/balance, and actual friction hinges to hold the tablet at the desired angle.

which doesn't exist. You can't just design half of a device and tell customers that 3rd parties will come up with the rest... in time.


Just a not fully considered thought, but maybe we could 3D-print hinges for this product?

Or, BT keyboards with trackpads (they do not consume much power) including hinges could be in commerce.


there are products like that but all of them (that I known) are designed for specific devices, either a Surface or an iPad (Brydge, Doqo)

Let me know if you find something on the market that may work with this specific tablet.


RYF certification or no sale.


What's RYF?


Free Software Foundation certification program: https://ryf.fsf.org/


I wonder if it can run Qubes OS and PureOS.


Finally. A proper Linux PC tablet.


No SIM card/4g/5g. Why is the state of telecom so bad on linux?


Because carriers and hardware vendors don't want to play ball. Some carriers consider the user's modem as part of their network and want to control it, hardware vendors want to keep their precious firmware blobs and private APIs to themselves.

Linux has all the drivers and infrastructure in place to use cell networks efficiently, carriers and vendors just don't want to be part of that.

Source: my only Internet access at home comes from a LTE router running Debian I built myself.


Any guides or writeups i can follow to build a setup similar to yours?


Unfortunately I never got around to writing it down and as it has been running pretty much unattended for the past three years it is not fresh in my mind.

The gist is that since it is a router only (switch and access point are external appliances) it is not that difficult, there are plenty of resources on the web explaining how to turn a linux box into a router. I had to configure a bit of nftables, dhcpd, dnsmask and stubby and that's pretty much it.

The server is a standard PC Engines APU4 and the modem in the mPCIe slot is an Huawei ME906s.

As I alluded earlier, the hardest part is finding a good modem and getting it to work. Once the modem shows up as a network interface the hard part is done.


Im curious: why would this be a Linux issue?


Because the manufacturers switched to making cheaper hardware and moved trade secrets and regulatory compliance into the Windows driver: https://en.wikipedia.org/wiki/Softmodem


My personal feeling is, there are more modems that do not work under Linux than those that do. Or are detected but happen to randomly stop working during the day.


[flagged]


You can, it is explicitly supported along with a few distributions. But you just wanted to be snarky.


Why not a Ryzen APU?


Can you have that fanless in this form factor?


... on Legacy ISA.

I'd rather a RISC-V Pinetab-V, based on the same JH7110 as VisionFive 2.


> ... on Legacy ISA.

Since when is x86 legacy?

> I'd rather a RISC-V Pinetab-V, based on the same JH7110 as VisionFive 2.

Have you actually used one of those? They are excruciatingly slow. Less performance than most Raspberry Pi. Maybe it works for your niche workflow, but for StarLite to succeed they'll have to build a _usable_ product.

Also competitive ARM based alternatives (such as Snapdragon) require many closed source blobs to function.


>Since when is x86 legacy?

Since the first commercial RISC chips obsoleted x86 in the 80s.

>Have you actually used one of those?

Yes, JH7110 on VisionFive 2.

>They are excruciatingly slow. Less performance than most Raspberry Pi.

Not my experience, owning VisionFive 2 and all raspberry pi B boards (1,2,3,3+ and 4), all of which have been used heavily.

The CPU is much faster than RPi3b+'s while using way less power. Only a little slower than RPi4, and this is assuming the RPi4 has a huge heatsink like mine does. JH7110 doesn't require a heatsink nor throttle under load, for it stays under 75C even when running builds 24/7, yet is designed to survive 125C.

Whereas everything else in the SoC is better. Especially the GPU, which in the RPi4 is anemic, and can barely keep up even for 2d at 1080p. The experience is much better on VisionFive 2 for even simple tasks like web browsing, and it's not even close. Night and day.

Nevermind the upstream support effort[0] getting this far in just a few months (indiegogo delivery was in March), and follows standard RISC-V boot/firmware interface specifications, whereas Raspberry Pi is all bespoke and never cared much about upstreaming support.

>Also competitive ARM based alternatives (such as Snapdragon) require many closed source blobs to function.

Here I just do not understand what point you're trying to make.

0. https://rvspace.org/en/project/JH7110_Upstream_Plan


> Only a little slower than RPi4

That was my point. It is slower than a RPI4, which is already too slow to use for anything serious. You can't expect StarLite to develop a tablet that is 'a little slower than RPi4', because nobody but you will buy it. And yes, your workflow may be possible on a low powered machine, but that is not how the majority of customers use a tablet. Until there is a serious competitor to modern x86 processors in terms of performance (which the JH7110 definitely isn't), RISC-V is not going to be offered in any commercial grade product anytime soon.

> Here I just do not understand what point you're trying to make.

The point I was trying to make, it that the other alternative, ARM, is also not suitable for StarLite's mission due to the closed blobs. So really, x86 is the only viable option they have.


>RISC-V is not going to be offered in any commercial grade product anytime soon.

That sentence isn't gonna age well.

>That was my point. It is slower than a RPI4, which is already too slow to use for anything serious.

I am still using my RPi1, but I guess my uses would not meet this arbitrary "something serious" tablet requirement.

Joking aside, the RPi4 is actually very fast, and its shortcomings are elsewhere than the CPU.

E.g. its GPU is grossly inadequate. Its video codec block can't encode. It lacks cryptography acceleration, so LUKS and SSL need to be done in the cpu with non-specialized instructions.

These are issues JH7110 does not have. The experience is much better with JH7110.

>The point I was trying to make, it that the other alternative, ARM, is also not suitable for StarLite's mission due to the closed blobs.

Ah. They buy the FSF nonsense where closed blobs need to be burned into a ROM for the machine to quality as "open".

Unfortunate, but I understand.

RISC-V is inevitable.


Honestly when I am looking at their other models like the StarBook and StarFighter, I have a weird feeling that I cannot repair them once the battery or other parts breaks down. They have an internal battery similar to the format of A1278 MacBooks, which makes my spidey sense go off that it's not easily repairable without breaking a ribbon flex cable.

I don't need a bleeding edge gaming GPU, I don't need USB4, I don't need Thunderbolt. These are optional extras for me, I've given up on them already.

Currently I am kind of stuck with the old T440p because of this, all other laptops that I bought in the last 10 years have died already with no chance of repairing or fixing them - either because of non available parts, or because literally the GPU melted down (happens more often on MacBooks than one might think, apparently).

From Ultrabooks to even MacBooks, I've had them all, from Dell to HP, Apple and back... all died eventually and I switched back to my T440p in 2019. This old laptop still lives, can be repaired, has a replacement parts community [1], and doesn't break when I let it fall down on the floor. But it's now so outdated that I cannot even order battery packs anymore, because they will arrive with less than 20% capacity condition when they're being sold as "New" because they're even not produced anymore and are stored for too long.

I think what I want is something like the MNT Reform laptop [2], where the battery cells are replaceable 18650 standard cells. But without the problems that come with ARM. I've been thinking about even getting an older Thinkpad just so that I can just let the guy from xyte.ch [3] build a better one out of it.

Honestly I've spent more than 10k over the years for crappy Ultrabook laptops or MacBooks, which all died up on me and I'm sick of spending so much money when my laptop from 2013 still works.

Why can't we have nice replaceable battery packs, RAM, and m2 SSDs anymore? I mean, not even the Framework laptop has a replaceable battery pack? WHY?

Am I the only one wanting this?

[1] https://www.thinkstore24.de

[2] https://shop.mntre.com/products/mnt-reform

[3] http://web.archive.org/web/20221130010517/https://www.xyte.c... (currently has php problems)


You must have missed on their product pages where they say:

"Laptops designed for open-source software need open warranties. Our 1-year limited warranty allows you to take your computer apart, replace parts, install an upgrade, and use any operating system and even your firmware, all without voiding the warranty."

They also post disassembly guides for their products and, so far at least, they don't seem to use super fragile ribbon cables for the batteries. Here's the one for the last gen of the Star Lite: https://support.starlabs.systems/kb/guides/star-lite-mk-ii-s...


> Why can't we have nice replaceable battery packs, RAM, and m2 SSDs anymore? I mean, not even the Framework laptop has a replaceable battery pack? WHY?

https://guides.frame.work/Guide/Battery+Replacement+Guide/85


I think they meant batteries you can swap without taking the machine apart, like old laptops had


Turns out the 16" Framework can have an external battery, they designed the connector to also support that.

https://www.theverge.com/2023/3/30/23612467/framework-laptop...


Taking a framework apart is very easy though.


AFAIK pretty much all (most?) the recent thinkpads have relatively easily replaceable batteries. I just replaced the battery in my X1 carbon (although it's a 2016 model so not so recent) and it was really straight forward.


Is this any good for regular full stack programming, running a couple of docker containers?


Yes, performance is very close to i5-8250U ( https://www.cpubenchmark.net/compare/3042vs5178/Intel-i5-825... ), which was release at the end of 2017. I have this in my laptop that I use for work, VM's, IDEs and containers are running all the time on my system.

Works great, even if I have all CPU mitigations enabled, turbo boosting disabled, and running in powersave mode. What I do to avoid the fan from getting audible. However, with all the options I mentioned enable (especially enforcing a strict powersave profile/max frequency/etc) you'll get a bit of latency here and there; which I can put up with because I hate to hear the fans spin. With a fanless design, a better power profile (6W TDP vs 15W TDP), I wouldn't constrict my CPU to such extents.

To compare it in relative performance terms, the Apple M1 is ~3.8x times more performant than it (if that helps).


It has more than enough ram, but the cpu and gpu will probably slow you down, but that depends on what you are used to :).

But it's sold out already.


Depends on what you put in the containers ;)


No USB-A ports and no plain old Debian option... Would be cool if they offered a blank keyboard option, because right now I would just have to pick any ISO language and because I used Slovenian layout, but if I bought it, as is, I couldn't look at the keyboard anyways...

Also the display seems a bit dim at 300cd/m.

And since it's touch, stylus (digitizer) would be very welcome.


USB-A to USB-C adapters are cheap. USB-A is past its prime.

Your post contains only negative points.

There will never be a perfect device because everyone has different wants, so enjoy it for what it is.


I'm sorry, I guess I didn't fully express my thoughts. I'm genuinely excited about the device and had signed up to be notified when it's available. I will, of course, wait for the reviews, but at ~600€ with a keyboard, it's already leagues ahead of my current Chromebook, and this is running a "real" operating system!

I expressed my issue with no USB-A ports because I prefer my devices to be self-sufficient (I hope that's the right word). My mouse that I've had for many many years has USB-A, all of my USB storage drives are USB-A, Wacom tablet is USB-A, external DVD drive is USB-A.

What I'm trying to say is that all of my devices are USB-A, I have a USB charger only for my phone. And I do find it absurd having to use an adapter to use a mouse with my computer. I would be very slightly less bothered if the adapters at least came with device, but they don't, switch means that it's something I have to buy (no, I don't have any USB-C to USB-A adapters, because only my phone has a USB-C port).

I don't believe USB-A is past its prime, I don't really even know what that's supposed to mean. I still love and use my 3.5mm headphone jack on my phone.

With criticism, it's much easier to improve then if you only hear praise.


Of course, your opinion is yours to have, but honestly, the world has moved on from USB-A for new computers/phones/peripherals. I just bought a couple of those: https://www.amazon.com/Basesailor-Thunderbolt-Converter-Gene... and they stay on the USB-A cable - the price is low enough that I can justify the expense for my 5 or so older peripherals.

The USB-A socket is too big for the slim form-factor of ultrabooks and tablets, and it won't make a come-back.


I would go this route :

USB C to USB Hub 4 Ports ($13)

https://syntechhome.com/products/usb-c-to-usb-hub-4-ports


I get where you're coming from, my mouse and keyboard are also still USB-A, but I don't think this is a major issue when you can just buy a decent USB-C dock. I've got one with 3 USB-C ports + display port + HDMI + ethernet etc. I find this honestly much better organizational wise rather than everything leeching from different parts of the laptop, and you're no longer beholden to each variation of laptop having the ports you need (some don't even have ethernet these days...)




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: