Hacker News new | past | comments | ask | show | jobs | submit | ases's comments login

All the code examples on this page are doubling any brackets ([], {}, ()). How have they managed that? Not a fantastic first impression of its capabilities...


Not only that, but the Golang example is full of errors. Parameter definitions don't have colons between name and type. The map for seen elements is declared as a, but later referenced to as m. The append instruction references uniques, which is undefined. The return statement also references uniques.


ah yes, so a typical example of generative AI output


The Go code shown now is both syntactically valid and semantically correct: https://go.dev/play/p/GDki9v78jMM

I can't check how it looked 15 hours ago.

Makes me wonder if it was manually corrected and is thus a fake example now.


I'm going to give them the benefit of the doubt and assume this is a marketing site problem and not a product problem, but either way, not a good look. Reminds me of Google's big Bard announcement including a wrong answer in the image.


Gitlab is using Google's Vertex Codey models/API for the code completions.

Showcasing innacuracies in AI responses must be a Google requirement.


GitLab team member here. Thanks for flagging.

Our web team is working to resolve this issue here: https://gitlab.com/gitlab-com/marketing/digital-experience/b...


There was, for a little while, the idea that a mobile device could seamlessly plug into an external display, with external input devices, and swap between a "desktop" and mobile mode. I believe Mozilla tried to invest in this pretty heavily.

But it went nowhere. I suppose the technical challenges of the time were too great, and the mobile devices that won the space were locked down like the iPhone, where it was better to have the ecosystem that let you sync your mobile device to something more powerful if you really wanted.

But given the power inside mobile devices these days, Apple silicon especially, I think it is sad that this vision never really came to fruitition. It seems like the perfect sort of device for most modern users. Something that is open and unlocked, and you can plug it into bigger things to solve the problems many other commentors are talking about around difficulty programming on small screens, etc.

But that's the whole of modern computing history, I suppose.


As others have mentioned, Samsung DeX does this very well. Plus, there are several apps out there for setting up a near Linux desktop experience on the device. Previously DeX itself used to support running Linux applications, it's a shame they dropped that.

As a result, along with the ability to access my proxmox VMs if needed, I've been able to retire my laptop in favor of just a Galaxy Tab Ultra.

It's overall pushed me heavily into Samsung's devices. They are not that much worse than most Android phone makers, have nice integration with their other devices, but don't lock things down as much as Apple. The wacom pen support across both the Ultra tab and phone has also been a huge draw for me. Makes me wish my desktop pentab also supported the same kind of pen for seamless interoperation.


Samsung has this reputation of being bloatware heavy with a lot of their default apps having ads. Is that still true?


Yes, they load up Android with their own versions of apps in an attempt at end-user lock-in to their ecosystem, plus various other, what I consider, unnecessary padding that's not uninstallable.

The new life breathed into an older Samsung device by the installation of LineageOS is what I consider the proof of this.

Overall it's entirely subjective, however. I prefer minimal base Android so that I can choose which apps are allowed to annoy me with their various reminders to create an account, log in, look at these new features, rate me, etc.

Other (less technologically savvy) family members love their Samsung apps and the ecosystem, and just find my setup to be confusing or overly complicated.


Right. Whereas we want to strip the software down to the bare minimum into some sort of platonic ideal state, it turns out all the Samsung stuff is actually quite useful and actually is what most people want. It's really quite useful stuff!


I have a midrange Samsung tablet (FE model) and the only "bloat" I've noticed is a Google Now replacement that I can't uninstall. That's very annoying, but that's also the only unnecessary application on there.

Samsung has to package certain applications with their device to allow Google Play, like Chrome. Google has convinced people that their browser is the norm (not the perfectly fine Samsung Internet) and that any manufacturer making their own apps to compete with theirs is "bloat".

The well-integrated Samsung calendar is a lot better than Google Calendar, but Google forced companies for years to install their version as well, despite their failure to capitalize on the unique hardware features of some devices. The same is true for many other Google apps.

I don't think I've seen random ads in Samsung's pre-installed apps. Samsung Health on my (non-Samsung) phone started advertising their new phone at me, which disturbed me, but it seems to have stopped doing that soon after all by itself. My guess is that they received more backlash than anticipated.

I'm pretty sure I could give Samsung Dex to a lot of people and they'd be fine using it as a daily driver for their computer. It does everything most people want out of Windows (a browsers and light office work) and it's not nearly as bad as some other attempts. With modern smartphone CPUs, the UI is smooth and responsive, and Samsung's design makes me a little jealous as a Linux user because it's honestly just better. I'd even go do far as to call it better than Windows 10's UI in some ways.

If you're ever near a store that had Samsung tablets on display, I recommend you try putting one of the demo units in Dex mode. It's a toggle in the quick toggles and it'll turn the tablet into a full desktop (which is perfectly fine for the massive screens tablets have these days!).

Hell, if Jetbrains keeps developing their remote coding platform, I bet I'd be able to do programming work from a tablet and a sleeve as if I was working on my Linux laptop.


I think this depends on region, and the type of phone.

I am in Canada and have never seen ads or bloatware on Samsung phones. I currently use a Galaxy Z Fold 3 connected to a lapdock via USB-C video out. DeX is a very nice desktop environment for the phone.


Meh. Not really. I notice a few things here and there but avoiding Samsung apps is pretty easy.

With Samsung I can plug in a USB-C to HDMI cable and duplicate my screen with zero effort. And with DeX, apparently I can turn my phone into a desktop?! I never tried before last week but when I plugged in aforementioned cable, I get a DeX prompt. Exit, and I see the screen. Hit "enable DeX" and it turns on a mode that shows something like a desktop screen on the external display.

Perhaps the fact this isn't known or operationalized by companies means it has a ways to go, or that laptops are easier for remote workers. But it's cool stuff.


That reputation was why I had stuck to brands which stayed closer to a more vanilla Android experience.

My experience with their latest stuff has been that they do still have a lot of bloat and duplicated functionality, but the base experience is much closer to stock Android than what I remembered from their older phones. To put it differently, it looks and feels similar to the near stock Xperia 1 ii I switched from but has all sorts of little differences.

I actually find myself using a lot of the functionality too, so it feels a bit less "bloated" to me.

I haven't really noticed ads though. They do recommend stuff, eg their camera app has a bunch of modes and the "Expert RAW" mode was shown in the app but required a download from their app store. I don't really count those since they're somewhat more relevant than say, Windows advertising candy crush on the start menu.


I have a Samsung A32 and the bloatware has been completely unobtrusive for me so far, and no ads either.

My purchasing and usage habits for mobile devices is to get something new that is mid or low-range, use it until I notice performance issues, or the bloatware does something I don't like, then flash a custom ROM.

That's served me pretty well in the past, but I may still have hangups from teething pains with rooting devices and compatibility issues on custom ROMs back in the 2010's


My Samsung phone doesn't look particularly bloated. And for any system app I can't uninstall, there is adb

adb shell pm disable-user --user 0 name.of.the.packet

Just be careful not to disable the ui, adb or something else that is essential. I haven't found anything that can't be disabled, yet. Always backup your important data in case you have to reset to factory defaults.


This was my Samsung experience. I found their phones to be a poor experience full of apps I didn't want to use and couldn't uninstall.


still miles ahead of apple


Never was.


Unfortunately, even with Samsung DeX, the tablet or phone can only do 16x9 resolutions on the external screen, which makes me very sad.


> But that's the whole of modern computing history, I suppose.

I really is not. Current Apple in particular has a strong vision of computers as almost commodities/appliances. The iPod might have been the defining moment, and almost all products after that were all defined by negative functional space.

The iPhone is a smartphone that was built around the idea of having neither a keyboard, nor stylus (then later nor side-loaded apps). These characteristics where heavily touted on stage. Nowadays you can plug a keyboard, but it won't help a lot.

The iPad was defined as having no advanced window management and no compiling, on top of iOS' other limitations.

The iMac was the original "only usb!" computer, and could still be defined today as the no touchscreen computer. Even as of now, the Vision Pro is the headset that's touted as having no primary controllers.

But if you look outside of the Apple ecosystem, these limits only apply to where the hardware can't do it. As many have cited, Samsung's phones can actually act as full computers, and som other android phones can too. Same way Samsung's android tablets have advanced window management, can load linux subsytems and do whatever a computer is supposed to do. Windows laptops have touch screens.


iMacs and MacBooks are definitely not “computers as appliances", and you can absolutely run unsigned code or an independent OS (with a vendor-supported mechanism for alternative booting that has been opened up significant to help Asahi along). On Intel macs you could flatly bootcamp windows if you wanted. Is a windows PC an "appliance"?

Not having the ports you want doesn’t make them appliances either, and a touch screen is not a requirement for something to be considered a laptop. Nor, some would say, a positive thing at all.

Parasocial attachment (fanboyism) isn't just a positive thing, there is such a thing as negative parasocial attachment, and you are letting your fanboyism make you say silly things.

http://www.paulgraham.com/fh.html


For context, I've exclusively used macs as daily work tools for 2 decades. The reason I did so was because I got tired of trying to make BSD work, and also burned out from windows.

I've heard many people having the same take, we wanted something that "just works".

But to step back, this also means, we don't care as much about raw perfs, we care less about new paradigms, we don't need the bleeding edge, and expect our devices to be stable and useable for 10 years, basically be "classics". And I also know if my mavbook burns down today, I can go to a store, buy the latest 13" laptop and be back where I was in 1 hour at most. They're utterly replaceable.

That's basically what we want from appliances.

I you care in any way shape or form to push the envelope, macs won't be your choice. The best GPU won't be there, the best CPU isn't (Apple's ARM is good, but not as powerful as the top of the line desktop CPU at full wattage), the best form factors aren't there, the most innovative apps didn't go there.

Some companies do crazy things with macs, but they're in a extreme minority. And Apple 100% doesn't care about markets that have no scale.


> And Apple 100% doesn't care about markets that have no scale.

The Mac Pro is an example of this.


Samsung Dex [0] is bringing that vision back, maybe?

[0] https://www.samsung.com/us/apps/dex/


Can confirm. I recently demoted my old Samsung S7 to be Wifi-only and work stuff-only. MFA, mobile email when I'm not at my desk, etc.

When I travel for personal stuff, I now just take my personal laptop and my phone. If I need to get into my work stuff, I'm pretty damn effective with all the cloud tooling I can get into from the browser under Dex. Screen real estate is the biggest issue which Dex handily solves, with a close second being things like MS Teams being a little clunky in Dex mode (not a dealbreaker). It's all more than sufficient when I get pinged on personal travel, since I'm not likely to need to be at 110% like I am if I'm working at 2pm on a Tuesday.

Why would I want to do this? Because every time I take time off, something goes off and nobody knows what to do about it, so I get pinged.


Why do you need DeX, if you bring your laptop.


Personal laptop is much thinner and lighter, and I'm less concerned with losing it on travel compared to my work laptop. Also - accessing work resources requires a crap-ton of security policies applied to the device (full control including remote wipe, a stack of security & endpoint management software, just to be able to authenticate with my work account) which I don't want to install/grant on my personal laptop.


That's an interesting value system. Why is losing a work laptop more concerning than losing your personal laptop? losing either of them isn't great, but the company has far more resources than you to protect and replace the laptop than you do.


My personal laptop makes me no money.

Part of your personal brand at work should be "not the guy who is always losing his work laptop".

Plus, my personal laptop is cheap, old and encrypted.


Yeah but it's yours. Replacing it is so much harder for you than it is for corporate IT which buys laptops by the pallet.

If you'd lost a work laptop ever than yeah I could see not wanting to lose another one, but the mere possibility of losing one isn't anywhere near being "the person who is always losing their work laptop".

Losing my personal laptop would be a big deal for me. Losing my work laptop, while still not ideal, is just what IT calls Tuesday.


We probably think about risk differently, I think. And may also have a different view in making personal sacrifices to further/protect career.

For what it's worth, I've never been caught up in the 10+ rounds of layoffs which have taken place in my career across all the places I've worked.


Microsoft attempted Continuum [1] with some of their Windows Phones. And as of now, PinePhone and Librem5 support USB-C dock with HDMI output.

[1] https://www.microsoft.com/en-us/windows/Continuum


The entire vision behind Windows 10 cross platform apps was inspirational. Code once run then across phone, tablet, desktop, and Xbox. And Hololens... I guess.

The implementation was terrible and was saddled the failing Mobile division.


I think Microsoft finally got it right around Windows 10, but the damage had already been done. Windows 8 and Windows Phone 7/8/10 had caused too much damage to the brand.

Now, Samsung Dex is doing what Microsoft wanted to do, but very few people know about it. That probably has to do with how uncommon USB-C docks are, and how many of those docks use DisplayLink instead of a normal standard to do video. It's a real shame, because a modern phone is more than powerful enough for most people's computer work.


Indeed. I have a Nokia 950 and in many ways WinPho was a really nice phone OS; Microsoft's endless jerking about of developers as it went 6.5 -> 7.x -> 8 -> 10 did it no favours.


Absolutely, as soon as developers and customers found a nice place - they would pull the rug from under them. I got into the WinPho around Win 8/8.1 and it was pretty cool and the update to Win10 showed to much promise that vanished the instant they stopped working on the hardware. To be fair if I was in Nadella's position, even though I loved the Lumia line - I would have done the same.

I have said that in technology one step ahead is an innovator, two steps ahead is martyr. Windows phone was Martyr.


Phones are powerful enough these days, but computers are more powerful. I carry work laptop between home and office, but I couldn't do work on phone. Computers are also cheap, I am using Raspberry Pi 4 as spare desktop.

The main use case I see is traveling. But where are you going to connect the phone while traveling? I guess some hotels still have office center. I guess could use the TV in hotel room but then need keyboard. I think phones should have external display support for playing on hotel TVs.

Instead of phone that plugs into monitor, what you should be looking into is tablet with keyboard that connects to monitor. It can be used as tablet or small laptop while traveling. Tablet OS and apps will work better on larger screen than phone. I don't know of any OS that does tablet, small laptop, and big displays well, iPadOS does first two and Windows does second two. Apple are missing an opportunity with iPads.


What doesn't Samsung Dex do well for your three form factors?


> given the power inside mobile devices

Lots of power, very little cooling. It's designed to be used in bursts. If you've ever played a high-end game on your phone, you'll notice it gets very hot and rapidly drains the battery.

While desktops are essentially dead outside the enthusiast space, I'm really happy that everyone still owns a laptop. It's more or less the ideal computing device, reasonably portable with its own screen and a real keyboard.


Desktops definitely aren't "dead outside the enthusiast space". About 2/3 of Steam users use desktops, for example. That's not enthusiasts, that's a large % of young people. Just thinking about people I know, everyone has a desktop in the household, even if they don't use it all the time. And of course every serious software developer uses a desktop - laptops can't sustain compiling anything for too long without getting hot. Hell, my laptop fans started going full blast yesterday because I opened the 'Stylus' addon in Firefox and it decided to use 100% of my CPU to render a text editing panel, and I have a good laptop. Laptops can barely handle a bit of Javascript.

Desktops are not as popular as they once were but they aren't anywhere near dead.


> Just thinking about people I know, everyone has a desktop in the household, even if they don't use it all the time.

Thinking of the people I know, about a third of them don't personally have any kind of real "PC" kind of computing device (laptop/desktop). Many get by with just their phone and a tablet, not even a laptop. They might have a work laptop, but that's usually only for work-related tasks; they don't have any permanent desk setup in mind for computing. This percentage of people is growing, not shrinking.

And to think, I've got that much exposure to people with that kind of computing lifestyle, and yet nearly half of my friends are people in PC gaming culture and go to things like massive LAN parties and watch game tournaments on Twitch.

I'd say people who have gaming PCs are an enthusiast of sorts. Step out of professional contexts and gaming PCs and a lot of people don't even bother with laptops these days. Having a dedicated space in your home for computing seems to be more and more rare these days.


> Lots of power, very little cooling. It's designed to be used in bursts. If you've ever played a high-end game on your phone, you'll notice it gets very hot and rapidly drains the battery.

"Gaming" phones exist that try to address this issue. But they're expensive, you're paying flagship prices for a brick-like form factor and a very limited OS update lifcycle compared to actual flagships.


You could maybe fix this with a cooled dock. I'm thinking of all kinds of wacky designs involving exposed heat pipes/metal on the outside of the phone that makes contact with something on the dock.


Asus makes a cooling attachment for their “gaming phones”: https://rog.asus.com/us/power-protection-gadgets/docks-dongl...


> I'm really happy that everyone still owns a laptop

I hate to break it to you, but outside of tech, a lot of people have no desktop, no laptop, and just a smartphone these days.


In most Areas HN is popular in, it’s still normal to have a laptop for school, work, or at least odd things like tax returns and document processing. You could use a tablet or phone but people tend to buy a laptop.

In areas where you do very much see phone and no laptop being common or the norm, the trend is towards increasing laptop ownership, not declining laptop ownership. It’s not a matter of phones displacing laptops so much as phones being more important and accessible.


> There was, for a little while, the idea that a mobile device could seamlessly plug into an external display, with external input devices, and swap between a "desktop" and mobile mode. <...> But it went nowhere.

It didn't go nowhere. This is exactly how my Librem 5 works, running a full desktop GNU/Linux.


I thought about this quite a bit when my phone and table started supporting Samsung Dex. It's a great system and I could literally do every single thing I need to on my phone.

In the end I couldn't justify it though. It's really not a big deal to take a laptop and its several orders of magnitude more powerful, doesn't drain the battery on my phone etc etc.

I'm sort of rethinking it again as VR/AR type devices become available and wondering if it would be great to be able to have a phone in my pocket and a set of AR glasses to work on a terminal etc. But it comes back to the same thing: that poor puny device in my pocket is just so weak and already so battery challenged that I don't really want it to do more. In the end it just really isn't that bad to carry a laptop.


> I'm sort of rethinking it again as VR/AR type devices become available and wondering if it would be great to be able to have a phone in my pocket and a set of AR glasses to work on a terminal etc.

> But it comes back to the same thing: that poor puny device in my pocket is just so weak and already so battery challenged that I don't really want it to do more.

Does it though? Combined with the VR/AR type devices + a VM/host in cloud, your phone can run VSCode snappily with all the heavy lifting being done on the remote host. I tried this approach for a specific use case and really liked it - hardly any lag, full power of Linux, doesn't matter from where I connect and unlike the physical host, I don't need to take care of the VM. Of course, this is a developer-specific use case and the phone would indeed struggle in other scenarios requiring a powerful machine locally.


Apple has a better solution with Continuity, if you are near your Mac there is an icon for the activity you are doing on the iPhone that you can click to continue it on the Mac. So you can just finish an email using the keyboard if you want.

That’s what Apple built and released and it’s easy to use and quite useful every once in a while.

You can be sure they also built and tried what you are proposing, but they found it just isn’t a great experience. Sounds great, doesn’t work.

Samsung actually released it, pretty much no one uses it. You can try it for yourself and you will also probably find it’s not as great as you’d imagine.


That sort-of existed though. Windows Phone could do it (Continuum), and so could select Motorola Android phones (via Lapdock).

However, we'll just have to accept that most people don't want this. It's a niche use case.


But with the long tail of the Internet, the dozens of us in the niche are a valuable market to target.


OQO were one firm looking at this market. They folded in 2009:

<https://en.wikipedia.org/wiki/OQO>

(VA Linux's Larry Augustine was associated with this, probably through Azure Ventures, if I recall.)

With the advent of very-small-form-factor computers, mobile displays, bluetooth, battery packs, and solid-state storage, you'd think that such a thing might be reasonably viable, though it would all but certainly be fussy.

Neither a smartphone/PDA, nor a laptop, nor a desktop, nor a tablet. That tends to be a pretty ugly duckling.

That said, the thought of a cluster of devices which could pair / peer with larger and smaller interfaces and provide utility across a wide range of circumstances does seem to have some level of attraction. For now, laptops or ultra-notebooks are sufficiently portable to cover much of the need, and smartphones / tablets too constrained mostly by OS, power, and input limitations to provide a true desktop experience.

(This written by someone who has a large e-ink tablet with Termux installed which comes close to providing a very useful on-the-move computing platform.)

I've heard that there may be such devices forthcoming. I can only keep an eye out.



I want to like this setup, but the NexDock's input devices (especially the touchpad) are so bad. Like, near unusably bad. Which, for a device that's mostly I/O, is a shame.

I own two NexDocks despite this, for the record. My favorite lapdock though is the one meant for the HP Elite X3 phone. Only downside of that one is that the hinge isn't quite strong enough to hold up a phone with a magnetic mount.


You can do USBC to an external display on Android and then do mouse and keyboard over Bluetooth. I've done it.

It's fine? Never stuck with it


Or an USBC dock for a USB keyboard and mouse.

... and the UserLAnd app then gives a normal Linux terminal with all tools or Linux Desktop apps.


Samsung Dex came out in 2017. And does just that.


Are there alternatives to Dex?



For those interested, Motorola's Atrix phone and Webtop platform were high on this idea: https://en.wikipedia.org/wiki/Motorola_Atrix_4G#Webtop


Perhaps it will now that third party app stores will be force onto apple. Granted, you still won't be able to jailbreak your iphone by just typing in the root password, but you will probably be able to install compilers and interpreters onto an iphone by 2024.


> ...but you will probably be able to install compilers and interpreters onto an iphone by 2024.

If you live in the EU.

I look forward to Emacs on the ipad.


There's already a ymacs port in the App store.


Compilers and interpreters have been allowed for a while now (with some caveats). Pythonista is an iOS Python IDE that lets you write apps on your iPhone.


HP did build a mobile phone with Windows Mobile (X3) and something they called Lapdock. It was like a slim Notebook, but just the screen and the keyboard. You could connect it wit the phone with a cable or even wireless. All the CPU and memory was in the phone. The Lapdock was just really just a second screen and a real keyboard. I have no Idea, why no other company made something like that for android phones. I mean all the tech is already there an it works (DEX).


iPadOS is mostly delivering on what you describe. Plug into monitor, mouse, keyboard, window manager, multitasking. Only gap is the “open/unlocked” bit, but Apple is making slow progress there.

It shows the horse power is there in iPhone. I’d guess it’s a UX decision to limit to iPad - iPad apps can scale up to monitor size, but I never want to use an app designed for a 6 inch screen on a 27 inch screen.


iPadOS has some real weird limitations when it comes to big screens. For example, the small iPads don't support stage manager despite being more than powerful enough to drive a measily 1080p display. Their weird window managed also makes for a suboptimal desktop experience, with the weird mouse blob making the entire thing look more like a non-touch tablet than a real desktop.

They could also just as easily implement this in their iPhones (I'm sure Apple's take on Continuum and Dex will be called "revolutionary" when it comes out) if they wanted to, but I suppose they don't want their phones to compete with their expensive tablets in terms of features.


I loved this idea too, but I think we’ve come to realise it in a different way, more or less, purely by syncing everything to the cloud. I feel that the transition between my phone, iPad and PC is pretty seamless with continuity of all the data I care about, and different views and interaction models based on the capabilities of the devices.


I just don't see the appeal of this. I still need a screen and a keyboard to use it in desktop mode. Then I can just bring a laptop. My data is in the cloud and my settings sync - so the only way this could be "better" is that I don't have to buy a phone and a laptop I guess?


> Then I can just bring a laptop.

I guess what I was looking forward to was just that phones are smaller. Rather than taking a laptop, you have your phone with you all the time. Imagine getting to the office (where, as you say, there's a screen, keyboard, mouse etc.) and plugging the USB-C cable from the monitor (to which the keyboard/mouse/etc is connected) into your phone, and the iPhone changes to macOS and you can use it like a Mac.

That would indeed be no different to carrying your laptop around, but it would be a lot smaller and more convenient than a laptop.


But in the office, there could be a beefy workstation waiting for me. As long as we're not talking about floating workplace setups (which I'm no big fan of) or cutting costs, I still don't see the upside.


That's true, but I think different people have different needs and preferences.

Many companies give developers laptops (even though workstations would be faster) so evidently some people already trade portability for power, so having your phone be your main computer would just be a further step in that direction.

For people who aren't developers, e.g. managers doing email, browsing, Excel etc., I'm sure a phone would be powerful enough.


I think there is a GNU/Linux distro which can do that. Not sure whether it was KDE Neon or some feature of Framework laptops or another one, but they had a video where they plugged the phone onto a screen and were aber to use both.



Isn't KDE Neon desktop only? For phones, postmarketOS and Mobian would be the ones I think of.


I think Valve’s Steam Deck does this too, quite successfully.


Does the Deck count? It's just a laptop with a weird keyboard, you don't exactly put it in your pocket. Attaching an external display to s device running Arch isn't all that unique.


I'm working for a smaller US company (hitting 100 people across the whole organisation soon), as a remote employee in the UK.

The arangement is a third party hires me as an employee in the UK, where I receive all the British employment rights (holidays, etc), and I just work for the US company. This is a fairly new thing for them, previously they had only taken on non-US persons through contracting companies.

I would guess that if larger companies were going to hire abroad, they would either have a local setup to manage payroll or do something similar to where I am now. No idea how many are really offering something like that.


What is the benefit to this? Why don't you just make your own UK corp? Then you can write off a load of stuff against the company, get VAT back, and decide what dividend you want to get.


I've had my taskbar in Windows at the top of the screen since 2005. Admittedly, there was no particular reason to move it there, but having done so I'm now used to it. Having to re-learn how to use my computer because the taskbar is at the bottom now sounds extremely offputting to me.


Read a lot of content like it, really the same as any sort of writing.

Then practice.

You could additionally look to use a memorisation tool (something like Anki perhaps) to store new words and phrases that you like, so they are in your head, ready to be retrieved as you're hammering away at the keys.


Yes, it does. Though in Firefox it spawns 20 at a time and every time you click on the original tab it spawns another 20.


To perhaps go against the grain a little here, I appreciate a focus on customisation from the start.

I am beginning to use Vim more, mostly because I was introduced to the keybindings with Vim emulation from other apps, and I appreciate how it works. I had previously attempted to learn Vim through the normal ways, but it just was impossible; by default, Vim for me is barely readable and none of the included colour schemes ever made it any better. Plus, of course, some of the other things like size of tabs, line numbers, often even syntax highlighting. But finding out how to deal with plugins, themes, settings, etc. is difficult when you aren't already introduced to Vim, precisely because most people make the assumption that you've learned Vim "as is" already.

I think some concession has to be made to modernity. Vim is very old and "as is" it often just doesn't function very well for everyday use. Grinding through that until you know it well enough to be given the knowledge of how to make it nicer to use isn't worth it to most people. It certainly wasn't to me.


I think the article has missed something about what Rockstar could consider piracy here: the revenue they get from selling GTA:O microtransactions. If people are opting out of the GTA:O experience for a different online one, that is users Rockstar no longer have to potentially sell microtransactions to.

If Rockstar's plan is to focus on GTA:O as a revenue source, then it makes sense to shut down any potential competition, especially if that competition is in their own game.

Of course, one could argue that Rockstar may sell more copies of the game for the sake of playing alternative online modes, but I think that this won't be true. If the potential gains from new 'boxed' copies of the game were greater than the potential gains from microtransactions, I think we'd see Rockstar focusing on bringing out more content similar to the two expansions they released for GTA4, rather than the online content they are focusing on now.


As someone who casually browses 4chan (at least some of the less major boards, certainly not /b/) I've got to say that I've never really seen any particularly awful content. Either it's not as ubiquitous as people think or the janitors are really, really good at their jobs.

It's all very silly, of course. But the anonymous thing is fun and I've had some interesting debates where I can just test ideas and say whatever for the hell of it without worrying about any kind of repercussion in terms of my reputation. It's easier to go in guns blazing being wrong and learn something without feeling that it's personal when no one has a name.


Most people think 4chan is /b/, because that's where all the worst stuff happens. Imagine if reddit had an "anything legal goes" subreddit - it would be pretty messed up. Other boards can be pretty tame.


No, people think that because it's bigger than everything else put together, the fountainhead of the site's culture, and is the source of virtually everything notable that's ever happened there.


>everything notable

cough gamergate started on /v/ cough

But I get what you're saying. /b/ really is the largest board. In a way it's meant to be the containment board, or the moat around the castle. It's where all the shit goes, so it doesn't pollute the rest of the site. And oh boy, is the internet full of shit.


* raises hand *

What's /b/? Because I don't know 4chan and have been assiduously avoiding learning anything about it.


4chan is divided into boards (like subreddits). For example, /pol/ is for politics.

/b/ is the "random" board (or "anything goes") and the biggest one on 4chan. Some pretty awful stuff happens on there (as well as some pretty cool stuff). Basically it's a grab-bag of Internet.


Reddit actually does have a number of these. I'm not sure I want to link any, though.


Reddit has subreddits dedicated to the foulest stuff that you'll ever occasionally see on 4chan. That has historically included borderline illegal content, with literally illegal content mixed in.

Reddit's reputation of being better than 4chan is entirely undeserved.


>Reddit's reputation of being better than 4chan is entirely undeserved.

As someone with extensive experience with both... it is entirely deserved.


Why are people comparing reddit and 4chan anyway? They are two completely different sites.


I don't think comparing means what you think it means.

No use comparing things that are the same.


Both are interest-separated discussion boards. Reddit bills itself as more news-oriented, but quite often it serves as a regular threaded forum where the first post sets the context.

After reddit got big, we started seeing (well, started speculating that we're seeing) a lot of people come to 4chan after initially cutting their internet discussion teeth on reddit. There were speculations of a secret irc cabal of redditors, who collude to drive threads offtopic and in general mould board culture to be more reddit-like. To this day I'm not sure if those allegations were insubstantiated. We did have several prominent spammers who comandeered fleets of hundreds of proxies and spammed threads they didn't like with random-generated posts. These were obviously not redditors, but given there are people who care enough and have free time enough... In short, 4chan is a magical place.


I used it as an apparently flawed comparison, assuming HN was more familiar with Reddit than 4chan.


I'm getting this issue as well. Using Firefox 39 on OS X.


Will investigate! Sorry about this.

Edit: It might have to do with JSBin. I reported it here: https://github.com/jsbin/jsbin/issues/2464


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: