Hacker Newsnew | past | comments | ask | show | jobs | submit | kevin818's commentslogin

Someone should make a list of all the little tricks you can use on Google, or any other site for that matter.


Yes, but then I realized that wouldn't make much sense so I forced myself to read each letter :)


Are these same kind of techniques applicable to mobile devices or is the hardware just not there yet? I've always wondered about whether console development and mobile development have converged or not yet.


Mobile devices have long-since surpassed the GameCube generation of consoles, and in some ways are past the Xbox 360/PS3 generation. Mobile devices have architecturally modern GPUs, and only lack in core count and clock speed (given the lower TDP than consoles).


And bandwidth.


Well done!

Just out of curiosity, if "Pixeltrek is a private project with a non-commercial and non-profitable background", then why isn't it open source? I'm sure the community would love to play with it! :)


> I'm sure the community would love to play with it! :)

I get the feeling that this is exactly the reason why the project is not open-sourced yet. It's an art project. Open-sourcing now has a risk that "the community" could take his work and develop it in a different direction than the author imagined or intended.


Please define best, because best could mean different things to different people :)


What's with the downvotes? As a big fan of Biggie I'd like to know what it means to say he's the best rapper. Lyric-wise? Entertainment-wise? Intellectual-wise?


I don't know if best is the right word, but he is definitely one of the best. Some of the things I think really make him stand out are his storytelling, his flow, and his charisma (that he looks unconventional also probably plays into this). Also Puff Daddy was instrumental as a hype man in getting him to platforms like MTV etc. And his untimely death definitely plays into the modern mythology. But I think if there is anything that truly encompasses his talent it's this video right here: https://www.youtube.com/watch?v=4hbwdAOogBw and this demo: https://www.youtube.com/watch?v=rEaPDNgUPLE&feature=kp


adjective: best

    1. of the most excellent, effective, or desirable type or quality.
best for the most part won't mean different things to different people, but which rapper whom people think is the best is certainly a subjective thing.


Didn't know that. Thanks!


I think just the fact that you were able to build a game with just 4 hours of learning a new language speaks volumes about Swift's barrier to entry. Well done!


I think it speaks more to the author's prowess, not to the fantastic magic powers of Apple's latest walled garden.


Any up-to-date C# developer already knows Swift.


Any up-to-date Java developer already knows Swift :)


Any up-to-date Swift developer already knows Swift :-p


But the code was just a port/convert, seems relatively non-trivial to do that in 4 hours.


I supposed "relatively" is the key word here. To me, that'd be a great feat. To others, not that special.


Question: for those of you who said Windows, is it by choice? And if so, why?


By choice.

I was a diehard Linux nazi until I got my first internship where everyone used Windows. Initially I was appalled; I had brainwashed myself with the spirit that `configurability is power' to the point that no one, not even I could get any work done on any system I touched. This philosophy carried over when I started using the work computer, which I plastered with ``productivity software'' on it in an attempt to morph the system into something more like my personal laptop. I became part of the ``Cult Of AutoHotkey.'' In retrospect it is absolutely absurd that there exists a subculture which revolves around changing the settings of computer software.

The first week of the job my boss had to do something on my computer, and within thirty seconds told me "What have you done to this poor computer." He was a Smart Guy; he is highly educated, has been working with for longer than I have been alive and adapted to modern standards, (sucessfully) administered company server farms, directed software projects (embedded, web and everything in between) and most importantly had a life outside of science and technology. It was at this point that I woke up. Obviously there was a problem with me. Now I leave every conceivable default untouched unless I absolutely cannot avoid it. I use Windows 8, IE (Firefox with no addons when that breaks), Visual Studio, Windows Photo Gallery, etc. Most of the less common software I must use is run out of a folder in my home directory. Now I and everyone can be productive on my system. The only thing I have lost since switching is the smug self-righteousness of Linux veganism and Lifehacker articles.

There is no default on Linux, so there can be no expected behavior. That is why I use Windows.


Thank you!

I also refuse to hobble myself with custom desktop toys. If my machine fails I want to be productive again as soon as I get another machine on my desktop. Forget all the imaginary gains of diddling with exotic editors etc. Truth is, you can get used to anything. So get used to the defaults, and leave the whole discussion behind.


I find it weird that you don't believe you can have both at the same time. I have my (important) dot-files and etc-files checked into a git repository, critical subdirectories of ~ (Photos and Documents, to name a couple) synced to an online backup service, and I have a cron job that takes a daily snapshot of the list of packages installed on my laptop with "dpkg -l" (which gets dumped in ~/Documents, which is synced to my online backup). If (and when: this has happened) I lose my main machine, I can be up and running, virtually exactly where I left off, in a matter of a few hours. The only stuff that's really missing that I need to get work done is ~/src/, but all of that is on various git servers, and can be re-cloned as needed (it's very rare that I finish some coding and fail to check it in and push it somewhere safer).

I just could not live with a stock Mac or Windows setup (or Linux, though, to be fair, there's really no such thing as a "stock" Linux setup) and still feel productive.

Then again, regarding Windows specifically, I just cannot feel productive on Windows anymore, period. In the 90s I would have considered myself a Windows power user, but switching to UNIX-y systems blew away anything I could ever do on Windows. Perhaps things have gotten better, but I have zero motivation to go back when I already have the real thing right in front of me.

In the end, though, everyone is used to what they're used to, and switching costs can be pretty damn high.


What you feel is interesting, but not the point. What you spend in engineering time setting up this machine is cost to your employer. Not doing it is savings.

Balanced against a feeling of productivity?


Well, much of Apple's philosophy is that they pick great defaults for you. How come you haven't tried OS X ?


* Most people don't use OS X

* I find their paradigm very bizzare. The nail(s) in the coffin for me were the inconsistent and incomplete patchwork of shortcuts and modifier keys. Even with universal keyboard access enabled, there is no way to do some very basic things via keyboard out of the box (e.g. move, maximize, minimize windows). Every time I use OS X I feel like I'm fighting the system. The only way I can get anything done on it is via CLI, which defeats the purpose of switching (at that point why not just ssh into it from windows)

If I had started out on OS X it would probably be great but as it is, the learning curve and opportunity cost is too steep for me.


you odnt realize how miserably composed your writing is until its uneditable


For me, yes. I've used Windows, OS X, and Linux, but prefer Windows the most because of the huge amount of software available for it, and it's the most familiar (I've been using Windows since 3.11) - I've done a lot on Win32 API applications. My environment has been setup with everything the way I like. It'd be perfect if the default command line shell was more *nix-like.


Given the time interval you're talking about (man, over 20 years now!), that's more than enough time to get up to expert level on any OS. I started on the Microsoft train back at DOS 5.something, and started using Windows at v3.0 (and wrote my share of shitty VB and MFC apps on 3.11, 95, 98, and 2000), but I left the MS world around '00/'01 and will never go back.

(Oh god, remember Win32s?)


Is it about the huge amount of crap shareware options you can download for Windows, or about getting a job done ?

It's been years since I've ever been unable to do any task on OS X, and do it with the best tools available.

There's so much useful, quality, well picked stuff built into OS X that the "quantity" of software available is irrelevant.


- by choice (win 7)

- I simply prefer the experience. I feel like I can navigate around the environment more quickly. Also: not being tied down to mac hardware is freeing. I've found the HP ProBook to be a decent windows laptop. It shamelessly mimics a lot of the MacBook Pro form. In fact, the chiclet keyboard can use MBP silicone keyboard cover.


By choice. It runs my older hardware better than anything I have found and with 8.1.1 pro I can run Ubuntu Server in Hyper-V which is a Type I Hyper-visor. That combo gives me the best of both worlds. I would also consider purchasing a Mac book pro to run this setup if I could get a guarantee from Apple not to "sabotage" the setup in someway, still honor the warranty for the hardware and let me boot to windows without using Bootcamp. As the the rest of the world catches up to the Mac book pro's screen quality, if Apple officially supported different OS's I think they could steal a huge amount of users from the OEM's.


Windows 7 is a solid OS. Also, built my own rig and own a copy of Windows CS4. If Adobe started making a Linux version I'd switch to Ubuntu.


Yes, by choice. Because I know it well and it just works.


I'm running on a 8 year old self built computer with Windows 8.1 and it still works wonderfully even with Photoshop, Visual Studio, and phpStorm. The only slowdown is the hard drive, which could stand an SSD upgrade someday.

Macs just seem too low end hardware for the price and an 8 year old mac wouldn't run this well - i also couldn't get over the no right-click on the mouse. Linux is fine for the server side, but after years of MythTV I got tired of getting things to work.


I choose Windows because it runs the software I value the most (Foobar, Photoshop, FlashDevelop). I prefer the Linux ecosystem (package managers, better terminal, etc.) but the software just isn't as good for my needs.


Macs cost too much and ubuntu just doesn't support enough /stuff/


Software. At the end of the day Operating Systems exists to run software, and Windows is the only OS that runs all the software I want/need to run.


By Bread !


Anyone else worried that using virotherapy may result in those virus' building up resistance, similar to what's happening now with antibiotics and superbugs.


Unlike bacteria, cancer isn't spread from person to person, so it's impossible for the treatment of person A's cancer cells to impact the treatment of person B's cancer cells. However the cancer cells within in a single person can become resistant, which is why this treatment is a one time shot.


Resistance to what? There's no particular selective pressure on the virus.

If you mean the cancers becoming resistant, one of the appeals of using viruses in therapeutic settings is they can evolve right alongside their targets, helping mitigate some of those resistance problems.


Apologies, I was referring to the cancer becoming immune.


You've got a good concern, just backwards: the virus is cleared from the patient, but the body becomes immune to the virus. Thus, it can only be used one time, which may limit the therapy's effectiveness.


Yes this is what I was curious about. As happy and excited I am for the breakthrough, I'm also nervous about the repercussions of it.


This is big. Although to be honest I'm still trying to understand how a company that sells quality stuff is buying the headphones that most audiophiles think is meh.

Is all this really just for the name?



1. Isn't it likely that any streaming rights are cancelled in case of acquisition? That seems like a no-brainer for the music execs to put in a contract.

2. And aren't streaming rights only good for a few years anyway?


Not to mention, Rdio destroys Spotify and Beats Music on all levels.


This confused me too, but the quality of the headphones that come with Apple's products is not reputed to be great either. (I can't say whether this is true. Hearsay at best.)


The old Apple headphones were average (for their price), but overall the newer EarPods are much, much better.


EarPods are quite decent and the design is very ingenious.

However, the response is pretty neutral. I'd imagine the Beats acquisition is probably at least in part to make better headphones. I don't know if Beats' streaming licenses are transferrable.


Audiophiles already think ipods sound meh. Which they in fact do, it is a fact that the DACs aren't great. Neither are the headphones that ship in the box. Apple products are not marketed on their audio fidelity.


Not quite true - Mac desktops are marketed as audio processing units. They even come with the software preinstalled. They took pains to make the mic-speaker loop low-latency so you could listen to yourself as you record.

Since other products are all about the audio experience, it does seem strange that the iPod, the device which most of humanity uses to experience music, does such a poor job. And Apple doesn't care. And neither do the people listening. Strange.


The inbuilt microphones do work very well. The whole loop is, as you say, heavily adjusted to compensate for the resonances of the unit. I wouldn't be surprised if there was some quite fancy DSP powering that.

I thought that was for making facetime and skype etc work well though? Musicians are not using the built-in audio interface, regardless of quality considerations, it lacks the necessary functionality.


I don't buy that.

Everyone I know has a sub $100 USB interface hanging off their machine and swears constantly at the latency and wishes they'd just stuck with ASIO on Windows...

Personally I still have a workstation...

As for the iPods they sound horrible. The latest line of nano has noticeable noise on the outputs.


Since Windows Vista there has been 100ms latency added to the audio loop. So you're experience was Xp? Keep that machine around.

USB interfaces are a terrible idea - they are polled and add latency every time.


That's basically rubbish and misinformation.

You're complaining about WDM audio. There's built in WaveRT and external ASIO stacks as well. The latter has become the standard on Windows for professional audio and is direct hardware access.

USB audio devices are and aren't polled. The cheap ass speakers and stuff are but anything $20+ are buffered raw device access.

I can get 8ms out of a Windows 7 laptop with ASIO over USB that isn't even tuned. My Korg Triton Studio manages about 4ms and that's seriously high end kit.


Right, ASIO isn't Windows its 3rd-party to work around the Win7 issues.

WaveRT is what WDM uses, right? With the associated delays going user/kernel mode. How does that help?

And USB devices are polled, at three levels as I recall - 'control', 'interrupt' including 'isochronous' and 'bulk transfer'. The frame interval depends on the USB speed - and I see they are sub-millisecond. SO very adequate for audio sampling. I used to use it for serial protocols, and that frame rate was entirely inadequate for signalling so I formed my low opinion of USB at that time.


Apple's iPod division was always more fashion-based than the rest of the company. Now that iPod sales are declining, maybe the fashion aspect will live on in the headphones?


Never looked at it that way. Great point!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: