Oh, this research is very much in the interest of the businesses — because staying at home also means spending less money. The idea is that if the taboo around "having fun" — aka spending money — alone is broken, then well… profit! This article of course contributes to these interests.
Personally I think time spent at home is can be as good as time spent out side of home. The article in question does not do any comparative study — it is just assumed that staying at home means not having fun.
Ya, I agree with you. I have to call bullshit on the part that says, "In the fifth, Ratner and Hamilton put the preferences to the test by gauging whether people actually enjoyed visiting an art gallery more when they were in the company of others, compared to when they were alone."
I used to have a museum membership so I could go everyday for free if I wanted to. I often went every other day just to get coffee, or to use the free wifi for work. It was subjectively a very different experience than being with someone.
I am pretty sure if you measured something like endorphins or blood pressure, you'd see there was more enjoyment with someone than without. There are tons of studies that already do this, e.g. the ones written up by UCSF profs in "A General Theory of Love."
Everybody wants to be Elon Musk but the way post-industrial unregulated capitalism works leads to a handful of Musks and their corporate entities controlling, well, pretty much everything, and fierce competition among the masses for scarce resources. AKA feudalism.
The era of the middle-class was a blip on the historical radar. Feudalism ruled supreme for thousands of years and couldn't be clearer that we are currently backsliding into an era of "digital feudalism".
It's not like Google, to name the obvious example, is really hiding their intentions. Larry Page just a few weeks ago said he thinks government is becoming more irrelevant and "outdated" and nobody even blinked. Well who's going to replace the outdated dinosaur government? The writing is on the wall. Anyone who seriously believes in the pipe dream of a libertarian utopia is a deluded fool. The innocently named "Internet of Things" is bringing the big SV players into every corner of every home and it's not about controlling your toaster with a smartphone, but it most certainly is about control. How exactly things will play out is impossible to predict - I mean a 2010 podcast of academics and students discussing Facebook Twitter and MySpace sounds quaint and old-fashioned - but history shows what happens when power is concentrated in a few hands. And the USSR was supposed to be a Utopia too...
Personally I quite enjoy living in tumultuous times but it's worth remembering the Chinese saying "may you live in interesting times" was used as a curse ;-) Don't be evil, now!
I graduated from Notre Dame with a Computer Science and Design degree. While the CORE classes are heavily theoretical and forces you to think about fundamental CS concepts (Theory of Computing, Programming Paradigms, Data Structures, Algorithms, etc.), there are electives tailored to learning how to program. I took electives in Mobile Application Development, Building Web Apps, JavaScript, Database Concepts, Data Mining, Human Computer Interaction and Healthcare Analytics. We even have electives in Cloud Computing that allows people to learn how to use MapReduce and other cloud frameworks. If there was something that I wanted to learn that wasn't taught in a course or elective, I either learned it through research (Data Mining & Machine Learning, along with Distributed Systems) or through learning it in my free time on my own (Ruby on Rails and EmberJS). People can't expect Computer Science to teach them how to be a Rails developer, but should take the initiative to teach themselves after they get the fundamentals.
The theory, I think, is nice for a small subset of who we call programmers now. The article highlights that there is a huge demand for what you might call blue-collar programmers who don't need to care about theory, because there are plenty of simple, non-groundbreaking jobs to be done. Your basic CRUD apps.
Those blue-collar programmers need not only more of a vocational education, but tools to match. Higher-level programming languages help. In mobile app development that's what's driven demand for PhoneGap and newer tools like Glide, again for those common, boring apps. https://www.kickstarter.com/projects/1783091318/glide-beauti...
But isn't that why there are coding bootcamps to allow those blue-collar programming positions to be filled? Like the Iron Yard, General Assembly, Hack Reactor, etc. I think there needs to be more advertising of these alternatives to a Computer Science degree if thats what somebody really wants to do.
In my opinion, if you want to learn how to be a software engineer you should take a degree in Software Engineering (which is a natural descendent of Electrical Engineering). If you want to be a computer scientist then do a degree in Computer Science (a natural descendent of mathematics). The problem is that the courses in many universities are misleadingly named.
But they're still not going to let you skimp on the calculus classes. I do think 4-year degree programs should offer people more opportunities for the practical aspects. These don't really have to necessarily be part of the curriculum either.
I did two summers in the College of Wooster's Applied Mathematics Research Experience (AMRE) program, where they paid us a small stipend and provided on-campus housing for the summer. It was run as a little faculty-advised math/comp-sci/econ consultancy to help local businesses, and it was possibly the most valuable thing I got out of college.
All without having to skimp on actual computer science and math in the curriculum. These things are supplemental, not exclusive. And of course not everyone needs to go do a full 4-year degree if they just want to get out and make software.
But those people shouldn't be in 4-year computer science degree programs. And we should stop talking about this issue as if it's computer science that needs to change. It's about making people understand their options and what they will and will not get from each. For most quality CS programs, you have to pursue the practical outside the curriculum. For a code school, you might have to do some self-exploration about common problems encountered in computer science.
Everyone needs to find their own correct balance for their personal abilities and goals.
This! Looking back at my career I didn't really learn to program properly until I was in an industry job staring down the barrel of a massive codebase collaborating with colleagues across different teams. It was great.
But time and time again I find my strong theoretical grounding in computer science coming in handy. The last time I hit the front page of HN was with a blog post about how I spotted an NP-complete problem in $m+ enterprise system for insurance companies. It would have gotten to clients with unresponsive brick-like performance if they upped a couple of parameters because my colleagues didn't understand the Curse of Dimensionality and why it was important.
I'll admit that college professors can live in their own world of academic navel gazing, but there is an actual point to theoretical computer science. And the best time to learn is when you have the time and freedom from short term goals and arbitrary client deadlines. If only there were a few years we could devote to such endeavours before entering the workplace. Hmmmmm...
Right, I get that. But if you look at job listings and see one for "computer scientist" and one for "software engineer," what sorts of things would you expect to see under each listing's "required skills" and "duties/responsibilities"?
I wouldn't expect to see a job listing for a Computer Scientist at all. Except maybe at Google? 99% of the market is writing code.
Maybe that's broken - more shops could sure use a good designer who understands the complexity of problems. But I don't see designers very often, and I've worked at dozens of places.
Amazing! I believe everybody has a strength, it's just not every strength is a celebrated achievement. I could've been a nationally ranked hopscotch player at 9, but alas...
Just like and even moreso than vinyl, there are sound, non-"hipster" reasons one might choose to shoot film.
The major one that keeps many artists coming back is medium/large format. It's much much cheaper to get an extremely high resolution photograph on film. Medium format is something like 100 megapixels, and it costs about a dollar per shot after initial expenses. The higher resolution might not matter on monitors, but it makes a huge difference in size limitations and sharpness when printed, and prints are generally the goal for artists.
True large format like 4x5 costs something like $10 a shot depending what film you use (I've heard it can cost a lot less if you shoot cheapo medical b&w), but has insane resolution, measured in gigapixels. You can print it wall-sized, no problem. On top of that, you can only perform the full range of movements such as tilt shift and correcting for some types of perspective distortion on a large format field camera.
This stuff does not matter for photojournalist or weddings or sports, but many professional artists still choose film. They never really stopped. This is in contrast to DJs, the largest supporters of vinyl through the 90s and 00s, who seem to have mostly stopped spinning vinyl unless they're scratching.
As a hobbyist, I appreciate that film makes me think more about each shot. I hate the immediate feedback of digital. I love film's tactile nature. I love turning off the screen and hitting the darkroom. But for me, I agree it's definitely a lifestyle choice.
You are comparing vinyl to film but in the way you are doing it you are implicitly comparing the experience of the consumer of the audio with the producer of the image.
It is interesting though, that the "analog vs. digital" takes place both in photography and music in both the production and consumption stages. You can record analog or digital and listen to analog or digital sources of the recording. Likewise with photography, you can use a digital or film camera and then you can view the image on a print from a darkroom or on your computer monitor.
It seemed as though you were using the term "hipster" to imply vinyl was more about style and trend. If that is the case, I wouldn't characterize the sonic differences between analog and digital recordings as simply "hipster" differences. There is a quantifiable difference between an analog and digital wave. Not saying one is better than the other but they are different.
Your argument that the differences with film vs. digital seemed to boil down to the economics of the two mediums not any aesthetic difference. That is interesting because off the top of my head I don't think there is any scenario in music recording where it becomes cheaper to go analog. I believe, in general, analog recording is more expensive.
> You are comparing vinyl to film but in the way you are doing it you are implicitly comparing the experience of the consumer of the audio with the producer of the image.
> Your argument that the differences with film vs. digital seemed to boil down to the economics of the two mediums not any aesthetic difference. That is interesting because off the top of my head I don't think there is any scenario in music recording where it becomes cheaper to go analog. I believe, in general, analog recording is more expensive.
You are correct, my apologies. From the producer standpoint, analog recording techniques offer few benefits compared to digital.
I didn't argue from an aesthetic viewpoint because I don't think the aesthetic viewpoint is worth arguing about, in that it's generally a non-productive conversation that ends up in "well I prefer x because it feels better than y". Although, I will argue one particular point: I find that vinyl creates an "equalizing" factor when listening to older music alongside newer music, whereas the increased clarity and lower noise floor of digital makes 50s/60s/earlier recordings sound considerably worse than contemporary recordings. A result of this is that when listening on vinyl, I am better able to look past poor recording quality and make decisions based on artistic quality. This, however, is merely a personal preference.
> It seemed as though you were using the term "hipster" to imply vinyl was more about style and trend. If that is the case, I wouldn't characterize the sonic differences between analog and digital recordings as simply "hipster" differences. There is a quantifiable difference between an analog and digital wave. Not saying one is better than the other but they are different.
I will direct you to this very enlightening page: http://wiki.hydrogenaud.io/index.php?title=Myths_%28Vinyl%29 -- in my view, the only quantifiable differences in audio between vinyl and digital is that vinyl has a worse noise floor, a generally smaller "usable" frequency spectrum (the highs deteriorate pretty quickly), and includes surface noise, hum, rumble, etc.
I think the revival is hipster. I collect vinyl because it's often the only place to find certain genres of music (such as western swing and classic honky-tonk country), but I am generally hesitant to buy a pressing of a contemporary recording. I will do it, though, because I like having the physical product, but that is a stylistic decision more than one based on necessity/actual audio differences.
There is little I miss about shooting large format. The huge print thing really doesn't work; an enormous high-quality inkjet print from a sufficiently large sensor (60-80MP medium formatback, or a 200MP multi-shot back if the subject is stationary) will usually look better subjectively. (Sensors are flat. So are glass plates. Film seldom is.)
There are two ways in which shooting film can give objectively better results than shooting digitally. The first is that a Zone System practitioner can wring an exacting exposure from deepest shadows to highest highlights in a single shot. That's especially true when using sheet film (with roll film, you're pretty much stuck with one development for the roll unless you're quick with scissors and can do development by inspection). There are no alignment problems, no interpolation, and no de-ghosting to perform, just a hell of a lot of dodging and burning, note-taking and test prints. Combining Zone System shooting on film with good scans and digital manipulation and printing is, in a sense, getting the best of all worlds for enlargements. And if you shoot colour, it's really the only practical way to use the Zone System, since reciprocity failure between channels meant that wild dodging and burning was always a bit of a science experiment with filters, etc.
The second is contact printing. We are a long, long way from being able to produce digital prints that are even in the same ballpark. Yes, they're tiny and jewel-like (unless you're shooting really large formats like 1114 or 16x20), but they repay a close look with an astonishing detail and depth. Not quite as much as a high quality direct positive (a good Daguerreotype is almost unbelievable, even if you forget that it's probably on the order of 150 years old and was made with a lens that is absolute garbage by modern standards), but more than a little impressive nonetheless. A contact print (assuming the picture has artistic merit at all) can still suck me in for an extended stay in a way that no enlargement, dye sub or giclée can. Who knows? We might even have been there* digitally, except that our printers became literally good enough for most purposes a few years back; only a fanatical devotion to ecstatic experiences with small prints by someone in a position to produce a printer is ever going to change that.
I guess I'm a little spoiled by being at a university with a very expensive Hasselblad scanner that can actually pull that DPI without issues. I imagine it's much more challenging to achieve that resolution on a flatbed. However, if you have access to a facility that has a nice scanner, then it's probably much cheaper to simply scan your large format film than to purchase a digital back.
I would love to see some large format contact prints someday. It sounds incredible. An artist in the area was doing tintype portraits and I got to observe. I wish I remembered more of how it looked, but after a 6 second ("manually timed") exposure, the result was beautiful. I've always loved making small enlargement prints of my 35mm negatives, but being able to contact print sounds so valuable.
I agree with your points, but it might be worth noting that the resolutions you're talking about are not so easy to achieve in practice. To get 100MP out of a medium format frame you need to scan it at around 4000dpi, which isn't really achievable unless you send it to a professional lab (which can be quite expensive). Of course, there are cheap flatbeds now which claim to have an optical resolution of 4000dpi or higher, but you never get that resolution out of them in practice.
This looks like a feature that LnkedIn could incorporate if it ends up being useful. From the candidate's standpoint it's a lot of duplicative work because much of this info is already on LI.
Also, for the tech industry, isn't it always assumed that a tech candidate is open to a better job whether or not it's obvious on the LI account. So I don't know how useful it is to keep your own employer from looking. If your employer believes that at any given time 80% of its workforce is vulnerable to turn over I think it would probably treat it's workforce better. Would Zynga have treated it's programmers better if it knew that many of them were ready to bolt?
This feature is what we are starting with. We would evolve into much different network than Linkedin eventually.
Reg, keeping profile hidden its to guarantee that your current job doesn't get affected if your employers find out that you are passively sourcing opportunities. This is a concern many of our users have and hence the privacy policies in place.
The biggest obstacle is the assumption that most beggars use cash for cigarettes, liquor and drugs. Until this assumption is overcome, there will be reluctance to give cash.