Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Kodak resurrects Super 8 (kodak.com)
278 points by tarp on Jan 5, 2016 | hide | past | favorite | 198 comments


I watched the analog-to-digital-to-analog trend happen in professional audio (I was going to school for audio as ProTools was beginning to be a thing, we worked on analog tape machines...but the year after I finished my degree, they brought in digital ADAT machines for the small labs, and eventually went digital in the 24 track room, as well). It's amusing how superstitious people can be, especially in industries that are mostly subjective but happen to bump up against a lot of technology. Audio, photography, video, and now film have all been through this.

The final product will be delivered digitally for 99.9% of consumers. Why fight it? Why spend so much money, time, and effort, to work with inferior media? I dunno. I worked on analog tape machines (I was even a hold out, for a while, having a 1" 16 track machine, as big as a mini fridge, in my house for several years after digital multitracks were the smart choice), but there really is no good argument for it today.

There was a brief window where the best digital equipment was inferior to the very best analog equipment, but it didn't last long. Maybe five years. We may still be in that window for film when comparing 70mm film to the best digital equipment...but, on the low end? Hell no. This janky little camera from Kodak will be a joke compared to digital equipment in the same price range. And, the film/processing costs will be outrageous comparatively speaking, limiting ones options when shooting to a significant degree.

In short: This is just hipster bullshit. Just like analog audio is hipster bullshit.


Is it really an issue of image quality? Because while the qualities are different, analog media has its own character. You have to do quite a bit of image manipulation in digital to get that "super 8" look, and it isn't always that convincing. Same with black/white photography vs. digital -- the way a digital chip reacts to light is very different than the way chemicals in film react, and some film stocks have a range and tone that is very hard to emulate with digital images. I don't think it is a question of which is "better" but rather that aesthetic that an individual wants. You have to learn lighting techniques in a new way when you switch from celluloid to digital, and a good DP's intimate understanding of how a particular film stock will respond in the shadows and highlights no longer applies when dealing with a very different medium.


At first it might seem like a fad, but "character" is essential when making art. You are trying to reach another human being, not simply deliver the best image possible, so her past experiences, memories, the fact that she grew up watching grainy movies at the local theater, will all interact and elicit a certain emotional or instinctual response. It's not that analog is better, is that your audience reacts uniquely to analog artifacts, even when delivered digitally.

This, and not superstition is the reason we still have tube amps, 24 fps polyester film ("celluloid"), vinyl records and the rest. That's not to say superstition is not rampant in the professional fields, We've all seen it: gold plated wires that deliver no measurable improvements, creators that refuse to touch the same application in an (much cheaper and faster) Windows PC as opposed to the "pro" Mac version, "magic" equipment brands that "all the pros use" and so on.

It's essentially a cargo-cult: we try to emulate successful creators and get fixated on the appearances. If we get success, often time by sheer luck, we attributed to brand X or Y and spread magic thinking to others.


Heh, I'd say tube amps are a bit more out there than everything else you listed (unless you mean tube headphone amps or something). It's INCREDIBLY difficult to accurately emulate tubes on normal computers in real-time due to their non-linear behavior. And there is at least some kind of science behind tubes "sounding better" since second-order harmonics are suppose to be more enjoyable for a listener, and are generated naturally by a single-ended tube amp design. I wouldn't say musicians using tube amps are trying to elicit memories of the past, rather that's just the best way they can make their instrument sound good


Sure, on a normal computer that may be the case. But, products like the Kemper Profiling Amp (https://www.youtube.com/watch?v=h0SmSl1aS1w) are on the same price range as high-end amps and have been able able to model _any_ amp with an incredible precision for years already. Still, it's easier to find a wide selection of analog amps in recording studios than one of those. That said, I think it has a lot to do with the guitar player "fetish" of recording on a boutique valve head with a pair of 4x12 cabinets.


creators that refuse to touch the same application in an (much cheaper and faster) Windows PC as opposed to the "pro" Mac version

You're ignoring the fact that the application must run within the confines of an operating system.

I'm not using a single application at a time. "The same application" you refer to might be nearly identical on both platforms, but I need to switch between applications. On OS X I have Mission Control (with trackpad gestures!) that makes context switching incredibly efficient. On Windows I have to click taskbar buttons, or press alt+tab hundreds (thousands?) of times a day, or take a break to hit windows+tab to have a laugh at the incredibly useless task switcher (I've never understood why windows+tab was allowed to ship). How people multitask on Windows is beyond me, with no friendly built-in solution and third-party applications that all have problems.

How about copy/pasting? Using ctrl+c and ctrl+v on Windows with a PC keyboard is frustrating compared to the cmd+c and cmd+v finger positioning on an Apple keyboard with OS X. Microsoft requires an awkward readjustment to reach for the control key, while Apple uses your thumb that is already resting on the spacebar to hit the cmd key that is right there. When you use copy/paste hundreds of times in a typical day (ex: programmer), this minor annoyance adds up.

How about finding per-application settings? On OS X this is always accessible via the cmd+, shortcut and the menu entry is always within the app's primary menu. On Windows every application has its own shortcut (if any), and the menu entry might be found under any of File, Edit, View, or Window. It's always a hunt just to open application preferences. I personally find this to be a frustrating experience and a waste of my time.

Finally, I simply prefer the visuals and widgets/controls of OS X compared to Windows. OS X is flat and simple whereas Windows tries way too hard to look "cool" (ie: designed by children for children). Also, OS X developers (other than for game clients) don't think it's cool to throw away the default window border and window controls (minimize, close, etc.). Whereas far too many Windows developers think it's cool to customize their fucking window style. Leave the system components alone.


Even photography experts can't tell the difference between a digital photo that has simulated grain added to it, and tone manipulation done in post, versus a picture shot on film and scanned to digital.

I suspect its the same thing in cinematography.

Double blind tests do real havoc to all kinds of fetishism from wine-tasting to high end audio.


You are confusing the experience of the audience with the creative process of the artist.


That is a good point. It is different experience to shoot with film than to shoot digital. However, I haven't heard people argue for film from a strictly process point of view. It is, after all, a strength of digital that everything is just software--easy to manipulate, reset, copy, distribute.

Having said that, you are still probably right. There is always going to be someone that prefers an analog process--maybe BECAUSE of how inconvenient it is--to a digital process. Some people also like working within constraints as it stimulates their creativity.


With photography, film and audio alike, it's easy to prove that digital has orders of magnitude more precision and accuracy. That's great, but it's not the whole picture.

Mixing desks are said to colour the sound in certain ways, and in doing so, introduce harmonics and other sorts of imprecision that would be anathema to digital components, for which THD and SNR are paramount and utter precision is the ultimate goal. Then, in the end, people listen to the source material with their own ears, which have totally different frequency responses anyway and the eardrum is basically an analogue device that will color the sound in the end anyway :D

But then you start getting into things like guitar sounds. It's not that digital guitars sound better or worse - all you can really say is that they sound different. The real issue is that there's no objective sound that it should have - the electronics, whether digital or analogue, are effectively part of the instrument. And in the same way that analogue amps and effects colour the signal in specific ways that a guitarist or engineer might prefer, film imparts its own aesthetic qualities that a director or cinematographer might prefer.


I don't think anyone disputes that. The real question is, can a digital filter provide a close enough approximation of that colouring that going through the analogue path is no longer necessary? Are Instagram and Photoshop an adequate substitute for real life film grain?

I say yes, but I'm a software guy. And I've had this argument with a serious pro audio engineer who has a Pro Tools workstation built around a Neve 8816— even though every track is digital, and dozens of digital filters are applied, the tracks each pass through their own DAC connected to this old-school analogue summing mixer, and then are re-recorded on the other side. It seems totally foreign to me, but there are definitely recording professionals still spending a lot of money to set up and maintain these workflows.


> the tracks each pass through their own DAC connected to this old-school analogue summing mixer, and then are re-recorded on the other side. It seems totally foreign to me

There was, at one point, good reason for doing this. Digital summing is really easy to get very wrong. ProTools used to introduce some really nasty artifacts back in the 16-bit days if you had non-linear effects.

24-bit chains have made a lot of this moot. And using 32-bit float as intermediate calculation steps makes even more of it moot. And, increased computer power ... etc.

Then there is the always present "Magic Box X adds tonal coloring that I like." You can't actually argue against this very well unless the box really doesn't do anything at all.


I think there are some answers to be found in technological histories, both creative tools and otherwise. The main point to consider is that new technologies augment the possibilities - outright replacement is actually the unusual case, because there's always some odd reason to have the old stuff around.

Artistic workflows definitely rank among those reasons; they do tend towards Taylorist efficiency when the artist doesn't want a restriction, but desire to be a maximally efficient artist in every aspect can lead to a deep probing of your motivations for doing any art to begin with, instead of just jotting down what the piece is supposed to be. So most artists will take on at least one technical restriction of some kind, consciously or no. That restriction subsequently guides and shapes everything else they're doing.

So for people who want their focus to actually be built on the limitations of analog, they need analog. Period. And that isn't most people, since there are many other ways to go about setting limitations, and what is usually needed when it's called for is an indication of analog, not its every nuance and imperfection. But like a car enthusiast who can enjoy old and inefficient designs, someone who really cares about analog audio will want to have the experience of the real stuff, despite the painstaking elements of doing so.

edit: And I welcome digital imitations. The newest VA synths really capture the imperfections of old stuff in a way I haven't heard from previous attempts, and I want to use them for that. But I don't want to have an analog box to tote around and maintain.


> can a digital filter provide a close enough approximation of that colouring that going through the analogue path is no longer necessary?

In principle yes, but it's surprisingly tricky, to the extent that even some very simple analog audio things, like '70s-era voltage-controlled-oscillator synths, are only recently able to be emulated reasonably well. The research field is called "virtual analog synthesis". I'm not sure if there's a canonical overview, but some googling turns up this 2014 PhD thesis: http://lib.tkk.fi/Diss/2014/isbn9789526055862/isbn9789526055...


I've got a bunch of friends who have worked on this at some point, and I know some of the people cited in that PhD thesis. I can still hear the difference, even with state-of-the art current virtual analog emulations, like Diva.

Have to say the gap is narrowing though.

One approach I haven't seen used yet is using the latest AI/neural networks to learn and model the inherent nonlinearities in analog audio


I wish I had picked that for my masters dissertation - ML + DSP seems like a really interesting area!


I don't think the issue is with the system's ability to replicate the effect. Rather, with a digital system you're now faced with the problem of explicitly representing the effect, where the analog system had the effect built-in. If you can't quantify the effect then you will be unable to replicate it, and if you can then you have to do the work to replicate it. If these effects are a signature of your work, I can see how it would feel like a step backward to suddenly have to quantify them, and it would be very tempting to switch back to the equipment that already applies the effect.


For audio, yes, digital can replicate analog to 100% accuracy. You cannot differentiate the output signal using an oscilloscope. The signals are identical. People using workflows you describe are a cult, IMO.


I think this is true, but definitely not in real-time. Go observe how fast a complicated analog SPICE emulation is. Typically it's about 1000x slower or worse than real-time, though in the end the wave form might exactly match the actual circuit.


> Are Instagram and Photoshop an adequate substitute for real life film grain?

I thought you were making a good point about how filters were shitty needless imitations and then you threw it all out most unexpectedly.


I once had an audiophile coworker describe how he listens to his favorite records. In his words, the way it was intended to be listened to is probably (but not always) the closest thing to the artist's desired vision. A 2016 blues album recorded in an old shack in Tennessee and printed to vinyl is intended to be heard on a record player. A 2016 pop album released in stunning digital clarity, but also released on vinyl, is probably best listened to on a good digital setup. All communication and art is about authorial intent. What did they want to listen to?


It's the same thing as using Polaroid cameras with Impossible film, or vinyl records. It's to find some tangibility in an era of intangibles. The low fidelity is the point.


Absolutely - the lack of anything tangible in the modern digital world is a huge problem, and slowly people are realising this.

Humans aren't designed for a 'virtual' existence - the technologies may be very useful and empowering in many ways, but they shouldn't take over life and reality.


It's a creative tool. And it's not comparable to music production, because film has a huge influence on the end result, whereas in music you must have a very well trained ear to appreciate the difference.

I sometimes shoot medium format film and the artistic results are often better, albeit with lower fidelity. The reasons as far as I can tell are due to both technical constraints that force you to be more mindful of what you are doing, the unique way light is interpreted by film, and mechanically superior vintage optics.

Could I achieve the same results in post processing? Yeah, if I calibrate exactly to colour rendering of my sensor, figure out the correct white balance, colour space, and the necessary adjustments. Although, given that even DxO Optics with their FilmPack addon doesn't give me the exact results I get from say Fujichrome Provia 100F film, I doubt that it's that easy.

Telling everyone you're vegan is hipster bullshit. This is just an artistic tool.


Sometimes it's the imperfections that we are attracted to. I started with digital audio a long time ago, but have been buying almost solely analog equipment (synths, effects, reel2reel) for the past 5 years. Often it just sounds more alive. I still use digital too, a lot, but it's nice to have another type of texture in the tool box.

Not sure I'd film on Super 8 though, it ends up being pretty costly and hassly, and you can get a pretty good match to those colours (if that's what you want) using flat or RAW profiles and film stock LUTs on digital cinema cameras


I think analog synths are in a different category because they generate sound vs capturing it. When it comes to instruments like that you can't make an argument that digital does it better because that is subjective. Similarly I don't think a synth with guitar sampled mod can replace the guitar because it would be impossible (or very hard) to capture all quirks/permutations of strings together. You could make a case that capturing guitar sound digitally is better than on analog, which is more in line with what's Kodak is doing


As far as art, it's all subjective, there is no "better" :) Accurate - now that's a different story.

And I agree with you, analog nonlinearities are more apparent when you generate the sound.

Having said that, as a practical example, there is a big difference in the sound of drums (loud, fast transients) recorded to tape vs digitally. I really like the sound of tape for certain things. For some things, it doesn't make that much of a difference. The amount of mojo depends on your tape recorder, tape formulation and amount of overdrive.


It's not about things sounding/looking better, but about them sounding/looking different.


And most important of all, showing off spending money.


Where there are limits, there is art.


I compose music on a Gameboy and a Commodore 64. You don't have to tell me. But...well, this is just spending unnecessarily large amounts of money for mediocre output.

I feel like there's a difference between working with limited means as a tool for creativity, and something like this.

But, I guess, more importantly: I hate the delusion, under which so many people operate, that analog provides higher quality than digital. If your argument is that this is a great idea because it kinda sucks and working with kinda sucky equipment makes you feel more creative, then I can't argue (again, I enjoy writing music on a Gameboy, which is truly sucky), but the moment you make the argument that it looks/sounds better than modern digital equipment is the moment I dismiss the opinion as hopeless superstition.


Some times it is.

Go look at great pixel art from the 80's and 90's era on an analog CRT.

Overall, modern displays do more and are better in every way. But, I find watching SD media is best on analog SD media.

Or I see a movie in ultra high resolution. Looks like a set. I can see it. But my copy of "How The West Was Won" reissued on Blu Ray is one of the very best I've seen.

This stuff has subtleties that people appreciate.

I find it much easier to go looking for and appreciate art where I see it, and largely ignore the more objective persist of perfection. It's a good thing, but I don't always care.


> Go look at great pixel art from the 80's and 90's era on an analog CRT.

> Overall, modern displays do more and are better in every way. But, I find watching SD media is best on analog SD media.

Well, this is because the pixel art was specifically designed (sometimes intentionally--sometimes just because that's what the artist used to make it) to the limitations of the CRT's is was going to be used on. Consequently, it looks worse when viewed on something that doesn't have the particular blurring, filtering, circularizing characteristics for which it was designed.


TL;DR version = new tech is superior, not always better in the artistic sense and not always worth it in the trust and make, do, build sense.


Indeed, though I do recommend watching that old western on BluRay. Someone who knows how to nail it with film actually does capture way more than they even expected back then.

There is a fine art there to be appreciated. This is why Spielberg still shoots film. He has a mastery that continues to have value. He may find all of that blunted in the digital realm. Can't blame him.

I think we will eventually find the art in high resolution digital comes down to things other than perfection. Sometimes too much of a good thing is too much.

On another thread, I mentioned Pink Floyd, "The Wall" and how it has been reissued on CD and a remastered gold version that is insane good.

Actually, it is too good. The average person may well appreciate the production values from the earlier work more, despite its considerable distance from perfection.

The trouble on all these discussions centers in on what people like as opposed to perfection. Those two are not always the same, and "better" has that subjective component to it always.

Many digital movie productions I see have subtle aspects to them one can find distracting or that break the immersion a movie is supposed to deliver. Not breaking that is Cinematography. I remain unconvinced everyone really understands that.

That's where the art is.

This is not to say advances in tech are bad or to be discouraged. Neither is true.

However, value perception of said tech may vary considerably from expectations.

On your last point, we actually lose the pixel art, and often the discussion goes to other things. Fine, but it's not always bad to see the pixels.

With gaming, extreme realism, or just extreme quality, can break some of the escape and fantasy, abstraction inherent in the entertainment form. There is definitely room for both and an active retro culture and indie scene taking liberally from retro in order to see that art continue.

The are also some economies with analog means. I'm on a chip project right now that can offer up great analog display. (Truth is, it will do a 4k on analog, no sweat, if one wants to do that)

Some of that is lost on digital devices. The thing is, generating the analog is at least an order more lean, while being able to offer comparable quality. Barrier to entry is low. That is also high value.

Raising that bar is good in many cases, but not all. So I find myself dealing with many subtle timing matters, artifacts of A to D conversion all a non issue on actual analog displays. I will end up with a lean device that can do HDTV signals nicely, while still maxing out an old TV, which does way more than most expect, and do so sans an awful lot of hassles.

As a "do it myself" kind of person, I do not always see the benefit of complex, resource intensive signals, compression, etc... as a good, or the better thing. And there are IP concerns too, all near completely absent from analog means and methods.

So then, "better" takes on some new depth when on is making or building from first principles. In the end, the system will deliver a great display for a fraction of the effort required to employ a fully digital path.

That's tech I know down to the core, can trust and control in any way desired. Does exactly what I want. High value as far as I am concerned. Timeless too. Works on anything ever made.

I just got a media player that refuses to interface with my other goodies. HDCP in play. It's hooked up via analog component, and delivers the great experience it is supposed to. Funny how that mess can all work.


Me too, and I like to work with small and old computers too, and for similar reasons.

The thing is, if a tune is really good, that it was done on SID won't matter. It's good art.

Not better in an objective sense, just something we crave.


99% of this is not people creating "better" (whatever that is) art. It is, if my experience with photography is any guide, people churning out the same crap everyone else does, while affecting a position of being somehow superior because they "aren't obsessed with the latest gear" - while being, in fact, vastly more obsessed with what equipment to use.


Their skill may be mapped to specifics. It's a kind of technical debt.

Remapping those onto new gear may cost them more than they feel it's worth, or they don't see the value somehow.

Whether that makes objective sense varies a lot.


The main driver is efficiency. Smaller, faster, cheaper. More accessible. More convenient. That's basically the heart of all technology advancements.

However, the end result may or may not be better/higher quality. Transportation is moving from mostly hydraulic/mechanical to electrical/computer controlled, Comms (analog to digital), Food (local whole foods to distant mega farms and processing), etc.

Old technology is far less efficient, but can be much more robust. For example, knowledge recorded in a book cannot become corrupt due to a failing hard disk controller, or lost due to electrical surge from a near by lightning strike. And, one can still send a message over radio waves using Morse Code to distant places even when cell phones fail due to network outages or cyber attacks.

So while we are smaller, faster, cheaper and much more accessible today (in almost every way... thanks to technology), our systems are much more fragile and dependent on other components. In many cases, this is why older technology is used.


Friend, analog audio is not hipster bullshit. Running through vacuum tubes and transformers, printing to tape, dumping it back into a DAW is not hipster bullshit. It's using the best tools for their portions of the job.


Did you miss the part where I went to school for audio? I don't want to appeal merely to authority, but, well, at some point I become frustrated at the level of superstition that backs these kinds of claims.

Look, I own multiple all-tube amps for my guitars. I understand the appeal (though it's much more complicated than "tube is better"). I even understand the science for why tubes sound "good" and why solid state amps sound "less good". (To be clear: It is not merely the presence of tubes.)

There are so many confounding variables, however, that people end up believing that a shitty so-called "tube" preamp that costs $20 to manufacturer and runs on a 5V power supply is somehow superior to a high quality solid state preamp just because it has a tube. The most sought after old Neve consoles and channel strips that people love so much? Solid state; not a tube in sight. They're loved because they were extremely high quality, and have a subtle distinctive sound that is what our favorite records sound like (at least, our favorite records from a certain era of well-funded studio dilettantes).

And, while we're at it, the best way to capture that particular color is digital. The difference between Dark Side of the Moon and a digital recording today that doesn't sound as amazing as Dark Side of the Moon has less than nothing to do with DSOTM being recorded analog vs. digital and everything to do with the kind of budget, time, and skill Pink Floyd had in the studio.

I'm not arguing there isn't a difference in quality to be discerned between different pieces of equipment. There absolutely is. But, Super 8 is not and never was, a high quality way to get images onto a screen. It was a compromise based almost entirely on cost. It just so happened that for many years it was a compromise that was necessary for filmmakers on a tight budget...video took a long while to catch up, and it probably took the switch to extremely high definition digital video to really put the nail into film's coffin.


Get off your high horse. I went to school for audio too. I own a ton of the same stuff you do. I spent 15 years on the road as a musician and 8 of those in the studio as an engineer. I've soldered custom Trident snake heads onto new snakes so we could build our studio with that board as the centerpiece. I used to use Pro Tools before I discovered (as you apparently did as well) that software is a more comfortable living. I have audio credentials.

If you're telling me that there is no sonic reason to put stuff to tape then I'm sorry, you have no authority to me. BTW, the reason old Neve boards sound that way it because of the transformers and the analog components in the path. You're so convinced of your own opinion that you didn't even read what I wrote.


Come on man, everyone knows solid state amps > shitty 1950s tube amps! ;-)

Sarcasm aside, the modeling amp I'm looking at getting has a tube in it (Vox VT40X, 12AX7). I love my JVM410H but it's nice to not share with the entire neighborhood sometimes.


At first I was going to disagree with you on the premise that the noise added by the analog system may be quite desirable, but then I read some comments that had the prices in there. Holy crap this is bullshit.


Interesting to see history repeating itself a bit: "When you purchase film you will be buying the film, processing and digital transfer". Kodak was pursued by the DoJ in the 1950s and ultimately ruled against in an antitrust suit for doing the exact same thing with Kodachrome, the market leader in color (still) photography at the time. https://en.wikipedia.org/wiki/Kodachrome#Prepaid_processing


Looks like the consent decrees were terminated in 1995. http://caselaw.findlaw.com/us-2nd-circuit/1300513.html


Interesting.


Kodachrome was slide film. Very few labs would process it, though the colors were bright and some people really liked it. E6 was the process that most slide film used (even some of kodaks).

Slide film was always a challenge to expose just right, its dynamic range is pretty narrow compared to negative film. Its what you needed to use for movies though.

I can't see this new revival being more than a niche market. Sending film away and waiting a week might not cut it in todays market when phones shoot hd and can edit.

There are a bunch of these. The "imposible project" [1] bringing back polariod film being another. Polariod though has the advantage of being instant.

[1]https://www.the-impossible-project.com/


Polaroid instant film is awesome and is not like Super 8 at all (except they're both old). There is nothing digital that works as well as Polaroid (or Fuji) instant film. No, carrying around a digital printer doesn't work as well, yet. Integrate the digital printer into the camera so that you don't notice the printer at all, then it will moot the instant film. This Super 8 thing just seems crazy. It's conspicuously added inconvenience where instant film is just the opposite, an instant photo.


I'd say the Fuji instax SP1 works pretty damn well.


Home movies used a positive process, but TV and cinema would typically be shot using a negative process. Kodak has discontinued all of their reversal films except one (and it's black & white).


Thing is, in 2016 it's hard to argue that doing prepaid processing is to lock out the competition, at this point, it's a requirement to make super 8 "viable" again. Where else will you get it processed?


Amusingly, it's also a video camera, since the viewfinder is an LCD display. Not clear if the manual aperture and shutter time settings affect the viewfinder. You can feed video into the display (why?) but it's not clear if you can get video out of the camera. The film costs $50 to $75 per cartridge, for a running time of 3.5 minutes. Market: wannabe hipsters and old guys in the movie industry.

Kodak makes movie film only because the major studios, at the urging of some older directors, pay them to do so.[1] (Pro movie film sales were down 96%) The studios have to pay for a certain amount of film whether they take it or not. This leaves Kodak with a paid-for, underutilized film production plant and film development facilities. That's probably why Kodak is doing this.

[1] http://www.wsj.com/articles/kodak-to-continue-making-movie-f...


This is probably what's called a 'video assist' in the film industry: when the film is being pulled through the gate, it's shielded from the light by a spinning mirror. In a fully-analog camera, that mirror reflects the light up to a ground glass target in the eyepiece, so the cinematographer sees the light that isn't recorded onto the film.

With a video assist, that light is simply captured by a sensor (yes, exactly like in a video camera) and presented to the user on a video display. Some use a beam-splitter to deliver the analog view to the cameraperson while still sending some photons to the video tap.

Note, in both cases, the light that hits the film itself is never visible to anyone until after development -- the camera crew gets to see the photons that are rejected, which can do weird things when you go to unusual shutter speeds (the image gets darker in the viewfinder as you increase the exposure)... though I believe there were some 16mm Bolexes that simply used a beamsplitter, so the viewer saw the same scene as the film stock -- but don't quote me on that.


Amusing idea: conceal a flash chip in the film cartridge and record video to it from the camera used for the viewfinder. Put in a sound generator to make clicky film advance mechanism noises when taking pictures. The film in the cartridge is just a dummy and is not exposed. When the film cartridge is mailed in for processing, download the video, run it through Filmlook to give it grain and jitter, then upload to the cloud server. If the user orders the "return processed film" option, print the video to film stock at the processing plant.


No hipster will even notice. :) If done right, it will be even hard to prove there was cheating in the process.


>You can feed video into the display (why?)

I'd guess it's cheaper/easier than using glass especially to support swivel. Unclear if there's a prism in the light path though or if they're getting the image to the sensor in some other way. It would seem to take away from the whole retro vibe though.


Also, the "viewfinder" is one of those (relatively speaking) huge panels that fold out from the side. Try doing that with a 100% analogue camera! I reckon it's a clever hybrid.

(Plus it has SD, USB, etc... but there's no documentation as to what they do. Maybe you can store lo-fi digital versions for rushes and offline pre-editing.)


The vie finder is digital because people are used to a digital viewfinder and not an optical one anymore.


Um.

1. DSLR users are not.

2. The whole point of this exercise is to make a product for people who want a retro (and largely impractical) film movie camera. What people are used to is [EDIT: 100% digital cameras].


You may have not noticed it, but the benefits of film have been going extinct this decade. Check out how many feature films are shot digitally versus on film. Unless you are Spielberg, you are shooting digitally.

Currently, there are small digital cameras like the Blackmagic Pocket Camera (BMPCC) camera, under $1k, which have a capability to shoot images that are so similar to 16mm film that the average consumer couldn't tell.

The bottom-line today: if you want the 8mm vibe, you oversample your image when shooting (16mm or 35mm digital) and then degrade the image in post-production to 8mm.


This is pretty indisputable. Yes, recent and popular movies are still being shot on film, which very well reflects the fact that old habits die hard, and that creatives, producers and investors easily let their own biases and "shibboleths" get in the way of hard data.

I suppose one argument might be that the additional hurdles of the traditional workflow can be a constraint that boosts creativity.


Or unless you are Quentin Tarantino; see Inglorious Basterds (2009) to see what rich colors film can provide.


> Sean Mattini .... digital colorist assist

Edited and color graded on digital intermediate


The image retains characteristics of film even though it goes through DI, even color quality. (Quality in the sense of 'different', not strictly 'better'.)

I think it's the principle of "you can't work with what's not there" - film (especially talking several years ago vs. today) will capture a different and usually wider dynamic range, and lend a different starting point that that might not be possible or easy to simulate.


Film is beautiful and has a great 'by-default' aesthetic - but many digital imaging systems have been able to capture more dynamic range (and far more resolution) for a few years now, and it isn't that hard to add grain and emulate the kind of image response for popular stocks in post...

These days it's really more of a tactile choice - 'do you want to work with film' - than aesthetic.


Well I'll admit my acquaintance with the equipment is dated by a few years since it turned out programming was more fun and paid more :P I imagine it still gets you a different quality and a different set of possibilities, even if digital has taken the rational and technical upper hand in many situations.

More on-topic, it also occurs to me that the Kodak cameras mentioned here will produce an image that the average film student or hobbyist (I guess that's the market?) with a canon or red wouldn't likely be able to reproduce convincingly, so that's interesting.

Anyway what I'd really like to see resurrected is 3-strip technicolor! That'd be retro


Or better yet, The Hateful Eight, in which he takes film to a new level, shooting on Ultra Panavision 70 (which perhaps counter-intuitively is 65 mm film).

Other people are going digital, but Quentin Tarantino seems to be busy taking film to the next level instead :)


The "70" in Ultra Panavision 70 mostly refers to the projection system. As projected in the the theater, it uses 70mm film. During the filming process a 65mm stock is used. The extra 5mm was for the optical soundtrack.


Or JJ Abrams. Or Christopher Nolan. Two of the most profitable and prolific directors of today.

http://www.hollywoodreporter.com/behind-screen/help-star-war...


Pity it looks like shit if you don't live on the first-run circut, but I guess us peasants who don't happen to live in NY and LA only deserve scratched-up seconds.


Honestly, I thought the opening shots of the snow-covered landscape and a few of the longer interior shots didn't look that good. However the close-ups looked amazing.


You may not have noticed it, but a quite popular movie was recently shot on film. (The Force Awakens)


Mad Men (TV series) was shot on film, it looks very nice.

But I suppose it could also have been shot digitally and nobody would have spotted the difference. Maybe.


From season 5 they switched to ARRI Alexa cameras and added grain in post to emulate the film look. So you proved yourself right here, nobody did really spot the difference (and every season of Mad Men looked fantastic).


The death of film has more to do with distribution than acquisition; the pervasive switch to digital projection completely killed the last bastion of demand for film stock. (A typical film might requires 100s to 1000s as much stock for delivery to theaters as it needed for initial capture.)

That said, you can't get the entire Super8 look with digital filters. There are optical properties (have to use the same lens and sensor size), and the way it handles highlights vs lowlights is different than digital sensors. (The 'rolloff' in the highlights, rather than clipping at saturation, is very desirable.)

And it's true that film isn't entirely dead in Hollywood. Quentin Tarantino shot his most recent film on 70mm stock. But that's nearly 100x the resolution of 8mm film, so they're not really comparable. OTOH, I recently saw Wes Anderson's "Moonrise Kingdom," which was shot on 16mm reversal (color, not negative) film stock for the look.

In each case, the choice of film stock definitely affected the look of the film. It also affected the act of shooting the film; even if you can mimic a filmic look digitally (through digital acquisition and post processing), shooting digitally is very different than shooting film. I happen to prefer digital, but courses for horses.

But yeah, for consumers, it's pretty much a hipster affectation. Good for Kodak, though! Also of note in the annals of hipster retro photography is 'The Impossible Project,' which revived Polaroid film:

https://www.the-impossible-project.com/

Also, 'lomography.'


With modern film stocks and scanners Super8 can look remarkably good. It'll be interesting to see how Kodak's scanning service compares to scans like these, both of which were shot with another new Super 8 camera, the Logmar Digicanical:

https://vimeo.com/129700087

https://vimeo.com/groups/super8/videos/87243287


I wish they would have resurrected 9.5 mm instead. 9.5 mm was an amazing format in that the emulsion went from edge to edge with no pulldown claw sprocket. Instead it used a single sprocket hole between the frames. It was still very cheap film but more than 50% larger emulsion than Super8.

https://en.wikipedia.org/wiki/9.5_mm_film


Unlike 9.5, they never stopped commercial manufacture and processing of Super8 so all of the commercial infrastructure is still there - its less of a revival, and more a introduction of a new Super8 camera.


Good point, but 9.5 is still kinda amazing format, if only.


What does it take to make film? The camera can't be that difficult. All those Nikon and Canon lenses for small sensors would be ideal.


Black and white film? It's very hard, but doable. Probably comparable difficulty to building your own computer out of logic ICs. You'll buy cellulose acetate stock, a bunch of chemicals, and build a coating machine yourself. You'll also need some chemical engineering skills. Your film might be comparable to that from the 1950s.

Alternatively, you can buy a film factory when it gets shut down, but these tend to produce film in large batches, and it might be difficult to repair the machines. People have done this before.

Color film? Basically impossible. Probably comparable difficulty to building your own ICs. Only a handful of companies were ever successful at it.


I have lots of footage from my grandfather (and my mother) shot on 9.5mm; I still have the cameras and projector.

The problem with 9.5mm is that when the projector jams, which happens rather frequently, it tears the film apart right through the middle or, in milder cases, enlarges each hole between frames. It's horrible and nerve-wracking.

8mm projectors don't jam as frequently, and when they do, each frame is usually somewhat salvageable because only the track of holes on the side gets damaged.

Anyway, as most comments already said, there is no point in shooting analog in 2015 if you're not Tarantino (and even if you are).


I see it as a good low-budget transfer format rather than a projection format.


Interesting. But it doesn't solve the worst problem I've ever had with analog film: The developer lost the rolls that had our honeymoon pictures on them. (As compensation, they were generous enough to offer us... blank replacement rolls. We were not impressed.)

That's the advantage digital has - you don't mail the pictures anywhere. Nobody can lose them for you. (Yeah, you can still lose them yourself...)


Tip from the pros: if the pictures are irreplaceable, then divide them in half. Send half to the developer. If it gets ruined, you've only ruined half.

This is how wedding photographers who shot on film avoided getting sued into oblivion by angry brides (and yes, there are many, many photographers who have been sued into oblivion by angry brides, contract or no contract).


Yeah. I figured that out from losing my honeymoon pictures...

:-<


Or alternately learn to do manual colour positive processing. It's really not hard, and with a big tank you can do half a dozen 120 reels at a time.


YMMV, the machines are very consistent, it takes some serious discipline to match that consistency at home, and you can get color casts accidentally. The C-41 chemicals are also a bit more unfriendly and more difficult to dispose of properly (not sure about positive). So these days I only do non-chromogenic film at home, but I might have made a different choice in the 1990s.

Speaking of which, Kodak doesn't even make E-6 film any more.


Indeed. I've a freezer stuffed with fuji e6 120 and 5x4, and chems for processing.

C41 is a pain in the ass and I only bothered a few times - but with a little practice you can develop positive film perfectly at home - all about being able to work blind, and having a series of tubs at the right temperatures. Admittedly my first few tries ended up with lomo quality from a rollei!


I'm strugging to figure out who exactly this thing would be for.

Amateurs for home movies? Nope, digital will always be cheaper, and faster/more convenient to work with to boot.

Aspiring filmmakers? Nope, if you want to shoot on film professionally you'll want at least 16mm to avoid the magnification/graininess Super 8 brings with it.

People nostalgic for the blurriness of old home movies? Do any of these actually exist?


>avoid the magnification/graininess Super 8 brings with it.

But how else will your viewers know that you were hip enough to shoot on film?


Please tell me their CMS got misconfigured and thought it was April 1. Who knows? Maybe there is an absolutely brilliant marketing person at Kodak but I'm having trouble seeing it.

I mean. "There are some moments that digital just can't deliver, because it doesn't have the incomparable depth and beauty of film." Um, we're not talking 70mm Panavision here. We're talking Super 8. Has whoever wrote this ever seen a Super 8 movie?


The same people who buy impossible project instant film.

Sure you could shoot a perfectly in-focus colour corrected image and filter it, but there's something fun about not knowing the result instantly. It's the reverse of when digital cameras came out, then instant was exciting.


"but there's something fun about not knowing the result instantly."

Let me guess, you have never ever used a super 8 camera. I have, and FUN is the last word that will come to my mind about those machines.

Using a new cartridge and not being sure about light exposure in complex scenes, only knowing about it after having sent the cartridge away and returned. Idem with motion response, color and lots of little things that now we have feedback about in seconds, but at the time, took weeks.

I mean, after all the pain now you need to mount the projector, switch light off only to discover that your film is ruined, because you did not take the right decisions or just the developer lab did it wrong. Frustration, anger, disappointment, anything but fun.

This happened several times to my father. It was an expensive process to learn, only fun if you did not pay for it.

It was a pain in the ass.


I can actually kind of see the instant film thing in that it lets you hand people a rather unique physical artifact at a social activity or whatever. It's a break from "The pics will be up on Facebook." I'm obviously not the target market but this just seems weird. I remember Super 8. We used it because we didn't have an alternative. It was low quality and expensive.


No, I think instant film is fun because you always get a physical artifact that you can give to your friend instantly.


It's interesting, original Polaroid film was probably one of the most color stable (and accurate) films around - I have Polaroids that were stored is less than ideal conditions and nearly 40 years on have much better color stability than their print contemporaries.


I could see someone trying to shoot a professional movie that wanted that effect for part of it, because that part was supposed to be someone playing a Super-8 movie. But I suspect that there's a Photoshop filter for that...


Gus van Sant did exactly this in Paranoid Park.

The skateboard scenes in the movie were shot on Super 8, as inspired by the common style for home made skate films.

Here's a short clip: https://vimeo.com/7170912


That use of filters to simulate things like that leaves me cold. Especially bad is when they attempt to fake a CRT television display.


Well it sounds like bad simulations leave you cold. I don't think making a realistic Super 8 effect would be too difficult for a professional VFX artist.


But would it be cheaper than just shooting Super 8?


> People nostalgic for the blurriness of old home movies? Do any of these actually exist?

Look at all the people buying vinyl records and ask yourself that again.


Vinyl version even of the same album is often mastered with less of the Loudness War.


As someone who went to school in Rochester, NY (RIT), i love to see Kodak making a move that could possibly return them to a relevant position in the film industry. Rochester was once a proud city that has been beaten down by missing the innovation train. I hope this and other initiatives help to return it to some of its former luster.


Is there an actual double-blind study that proves that digital still cannot beat analog in certain realms? Because I'm having a hard time believing that this "analogue renaissance" isn't just pure marketing hokum.


If nothing else, it's a stylistic choice. Choosing how a film should look, the colours and tonality etc, is a creative process and has aesthetic and therefore subjective goals. You can't really say which one is 'better', only that they're different.


Everything you described is at least theoretically digitizable.


This is the best thing i've read on the whole film vs digital debate: http://www.fototazo.com/2015/04/the-meaning-of-films-decline... # TLDR: most film advocates claim all the wrong reasons for shooting film, and i say this as someone who is primarily a film shooter.

The most compelling reason to shoot film is because it gives you access to a range of camera and lens technology that is impossible to replicate with digital without significant compromise in quality and/or shooting style.

Nolan shoots on IMAX cameras with modified Zeiss lenses that give an extremely shallow depth of field (witness some of the scenes in The Dark Knight Rises).

Tarantino shooting on Super Panavision 70 could not be replicated on digital without extreme cropping of a wide angle digital source, which would change the depth/perspective, or stitching (difficult if impossible in motion picture shooting). See also panoramic cameras: Fuji 6x17, Hasselblad X-Pan.

Large format and even medium format, because the available digital backs have not yet reached the size of a full 6x6 negative. The digital back manufacturers claim they are "full frame" but when used on actual full frame 6x6 cameras they are anything but (this is not a resolution/quality argument, it is a "oh, my 100mm lens is now actually cropped" argument).


Are you claiming that that range of camera and lens technology will never be duplicated in the digital realm?


Not at all, and i suspect as sensor technology improves and prices become more affordable digital formats larger than 35mm will rise in popularity.


Hearing the name "Kodak" just makes me sad these days.

From the wikipedia entry, "From the $90 range in 1997, Kodak shares closed at 76 cents on January 3, 2012".


As sad as that is, it's a pretty long life for a company that was killed by mis-management: they invented the digital cameras but chose not to cannibalize their incumbent analog business. To combat demise, they chose to fire people and become an IP shell. Last try at survival was to sell off most/all IP and license its name. The rest is history.


We weren't in the company. It can be hard to tell. You could open a beauty therapy shop, and it's revenue increase every year, and suddenly after the introduction of cheap laser treatments and race-to-the-bottom competitors, in one year you could face a reduction of 70% of your revenue.[1]

[1] Anecdote.


Kodak certainly did a lot of things wrong--especially with the benefit of 20-20 hindsight. Part of their problem with digital was that they were arguably ahead of curve (and not quite on the right curve) with things like PhotoCD and providing equipment to photo shops to print customer photos.

However, the bottom line is that it would probably have been very difficult for even the most brilliant management to replace the film, photo paper, and chemicals consumables business. That revenue basically doesn't exist in the digital world unless maybe you count inkjet ink--though that's trending down too.

Fujifilm did end up doing OK by, among other things, applying their film making expertise to other industries like medical. But they had a tough run too. [1] The film business fell off a cliff that made CD sales look like a gradual decline.

[1] http://www.economist.com/blogs/schumpeter/2012/01/how-fujifi...


I worked at Kodak as a summer intern in '85. Was the era of the disk camera. Was also my first programming job. Lotus 1-2-3.

Most people today can't comprehend the scale of American manufacturing as it still was at that time. The Elmgrove plant where I worked (one of a dozen facilities in the Rochester area) has over 14 thousand employees. Our start and end times were staggered in 7 minute increments to manage traffic flow.

That none of that would exist 20 years later was inconceivable at the time. The word "disruption" wasn't in business vocabulary. Nor was the phrase "made in China". Some senior technical managers saw the "digital" writing on the wall. But what could they do? What could anyone do? There was no way to turn that aircraft carrier on a dime.

There was no business model in digital cameras that would employ 100 thousand engineers, managers, factory workers, technicians, and staff.


What people don't get a lot of the time when they're opining about what a business should do or should have done because the market is collapsing/collapsed for a particular product category is that you have to run the numbers. Maybe the business executes brilliantly on creating a new $1B business (which is hard). But if that replaces a $10B business, things are still going to get ugly. I don't have the exact numbers at my fingertips but, as I recall, film revenue fell something like 90% in under 10 years.

(That said, Fujifilm provides an existence proof that Kodak could have, however painfully, probably navigated this with better management making better choices.)


This bit about replacement of the business is exactly true; most people don't realize Kodak was making digital cameras the whole time film was being replaced. As a purely off-the-cuff guess, I'd bet the entire digital camera industry is smaller than Kodak's film processing work at its peak.

I live in Rochester - Kodak was literally one end of the entire city; the scale of the operation was huge.

What I regret was that it was a good place to work, though I was never there. Car dealerships would plan annual sales around bonus time at Kodak, and retirees had health care for the rest of their life, for instance.


It's one of the best examples of what disruption really means. To me it translates into that no matter how well you prepare yourself none of your best plans will be close enough to the reality on the ground to be of much use and the size of your company actually becomes a hindrance rather than an asset.


> and the size of your company actually becomes a hindrance rather than an asset.

I'd say less size than diversity. Conglomerate aren't subject to this and still preserve a lot of the benefits of size (although as GE Capital shows... that can go badly too if not risk-managed).


Although I'd argue that, if anything, a conglomerate is even more likely to just walk away from a business that's in sharp decline rather than taking heroic measures to try to fix things the best they can. Maybe maintain it as a small cash cow business if appropriate, but you probably end up with a lot of the same factories closed and workers laid off.

The company as a whole probably makes it through OK and that's probably a net positive given HQ staffs and so forth will be more likely to keep their jobs and there's less disruption than a bankruptcy but a lot of the same net effect is still the same.


> Fujifilm did end up doing OK by, among other things, applying their film making expertise to other industries like medical.

Another thing they did was carve out a very nice niche for themselves with their X-mount cameras. They've done quite well in part because the in-camera processing engine does a good job of emulating a lot of the old Fuji film stocks (as well as making very nice cameras that offer something different to the canokin mill).


I have an XE-1. Now there's retro i can buy into. Especially with the pancake lens, it really feels like an old fashioned rangefinder in a lot of ways.

I don't know how much money they make off that line but they've done a nice job of it to the point where I don't use my DSLR very much unless I'm shooting action or need either ultrawide or telephoto lenses.


I've gone to Olympus, which has much the same appeal for me, and my Canon equipment is very much unused these days.

Interestingly Olympus are one of the few traditional camera vendors whose camera division are moving up in profitiability, so it seems that "doing something different" can work quite well.


They really did have trouble letting go of the per-imprint pricing model in the end.

I believe that photo-chemical prints are better and longer lasting than any inkjet (anything really other than dye transfer and offset printing), but they didnt get digital cameras out there fast enough, and couldn't let go of the film based model.


What I understand was that they had developed advanced technologies on holographic film technology, had a nascent business in forgery-resistant ids, and declined to pursue that business.


Of course. Hindsight 20/20, etc.

But I feel like if your business is a mature process based around a singular need, having a future disruption department to keep tabs on the market and feed back into high level management should be de rigueur. Innovating amazing and critical components of digital cameras and then spending $5B to buy a drug company in 1988 deserves to get you some future flak on your corporate leadership's decisions.


I think you're being more than a little unfair.

Part of the problem is that there simply isn't as much money to be made in digital as there was in analog. A digital camera doesn't need film, but film (and film processing) accounted for much of Kodak's profits for many years. So it's not just a simple matter of "make digital cameras instead of analog ones".

Also, remember that Kodak was very big into digital imaging, and was considered the leader in the field of high-end digital cameras. That is, until the Nikon D1 came along around 1999. Even then, Kodak sensors were common among the next few generations of DSLRs.


Didn't they also have a stint as a patent troll?


I had to read that wikipedia article just to discover that Kodak was still in business, albeit a shrunken shadow of the former colossus that today is a niche player.

I grew up with Kodak products -- had a darkroom as a kid and every chemical, every paper, every film was made by Big Yellow. My camera was a Kodak Instamatic.

It is sad to see these organizations evaporate from disruption. Progress and all that, but still sad.


> There are some moments that digital just can't deliver

Isn't this the mindset that drove them into the ground in the first place?


Came across a tweet about the Sony MiniDisc Recorder/Player going for hundreds and even thousands of dollars on Ebay. Nostalgia apparently sells.

http://www.ebay.com/sch/i.html?_from=R40&_trksid=p2050601.m5...


That's amazing. I have one that works absolutely flawlessly; I still need to transfer what I have on those disks before I try to sell it, but then...

Why would people pay that kind of money for this?? Do they need it to play their old disks? Wouldn't it be cheaper to have them transferred to another media by a lab?


My brother has a bunch of MiniDisc players and recorders. I do not understand why. He refuses to get an MP3 player or use his phone but will instead record audio in REALTIME from CDs etc. to MiniDisc. He says the battery life is really good. The psychoacoustic modelling on the audio is not, however!!

He even bought a multitrack minidisc desk. Apparently it was going cheap. I understand why - with 4 tracks of audio, it'll record 15 minutes. No good for long jams.

I do not understand the fascination with it at all. Even an old Tascam portastudio or the modern equivalents that record to SD card would be better.


The psychoacoustic modelling on the audio is not, however!!

There were quite some improvements between older and the latest models though, and the latter could also use uncompressed and/or lossless schemes.

the fascination with it at all.

One thing I really liked about it is the handling of discs. I don't know why, but I just liked the feeling of opening the player, ejecting a disc and inserting another one with a satisfactory 'click'.


A bit like using an old cassette walkman then!

Wobbling CDs into top-loading CD players was never as satisfying (and you were likely to scratch the underside of it as you slid it into the caddy).

Loading MP3s isn't satisfying at all.


It was a very popular platform for audio recording by journalists. Good from factor, good battery life, decent audio quality (compared to cassettes/mini cassettes,) flexible controls and either decent microphones or at least connectors.

I'm surprised it's still popular enough however, I'd have thought modern flash based platforms would have killed minidisc by now. It was different back in the early 2000s.


WTF I had no idea.. I have multiple models, including MZ-RH1 and MZ-RH10, lying around in mint condition. I also have a bunch of Nintendo 'Game&Watch' series. From a quick look on ebay I could sell all of that for well over 1000$. Look ma, I'm gonna be rich :]


I grew up in upstate NY and visited Kodak's campus in Rochester in their heyday. It was impressive as hell to a ten year-old. I'm sure the idea that the massive works and the business they represented could almost completely evaporate never occurred to the people working there. It's a little melancholy in some ways. I wish I could see this as more than a desperate attempt to rekindle that dead business, but I can't. How much do the benefits of using analog processes to capture the light really matter when the vast majority of people will access the content downstream through digital delivery platforms? Somewhere along the line the information is going to get sampled and aliased. Tarantino getting all nostalgic for analog content that will be shown to viewers via $100,000 digital projectors is one thing, but most photographs are viewed on phones, and in web browsers.


I feel like we're out of ideas, so we revert to nostalgia. I just saw that Technics just re-reintrouduced the SL-1200 and now Kodak is reintroducing the Super 8. All so that we can feel the analog warmth. I think this is just the effect of software taking over the world. We cant have these tangible things anymore that have their own character, quirks, and defects(that when new infuriate users only to be remembered fondly 20 years later) We recently bought my 9 year old an instant camera. Each pack only has 10 pictures. She blew through that in about 30 minutes because she was so used to the "infinite" digital pictures she could take. To add insult to injury, she kept taking the film pack out and exposing the pictures, and couldn't figure out why they were blank.


This is good for digital in ways I think people won't notice quite yet.

As an amateur photographer I tend to see film users a lot. I shoot digital but I'd like to shoot film too. The reality is that there are some artifacts that film gives you that you can't replace in digital yet. In art, if the effect or feeling that you want is given on a certain tool then that is the tool for the job. In photography, if this is film, then film is the right tool for the job. There's also the large factor of workflows. People get used to a workflow that influences their style and it's important to them to maintain that workflow. What is measurably better in technical terms is not important.

What I see happening is that market trends like this will push digital forces to perfectly re-creating classic films. The difficulty with re-creating film with filters or presets is it's notoriously hard to do and usually not perfect. Fujifilm is a perfect example of this with their film simulation modes on their X-Mount camera lineup. I suspect that the film trend is going to push photography giants into creating more accurate emulation of film baked into their workflows and devices rather than the current trend of generic hipster Instagram filters or playing in Lightroom for a few hours (and still not getting the effect you want).

Whether you're a professional or a 16y/o girl with a K1000 and mix-matched 80's leg warmers film still does have a place amongst people and this will in turn affect the development of digital processing.


Cool, I'm switching to coding with punchcards.


Might be a startup idea to sell hole punches then eh?


Interesting. Today I've come across quite a few stories on analogue tech. This + new Technic SL-1200 models + new vinyl pressing machines being produced due to vinyl demand continuing to grow + record players and instant film cameras being incredibly popular on Amazon this Christmas. I wonder if we'll see this occur to other analogue tech too?


I call this the "letterpress phenomenon".

Once upon a time the only way to get some text printed was to hire somebody to arrange lead type letter-by-letter and print your thing on a six-ton iron press. This required significant amounts of training to do well and still resulted in artifacts of the process, in this case slight debossing of the paper as the fibers were crushed between the press and the type.

Then came photolithography and xerox and laser printers, and nobody saw the point of all that labor and machinery.

Then came inkjet printers and Microsoft Word and email, and suddenly a textual message doesn't seem to have the gravitas or expertise that once did. At which point, having something letterpress printed is very noticeable and neat, even if you can't put your finger on what's different, and a person who does letterpress printing has necessarily invested enough time to correlate with passion and a keen typographic eye.

Once the old thing takes off, people really want those artifacts of the antique process, to the point where contemporary letterpress printers are goaded into ramping up the pressure on the press until the paper is crushed to oblivion and your print is ridiculously three-dimensional, more so than would have been acceptable back in old times.

Similarly, people who are shooting on Super 8 these days are really looking for grain and weird color temperature. And it also explains why, like photolithography, 16mm or analog video are neither distinctive nor expensive enough to be as interesting.


See also: Instagram


There will always be a big market for items of nostalgia, whether it's taking photos, playing music, enjoying ancient video games, or banging out letters on a typewriter. All good fun as long as people don't spout pseudoscience about its inability to be emulated with modern equipment.


It isn't just nostalgia.

Digital music can't simulate having a physical album cover, an e-ink screen won't ever be similar to a printed page, no music encoding will physically prevent loudness-wars mastering the way a vinyl record will (the needle would just jump out of the track), no printer will ever be able to fool someone into thinking a document was written on a typewriter.

You can tell the difference if a piece of mail was signed by a human with a pen instead of a printer, and it means something.

The limitations of analog media are very often their strengths, especially in corner cases. The limitations of a medium are often a significant driver for the creative process and losing them or approximating them makes lots of things worse.

Normally people (morally) opposed to analog media spout just as much pseudoscience in defense of their position. (normally people arguing about such things on the Internet are idiots anyway)


What gives those things value IS the nostalgia though. A computer printed page has clear crisp letters with advanced fonts and typography compared to a typewriter with smudged, fixed-width fonts. E-ink books allow for notes, search, bookmarks and much more. The new is with few exceptions objectively better.


But it's not just about nostalgia. There's the fact that there is more human work and thought involved with doing something by hand. It is a social signal with some of the objectively "worse" ways that will stick around for awhile.


I still like real wood... It's never "perfect" the processing can never be as precise as metal, plastic or other composite materials can get. For that matter, I like glass, which oddly will often show some material imperfections, such as in a table top.

As for books, I simply learn better from the physical book... every time I try e-books I wind up in a blur.. can sometimes happen with very long online articles too. There's no context or position to it all.

Don't get me wrong, I'd rather never again have to lug around an 80# monitor, or use a typewriter again... for that matter, I don't much see the point of film over digital capture.

All of that said, there's something to like about natural flaws.


"You can tell the difference if a piece of mail was signed by a human with a pen instead of a printer, and it means something."

Not any more. There's an app for that now.[1]

[1] https://vimeo.com/101932145


Looks like it doesn't do ballpoint, so it probably can't duplicate the varied indentations into the paper that a ballpoint pen will make. I've been checking mail from my Congresscritters to see which ones are using autopens [0] or similar technology, such as the link you provided. (NB the Wikipedia article for the autopen [0] mentions a model that functions on the x and y axes, but makes no mention of a z axis)

[0] https://en.wikipedia.org/wiki/Autopen


Until the nostalgic die. Things are going to be rough for Rickenbacker when Beatlemania breathes its final gasp ca. 2030.


I see USB and... SD card slot on the back? HDMI and... 3.5mm audio?

One of the most characteristic features of Super 8, at least to me, is the complete lack of audio (at least, on most Super 8 works). So if you went to a theater to see something on Super 8 there might be a live band playing the soundtrack.

I wonder if Kodak is doing something like putting audio on the SD card and then storing digital synchronization marks on the film somehow.

Edit: To be clear, I know you can already put audio on Super 8. It's just that most Super 8 films I've seen in the theater have had no audio or live audio. And yes, I looked at the specs. The specs don't mention anything at all, but the product rendering appears to show jacks for audio and data, and I'm wondering how that's incorporated.


Super 8 was designed for magnetic sound from the start.

I suspect this would allow you to record digitally at the same time you were exposing film perhaps, or even use it as a digital video camera without film.

I'm excited, I want one.


Growing up, my Dad shot on Super 8. And it had no audio - the camera had C-sized cells that only ran the film motor. I see from Wikipedia that cameras with audio capability were added in 1973, so it may be that Dad's camera predated that.

At 24 fps, with each frame being 4.01mm tall, means 96.24mm (3.7 inches) per second of magnetic stripes (there are two, presumably for left + right channels). This is roughly twice the speed that cassette tapes used (1-7/8" per second). Their announcement doesn't say if there's any audio compression/encoding used, but I used to use a dbx compression/expansion box to lower the noise floor on cassettes with very good results.

One of the nice things about Super 8 was how easy it was to edit. You used a cutting station that had pegs to ensure your cuts were between frames, and glued them together after scraping off a little bit of emulsion. Adding audio complicates matters, as the audio track is offset from the matching frame by about 3 inches.


Super 8 film could have an oxide strip for magnetic sound.


Right, but most Super 8 works don't have sound.


Never mind audio, this is, sadly, proprietary bullshit, and has been since the 60's. "Super 8 film cartridge" - what? I'd be far more interested if they were just doing reeled 8mm with development services - I have a perfectly serviceable 1930's 8mm that I haven't been able to find film for (well, at any sensible price) for twenty years. I'd love to be able to shoot with it again.


What kind of 8mm film does it take? I don't think I've seen anything but Super 8 for ages.

I know some 8mm cameras can take standard 16mm film that's been cut in half.


Standard 8 - they made it from '32 to'92. Camera is a kodak, from '32, model 20, in the family since it was new. It is beautifully engineered.


Kodak once made some really nice cameras. My dad has a Kodak Retina IIIC which he used for many years before giving it to me, and I used it until the late 70s when some of its mechanisms finally just wore out.


You're just imagining what the product could be like instead of reading the actual product/specs page? That's an interesting approach, but should probably label your fantasies accordingly.

http://www.kodak.com/ek/US/en/consumer/Product/Product_Specs...

"FILM GAUGE: SUPER 8 ( EXTENDED MAX-8 GATE )

FILM LOAD: KODAK CARTRIDGES WITH 50 FT (15 M)

SPEED: VARIABLE SPEEDS (9, 12, 18, 24, 25 FPS) ALL WITH CRYSTAL SYNC

LENS MOUNT: C-MOUNT

FOCAL LENGTH: FIXED / 6 MM, 1:1.2 – RICOH LENS (OPTIONAL ZOOM 6-48 MM LENS )

FOCUS / APERTURE: MANUAL FOCUS & IRIS"

etc.


I did read the specs, and they don't answer any of my questions. This comment fails to give me any new information and instead just makes me feel bad, so, congratulations, if that was your goal.


Description of crystal sync here: http://www.tobincinemasystems.com/index_files/Page269.htm

"a camera's running speed is locked to the digitally divided oscillations of a quartz crystal, with an accuracy of a few parts per million. The function of the pilot cable was to feed a representation of the camera's speed to the recorder. Since with crystal control that speed is precisely known, the cable can be replaced by a similarly accurate crystal generator mounted in the audio recorder. This eliminates any connection between camera and recorder, but permits them to stay in sync with each other. A pilot signal is recorded as above, but it comes from the built-in crystal instead of the camera. The resulting tape is resolved to mag film just the same as a pilot tape. You still need a clapper board for a start mark."

So, it doesn't actually record the sound, but it is capable of being sync'd to the recording of the sound.


It'd be nice to see films shot in color again instead of blue and orange.

http://priceonomics.com/why-every-movie-looks-sort-of-orange...


Go see Carol (shot on 16mm)


dumb question. as someone who shot plenty of super 8 about 35 years ago, what I don't see here is the projector? That was the part that sort of sucked (not to mention editing). You still need a projector to consume this media, right? Or is it just teleported through some hipster USB device now...


They include processing and digital transfer.

it's teleported to some hipster cloud storage now.


It's color negative film anyway. Probably gives better quality scans and is a bit cheaper.

[Edit: And actually I don't believe Kodak makes a reversal Super 8 film any longer, although others do.]


Yeah, Kodachrome processing stopped about 5 years ago, since the chemicals were too nasty. I think Kodak still sells B&W reversal though.


This is amazing news, the quality of the super 8 has a lot of character.


Well, this will surely at least be a collectible, some day.


Here's a nice solution in search of a problem.


I speculate that there is latent demand for it between older people that miss it and younger people that are tired of pure digital and want something more organic and different.

Personally though, I would rather have something similar to analog that doesn't require the cost and time/complexity. I wonder how many folks will buy these to go with them: http://nofilmschool.com/2013/12/nolab-digital-super-8-cartri...


>younger people that are tired of pure digital and want something more organic and different

Which I actually sort of appreciate (in my less cynical moments). Though, if it's just that they've never shot anything on film, I'm sure they have friends or co-workers with old still cameras gathering dust. Borrow one, shoot a roll of film, send it off to be processed, and the urge will probably have gone away by the time you get your prints back. You can get those prints scanned too if you like.


Here's a nice solution in search of a problem.

An old boss of mine used to say something similar: "That's a cure for no known disease".


That's crazy amazingly awesome. Need to know how much it costs though, both camera and film/processing/cloud.


I'd expect the camera to cost 2-800 and the processing/film to be about 30 bucks a cartridge. Too bad they don't make a color reversal film anymore, so you can project it too.


it makes a lot of sense for film makers who want to film in 4K without forking out for a Red camera, just paying per minute of film instead.

If you like in hollywood tho, it's probably cheaper to rent a 4k camera for your next blockbuster.


These are all meaningless words without a sample video


If Kodak is scanning all of the photos and putting them into the cloud themselves, does that mean they have rights to said images?


No.

Photographs belong to the person who pressed the shutter release. The actual format or medium of the photograph (digital, print, negative, whatever) is irrelevant to the copyright, which is created during the act of taking a photograph.

So remember: Next time you hand someone your camera to take a picture of you, that person effectively owns the copyright on that image. Including a monkey[0].

[0] https://en.wikipedia.org/wiki/Monkey_selfie#Copyright_issues

PS - Although I'm sure you have to give them a license to your copyrighted photo so that they can display it back to you on their website. It is standard industry practice. Doesn't mean they have redistribution rights or own the copyright however.


Is there theoretically anything preventing them from putting a line like this from insta's TOS in there?

"Instagram does not claim ownership of any Content that you post on or through the Service. Instead, you hereby grant to Instagram a non-exclusive, fully paid and royalty-free, transferable, sub-licensable, worldwide license to use the Content that you post on or through the Service, subject to the Service's Privacy Policy..."


Why does that text concern you?

If you want Instagram to display your photograph back to you (or other users at your request) then you have to grant them a license like this.

Where terms get concerning is when they want a redistribution license, or they want to allow unnamed third party "partners" to be able to utilise your photograph royalty free. Neither of which appear in the snippet above.


> Instagram a non-exclusive, fully paid and royalty-free, transferable, sub-licensable, worldwide license to use the Content

Does "sub-licensable" not allow the "partners" you speak of?


Yeah, I probably could have found a better quote, but it's a random TOS and I'm at work haha. At the end of the day, I don't see a reason to believe why Kodak wouldn't make a play to control the images they scan/upload as much as possible. Or a reason to believe TOS wouldn't change to make that so once adoption of the new Super 8 is at an acceptable level to Kodak.


The sub-licenseable, and transferable parts.

That's pretty much "Screw you we can do what we want now".


What amazes me is the lawyers haven't been able to come up with some legalese that allows companies to transfer your data around their internal systems for your own use without stating "we can do whatever we want with your content".


AFAIK this is a solved problem in B2B. e.g.

AWS: "We do not access or use customer content for any purpose other than as legally required and for maintaining the AWS services and providing them to our customers and their end users. We never use customer content or derive information from it for marketing or advertising."

SoftLayer: "As a Processor, SoftLayer will not access the Customer Content for any purpose beyond providing You with support as described above, and will not disclose it to any person or entity."


Copyright says you get to define that however you like. It's why we have so many options for code licensing. The problem here is that companies would much rather have the "we can do whatever we want with your content" option because it benefits them much more, and not enough users care enough to make them change it.


This is going to fail HARD.


c mount? i mean "c mount"!!!!! yes!


Wow--surprised! Never thought I would see another film camera again. I didn't want to get into the film/digital debate, but I have no life--so here goes:

I can still tell the difference between film and digital. It could be I'm too used to film? It could be I'm partially color blind? Whatever the reason, I like the look of film.

Recent example. I watched Dumb and Dumber Too. Yes--it was bad on a lot of levels, but what really suprised me was the look of the movie. It just looked cheap. Say what you want about a Farelly brothers movie, they always looked great. I then looked into it, and one of the Farelly brothers was given the choice between film, or digital. The producers brought him to a digital lab, where the techs applied the film "look" program to make the digital look like film. Farley couldn't tell the difference. I sure could? They went with digital. If this is their last digital film--they will have settled to controversy--in my little world. I just have a feeling their next movie will be in film?

As to digital photography. When digital finally hit the practical point, I went digital. For myself, it was the Canon 20D, I bought the camera, and three very expensive lenses. Yes, it took great pictures. Great pictures to what? I wasen't a photographer before digital? I didn't know better. Actually, as a kid, I was a photographer. I used a Pentax K1000, and a Canon 350D. A few years ago I found a stash of negatives. I had them blown up, and asked a couple of family members to pick the best pictures. These were nature pictures. All picked my kid pictures. Maybe I was a better photographer as a kid? I don't know.

If I was going to "gear" up again, I think I would stay with film. Not because I think it's that much better, but because the used lenses, are so cheap right now. A used Canon F 2.8 300mm lens is under a grand used. The digital Canon 2.8 300mm lens is a minimum of $3500 used(usually beat up.).

To anyone who honestly wants to get into photography, but funds are tight, look into film. Hell, I wouldn't even bother with color. I would set up a bathroom darkroom, and set up shop.

I guess it's easy for me to throw around this advice. I'm not going back to chemicals in the sink. I have bought the digital equipment, and probally won't go back to film. Oh yea, whatever you do stick with prime lenses. That was my biggest mistake. Buy whatever camera bare. Buy the lenses(glass--if you want to sound like one of those guys--I never wanted to be in that club.) Buy your primes separately. Try to keep your digital camera not in [fully auto] all the time, but then again, I sometimes wonder why.

My ex-girlfriend is a professional photographer--takes studio pictures of old stuff. She loved to brag about it. "I'm a professional Photographer. Did you know, I take pictures for a living?" Yes--it was worse than going to the dentist, but maybe I was being too critical? She has never been in manual mode, TV, or even AV mode. She doesn't know about F stops, or exposure times. I think she got lucky though. I told her the pictures were Spectacular, but I really though they were too Photoshopped. As to her job--well it's a lot about who you know at big corporations.

Good luck--


"I used a Pentax K1000"

I got one of those as a gift, probably so I'd stop borrowing his Spotmatic (the K1000 was basically an 80s version of the 60s Spotmatic, with an annoyingly different lens mount).

I think a large aspect of picture quality when we were kids, is rich grandpa can give you triple digits worth of camera at Christmas, but I paid my own way on consumables and I had to push a broom at the food store for something like ten minutes per pix once all the costs of analog were accounted for. Large sheets of photo paper for enlargements were not cheap, either.

Something often overlooked is the analog era was extremely expensive.


I mostly shoot digital, but bought a Pentax spotmatic with Takumar 50mm 1.4 lens a couple of years ago for like £60. It's fun and there is definitely a lot of vibe in the pics, but ends up costing about £15-20 to develop and scan a roll of film. For someone who shoots a lot, that adds up




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: