Hacker Newsnew | past | comments | ask | show | jobs | submitlogin
Writing device drivers in Linux: A brief tutorial (2006) (freesoftwaremagazine.com)
186 points by seangarita on April 21, 2013 | hide | past | favorite | 33 comments


If you're like me and read comments before you read the post, then I urge you to take a look at this how-to. Not only is it well written and informative but it makes fairly complicated concepts easy to understand. If you like the idea of writing drivers and want that kind of control over hardware but don't have a clue where to start, this is pretty much it.


Very exciting and empowering article. Makes me want to learn c, or at least believe that c is possible, straightforward easy and useful.


While not straightforward or easy (for an average chap), it's quite possible, and very useful. Although I wouldn't call it most people's programming language of choice, knowing, and being able to grapple with how C pushes memory management to user-code will definitely change the way you think about programming.

TLDR : Pick up any of the hundreds of books on it and start learning! If you're having trouble motivating yourself, try solving Competition Programming problems in it - it's usually a good way to learn a new language.


K&R was the first and only book about C that actually worked for me. Everything else was verbose, confusing, and left me with no more than the ability to do basic character apps. I read dozens of the "hundreds of books on it", and they all left me incompetent at C. K&R allowed me to become useful in the C language, including hacking on kernel modules.

I'm sure there are other good books on C. But, I never found one. They were all too long, and hid understanding behind verbosity.

K&R also made C fun for me, revealing it's nature as a well-crafted tool. Reading a lot of real C code also helped...I finally really learned C soon after migrating to Linux, where code for everything was available.


I'd go even further: I would say that K&R is the best written book in all of computer science.

It is so brief, yet it covers everything about C, introduces you to programming and to UNIX, and teaches you style. It's written by no less than the guy who coined the term "UNIX" (the K in K&R) and the inventor of C (the R). The original K&R--not including the reference manual at the end--was only 177 pages. The updated ANSI C version, 189 pages.

The opposite is any book about Java. The usual university-level Java books are a sickening 1000+ pages.

I've often wondered how anyone who's trying to learn something new can prefer a thick book over a thin one. I deliberately look for thin books -- but most people must fall for the thick==better notion.


Someone needs to write "c for ruby programmers"


Not exactly what you're looking for, but "Learn C The Hard Way"[0] isn't a bad tutorial...

[0] http://c.learncodethehardway.org/book/


> "c for ruby programmers"

Most, if not all, of this book would be "How To Use Pointers" with large sections on how to structure your code around correctly allocating and freeing memory.

A (fairly brief) appendix would be "Weird Stuff C Programmers Actually Do", for the 'dark corners' stuff that's actually used outside obfuscated code competitions. It might be hard for someone who's actually a C programmer to write this; things stop looking weird after a while.


Does anyone have any suggestions for books/learning materials related to the pre requisites mentioned in the article, specifically Microprocessor programming? I've tried to find some in the past but having no prior EE experience I find even some of the basics challenging.

I have experience with C, but don't really know where to start with the lower level stuff.

I'm thinking I should start with a simple book like Electrical Engineering 101 (http://www.amazon.com/Electrical-Engineering-101-Third-Schoo...). Once I have a grasp on some EE basics I might be able to step into the Microprocessor programming a bit better, knowing a bit of what's happening behind the scenes.

Any thoughts/suggestions?


The "lower level stuff" you're asking about here isn't exactly EE stuff, it's more about microprocessor interfacing. The kernel driver is getting data from userland, but now it needs to be massaged and placed into appropriate registers in the processor to get it to do something, or push the data out to a helper chip where it can do something (like sending and receiving mouse coordinates over USB to the little chip inside the mouse, for example).

So one one hand you need to understand your host processor (and it's constellation of helper chips) inside and out. Some modern SoC systems like the ones in smartphones have everything built into the same chip, so you wind up combing through 5,700 page Technical Reference Manuals like this one for the Freescale i.MX6:

http://cache.freescale.com/files/32bit/doc/ref_manual/IMX6DQ...

Or, in the case of more generic micro-based systems like an Arduino or something, you're reading datasheets for other little chips and figuring out how to interface them to your host's kernel.

But yeah, knowing how to wire up a transistor or LED to a processor without cooking it (or your power supply) is a good thing. You can learn a lot from taking apart other people's projects and seeing how they do it. Common patterns start showing up.


In addition to reading a book on basic EE, buy an AVR-based Arduino. From a perspective of professional EE/embedded developer Arduino may seem limited, but there is loads of information on "how to connect X to Arduino".

If you are specially interested in embedded programming, try to start using avr-gcc directly- you'll be forced to learn lot of low level stuff which is hidden by Arduino IDE.


I'm glad you suggested what you did because that's exactly my current setup.

I have an ISP and Arduino. I yanked the AVR chip off the Arduino and stuck it on a breadboard. I just found it difficult to do anything due to my lack of EE knowledge. I could understand the programming basics due to past experience, but had no clue what was going on under the hood.

I think I'll read that EE book and then continue on the path I'm on.


If you get the basics of assembly programming, and read any periferal's datasheet (get a simple one), you'll notice that there is protocol for accessing the periferal. If you can understand that protocol and know how you can write C code that conforms to it, you know "microprocessor programming".

If not, well, you may need to learn assembly, C, or something else entirely (but probably not EE), and the experiment will help you discover what exactly you need.


You can probably do just fine with what you know.

One project that might help out is to re-write the 8150 USB network adapter project. The devices that have this chipset are easy and cheap to find and the data sheet is easily available. There's a PCI version of this chip, the 8139, too. The USB project would be easier since you're passing URBs back and forth. The LDD3 book and the data sheet is more or less all you need.


I'm trying to understand what you mean by "Microprocessor programming." Do you mean something like Assembly language? Or do you mean Microprocessor design?

I'm wondering if I might be able to help out, but I'm not sure I understand the question. I'm an EE with a lot of digital design background and some software.


I really recommend a book called Linux Device Drivers (Third Edition) [1]. It's free, and split according to the major kernel subsystems. I used it many times as a reference when I had to recall some obscure API.

Another a great tool while developing for the kernel is the LXR [2] which is a browser-based indexer of the kernel source, for each of the kernel versions. Again great for checking out how to interface with a subsystem or how different calls are used.

[1] http://lwn.net/Kernel/LDD3/ [2] http://lxr.linux.no/


Why wasn't I aware of this guide last week when I had to write a kernel driver for my OS class?!

One thing that's surprised me with Linux development is how nice the APIs are to work with. Things like the file_operations struct, and the linked list stuff are very well thought out and easy to use (as easy as they could be for C programming).

My only complaint is that a lot of the written documentation is out of date about a number of topics. As the kernel has evolved it's gained and lost a number of APIs and depending on when resources were published they may say a number of conflicting things about how to carry out a task (registering a character device and getting it in /dev is the big one that comes to mind).


This is why kernel hackers tend to say "Use the Source, Luke!" instead of pointing you at documentation. The one exception is documentation that talks about the high-level design of a feature (e.g. Documentation/pi-futex.txt), which is usually updated whenever there's a complete rewrite or major design change.


I worked as a linux kernel programmer for a time. All I really did was work slowly on bugs. It seems that there are always neat features that need implementing but the number of engineers with the now specialized skills required is relatively small compared to the general population of embedded engineers. It seems rare to find someone willing to cultivate an embedded engineer into a kernel developer. They're not hard and fast separations, but still, this division remains.

So, if you can, try to connect with actual kernel engineers, do an interesting project in school for a professor doing hardware work, etc.

One project that might help out is to re-write the 8150 USB network adapter project. The devices that have this chipset are easy and cheap to find and the data sheet is easily available. There's a PCI version of this chip, the 8139, too.


Wonderful - showing how little magic there really is in kernel programming. I recommend that everyone who does Linux programming go through this tutorial, even if just for fun - and for the sake of seeing how different the world on the other side of the syscall is. When writing a kernel-mode driver, a single bug usually means a reboot: either because you screwed up some refcount, so the module cannot be unloaded anymore - or because you clobbered something badly and the kernel is toast.

Having a driver-writing background, I silently laugh at the recent unit testing hype - I simply know from experience there are areas where you need much more care, gut and skill than def test_something(). And besides, you just can't unit-test a DMA transfer.


Speaking of 2006, is it still a releveant article? I'm sure that internal Linux structure didn't changed that much?


If you give this a try, take note: the drivers include the <linux/config.h> header file which was removed in 2.6.19. You should be able to remove it from the source with no ill effects (a quick test of the "memory" driver worked)...


They left out the bit about begging hardware manufacturers to release specifications.


I haven't bothered to check the docs, but isn't that a race condition in memory_init, registering the device before allocating the buffer?

Or are drivers loaded while holding a global lock? Even though driver (un)loading probably is fairly rare, that seems a bit heavy to me.

Worse, the 'goto fail' path, if it is ever hit, seems to leak a register_chrdev call.


I have zero experience with microprocessor programming , any recommendations in terms of resources and hardware to get for absolute beginners?


The way it's written almost makes me believe that kernel modules are that easy to write.


Lol. They ARE that easy to write.

Debugging, by the other side...


This is really a good introduction to device driver programming. Thanks for sharing this.


really good


I remember, this was my tutorial when my lead had told me to practice writing some dummy device drivers as our team was supposed to merge into kernel team. Well, the merge didn't happen and this was my first and last device driver.

I liked the clarity and ease it offered to a first timer. I was somehow miffed about moving to kernel team from Java(app dev) but after this tut and some more articles I was disappointed when finally I didn't. This was one of the articles that changed my leaning towards learning C in a positive way. C is now my interview language :-).


Article is from 2006.


It doesn't make it any less relevant, people post Wikipedia pages, irrelevant news stories meant to make you "awww" on here, etc, at least this not only provides technical insight but is also relevant to the normal content on HN.


Yes, but they publication year should be indicated in the title if it's an older article. When I posted my comment it was not.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: