Wait a moment, they have a Falsehoods series on all sorts of subjects. The Fake News fact checkers are going to try globbing on to every niche they can...
A real man might take responsibility, but if that ever becomes too visible you will be made the dunce if not scapegoat. Maybe only the successful people had the clout to pull it off. :/
Admitting mistakes to move on, failing fast, should prevent prolonging arguements though (if you can pull it off).
Are you sure there weren't some sort of recruitment effort with at least some of those guys... Though you do see some tech pages out there, maybe because of Hangouts? shrug
I didn't know Tim O'Reilly was on G+ but ALL these recent posts are pop-politics and ugly at that! I thought you were referring to all Google services being tied in.
I don't know. Good point. I expect some, e.g. O'Reilly, were actively solicited.
I don't spend too much time in Plus. Brin has had a lot of political stuff, recently, although he's a bit mixed and individual in his opinions and they do tend to tie back to points modeled in his sci-fi writing.
O'Reilly I see less frequently. I suppose he's getting more "elder statesman", too.
Torvalds actually addresses some tech stuff. Though he has personal stuff, too -- he does not use it as a "tech blog".
Reminds me some of Tim Bray's "Ongoing" blog -- self-hosted -- that I haven't been to in a while. A personal take, on both tech and other stuff in his life. Whatever the topic, informed, intelligent, and thoughtful. Creative, too.
I don't think Plus is in any kind of a steady state. Even what I wrote about, may well be in decline. Maybe it was better a few years ago.
Regardless, this year's politics have really shocked a lot of people, and this is spilling over into erstwhile non-political venues.
Reasonable means the simplest and most minimal set of instructions to implement a particular behavior. These are your Primitives for the intermediate language. If you find something that you can't do in terms of the existing instructions, then the required operation should be added to your set of primitives. A good way to figure some of this out is to read through a minimal FORTH, say something like Jones FORTH (https://github.com/phf/forth/tree/master/x86 here's at least one port). The core set of primitives is in the .S assembly file and everything else is implemented in terms of those 'words'. If you want a really simple one, look up something like https://esolangs.org/wiki/Brainfuck and work your way up into a more reasonable (i.e. actual numbers!) ISA.
You are describing "generalization". Lookup Tables (LUT) can still be a valid technique if the problem space is small enough.
If you use a table for trigonometry functions then interpolation is generally used... So it doesn't have to be all-or-nothing.
Compression generally works by removing redundancy via some means or another according to some scheme... But compressed data can take up more space if it goes against the grain (i.e. RLE when the repeated byte is the same as escape byte). Randon number generators with simple seeds (not a hidden entropy pool) can be thought of as disrupting simple patterns in a sequence.
Sine and cosine are the only smooth continuous functions that are solutions to that system of equations. There are also numerous geometric ways to define them. That's what's meant by "understanding". If all you had was the LUT and no other way to compute the value of sine, then you don't really understand what sine is.
Table lookup is a common solution when a FPU isn't available and speed is a higher priority than accuracy.
I've actually used formulas for trig functions before with IEEE double floating-point until the accuracy didn't improve.
(think it was a continued fraction or or some other technique I didn't understand, didn't -lm or something for builtin functions and worked around it lol...)
It was very slow! Makes your computer's fan spin faster, even changes the smell in the room while grinding away!
Those LUT's have implied indexing internally, I suspect something akin to indexing is essential.
Logic functions can be composed using any one of: NAND, NOR, XOR, IMPLY, or other units like full-adders etc.
All memory depends on some form of persistance and/or refreshing.
Sometimes this is a physical phenomenon like magnetism or electrical compatence or after-glow in a willian's tube. Short-lived effects need to be refreshed.
Sometimes there is a feedback loop that continuously sustains latching, such as flip-flops.
Magnetic core memory could be less transient than say, a mercury delay line, but needed to be rewriten after being read.
But regarding your question, I think so. If it is possible to impliment one with the other (but memory usually has internal indexing).
Yeah. But for a function with parameters there needs to be a way to associate values with parameters. There are different ways to do that, but even a static global would be accessed by address.
I suspect it would be difficult (impossible?) without relative position addressing modes and/or other tricks. A conventional stack would need some way to access an offset away from the base, but perhaps you could use self-modifying code. Perhaps relative from current position like brainfk.
All general computation needs either a direct or convoluted means of achieving conditions too. A tricky formula exploiting abs(x) could be used to produce {0, 1} but...
Right, my comment retarding truth tables was only in reference to combinatorial logic. That said, pure circuit models of computations exist, and are interesting from a theoretical standpoint at least. They don't translate to hardware, though. As you noted doing arbitrary computation with (finite) circuits requires some type of feedback/sequential logic.