> but for some reason decided not to make integer types available to programmers.
Can you expand upon this? All of the research I've done suggest that, not only was it was possible to use integer math in Basic for the Apple II, there are versions of BASIC that only support integers.
Wozniak's original BASIC for the Apple II only supported integers; when Apple decided they needed floating point and Woz refused to spend time on it, they decided to license it from Microsoft, producing Applesoft BASIC. Applesoft was slower than Woz's BASIC, because it performed all arithmetic in floating point.
"In the Apple II ROMs, I even stuck in my own floating point routine. It wasn't incorporated into the BASIC, but I just didn't want the world thinking I couldn't write floating point routines." -- Steve Wozniak [0]
"All operations were done in floating point. On the GE-225 and GE-235, this produced a precision of about 30 bits (roughly ten digits) with a base-2 exponent range of -256 to +255.[49]"
Good find, Dartmouth was the original BASIC for their mainframe timesharing, Apple and other micro variants came later.
Speaking of, John G. Kemeny's book "Man and the Computer" is a fantastic read, introducing what computers are, how time sharing works, and the thinking behind the design of BASIC.
Classic BASIC does have typing, it just shoves it into the variable name. E.g. X$ is a string, X% is a 16-bit signed integer, and X# is a double-precision floating point number.
This started with $ for strings in Dartmouth BASIC (when it introduced strings; the first edition didn't have them), and then other BASIC implementations gradually added new suffixes. I'm not sure when % and # showed up specifically, but it was already there in Altair BASIC, and thence spread to its descendants, so it was well-established by 1980s.
Interesting. I wonder if this association of $ with strings is related to the use of $ (rather than NUL) as the string terminator for DOS output routines?
IIRC Python had similar reasoning for making floats the default in version 3 instead of integers, which had been the assumption in 1 and 2 when you entered a number without a decimal. R always did it that way and Julia still assumes integer, which occasionally trips me up when switching languages.
Can you expand upon this? All of the research I've done suggest that, not only was it was possible to use integer math in Basic for the Apple II, there are versions of BASIC that only support integers.