A few years ago I had the pleasure of working with Lee Felsenstein on an electronics design.
One night over burrito bowls we got to talking about the Osborne 1 and the odd size display. Specifically why 52 x 24 instead of 40x24.
As the story goes, there were huge concerns the computer would come in under budget. At the time chip companies would bundle their CPU’s and RAM - loss leading with one but charging more than market for the other. Architectures should have meant these chips were not cross compatible. Adam Osborne set a hard ceiling on the price, Intel compatibility, and 40x24 native resolution for the display (so as to be compatible with with other “Business” applications). Facing these constraints Lee couldn’t make it work unless he got creative. Through some arcane use of the dark arts (read: layout and capacitor placement) he was able to get the timing working between the low cost Z80 processor and the low cost 4116 RAM.
When he presented the news it went something along the lines of: “Adam, I’ve got some good news for you and some bad news. The good news is I made the cheaper ram work with the Z80 and we’re going to come in under budget. The bad news is I can’t get you 40x24, but I can do you one better: 52x24!”
As I understand it, this has something to do with the timing weirdness when integrating into the Z80s built in refresh logic which Motorola devs had to write in software. My understanding is they rolled with it but Adam went on to flush the savings windfall this hack got them down the drain by announcing the Osborne 2 at a lower price with more features before the Osborne 1 shipped.
It wasn't uncommon at the time for small computer systems to make the memory reads performed by video output pull double duty as a memory refresh. The Apple II relied upon this, for example, and ended up with an oddly fragmented video memory layout as a result.
I remember having a brief conversation with Felsenstein at some trade show. I asked, "what kind of bus does the Osborne use?" to which he replied only: "Analog!"
So bosses have just always driven their engineers to insanity with over commitments!
No matter how outdated and irrelevant those computers are today, their look is so nostalgic to me like no other thing. Other things that disappeared from our lives, such as rotary phones, vinyl records (not completely lost though), the beautiful stereo systems of the 1980s, etc can never compare to the feeling of seeing and touching a computer for the first time. The magic of technology is lost at least for my generation, I don't know about the others.
It depends. Technology became magic beyond our wildest dreams too, in many respects, when we take a second back to look at first principles, how far we've come. Part of that magic was age and personal disposition too, in addition to the revolutionary object. That combination doesn't happen quite often.
But imagine a true sci-fi-esque VR experience a decade or two from now in company of your children or grand-children on a Christmas evening, that might make you feel in magic-land all over again.
Progress doesn't work like that. Once you've picked the obvious low-hanging fruit it stops being a research problem and starts being an engineering one. Which means tradeoffs and no free lunch and no more magic.
We're never gonna have an airliner that is 10 times faster than the ones we have now. Same principles with computing too.
"The punch card remained a keystone of data processing until the 1970s, and its impact still remains."
I worked for an aerospace contractor in the mid-80s, and the corporate computing center was an IBM mainframe shop (originally 360s/370s, later a mix of 370s and 3033s, later the 308x multiprocessors series).
Even when I left that world in 1986, punch cards were still used in several contexts, such as the starter decks for all production jobs. But the greatest reliance on punch cards was that every employee's time card was a punch card. Recurring data was pre-punched, employees would write billing hours and projects on printed fields, then that data was punched in by the keypunch department at the end of the week. The timecards then were written to tape and fed into accounting, payroll, and shop order control.
Yet despite this primitive appearance, every employee got their paycheck by the following Thursday, every week.
For background, the "sonic delay line" appears to refer to this:
> A later version of the delay line used metal wires as the storage medium. Transducers were built by applying the magnetostrictive effect; small pieces of a magnetostrictive material, typically nickel, were attached to either side of the end of the wire, inside an electromagnet. When bits from the computer entered the magnets the nickel would contract or expand (based on the polarity) and twist the end of the wire. The resulting torsional wave would then move down the wire just as the sound wave did down the mercury column. In most cases the entire wire was made of the same material.
wow. this is literally a sound wave. i thought it was some kind of euphemism.
as i was until now only familiar with the electric variety.
> literally a sound wave
Well, not literally. A sound wave is a compression wave (vs. shear or torsion).
Gives flipping a bit a new meaning.
There's a great video that goes over an old calculator that used delay lines: https://www.youtube.com/watch?v=2BIx2x-Q2fE
Nice video indeed (I only wish the guy pointed the camera more to the objects of interest rather than his own face), but what is shown is clearly a magnetostrictive delay line (rather than an acoustic one) which is similar to one shown here: http://www.vintagecalculators.com/html/calculator_memory_tec....
A kind of memory that gets corrupted by loud noises... Love that.
Don't forget the famous discovery by Sun that shouting at disk drives messes them up: https://www.theregister.co.uk/2009/01/05/shouty_sun_engineer...
Fus ro DATA LOSS!
This is very much an IBM-centric narrative - it doesn't mean that it's wrong but at the time there really was an IBM world and an everyone else world - IBM carefully tended to its walled garden, to the point of getting people fired who might consider looking outside of it - the result is people who knew nothing about IBM, and people who knew nothing but.
Which essentially means that there are going to be different stories about 80x24/5 from that time - probably all of them correct in context.
(oh and the console monitors on our Burroughs 6700 mainframe had delay lines while the user block terminals - TD830s - had dram and micro-controllers - there's another non-IBM/non-DEC world view of essentially the same technology)
This is still true to this day. I have spent some time playing around with MVS (precursor to z/OS, IBM's mainframe operating system) on an emulator, and it's been a very educational experience.
The first thing you notice is how everything is different. While the rest of the world was standardising on what a "file" was (a sequence of (not necessarily 8-bit) bytes) as early as the 60's, the IBM world has "datasets" which are rigidly formatted sequences of records.
The differences doesn't end there, and extends as far as terminology, referring to booting as IPL and disk drives as DASD.
When reading documents abd tutorials written by people who seemingly has worked with IBM systems for a long time, it also seems that while they know everything about how to operate z/OS, they are very ignorant of even basic features of "traditional" operating systems (which these days means Unix and Windows).
I was reading a discussion about Revedit, which is an editor that is popular on MVS, and it was praised for being incredibly advanced. I've used it, and compared to the other one I have access to (whose name escapes me) it sure is better.
But, the editor lacks a lot of functionality even the simplest of editors on Unix had (regexp, for example) and the fact that no one in the mainframe world seems to miss it suggests to me that people who work in the mainframe world simply don't think about computing in the same way as people who were raised on Unix.
I'm not suggesting that one approach is better than the other. I think there are things that each side could learn from the other. But right now I only see the mainframe side learning from Unix, and very little going the other way. I probably will never work with mainframes for real, but learning how they work had been very helpful. It would probably help the industry if more people tried that.
Well yes to this - in the 80s you were either System V or whatever crap IBM was selling back then! :)
Ps I always assumed 80x24 was a teletype (eg. tty) standard or at the least a two-by 80x12 punch card layout. I also wonder if the serial cable throughput to the terminal played a role..
I think RSX, RT-11 and VMS would like a word.
And all the other mainframe operating systems ... Most of the other big mainframe companies tried to emulate IBMs walled garden each had their own OS, their own (incompatible) networking system, their own compilers etc
Unix was great mostly because it was simple and you could port it to new hardware ... Prior to that it was in no hardware manufacturer's interest to release their OS into the wild, that didn't sell mainframes. Unix also released a generation of frustrated systems people who were not allowed to write kernel code unless they worked for a mainframe company
And don't piss off Burrough's MCP.
I always imagined an OS had to be really user hostile to have a movie villain named after it. IIRC, Alan Kay, Bonnie McBird's husband, worked for them and, probably, made some of his opinions clear to his wife.
It IS the most user hostile thing I ever seen running on a computer.
Sorry I was being intentionally controversial. I had both Unix and vms accounts as a student and I guess the vms never really resonated with me...
Don't worry. I never liked it either. When I found Unix, I was sold. I think it was the pipe (in the shell) that did it.
For the longest time I had my systems cat Syslog to an lpr... anytime someone did something, I’d hear a bzzzzzt. Usually about 5 times a day. Maybe a dozen. Then one morning I came into work and there was a whole box of paper on the floor and the printer is still bzzt, bzzt, bzzt... it was a hacker who had decided to follow some instructions he found on the web and hack my system. Every time he’d do something he’d erase his steps... but he never checked to see that all his movements were being printed!
> AT&T introduced the Teletype Model 40 in 1973, a CRT terminal with an 80×24 display.
The Model 40 was a nice machine. I remember it because the printer unit reused the type pawls from the Model 28 in a rubber belt that ran the full width of the paper (80 columns, of course). So it was a chain printer without the chain, which meant it was somewhat quiet when printing.
I can still hear the sound that an ASR-33 would make as it was typing out a listing. I was fortunate to attend a high school in 1975 or so that had a DEC PDP-8 with an attached ASR-33 with paper tape reader and punch. I remember that it took something like 45 minutes to load the BASIC interpreter from paper tape. Good times.
I love the cadenced hum of the ASR-33 when it idles. Wish I could have one in my office.
Our school used DECWriter II (LA36) terminals attached, initially, to a PDP-11 and then to a PR1ME.
Non-mobile link: http://www.righto.com/2019/11/ibm-sonic-delay-lines-and-hist...
The ‘recent blog post’ that Mr Shirriff mentions was discussed on HN at https://news.ycombinator.com/item?id=21340548
There is a mistake somewhere: the 3270 is said to have been introduced in 1977, but to dominate the market by 1974. Could it have been introduced in 1971 instead?
You are right; I've fixed the typo.
"The new 32- and 43-line sizes didn't really catch on"
But it made it to the PC and its MDA video, IIRC. I used to use that mode all the time.
It's a real shame the font used in the MDA/CGA/EGA/VGA is not the 3270 one.
I don't recall 32- or 43-line MDA text modes (and online sources on MDA I can find don't list such modes), but EGA and later adapters supported 43-line text modes, which I do remember using.
IRI. :) definitely then I used the EGA mode. I had a Hercules compatible adapter that supported EGA mono modes.
From the article, about why 80x25 became standard:-
>The biggest problem with this theory is the VT100's display was 80×24, not 80×25
Having spent quite a few years banging away on VT100's and clones, wasn't the display actually 80x25, but with the 25th line being used to show status? So the electronics was actually set up for 25 lines.
No. The VT100 was 80×24 (or 132×14 or 132×24) and had no status line. You can verify this on Wikipedia or in the manual. The status line was added in the VT320.
Interesting article with some nice pictures and links.
I have a very simplistic question. The article asks, "Given the historical popularity of 80x24 terminals, why do so many modern systems use 80x25 windows?" My question is: Do any modern systems use 80x25 windows, other than the console of the IBM-compatible PC?
The Wyse terminals we used with UNIX in the late 80s had 25 lines -- 24 regular lines, plus a bottom status line.
Almost half of that line could be written to using escape sequences -- in UNIX, a simple "echo" command with the properly encoded escape sequence did the trick.
In my organization, we had to install and run multiple generations of our products, and it was confusing to keep track which "world" your current environment variables pointed to.
So, I, being a tinkerer by nature, wrote separate environment files for each "world" that included an echo command to echo the appropriate escape sequence so that the current environment setting could be seen at a glance in the bottom status line.
I believe 80x25 is the default for iTerm, qemu, as well as cmd/command on Windows, minicom, kermit, and CentOS console. Anyone know of other current users of 80x25?
MATE Terminal defaults to 80x25 as well.
All the terminal emulators I have at hand (xterm, urxvt, tmux) default to 80×24, not ×25.
I use a tiking WM so all my terminals just expand to fill the screen or some integer fraction thereof.