Sorry for calling you old.
On reflection, the thing I miss about web dev in the '90s is the low user expectations.
No React, no flexbox, no REST API, no microservices, no Docker, no Kubernetes... but the reason it didn't need any of those things is because it had a terrible UI, didn't do all that much, and only needed to support trivial numbers of users whose modems were probably the main bottleneck anyway.
Trying to make modern users happy with '90s era tech would be impossible and deeply painful.
But it was nice, just for a while, to have a world where people were thrilled that even a super-basic web application was a thing that existed.
> Trying to make modern users happy with '90s era tech would be impossible and deeply painful.
Given my experience users can be delighted in 2019 by well-designed and lightweight UI built with just CSS and vanilla JS, instead of an overweight buggy SPA (usually built by engineers more interested by the code itself than solving the business problem in the leanest way). Source: I made that my business and I couldn’t be happier.
You're correct, but so is OP.
You're exceeding their expectations by providing less than they expected. That's easy to do when the user expects to be accosted by a slow loading page that comes onto his screen janky because of deferred stylesheets and 4 different Google fonts and 1.6mb worth of ads.
SPAs are often buggy, but you can’t honestly blame React/Angular for that. I’ve built and actively maintain a number of SPAs in React. With some care JS size and speed is very manageable even in mobile devices.
The thing that consistently kills site performance is ads and trackers. Of the sites I work on, GTM, Optimizely, FullStory, and other similar crap routinely are responsible for >75% of the JS weight and >90% of the HTTP requests. It doesn’t matter how much I prune our npm dependencies and tweak React to make sure it’s rendering fast—marketing is going to make the page suck and I can’t do anything about it besides generate page speed reports that the client will ignore.
Yes, it was a simpler time :) There was a job position known as "webmaster" who was basically a god. User expectations were low because users didn't know what to expect. Everything was so new. Putting formerly locked-away resources like job databases and digital libraries online blew people's minds. And we imagined that we were building a utopia where everyone could publish, and information was free..
I still make websites this way because no-one told me to stop. They can be made responsive to things like smartphone browsers which incredibly minimal changes. No complaints so far.
What kind of sites do you make? I’d imagine that avoiding things like react is fine for a lot of sites that just need a bit of form input here and there and some nice styling, but that anything relatively big with interactive pieces would fairly quickly devolve into a difficult to maintain mess without components. I’ve sort of been trained to think that, though, as I was just starting development right before libraries like React started to get big. I remember reading cs books and learning about how important encapsulation and modularity was and then feeling like I wasn’t really able to apply those concepts adequately in my really early web projects. React came along and seemed like the best answer. Do you do something different to organize code related to more complex frontend user interaction than forms/animations, or do you find that you just don’t really need complex frontend user interaction for the websites you make?
Most of the things I've made are not public-facing. I just try to keep things simple. My sites generally require page loads more than the average single page application, but they have the advantage of loading faster and working in nearly any browser, even super old ones.
Link or not real.
No, please continue.
What is old is new again.
Brutalism for President 2020.
If there were no JS, how would anyone make billions of dollars of ad revenue?
That was... really interesting to read with a early 2000's mindset (seeing the date of the article)
Pinboard is also a good example of this.
> the simplicity of it all
> No React
True, but when I remember how we used 1x1px Java applet to allow reloading data without refreshing the page I would not call that simpler... And don't even get me started on Flash.
> no REST API
True, but when I remember fumbling about with SOAP I would hardly call that simpler...
> no microservices
True, but what we had instead -- monolithic PHP "applications" with insane hacks to integrate with other monolithic services, Microsoft' and Adobe's attempts to abstract the backend/frontend separation away and so on -- I would not call that simpler...
> no Docker, no Kubernetes
True, but when I remember the deployment mechanisms (upload via FTP), problems with scaling once our server became not enough (usual approach: set-up everything on a separate machine and then have a downtime until it was back up) and so on, I would certainly not call that simpler...
There are always trade-offs, and you continuously have to learn new technologies and unlearn others...
Interesting. There were two primary triggers for me creating https://alchemist.camp. One was that the Elixir screencasting I'd used site had very out-dated tutorials that often didn't compile or run on the post 1.0 versions of the language. The second was that I loathed the huge card-based design and the SPA the author moved to.
I just wanted information density and scannability, not hamburger buttons and pages and pages of responsive cards.
Similarly, I'm no fan of Reddit's redesign or the huge lag when loading IH and yet more and more content sites are going the same direction.
The Scrum bullshit didn't exist.
XP was a sane and useful thing before the sc(r)?umbags came along and shat in the Agile pool.
You could ctrl-u any web page, understand it, and do all the cool things you actually cared about yourself.
Nobody played Matryoshka doll's with VM's and Containers, they did real work that mattered instead.
Scheme/Lispy Lambda's just worked decades before these halfarsed modern languages came up with dozens of borked half- baked versions of it.
GUI's were an almighty colossal pain in the proverbial to write, so almost nobody did that and as a result were about 20 times more productive than they are today.
Threading implementations were borked, so nobody did that. They used processes and everything was much better. If you have any sense you will _still_ be using processes and not threads and your life still will be much better.
An, oh yes, pen plotters were fun to watch. Laser printers are absolutely wonderful.... but it's like the difference between a wood fire and panel heater... Which would you rather sit with a whisky in hand and just watch?
I might be an old curmudgeon, but I do have a very longer list of things that are much better than they were in the Bad Old Days if anyone cares.
Now GET OFF MY LAWN!
I got to operate IBM 3800 printers in the 1980s. The sight of a laser printer running at printing-press speed hour after hour, going through a 2' tall box of continuous-form paper in less than 20 minutes, was just nuts.
Laser printers are absolutely wonderful.
you got the point there was no virtual machines, docker stuff, etc. there was no VM snapshots when you screw it, there was no turn back, backups were mostly on tape cassettes...but i certainly do not miss that... i'm happy those things are gone, haha.
From the 80s I miss the variety of computer systems at the consumer level. Apple ][, C64, Tandy Model 3/4 and the CoCo, OS/9 and more. When it became clear everything would converge to x86, that made me sad. The world is just that much more boring without variety.
I miss programming in assembly. Slapping together frameworks with some sample code from Stack Overflow just doesn't provide the same mental satisfaction.
I miss interviewing in the 90s. No white board algorithm puzzles or such nonsense. Just a conversation to see if your interests matched the role and you're hired.
(I guess I don't miss the low salaries which were on par with any office workers in non-tech fields, nothing like today. But the good side was that everyone was in tech for the sheer love of it, not to make a buck.)
But what I truly miss (nostalgia aside) is when Silicon Valley used to be about technology. The big names in Silicon Valley were all actual technology companies: Sun, SGI, HP (the good original HP, not the ink maker). The product was the technology and the culture reflected that. Engineers were in charge and the goal was to make better technology.
Today the product is advertising and the goal is just to drive more eyeballs. Engineers have become commoditized and micromanaged (agile) by PMs who are driven by advertising goals. (PMs as we know them today did not exist.) Except for Apple, none of the big names today (FAANG) are tech companies anymore.
> Today the product is advertising and the goal is just to drive more eyeballs. Engineers have become commoditized and micromanaged (agile) by PMs who are driven by advertising goals. (PMs as we know them today did not exist.) Except for Apple, none of the big names today (FAANG) are tech companies anymore.
Does anyone know any tech companies or industries that aren't like this in the US?
I've been considering saying goodbye to the American market and going to Europe for a few years and see how things are there first hand. The one thing that would make me stay is the discovery of an industry or large company that doesn't micromanage and commoditize programmers.
Developers being far less critical of other developers.
We were much more positive in the olden days. Back in the 90s you could post an idea in a developer-oriented usenet group or discussion forum and get pretty decent feedback. If you do that today there's good chance you're going to either get nothing back or you're going to be flamed for using an "anti-pattern". People are far too quick to dismiss things now.
It's probably a function of how everything gets marked with a score now and every post is social proof, but it's quite annoying regardless.
You've got a very different memory of Usenet than I do.
Onion2k's experience mirrors mine.
There was a genteel, supportive quality of having a participation base comprised mostly of sincere, interested readers and posters. There were trolls even then, for sure, but they were few and were generally embarrassed into an inert mumbling.
I still remember fondly the debates in comp.lang.c about the first ANSI C standard proposals. Even Henry Spencer's rants against NOALIAS were entertaining.
And where user participation got unproductively unruly, moderated groups served well, like rec.humor.funny.
There was also an open, transparent process for forking groups, generally along natural fracture lines, and for creating new subgroups.
What ultimately ruined Usenet (IMHO) was a falling signal-to-noise ratio, and not just because of spam. The opening of the AOL gateway and the resulting Eternal September proved to be mortal wounds.
The perl groups were nice!
Speaking as someone who cut my teeth in the 80s and early 90s: I miss the elitism and exclusivity of it. Nowadays the internet has made it so any question can be answered within seconds, but back then you had to scrupulously acquire your information from magazines, Usenet discussions, and just plain experimenting with the code. As things currently stand, programming has become a commodity skill that anyone with a room temperature IQ can learn within a reasonably small time frame. Then, you get on StackExchange and start cobbling together your app from copypasta provided by other programmers.
Things have been dumbed down considerably.
Problem is, you can find dozens of mutually exclusive answers to any given question and hundreds of copies of each served by content farms.
any question can be answered within seconds
In the Usenet days, you'd find a FAQ with answers curated by the combined reviews of dozens of participants over time.
Yeah. In the 90s I met almost no computer scientists. We all came from different backgrounds and got into programming because we were interested and liked it. Now we have a lot of people who chose it as a well-paying career. Nothing wrong with that but it's less fun.
I'm an applied mathematician that worked in those days with chemists, microbiologists, hydrologists, civil engineers, electronic engineers, physicists and almost every darn flavour of critter _except_ computer scientists.
Learnt a lot of interesting stuff.
Most of their code was utter shit to read and maintain... but then so is that from most recent comp sc graduates...
Well, you can always pick something, that is still complicated. Like distributed systems.
Kudos for honesty, I suppose.
No doubt about it.
That you could buy a book, read it, and then know literally everything about a particular system. And often that book came with the system.
Turbo Pascal came with reference manuals, but also with instructional books that taught you OO programming and Windows UI programming from scratch.
They must have been well-written; I didn't really have Internet access, so there was nowhere else to go if I got stuck. Though I was probably a much more determined learner back then.
Borland C++ came in a fucking chest of something like 40 800 page books. Books on OWL, books on the Win32 API, books on the Win16 API, lots of books.
So much this. It ties in with my other comment, to some degree.
Someone here mentioned StackOverflow, but really, a handful of manuals, maybe a good book or two, and some persistence, /usually/ gave you all you needed to figure out a problem, and do something magical.
Now, those resources (if they're available) will likely only help you understand a very small subsystem.
I appreciated that the behavior of that system was fixed for a certain period of time. That word processor would work the same for all eternity (warts and all) if you didn’t intentionally get a version upgrade.
The Mac documentation was great. I remember the UI guidelines as a book and in the end I really knew what's going on.
IBM mainframe documentation was amazing. I encountered cases of strange edge conditions that would be carefully detailed in some manual even though any given installation would be unlikely to encounter it. (The trick would be finding that given manual, so having your library properly organized helped.)
Worked in SW since early 90s. Back then you didn't have to think much about security and you could push things to the limit.
Also things were not as "professional". When I started out my boss told me to figure something out and I would report back weeks or months later. I had the time to make mistakes and learn from them. Today the young devs are often very micromanaged and have no freedom.
On the other hand we can do a lot of cool stuff today and it's amazing how much info is out there. But I think the 90s were more creative. Stuff like StackOverflow is great. Documentation was MUCH better in the 90s.
Yes, back in the day we used to consider thinking about security as sand in the gears of progress. A slog to get through with little payoff.
Sooooo.... like today?
Despite this being an obvious honeypot to out the old people on HN, I'll bite. There's not much to miss about development practices of yore, except for the way better emphasis on documentation. I find most software developers today can't write worth a damn in their development practice. Requirements, specs, code comments, end user documentation, you name it. Really terrible bar. Perversely enough though, software developers today who blog can write amazing documentation on how to code. Go figure. OTOH, to view development in the 70's to the 00's from an emotional lens, what I miss the most was really living through the hockey stick of Moore's law. Everything was new and exciting until it got replaced 18 months later with a newer and more exciting thing. And back then, you were a sucker to bet against Moore's law. All it took to get a front row seat to this action was to code.
I hope I don't qualify as "old", but I guess I've been programming since the early 00s...I miss the days when LAMP (+ XAMPP for local development) was more-or-less state-of-the-art for a website. PHP, despite its numerous deep flaws, really is a very pleasant way to write frontend code. I think what the web dev world needs is a modernized version of LAMP which:
* Is trivial to bootstrap
* Is totally serverless/scalable
* Can be run effectively for free with low levels of traffic, like an S3 static website
* Uses a non-broken PHP-like templating language (maybe jinja?)
* Uses Postgres instead of MySQL (ideally without infrastructure to manage, a la Aurora Serverless)
* Somehow intelligently figures out how to split between server-side and client-side rendering - ergo, you write templates like PHP and it seamlessly figures out bits that can be AJAX-ified to avoid full page-loads
If someone could just, like, make that, web development could be as fun as it was in the days of yore!
Django + stimulus
I was tempted to write "computers were so slow and lacking in resources that you were forced to write efficient code". Because efficient code is good, right?
But I lived through that era and being forced to write efficient code is a god-damn nightmare. There were so many ideas flying around that you simply couldn't do because the computing power just didn't exist.
Today people revere things like "vi" but when you were forced to use such basic utilities because your human/machine interface was a 300 baud modem, or even a paper teletype, life wasn't so good.
Nostalgia isn't what it used to be.
Turbo Pascal. Lightning fast compiler, snappy IDE, included graphic library, simple and beautiful language.
Aw man, I'm with you on that one. Turbo Pascal was incredible. Delphi 3, 5, and 7 were incredible too!
And every damn licence since then has been a frustration and tragic disappointment.
I was pro-closed source in Turbo Pascal days.... once the licences turned to complete shit I went pure opensource.
Yes!!! You could build nice software in DOS with this.
Main thing I miss is the lesser degree of abstraction in doing anything. If you wanted to build a simple app which just took one input and did a calculation and returned a result, there were multiple languages, but you didn't need a huge number of libraries, infrastructure, etc. to deploy it -- you just built the code, compiled it, and distributed a binary (and source, with a makefile). There was complexity crossplatform (even across unixes), but there was a period of time in the mid-90s/early 00s where "some form of linux or a BSD" was all I had to care about, and it was pretty easy.
Also, being able to trivially modify almost anything I used (because it was largely open source, and simple).
Lots of things were worse, but less abstraction was pretty nice.
Elegant, tight, and efficient code mattered back in the days when a machine with a 20 MHz processor and 8 MB of RAM was considered a high-end workstation. You really had to understand what your hardware, your OS, and your compiler was doing in order to produce decent software.
Nowadays, such esoterica is relegated to the tiny niches of systems and embedded development.
My side project right now is flight control/readout software written in assembler for the C64 which will control a virtual spacecraft inside of my lua programmable 3d/physics libretro emulator front end.
I am really enjoying working inside of Turbo Macro Pro. It is an interesting challenge to make the assembly code efficient and easy to understand.
Late 90's here. There are two things I miss: 1. Going to the book store and looking through the (mostly) O'Reilly books to see what there is to learn or what new technologies are coming out. 2. Having a huge pile of 3.5" floppies (or even CDs) you are using to load the latest version of Visual studio or whatever and just imagining how much knowledge is contained in those. I think my copy of VC++ was something like 21 floppies.
Just the sense that we were on the cusp of some magical, new, civilization changing thing.
I don't get the same sense that stringing lines of code together can change the world, largely because, where it could, it mostly already has (I am not a believer in the latest wave of AI hype).
What we did used to be magic and was met with gasps and now, it's mostly just expected and complained about if it breaks :/
I miss the wonder and awe and naivety that all of this programming is leading to something better socially/politically or for lifestyle. In many ways it did. It also may have created more problems than it solved.
Also, I miss the joy of learning all the deep features and idiosyncrasies of a computer - its instruction set, I/O routines, timing and interrupt tricks, etc. These days I am focused more on Big Hairy Goals and complex systems. Fun of a different sort.
Are the Good Old days before computers became those things the secretary didn't quite know how to use.... it was downhill all the way from there.
I miss building my own computer out of TTL chips.
I miss "the front panel" where I could single-step my program and read the octal / hex off the front panel lights.
I miss debugging my program with a plastic block and a wire to hand-punch patches into the binary paper tape.
I miss the "user manual" that had circuit diagrams as part of the documentation.
I miss doing "machine vision" on boxes of punched-card images.
I miss sitting "at the console" of an auditorium-sized "machine room" with a sea of DASD, watching the PSW flicker as the program counter changed.
I miss analog computers.
I miss coordinating 24 IBM Selectric consoles all "typing out" classical music (each one tapped out an orchestra instrument). Beethoven's fifth on selectrics....
I miss playing music by holding my radio next to the mainframe while my program was running, adjusting the program so it played a song.
I miss hand-designing a 16x16 multiply chip in MOS.
I miss programming plated-wire memory in binary switches to drive a Unimate robot.
I miss "scoring a complete copy" of the listing of Lisp 1.5 and reading the source code.
I miss running "the Hadoop algorithm" on a room full of punched card equipment (Google didn't invent it).
I miss "real programmers" who could solder. And could replace a failing memory address chip on your core memory board.
But now I'm proving a computer algebra system correct. So it's all still good.
The TTL chip computer was fun. The hardest part was fixing bugs in the wire-wrapping. You could have a wire buried 3 wraps deep that needed to be changed... oh, wait. You probably don't know about wire-wrapping. The memory wire-wrap was symmetric and pretty but the CPU/ALU was a nightmare.
> I miss building my own computer out of TTL chips.
That sounds like an interesting project! Can you expand on it? When was this? How many TTL chips did it take? What other components did you use? What did you do with the end result?
You should check out Ben Eater's 8-bit computer from scratch.
There's even a kit for you to build one.
Check out: https://eater.net/8bit/
There's an entire series on YT of him building it, very informative!
> Sorry for calling you old.
Whippersnappers these days are so gosh darn polite.
Writing C code on a SPARCstation and Green Hills compiler/debugger as an undergrad in 1989, I got used to step-forward, and step-back, in a GUI environment.
I had no idea that I'd never see that magic "step-back" button again, in C, C++, MFC, Java, Scala, or a litany of scripting languages and development environments.
Other people have touched on it, but you could be proficient in literally all the programming languages back then. There were less than 20. The later explosion of languages and techniques in common use can be overwhelming, but has stretched the spectrum available to our craft.
Oh, also: personal computers in the 80s booted into a BASIC prompt. It was a terrible language to cut your teeth on, but it was there, right in front of you, and there were magazines and books (at the library and local book store, on dead trees), that taught fun things (like how to animate sprites, or set half your screen to graphics mode, or how to poke the midi controller into croaking out something beepy).
Or "poke" into the interpreter's ram region to do crazy things like shifting the string store to overlay the graphics ram and clearing the screen faster than any other technique by creating new blank strings and then poke the original values you peek'd out of it back.
Or spending afternoons typing in some game that was printed on the back of a magazine in assembly, and then poke 32768 and poof, game. Magic when you're 9 years old.
SYS 32768, not POKE 32768
As an aside, reading and understanding the entire disassembled "kernal" and BASIC rom for the C-64. All of it taking up less space than just about anything on just about any site nowadays - in that respect we took a wrong turn somewhere IMnsHO. Yes, storage is cheap and bandwidth abundant but the same is true for clean water yet nobody drinks 50 litres of the stuff per day 'just because it is cheap and abundant'...
I got started with programming as a teen in the 00’s. Tried to start with C++, but couldn’t make any sense of it. But then I tried BASIC running on a micro-controller doing simple stuff like blinking an LED. I think it was the perfect intro to programming—no OS to care about, simple language running on bare hardware running at 20MHz / 2KiB RAM. Sure it’s not a great language to actually do stuff in. But for learning the basics, it’s perfect.
Oh, also: I made fun of my friend who got a 1200 baud modem.
I mean, obviously, why do you need more than 300? You can't read text faster than 300 baud.
Oh the day I realized there was more interesting stuff appearing on fidonet faster than I could ever read. That I could read all day and still get further behind...
Now I'm now so sure... sometimes I think the rate of interesting new stuff to _read_ is slowing down again and I may yet catch up again.
An office/room with only 1 or 2 developers (versus "open office spaces" now.)
My experience is that it was much better for concentration, creativity and productivity...
I miss job interviews (from both sides!) where a few people read the candidates' resumes, asked them deep and intelligent questions about what they had worked on in the past, and then made a hiring decision, without further quizzes, take-home projects, or probationary work-to-hire contracts.
All my recent job interviews have been like that. Although they've all been for software and data analysis jobs at non-software engineering firms. In my experience companies used to interviewing not-software engineers have much more sane interviewing practices since they know that a chemical or structural engineer with 10 years of experience won't put up with that sort of thing.
I agree. My first interviews were like "You have no clue about C but you are good at FORTRAN and you are smart. No problem. You'll pick it up in a second. Read this book before you start.".
Ah, the good old, "Read this pile of books", call me when you done style of onboarding.
For my second programming job my manager handed me a pile of books and AFAICT that was the last thing he did before he resigned.
I miss dump reading on reams of green-bar paper.
The dump reading was done with a cardboard pamphlet that listed the machine instructions, arguments, address offsets.
You'd find the hex abend on the greenbar, mark it up with a pencil, and use the machine instructions to figure out what went wrong (and the record that caused it).
In my business (banking) this was done sometime around 2 a.m. after you got called back to work by the night operators.
Ah, the good old days.
My systems were core to business operations (manufacturing shop-order control) and always ran between 2-4 AM.
I quickly hardened the shit out of our interfaces so that we (a) wouldn't break and (b) had automatic workarounds for when feeding systems blew up. Oh, how my project leader loved me.
Want nostalgia? Run this happy command...
enscript --landscape --line-numbers --media=A4 --font=Courier7 --highlight-bars --verbose --highlight=c myFileName.c
Yes! Abend, lol.
The ability to know a lot about a lot. And I'm not that old yet - I started my journey in the early 90s.
I hear you. There was a time when I could pretty much describe everything that happened from the time you pressed a key to the time a file was displayed on your screen: exception/interrupt handlers, keyboard driver, network stack, storage stack, context switches, page faults, etc. And that was a pretty good chunk of what I needed to know in order to do many useful things. Now some of those pieces aren't even relevant any more, the others have all become far more complex, and new ones keep being added at an ever increasing pace. Nobody but nobody can know such a high percentage of what's going on as many of us once did. I'm not saying it was better, but I do miss it.
Like, because there are so many more unknowable levels of abstraction now? Hyper-specialization?
Among other things, yes.
For example: On an IBM PC, running DOS. You could know all there was to know about the BIOS, hardware ISRs, DOS ISRs, DMA, how a file system works, and maybe a bit about the graphics system, and you could make the machine do anything you wanted.
Today, if my kid wanted to learn all those things, I wouldn't even know where to tell him to start. You simply can't get that low (easily) anymore, or know that much about numerous subsystems. And the higher level stuff, like you say, is buried in layers of abstraction.
Of course, now we have an unprecedented access to cheap, powerful microcontrollers and SBCs, which make up for it to some degree - but it's still harder to translate that knowledge into something of perceived usefulness on a 'proper' computer.
It's all still down there somewhere....
"We build our computers the way we build our cities -- over time, without a plan, on top of ruins." Ellen Ullman
Yeah, I see what you mean.
You know, it’s pretty interesting that IBM was painted as the devil, but that they made this standard, interoperable computer system that you could touch at such a low level. Now it’s all Windows 10 and MacBooks and Linux userland, and even if it were possible, good luck finding documentation for any of that.
—Sent from my iPad.
I know it's kind of wrong, but I miss being part of a smaller community. I wasn't even one of the really early pioneers, but when I started computing was still small enough that I could find common ground with just about any other programmer I met. Sure we had our specialization and even our disagreements, but there was probably some overlap between what we each knew. We also got the chance to work on some genuinely new things. The early days of SMP or clustering were pretty cool. Nobody really knew all the answers or thought they did. We were all experimenting. Now there are so many programmers doing so many different things that we might have no shared knowledge, and so much of what most people do is reimplementing or papering over the flaws in some previous piece of software, layer upon layer like a giant midden. I'm exaggerating a bit, but it really does feel different in a way that makes me sad.
Since this is phrased as "old guys" I'll also add that I miss working with more women. Remember when I mentioned the early days of SMP? In those days we were still working closely with the people who were figuring out how to physically connect the pieces, keep caches coherent, what kinds of lock instructions (let alone higher-level patterns) would be useful, etc. Hard core stuff, and maybe a third of the people I was working with were women. That's still not a majority, but it's down to less than half that in most teams I've been on for the past decade. Things have gotten worse, and that really makes me sad. In so many ways, I feel this industry has been going backward.
There was a lot less stuff, in general. Now you have libraries, protocols, umpteen different languages, paradigms, design patterns, IDEs, runtimes, platform after platform, etc.
I remember all I used to need was a C compiler and a task and basically alone I would be able to complete it with the standard stuff that came with the compiler.
No DRM. No walled gardens. You could write code on your own computer without needing Apple's permission to code sign it.
It wasn't always so great across the board. Back in the late 80s or 90s I could write and distribute my own code for free. OTOH, for my day job developing network code on larger machines, we had to pay hefty licensing fees to get a decent compiler. Or debugger. Or libraries to do the most basic kinds of things. Corporate control hadn't come down to the smaller machines, but open source hadn't grown up to the larger ones either.
I miss being able to buy a complete 3rd party component for $400 that came with great docs and tech support.
Yeah, there are some excellent, well documented OSS libraries out there, but they are the minority and there's no one who feels obligated to help you if you need a small improvement or moderate bug, cause no one took money from you.
Imho, the trade-off isn't worth it (free but self serve support) for professional development.
I miss building everything from scratch, in assembly, because I could do it better than the libraries that shipped with the system.
I can still do it better today, but the stack is so deep now, I just don’t have the time.
Would I go back if I could? Nah.
Hacking is a blast, but never forget it is to achieve a greater purpose. A greater purpose for our software’s users, for ourselves, and for for our subroutines.
Old hackers never die— we merely gosub without return.
There were very few ready-made frameworks and libraries to get things done, so it was fun writing those meta libraries that runs your business. Of course, purely from a business point of view, this is not ideal and is extremely wasteful. At the same time, it really helped me understand the systems that we use. Maybe that's why people survived without Stackoverflow :)
In comparison, modern software development is almost all about composing or stitching together a bunch of different services. There are other complexities around orchestration and breadth of the systems involved, but very rarely one goes deep into a particular technical problem.
After a really long time, I'm getting a kick out of working on a highly technical problem: an open source search engine from scratch (https://github.com/typesense/typesense). It has been deeply rewarding and definitely something that's missing in a typical web oriented software development career today.
I miss developing in the 00s when I wasn’t considered old yet.
Agree. I’m around students often. The way they treat me is pretty alienating. I liked the Sir Davos quote from GoT “the young treat us old people with respect because we’re an uncomfortable reminder of the inevitable.”
Moore's Law was pretty amazing while it was ruling the landscape.
The transition from monochrome monitors to color.
The explosion of new peripherals that did new things, not just the same thing slightly better.
The first (consumer) hard disks.
For that matter, a computer that a person could own, rather than a corporation, was an absolutely amazing shift. If you had an idea, you could try to make it a reality - not a reality for a company, but a reality that might become your company. You could do that with just a bit of money, in your spare time (it would eat all your spare time, though, and some more besides.)
I miss the 68000 series of chips. They were great to work on. (Others have said that the 32000 were even better, but I never worked on them.)
I miss SGI workstations. For their day, they were awesome.
> Others have said that the 32000 were even better, but I never worked on them.
I worked on both. The NS32K had a more complete instruction set, but I wouldn't say that made it more fun. Then again, maybe I'm a bit weird. The chip I most enjoyed working on was the MC88K, which everyone else hated because of its exposed pipeline. Part of me would like to tinker around with a 6809 or Z80 some day, not because I'd expect anything useful to come out of it but just because fitting within those constraints is a fun kind of puzzle for me.
A real, ent-to-end development environment, with the ability to do the database, logic, forms and reports just 1 second after install. No package managers requiered!
My current dream is to build a spiritual successor (http://tablam.org) and doing it I start to appreciate how hard and how awesome it was. Also, the current me can't shake off the new ways (so, already diverted from Fox in a lot of ways) and must rely on imprecise memory of the past doing it. But still, I think a dbase-alike environment is surely missed....
Perhaps the largest change over the forty years was the transition from software that shipped once every six months and had to last for years in production, to something that can ship every few minutes with a lifetime hours to a week.
During the era of shipped software (long cycle times) a hardware / software co-design model worked well to work out the customer requirements, dependencies, documentation, training materials, marketing, sales sheets, software, installation tools, and all manner of support modifications. Some of that cross-team communications takes more time and coordination than may be available in a two week sprint.
There were also the constraints of the release schedule burnt into everyones brain. So many things depended upon your shipping date that you dared not miss it. Sure, agile / scrum / whatever enables you to ship something a bit later, but there was nothing like the pressure cooker of everyone working as a team to make their delivery date. Requirements, in my rosy recollection, became very clear - things were in or out, not maybe.
Moving from large shipped software to web applications in 1999 offered any number of advantages, but it was quite difficult to keep the non-coders up to date with features and capabilities. Understanding doesn't always move at the same speed for all people.
What I'm going to say next may not be everyone's experience, but the two week sprint has started to seem like a convenient way to set and miss objectives; i.e., we can put it into the next sprint. Sure, I understand that planning is a black art in some cases, but always missing your commitments and making the same statement indicates, to me, that there's a problem with the absorption of the agile development model.
Product development and execution is a very hard process for all involved. Brooks had it wright, there are no silver bullets and the web seems to proliferate the belief that such things actually do exist.
I miss Tops-20 CMDJSYS sometimes. I don't miss VAX/VMS. SunOS was pretty darned scrappy at the beginning. Usenet prior to the 1994 Apocalypse On Line (AOL). I miss Fujitsu mechanical keyboards (so much noisy awesomeness). I miss having an office with two or three people, we had so much fun and worked so darned hard together. We remain terrific friends 30+ years later.
The simplicity of things. If you needed to write a 20 line shell script to solve a problem. That was fine. Today we argue about if tools are production ready. We take the simplest of problems and magnify it 20x. It will need a framework with 100% code coverage and 5,000 github stars, blah, blah, blah. All of a sudden you end up with 10k line of code, CI, CD and a team of 10 people for something 1 person could have rolled out in a day with a shell script.
From the early '90s.
I miss the simplicity of a 320x200x256 screen. Want to access the pixel at (0, 0)? $A000:0000.
I miss FIDONet.
I miss the big GOPHER network.
I miss the the variety of monthly ANSi packs.
I miss ACiD.
I miss the wonder and amazement of the demo scene.
I miss the community feel.
While not exactly as it was of course, FTN-style networks including FidoNet are still going and active, Gopher is still a thing, ANSI packs are still being released very regularly (ACiD folks are mostly Blocktronics at this point), demo scenes are still active, so on.
Hop on a Telnet/SSH BBS and check it out!
Learning web development via view-source
I'm voting this one up because I still hit view source out of habit knowing it will show me nothing. Gone are even the cute messages like "Hey, if you're looking here, we'd like to hire you!"
In the 1990s nobody thought I was crazy for writing CGI scripts in C.
(I still do it, but now people think I'm crazy.)
Or writing amazing CGI scripts in perl that did amazing things before I realized I had to maintain aforementioned perl scripts.
Turned to shit pretty rapidly, but for a day or three I was wondrously happy.
You’re not crazy, they’re all crazy!
70's and 80's here. Big fast printers. I know, who does massive code listings or memory dumps anymore? I feel that if I had one, I'd be using it all of the time, though.
When I started in mainframe programming, our organization had a license for a core dump formatting package that would do a lot of cross referencing for you and even break down COBOL into assembler on a line by line basis for you. Well thought out little things like that made a big difference.
massive code listings or memory dumps anymore?
I did not expect this one! How fast was "fast"? 55ppm seems... reasonable? for ~$1000 CAD https://store.hp.com/CanadaStore/Merch/Product.aspx?id=K0Q15...
Did a course on a new fangled laser printer that printed _all _ the payslips for _all_ government employees of the time/place.
It took a fair number of pages to get up to speed and a largish chunk of a box of paper just to stop.
Code bases that were small enough that printing it out was a realistic option.
I guess the 2000s count, especially since I was working with 90s era technology when I learned to program. I miss QBasic. I don't think programming has been so much fun since. One of our projects was to make an animated Christmas card, with carols playing over the built-in sound. Later on, a friend and I worked on making a really stuttery, buggy version of Slime Volleyball. Everything you needed was built in, and the in-IDE documentation was great.
As for web development in the 90s I'd be remiss to not mention live code editing in production ...
Just kidding, of course. It's a terribly bad practice. Don't do it.
However, there was something about the simplicity that came with it (in case everything went smoothly, that is) as well as the ability to make minor changes without having to go through a deployment pipeline.
For nostalgia purposes only: http://cowboyprogramming.com/2010/06/03/1995-programming-on-...
Things were much simpler back then. But, simple did mean easy. It meant you had to do every little thing manually from scratch.
Goofing off while waiting for a compile to finish.
Getting analog lines in my office and/or the lab so I could rlogin around and fire off builds and tests from home so I could get an extra iteration or two in with minimal lag time.
Solving problems together instead of finding the answer on Stack Overflow.
I mean, yes. But I would have killed for Stack Overflow in the 80s.
Actually had an office with a door you could close and a window you could open to actually think in.
And then walk down the corridor to the terminal room to turn thoughts into programs and/or ask the other guys or gals.
Strangely enough there were more ladies around in those days.
Microsoft ran a radio recruitment ad campaign in SV in the late 80s with that exact theme! The narrator exclaimed how hirees would get "... an OFFICE! With a DOOR! That you can CLOSE! So you can THINK!"
Actually had an office with a door you could close and a window you could open to actually think in.
Late 90s, early 2000s... CSS Zen Garden, the amazing community of true forums/messageboards, IRC, learning from O'Reilly/Sams 24 hour/etc. books... so many memories...
Zen Garden still exists!
I miss some of the old graphics solutions. Drawing something on the screen took a lot more work but it felt more real to me.
The prevalence of applications over webpages. Don't get me wrong the web has massive benefits in terms of standardization, distribution, and security but I miss the power and flexibility of traditional applications. And yes, I know they still exist but I'd say they're not the primary delivery mechanism of most user-facing software today.
I miss how simple programs were in the 80s and 90s. I mean, they were a lot of work because there were no powerful libraries and you had to do almost everything from scratch. And then wait 30 minutes for the compiler each time you did a build. And it was way too easy to lose work before there were usable version-control systems (and even with the early ones, like CVS, it was easy to shoot yourself in the foot).
Stuff that was a major challenge back then, like rendering a polygon or keeping the state of an HTML form, is trivial to do today. I wish I would be paid to write early 90s programs with today's tools :)
There was little or no evangelism for a particular language/framework/database/stack. People just trusted you to pick the right tools to get the project done, without feeling the need to insist you look at the latest flavour of the month thing out there.
But mostly, I miss the fact that 20+ years ago, programming was still considered a professional, white collar role that engendered respect from the entire organisation. Nowadays, I feel that most coders are lumped together with "support people" and are considered nothing more than replaceable factory line blue collar workers.
That’s a very different experience than I remember. Macho culture dominated: assembler was the king, then C, then C++. LISP or Smalltalk or COBOL people hung out on their own and didn’t talk to each other. Java was a joke usurper at first. Perl, python, ruby were toys. Etc.
IOW language advocacy and flavor of the month were rampant in the late 80s onwards from my POV.
These days at least there is enough community for all of the major languages and even the minor ones.
Also I find programmers get a lot more respect now than they did in the 80s and 90s. Programmers can make $300k+ USD at a top Bay Area company today, and starting salaries are well over $100k. Often they talk directly to users - no handlers or analysts in between to shield people from the “strange programmers”. Attitudes have shifted. Though given your experience, I guess not uniformly.
I started in the 90's and I realized early on that non-tech people in the company viewed us as cogs.
Edit: Well, to be precise, not XSLT per se -- but I'm missing a similar generic transformation DSL for structured data, ideally able to deal with any form of structured data like JSON, XML, CSV etc...
Not having so many plug-n-play abstractions that you’re so strongly encouraged to use to “not re-invent the wheel”. The problem in my mind is that the abstractions leak all over the place, but we stop caring.
Pushing off compute to the browser because it’s cheaper, nevermind the cost to non-powerhouse phones.
Avoid putting any thought into optimization because compute and memory is cheap. Nevermind that you’ll have to do a complete re-write when a few dozen 600+ ms microservice responses results in 20 second page loads.
Store all the things! You might need them someday, after all. GDPR what?
I work with developers who think that DB migrations are “of the devil”. As a result, they have pivoted to use the RDBMS as a rudimentary k/v store and create the relational data structures in memory. All they need to do is pull a few dozen GB of rows from the DB with every container restart.
So, yeah. I miss having fewer abstractions; having more constraints. The software seems somehow nicer through the CRT-colored lenses.
> Not having so many plug-n-play abstractions that you’re so strongly encouraged to use to “not re-invent the wheel”.
Inventing the wheel, rather than re-inventing it. There was a lot more "doing something for the first time" and less "doing something for the Nth time, slightly differently".
(Or so it felt to me. But there was still a bunch of "doing what the mainframe people did a decade ago, but doing it badly". Still, it felt different.)
I miss working in my own office
Burning and erasing UV EPROMs.
Wow, do I ever not miss that!
Not much, so much more is possible. But if Stack Overflow goes out of business, I quit. I do not miss trying to find obscure stupid things myself. The Internet is very good at making my time more valuable.
working in an office with a door
i miss the lack on distractors like fb,tw,whatsapp,et al i miss the time when people used to think instead of just search for the exact answer ;) do not misunderstand me, i'm not against it, i'm a php/js scripter 40yo guy that enjoys to copy paste Almost Everything but When I am not able to find the strightest answer, i start the thinking machine, i won't Almost Never ask that question on Facebook, hacker news, Twitter, stack overflow... I would prefer thinking than waiting for anyone else to answer my question. so, i miss the think first era.
I don't miss the assumption that all developers are "guys".
6502 machine language of the Apple II+ really helped get a mental picture of how computers work. Call -151 at the prompt then you’re in. A manual listed what most ROM addresses did: move the audio speaker cone once by reading or writing anything to C030 (I think, or was it C080?), of course, 30 C0 syntax. 85 store the accumulator? Geez, it’s been a long time. 1 MGhz 8-bit CPU, X and Y registers, an accumulator, only adding and subtracting commands except for shifting bytes left and right. No audio routines.
It was all very inefficient but what a way to learn!
It was probably like the 1930s and 1940s with radio: people really knew how it worked at the nitty gritty level and could friggn make an impromptu radio out of debris from a crashed army jeep (probably on youtube now).
That’s civilization: layer upon layer upon layer of knowledge and advancements until we all roast the planet from our greed, driven by the 4% (the sociopaths).
Just because I did development in the 2000s does not make me old.
I miss when default username/password combos like admin/admin and root/“” would let you log in to a solid 5% of internet and Telenet (RIP) connected hosts.
And most .rhosts or /etc/hosts.equiv just had a single lonely '+' in them and nothing went wrong...
I had great times with .asp and ms access db ^^
No leetcode puzzles in coding interviews!
Hi, I posted this about mid 90’s development in another thread:
""" Before about, 1996-- (I'm estimating)-- before network connectivity was the norm-- I understood how every processor in every system I dealt with worked. Two things happened in the mid to late 90s.
First-- an explosion of silicon. The days of understanding how the system worked, in its entirety, were over. I had a copy of "80386 systems designers guide", and I can't remember the name- but the Michael Abrash's guide to VGA video cards and a PC Interrupts were all that were needed to master computer architecture. If you were super fancy you understood the Pentium math extensions (whose names I cannot recall), that let you do a crossbar in a few cycles, and you understood how common chips like the 16550 worked, which is an addendum to PC Interrupts if I recall.
Second, and this is the key thing, technology started working AGAINST us. As complexity exploded, so did network connectivity. We had this era in which Operating Systems complexity exploded (win95), silicon complexity exploded, and connectivity exploded. The thing about connectivity is this-- that's when our computers went from isolated things to these things that are always online-- they started working against us. They could now talk to other computers, other people, and that changed computing fundamentally.
Computers went from these things that were our helpmates to our masters. This is what I lament the most. I don't miss bit-banging, assembly programming (well, a little). I fudging love Ruby/Python, as opposed to C++ being considered "High Level." I love that I can buy a computer for 35$ (RPI) that is fantastic. RPi's are so cheap I employ half a dozen just to run my 3d printers (yes I have a problem). But I do so much not miss being scared of my computer. Miss being scared of the ne5work. I miss the sense of wonder at what a computer could be or do. You have to understand technology as it exists now, is beyond my wildest dreams. I watched Star Trek TNG as a child and the devices we have now, legitimately have exceeded my wildest dreams. The simplest cell phone now has as much computational power as every computer combined at the time I graduated high school.
But, I do miss simplicity. So much so, I've been considering writing an NES or Sega game; I'm precisely 40 years old, and I've finally come to understand that art and constraints are intimately connected. That they press on each other, and neither is possible without each other.
I have so few restraints on modern systems that I... am constrained. I miss the constraints of my earlier years that were, in fact, my freedom. I miss software that shipped and worked on the first day. I, like everyone, miss my childhood. Because the universe is an explosion of complexity-- and when we look back, we will always feel like things were simpler-- because they god damned actually were.
Also, I am old.
I'd love to have a cubicle, honestly.
Some do, some don’t. It’s the same with every period and every group born in it. I grew up in the 1940s and can think of some things from that time that I miss, but I wouldn’t go back for quids. Let’s face it, we’ve never had it so good. These are the good old days.
A famous Australian (well, he was born in England but spent his life here) named Paddy Pallin nutshelled it. “The best place to be is here. The best time to be here is now.”
Everything was shittier than it is now, but also much simpler.