Hacker News new | past | comments | ask | show | jobs | submit login
18-year-old personal website, built with Frontpage and still updated (fmboschetto.it)
520 points by fbn79 on Feb 14, 2020 | hide | past | favorite | 472 comments



These simple sites show us something profound: If you want something to last, don't base it on something that won't last. There are a some technologies that will never allow somebody to build a site and leave it unchanged for 20 or 25 years. Cold Fusion comes to mind. Almost nobody hosts it anymore for one. Can you imagine running the same WordPress version for 25 years? The version of PHP it runs on will be EOL long before.

I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.

Looks like Web 1.0 got something right after all :)


While this website still works fine, the actual HTML that Frontpage generated isn't exactly easy to maintain if Frontpage stops working for whatever reason.

The author of this website is basically stuck using whatever version of Frontpage supports the markup of his website. And I bet there have been plenty of people who used <some other WYSIWYG webpage editor> who are no longer able to maintain their website because their editor no longer runs on their system.


Microsoft has been fairly good at allowing older binaries to run on newer systems.

Apple is pretty annoying in this regard. There’s a lot of software that doesn’t work on versions maybe only 5 years old.

A lot of software doesn’t need to change to be honest. Microsoft word for example. Word processing: you sit down and type stuff, maybe change the font once or twice. I guess the collaborative features are nice being able to edit the same document with others.

It would be fun to use an older machine and see how productive you can be with the old software too !


Been hit with that, I needed to run chromium v49 to be able to remote debug some TVs with old opera tv sdks, the version I had stoped working, and several versions that I tried crashed when using the chromium devtools. I ended having to use a windows virtual machine


I wish I could find the movie on youtube again, a demonstration of collaborative text editor from 1960-ish. Been looking for it a number of times just this year.


from the mother of all demos (1968) https://www.youtube.com/watch?v=yJDv-zdhzMY&feature=youtu.be...

I suppose you mean this video ?



Thankyou, that was nicely indexed even!


Please post back if you find it, I'd be curious to see.


As posted by Fice above:

https://www.dougengelbart.org/content/view/374/

It's a very interesting show, how they solved the displays with commercial cameras filming the lcd displays in the lab and sending to the users screen. How the mouse worked, five button keyboard and so on.


You haven’t tried to run 16 bit software on 64 bit Windows have you?


Not to be snarky, but if there is a need to do this, it's pretty easy. If there is a real need, it is pretty trivial to do with VirtualBox or DosBox.

Those applications from 20 years ago running in emulators will work far better in 20 more years than Apps from today that stop working due to remote service dependencies to force vendor lock-in.

It is endlessly amusing to me that the more tightly integrated the cloud services get to conventional computing tasks, the more likely we will end up with Vernor Vinge style programmer archaeologists from A Deepness in the Sky...


When I worked for a NASA contractor doing sounding rocket telemetry, the main telemetry stack programming software was a Turbo C program from 1987-1990 (TDP502.exe on the odd chance that the maybe 50 other people on the planet who have ever used it sees this). Works just fine in DOSBox, at least to create files. Still needed an actual older PC with an ISA slot to handle the hardware that TDP knew how to control. But for configuration tasks, Windows + DOSBox + a USB 3.5" floppy drive = I could do things on an actual modern system.

So yeah, you're right, emulation saves the day in many cases. And I felt like a programmer-archaeologist using DOS to launch something into space in the 2010s...


I cant upvote this enough. The Vernor really captured this. As a programmer you can clearly see this happening right now.

I shudder to think of the massive house of cards we will have in 50 years.


Can you imagine how massive the field of software archeology will actually need to be to capture an understanding of the human experience of software development in the “distant past”.

Take any given product or system. what percentage of the software involved in implementing the system was written x years in the past. What’s gonna happen to that distribution in 10 or 100 years?

I wonder how inevitable it is that this percentage of ancient software in virtually every system will just keep growing and growing over time — until the systems of the future are tiny layers built on top of 1000 year old, impenetrable old growth forests which as well have been written by aliens for their understandability ...


So if emulation or a VM is your go to. How is that any different than what yuh can do with older versions of MacOS?


Virtualizating Windows isn't very hard, even back to something like Windows 95.

On the other hand, only OSX 10.7+ are really easy to run in a VM, and .5 and .6 only work for servers, and anything before 10.5 isn't really going to be compatible with virtualization. That's 2007, so OSX lets you virtualize back about 13 years, and Windows you can go back almost 30 years. People even have Win 3.1 running in VMware.

This is probably due to the fact that there isn't powerpc virtualization software, but if you need to run osx software from before 2007, you're basically out of luck.

You can also virtualize windows from just about any OS you can imagine, Mac, Linux, Windows etc, while OSX virtualization has a hard requirement for running on Mac hardware.


There seems to be a misconception that you can only run 10.5 and later in a VM, but you can actually run OSX 10.4 Tiger fairly easily. This is the non server version. [1]

I was able to import almost everything from my old PPC computers. It's not completely virtualized because is using Rosetta and can not use Classic OS apps. But it is still extremely useful, and way faster than my PPC computers ever were.

[1]: https://github.com/ranma42/TigerOnVBox


>OSX virtualization has a hard requirement for running on Mac hardware.

If you aren't a stickler for Apple's terms of service (if you're doing this for business purposes, I suggest you should be), you can use a tool called macOS unlocker to patch VMWare Workstation to run macOS VMs. Runs great, though all VMWare products can only render display output for macOS in software mode.


Run a shady binary that seems to not have a certain author/website, as administrator, so it modifies VMWare binaries? A rather... curious approach, but for some reason common in Windows among e.g. gamers.

I've ran MacOS in VirtualBox iirc, without shady patches―though it probably was in Linux.


I have literally never heard of a "gamer" running shady binaries with administrator privilege in my entire life. Maybe you're thinking of the hacker culture of the 80s, but gamers today use launchers to manage downloading, installation and setup of software. Maybe you're thinking of software pirates using scene software as keygens or DRM-defeaters. I suppose that's common among kids who don't buy things (but I don't believe those tools run as admin).

It may be more common in Windows, but I would challenge that since Windows is basically free and runs on anything from a raspberry pi up, the vast majority of "hacky" stuff happens in Windows and Linux. Mac users buy very, very expensive hardware to do very specific tasks, and "hack around" is often not a good enough justification for the most expensive personal computers money buys.

I would also suggest that it in the Linux world where running random binaries as root is most common. Found some random repo that claims it's a fork of a good one with a bug fix? Build it and run it!


A quick Google search for running PPC Mac software under emulation.

https://www.thefreecountry.com/emulators/macintosh.shtml#pow...

For the most part yes. If you want to run Mac software you need to own Mac.

As far as going back 30 years. Now you’re in the Classic Mac era. There are plenty of cross platform emulators that run Mac software that old.

If you want to go back 40 years. Apple // emulators are a dime a dozen.


> There are plenty of cross platform emulators that run Mac software that old.

What, Basilisk II and Sheep Shaver? PCE and Mini vMac if you want a Mac Plus. For a large array of apps only one of these options will actually work.


Thanks for the flashback! I remember when PearPC was announced!

I’d been wanting a Mac for several years, but had little understanding of how it worked. PearPC let me try it out and a get a glimpse of Mac OS X.


That's fair, but this was about virtualization not emulation. Similar but different, but that's certainly a solution too.


If the current version of OS X was backwards compatible with 10.0 - 10.4. It would still need both a PPC emulator and a 68K emulator since iOS 9 still had 68K code.


10.4 x86, the origins of Hackintosh, will be the first version that's actually practical to virtualise.


Pearpc can run MacOS X 10.1.


You act as if no x86 OS X software that was written before 10.7 will run on 10.7.


You're drawing an equivalence between 5 and 25 year old software?

Microsoft Windows 10 is able to run software that predates all of Apple's supported platforms.


There's always a person who is happy to explain how Apple bests any competitor you could mention at any metric you could imagine.


So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?

Someone else was complaining that they didn’t keep FireWire. Should modern Macs come with ADB ports?


Obviously not, but that doesn't prove that there isn't value to having backwards compatibility. Sometimes you just want something to run and not have to touch or change it for a long time.

A 20-year old machine that's critical to a factory can run off a serial cable plugged in to an expansion card running software written in the 90's that will still run on Windows 10. Nobody in their right mind would decide to write that same software on a Mac.


Well, given where all of the PC manufacturers are that were around in 1990 compared to the revenue and profit of just the Mac division, it seems like Apple didn’t make a bad business decision not prioritizing backwards compatibility.

If you compare where Apple is and where Microsoft is also, it doesn’t seem like chasing enterprise PC sales was as good of a long term bet as going after the consumer market....


> So if Apple kept “25 years” of backwards compatibility, should they have been better off bundling a 68K and PPC emulator? Why stop there? They should have kept compatibility with the Apple //e and also bundled a 68K emulator?

I don't think it's unreasonable that Apple hasn't done so, but neither do I think doing so would be unreasonable. Archive.org can emulate Apple II's in your browser, I'm sure Apple could add an equivalent feature to MacOS if that were something they cared to do. They obviously don't, and that's their prerogative.


I have a Windows 10 PC with a PCI (not express) slot that I installed a Firewire card in last year to use 15 year old software still available from Sony's website to rip a stack of Digital8 home movies.


And Apple has sold a Thunderbolt to FireWire adapter for years.

https://www.apple.com/shop/product/MD464LL/A/apple-thunderbo...


I tackled that project about two years ago. Asked around and a friend had an old laptop with FW port, so I installed Ubuntu on it and copied all my old Video8 and Digital8 Tapes.


That is fair, though, because it is equally likely to find someone that will never give Apple credit for a single thing.


In the face of what Apple does for privacy, comparatively, nobody else doing a damn thing. Privacy is by a very significant margin the most important metric.


True enough. Though to be fair the last new version of a Win16 OS shipped 26 years ago, and Win32 became the standard API in consumer products 24 years ago. There are degrees of worry here. Software of the vintage you're talking about was contemporary with System 7, and the closest ancestor to current OS X was called "NextStep 3.3".

The point upthread was that genuinely useful stuff gets retired just a few years after release in the Apple world, and I think that's broadly true. It's true with hardware too -- professional audio people are stuck with truckloads of firewire hardware that they can't use with their new laptops, for example.


Apple shipped the last 32 bit Mac in 2006 over 10 years before 32 bit software wasn’t supported. There were plenty of FireWire to Thunderbolt adapters.

No the closest ancestor to MacOS X is System 7. There were Carbon APIs until last year. A poster up thread said they could use an emulator. There are 68K Mac emulators available too.

AppleScript for instance is a System 7 technology - not a NextStep technology.


> the closest ancestor to MacOS X is System 7.

How do you figure?

System 7 was part of the Classic Mac OS line, the last of that line was System 9 (Mac OS 9). This was a proprietary kernel developed by Apple.

Mac OS X is a Unix based OS derived from technologies they acquired from NeXT.

To say MacOS X is an ancestor of System 7 seems completely nonsensical.


No MacOS X when it was originally released had parts from NextStep and parts ported from Classic MacOS including QuickDraw, AppleScript, QuickTime, some audio frameworks etc.

The entire Carbon API was a port of classic MacOS APIs to make porting from classic MacOS to OS X easier.

MacOS X was a combination of both. That was the whole brouhaha of why Apple ported Carbon APIS to OS X because major developers like Adobe and Microsoft insisted on it.

That’s not to mention that the first 5 versions of MacOS had an entire OS 9 emulator built in.

To take the analogy to the extreme. MacOS had two parents - Classic MacOS and NextStep.


I would disagree, most of what was brought from Classic OS was ported, adapted, out of necessity and short lived. OSX was an entirely new operating system that ported some frameworks and software but wasn't backward compatible. Were it so, they wouldn't have provided an emulator.

I think you're just supporting the original assertion that Apple does not support things for very long. Does Software written for OS X v10.1 run on Catalina today without using 3rd party tools or emulators? Software written for Windows 95 still runs on Windows 10.


You call the Carbon API that existed from 2001-2018 “short lived”? The entire Carbon API was used to port software like PhotoShop and Office.

Carbon was a port of enough of the Classic API to port major important programs.

AppleScript is still built into the current version of OS X. It was introduced in 1993-94

And seeing that 10.1 was PPC only, do you expect them to keep a PPC emulator around?

Can you run PPC based Windows NT software today on an x86 PC?


Sounds to me more like the ported programs were short lived - and IMO, in that they are not entirely wrong.

Sure, Carbon and Rosetta certainly were no mean feat, and the drastic PPC/x86 break is something Microsoft never really had to deal with (heh, the biggest problem trying to run a PPC/MIPS/Alpha based NT application today is actually finding one :) ).

But Apple never went to the same lengths as Microsoft regarding backwards compatibility, and while Carbon and Rosetta immensely eased the transition, the continuity definitely wasn't comparable and it was never transparent to the developers (and in Apple's defense, this was never their intention and they always were quite open about it.)

For one, Rosetta (and thus PPC compatibility) was dropped with Lion in 2011, so no amount of Carbon would help 10.1 applications after that.

And even with Rosetta, each release, especially after Tiger, came with quite a list of API changes and deprecations (with the whole of Carbon declared obsolete in 2012) - and and increasingly longer list of high-profile software that would not run anymore and require an update or upgrade. And while Microsoft did a lot even to prevent and/or work around issues with notorious software (hello Adobe! :) ), Apple was far less willing to do so.

I mean, just as an example - I can run Photoshop 6.0 (from 2000) on Windows 10 (certainly no thanks to Adobe), but no chance for PS 7.0 even on Leopard...


Carbon was declared obsolete in 2012 but wasn’t discontinued until 2019.

Porting from PPC to x86 was relatively easy. But you’re also forgetting about the first transition - from 68K to PPC.

Can you run the PPC version of any Windows NT apps?


PPC to x86 possibly the smoothest transition I've seen in my lifetime, for most it was just a recompile, and I'm convinced it was only as smooth as it was because of the shit show transition to OS X.

Apple announced it's plans to move to OS X in 1997 and that they'd ship an emulator, Blue Box, to run classic apps. That was met with a resounding "no" from the community.

Carbon was never suppose to exist, the Classic APIs were not memory safe, don't support thread, and had a lot of other issues. Apple wanted a clean break in the form of Cocoa but the community said no. So Apple came up with Carbon, which was sort of a port of Classic APIs to OS X, but because the two operating systems were so different it wasn't anywhere close to a 1:1 copy and required developers to port to it.

Since it's inception, Apple wanted Carbon dead, it required them to rewrite core parts of OpenStep in C and they had to maintain them alongside their Obj-C equivalents. It took them 12 years to get to the point where they felt comfortable killing it off and almost 20 years before they actually could.

> Can you run the PPC version of any Windows NT apps?

Developing for PPC was much like targeting x86 and PPC on a OS X. It was mostly a recompile unless the App used assembly. You can't run the PPC version of an NT app on modern hardware just as you can't run the PPC version of an OSX app on MacOS.

The difference thought is that PPC on NT never took off so there's something like 4 or 5 Apps for NT versus the thousands or hundreds of thousands for OSX.


I haven't forgotten anything, I just fail to see the relevance to this discussion. (68k? Really? That one's been dead for 14 years. And what is with you and NT on PPC? You really want to start comparing a 25 year old, short-lived, ultra-niche side version no one bought or even wrote software for with the "mainline"?)

I think you missed the entire point of my posting, i.e. that even outside the architecture changes long term compatibility was never even near the same level (and different arch often not even the culprit). Carbon being available doesn't help you a thing when old software still doesn't work.


If you are complaining that you can’t run 25 year old Mac software on an x86 Mac, the only option is for Apple to ship MacOS with a 68K emulator and a PPC emulator. The first version of MacOS that ran natively on x86 came out in 2006.

Yes I realize that PPC Macs came out in 1994. But they required a 68K emulator because even parts of MacOS were 68K.


>If you are complaining that you can’t run 25 year old Mac software on an x86 Mac

But I ain't. I'm arguing that for vast stretches of Mac OS/OS X/macOS history, even 5 year old software has been a gamble.


There were a few breaking change epics in MacOS history.

There were three major breaking changes for MacOS.

- If you bought the x86 version of software in 2006. It would potentially work until 2019 when Apple dropped 32 support.

- If you bought the first first version of OS X PPC software in 2001, it could potentially run until July 2011 with the release of 10.7.

- If you bought a classic MacOS app, it could run from pessimistically from 1992 with the release of System 7 to 2006 with the introduction of the first x86 Macs.


Yes, we already talked about this. The keyword here is "potentially", which I'd swap with "theoretically".


https://en.wikipedia.org/wiki/Carbon_(API)

"Carbon was an important part of Apple's strategy for bringing Mac OS X to market, offering a path for quick porting of existing software applications, as well as a means of shipping applications that would run on either Mac OS X or the classic Mac OS. As the market has increasingly moved to the Cocoa-based frameworks, especially after the release of iOS, the need for a porting library was diluted. Apple did not create a 64-bit version of Carbon while updating their other frameworks in the 2007 time-frame, and eventually deprecated the entire API in OS X 10.8 Mountain Lion, which was released on July 24, 2012. Carbon was officially discontinued and removed entirely with the release of macOS 10.15 Catalina."

I think you are confusing "supported" with EoL. Adobe was pissed because there was originally talk of doing a carbon64bit and they never supported it so they had to move their entire app over.

The main point is, that Windows would never stop that api from "existing" In some manner. Unlike Apple.

This is just a difference in how both companies view themselves. While Apple claims "it just works". That isn't quite true in some of the cases we have seen. Microsoft has actually done a far better job of this.

I know someone that worked on the visual studio team. They literally had 100-200 servers that would run overnight with each build guaranteeing that the software would install and run on every single permutation of windows on an array of hardware.


So, what exactly did you say they refuted anything I said?

The Carbon API was 32 bit only and was supported until the latest release of MacOS.

Do you realize how many deprecated end of life frameworks that Microsoft has been lugging around for decades?

So should Apple have kept support for 68K software in 2019?

Also, do you realize that for all intents and purposes the entire .Net Framework is deprecated and EOL except for minor compatibility updates?

There are plenty of “pissed” .Net Framework developers who feel abandoned by MS.


I've only heard complaints from Silverlight and Windows Phone/Mobile developers anecdotally.

From a web perspective (and my experience), .NET Framework 2/4 -> Core is actually not a big changeover outside of the views (probably better if you switched to MVC).

The Windows Phone apps I built are dead now, but that isn't a matter of APIs no longer being supported, but an entire platform going under.

As a macOS user, I had one operating system update kill external GPU w/ Nvidia cards (that sucked) and another update kill 32 bit apps (that one isn't a big one for me personally). All on the same computer.


The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible. Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.

Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.


> The entire ASP.Net Core and Entity Framework architecture was changed and is not compatible.

There's been a lot of confusion, due in no small part to Microsoft's branding and communication, but what you said is not at all accurate if not intentionally misleading.

What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.

> Not to mention all of the legacy third party .Net Framework only third party packages that don’t work.

".NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.

> Microsoft also completely abandoned Windows CE/Compact Framework while there were plenty of companies that had deployed thousands of $1200-$2000 ruggedized devices for field services work.

Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.


What's been know as .NET for the last 20 years is now called ".NET Framework", this is not unlike how OS X is now called MacOS retroactively. ".NET Core" is an entirely new framework that just happened to be compatible with ".NET Framework" but as time goes on the two have diverged.

The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”. In fact, you could originally create ASP.Net Core and EF Core apps that ran on top of .Net Framework.

NET Framework" and ".NET Core" are similar to Cocoa and Cocoa Touch in the sense that you can write code that will compile under both AND you can write code for either that will be incompatible with the other. In fact I maintain a half dozen packages that are compatible with both.

Which will not be the case for long since MS has stated that no new features will come to .Net Framework.

Microsoft didn't "abandoned" Windows CE, it stopped development for it 6 years ago as it was largely dead and Microsoft offers many pathways off of Windows CE. The CF actually runs on platforms other than CE intentionally such that any apps written for the CF will just work elsewhere. AND they still support CE and CF to this day, they just don't maintain or develop new versions of them.

Which is also not true. The last version of Visual Studio that supported Compact Framework was VS 2007. It was far from dead in the Enterprise by 2010 or even 2012. Companies were still relying on CF to run on their $1200-$2000 ruggedized field service devices. They had deployed literally thousands of devices in the field. I know, I was developing on VS 2007 until 2011 just to support them.

I mean devices like these that cost $1300 each. I deployed software for a few companies that’s had thousands of Intermech and ruggedized Motorola devices.

https://3er1viui9wo30pkxh1v2nh4w-wpengine.netdna-ssl.com/wp-...


> The two weren’t initially slated to diverge at all. .Net Framework and .Net Core were suppose to be separate implementations of “.Net Standard”.

Uh... no. Hard fucking no. .NET Standard is the commonalities between Core and Framework. Core and Framework were NEVER the same or intended to be the same.

Framework is all of the legacy Windows specific Libraries for things like the File System, Active Directory, etc.

Core is intended to be platform agnostic and cross platform.

Read this, specifically Figure 5:

https://docs.microsoft.com/en-us/archive/msdn-magazine/2017/...

> The last version of Visual Studio that supported Compact Framework was VS 2007.

Windows Embedded Compact 2013 shipped with CF 3.9 in 2012.


And yet you can still run .NET 1.0 apps on Win10, and this isn't changing in the foreseeable future.

Hell, you can run VB6 apps on Win10 - it even ships the runtime! - and the remaining hold-outs in that developer community have been complaining about abandonment for two whole decades now.


https://docs.microsoft.com/en-us/dotnet/framework/install/ru...

The .NET Framework 1.1 is not supported on the Windows 8, Windows 8.1, Windows Server 2012, Windows Server 2012 R2, or the Windows 10 operating systems. In some cases, the .NET Framework 1.1 is specifically identified as required for an app to run. In those cases, you should contact your independent software vendor (ISV) to have the app upgraded to run on the .NET Framework 3.5 SP1 or later version. For additional information, see Migrating from the .NET Framework 1.1.


You can't install .NET 1.x itself on Win10. But you can install .NET 3.5, and it can run .NET 1.x apps.


From the article “In some cases, the .NET Framework 1.1 is specifically identified as required for an app to run”

By “specifically identified” it means that some applications actually hard coded a check of 1.1.


If app developers explicitly prevent their code from running on future platforms, this is hardly the fault of the platform.


Which would be no different to a macOS app hard coding a check for 10.3 and not working if you have anything newer. Neither says that the app _couldn't_ run, just that a badly thought gate prevents it.


Whether the app could run or not is irrelevant if the app doesn’t run. There must be enough apps that don’t run that MS thought to call it out.


> There must be enough apps that don’t run that MS thought to call it out.

The callout exists because Microsoft takes a different approach to support from Apple. Microsoft provides support material for all of it's legacy and deprecated software, as well as the ability to download and install them. So it's important to identify and track incompatibilities between them.

When Apple moves the past is whitewashed over and when support stops they forget it ever happened.


And so the mystery of why a 32bit version of Windows 10 still exists is solved.

What's mildly annoying is that much of the early 32bit Windows software came packaged in 16 bit installers. Office 97 would be such a breeze on modern hardware.


Office 97 can be installed on 64 bit windows 10 with original installer. I have done it just last month and it runs without any problems... and it is fast.


There are special workarounds, many of the old installers run a small piece of 16 bit code which doesn’t work in 64 bit Windows but because it’s so common Windows just runs a replacement version.


Failed the last time I tried, which must have been on 7. Will be a strong example for Microsoft dedicating resources to compatibility if they added it for 10 or with a patch update. (both their resources and the users', there's a crazy amount of checking for necessary compatibility hacks going on whenever an executable is started)


I had some selection rendering issues in Excel, and Word is reporting some kind of registry corruption on startup. Otherwise, works fine.



There are some third-party implementations of NTVDM that allow running 16-bit DOS and Win16 apps directly on Win64. Although DosBox is still the easiest route, and "good enough" in practice.


> The author of this website is basically stuck using whatever version of Frontpage supports the markup of his website.

But at least getting it done largely depends only on them, and it's not too hard. I have friends who swear by ProTracker and still use it, even though it's thirty years old and the platform it's running on has been dead for more than twenty. They don't have an Amiga but it's trivial to get it running in an emulator today.

You can run Windows 98 in a browser, and your web editor in it. It's certainly less complicated than hosting a WebObjects application today.


I know a person who is maintaining a few sites she built in like 2005 with a version of Dreamweaver a little older than that, so never dares to upgrade the Dreamweaver version.

The whole thing is terrifying and horrific to me, but they keep paying her to do the work so she's fine with it.


I actually just finished redesigning my site with static HTML using Dreamweaver 2004 on an iBook G4. Why? Why not? My little brother passed away a couple years ago and I inherited his iBook, and I have decided its going to be my personal laptop from here out even if all I use it for is VNC to one of my other computers. Plus, as mentioned above it can still run all that delicious old Mac stuff from System 7 through OSX 10.4.xx and its all "abandonware" now, yet in many cases still VERY usable.


I cut my teeth using Dreamweaver for tripod and geocities sites way back in the last century and have fond memories of it. It was great for templating headers and footers before I discovered PHP, which I have less fond memories of, but that's another story.

With all the churn in certain areas of tech, it's easy to forget how much stays the same.

I was using Macromedia Fireworks MX from the early 2000's right through about 2013 to do graphics for sites I was building people on the side.

I used it while it got several version updates, Adobe took it over and updated it for five or six years, and then discontinued it.

Meanwhile I was still using the old version to make beer money.

I only quit using it because I finally admitted I kind of suck at graphic design. Besides, there doesn't seem to be much need for graphic design in much of the modern mobile-first, material design world anyway.

I bailed on the whole thing and have been sticking to back end at the day job these days. Things move at a slightly less hectic pace back here for me most of the time.


You would just edit the html pages lol. I still use dreamweaver for the visual editor if I need to copy and paste from a pdf and want perfect html. No one has made anything like it. No current editor has a quick sftp that allows you to connect/edit move on.


Early versions of Dreamweaver were pretty slick. I used it as my primary IDE for developing ASP pages in early 2000.


Coda does.


As does HTMLPad 2020, the closest I've found to a Coda clone on Windows. (It's not as good as Coda but it gets the job done.)

That said, neither app has the WYSIWYG editing that DreamWeaver had. But I've always preferred to hand ode my HTML anyway.


I mean .. it's just a static HTML editor at that point (maybe it does some includes/builds to simplify things). If you're just pushing out static content, you don't have to worry too much about outdated libraries and security issues, so long as the web server it's being served from is maintained and up to date.


"Dreamweaver Templates" was basically an early static site generator that made it really easy to design and include site-wide or section-wide elements.

Yeah you could always edit the individual files that it outputted, but in some cases people were using this system to manage sites with hundreds or thousands of pages. As recently as a couple years ago it was how the natural history museum in DC managed their site content.


Dreamweaver 2003?

I know a website that is still maintained regulary and built with dreamweaver 2003


Can't you just keep editing the html in a text editor? Frontpage's generated html isn't that unreadable.


Presumably the author used a WYSIWYG editor in the first place because he is not a technical person, so for him/her to now not only learn enough HTML/CSS/javascript to turn to hand editing but to also understand Frontpage's noisy output would probably take enough effort that they might rather decide to shut down the site if they're not able to continue using Frontpage. Hiring a dev to redo the site is another option but that presumes they have enough money to invest in a hobby site...


Nah, I don't think this is such a big deal. Adding a row to a table is much easier than creating a table from scratch. And Frontpage's output isn't that noisy -- I had to go through that experience myself. That said, my old Frontpage from 2005 (which I copied from Win XP probably) still works fine except for a warning it throws at start about not finding some registry value. I wouldn't want to use it any more (it doesn't understand CSS and screws it up), but if I wanted, I could.


ironically, i bet the author has learned more technical skills by maintaining a system that can continue to run their version of frontpage than they would have if they had just taught themselves HTML from the start.


I ended up learning more HTML by having FrontPage, because it would regularly fuck everything up and I'd have to go fix it by hand.


I properly own several versions of Office all the way back to 95 so I can say this as I am covered :)

Years ago I found a "Portable Frontpage" which of course I downloaded and still have somewhere zipped. I know that MS wouldn't like this much, but life is life and Portable Frontpage exists. So as long as there are Windows, Frontpage will work!


OTOH, I guess that maintaining a windows VM for use with frontpage would be a lot simpler and safer than maintaining an old software stack server side.


(OTOH = On the other hand)


Looking at the page source it looks dead simple to modify. I know it isn't WYSIWYG but it's just HTML.


Actually the browser makers are shouldering the burden of supporting the dredge output by FrontPage. Remember that the intention was to make it work in Internet Explorer and crash in Netscape.


Also note that a Java Applet is included in there, which has likely not worked since 2015.


VMs solve this problem.


> If you want something to last, don't base it on something that won't last.

and

> I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.

While simplicity is a great way to future proof things, I'm not convinced that this argument in general would work nearly as well without the benefit of hindsight. One could be forgiven for confusing it with "guess the future correctly". Plenty of relatively safe bets from 10, 20, 30 years ago haven't panned out that well. It's an interesting line of thinking though: exactly what properties of HTML make it so long lived?


My text files still work. I have MUD design documents from when I was in high school (mid to late 90s). Org mode and Markdown are kind of eternal formats. Even if all the tooling dies, they still look decent. Basic HTML still works well enough as well. You can write a parser for XML pretty easily. HTML can also be processed and rendered trivially. I think we could collectively find some other technologies that are likely to be around in another 20 years. The simpler the file format the more likely it is to be around :)

edit: A few more popped into my head. CSV. SQL schema + Data dumps (text format). The common theme to everything here is plain text. SQLite, although binary, is probably close to eternal. Git is eternal enough (recent HN post showed even POSIX shell is good enough to write a basic git client). JSON is easy to write a parser for as well. YAML.


I think you can generalise the advice: remove as many processing steps as you can.

It's not so much that you needed to guess that HTML was going to be as long-lived as it is, it's that HTML is the final product that actually loads on the users computer, and those tend to stick around for a long time (or at least be emulated). The code that lives on a backend server somewhere, not so much.

For what it's worth, I don't think this example is necessarily bulletproof: it requires a working copy of Frontpage. If Microsoft behaved more like Apple it might have been deprecated away long ago!


Google controls the major web engine. I don't trust google to not deprecate parts of html over time because the new shiny is "better". I would rather maintain markdown generators which I can update to change the markup to whatever the latest google insists needs to work instead of rewriting all my documents.

I'm currently tasked with writing a UI for a machine that has a 25 year expected lifespan before wear means it is replaced. This is a real concern - think about where computers were 25 years ago and try to find something you are sure will work and look nice.


Google wields too much power. To an extent, they can dictate to website owners what HTML is allowed and not allowed thanks to their dominance in search. This is compounded by the fact that their browser marketshare via Chrome and now Microsoft Edge basically allows them to do what they want with HTML.

Matters are even worse. Last year, the W3C became the "yes-man" of Google. They decided to stop developing the HTML standards and just start rubber stamping whatever WHATWG produces. WHATWG is run by Apple, Google, Microsoft, and Mozilla. And who has the most power in that relationship? Yep, Google.


Even if Google did do that (which they’ve shown no signs of, and they are still far from a browser monopoly when you look at iPhone etc) it wouldn’t stop HTML from being read. Translating from HTML -> GoogleHTML wouldn’t be meaningfully different to translating it from Markdown.


It could be argued that AMP was that attempt, and the only reason AMP gained tractions was Google started using it in the carousel of their SERPs.

While Safari, when mobile is included, has ~17% of the market, that's not enough when you combine Google's browser share along with their search engine share.


Is the machine connected to the network/internet? Are you planning on any software updates? I'm curious how you plan on handling https root certificate updates.


Https is something I haven't figure out. If anybody has a good answer to this please let me know.

The only thing I can come up with is http (no s!) and firewall rules that limit connections to 192.168.1.xxx - or otherwise not allowing connections from outside of the local subnet. I don't like it, but I don't have a better plan.


I sure hope the UI is buttons and not screens :)


This is one reason why the static site generator I use for my personal website uses HTML rather than something like Markdown.

I don't think Markdown is going anyway, incidentally, or that it would be hard to process on my own if I needed to. But the HTML I use is simple enough and Markdown only decreases the probability the site will last a long time.


Markdown and/or markdown processors are known to change.

Since there's no single Markdown spec, determining just how a page will render, or what will break, is a bit of a crapshoot. And since Markdown treats nonparsable markup as ... plain text, you don't even get errors or other indicators of failure. You've got to view and validate the output manually or by some other means.

With formal tag-based markup languages (HTML, SGML, LaTeX, DocBook, etc.) you've at least got 1) an actual markup spec and 2) something that will or won't validate (though whether or not the processor actually gives a damn about that is another question, hello, HTML, I'm looking at your "The Web is an error condition": https://deirdre.net/programming-sucks-why-i-quit/)

I can't find the post at the moment, but someone recently wrote a cogent rant on the fact that a change in their hosting provider (GitHub via a static site generator IIRC) had swapped out markdown processors, with changed behaviours, rendering (literally) all their previously-authored content broken.

Which is indead a pain.

I personally like Markdown, and find it hugely convenient. For major projects though, I suspect what I'll end up doing is starting in Markdown, and eventually switching to a more stable markup format, which probably means LaTeX (HTML has ... proved less robustly stable over the 25+ years I've worked with it).

Though for simple-to-modestly-complex documents, Markdown is generally satisfactory, stable, and close enough to unadorned ASCII that fixing what breaks is not a horribly complicated task.

Up to modest levels of scale, at least.


I appreciate your reply. Seems Markdown is more complex than I recognized and this just makes me want to avoid it more. If you do find the rant you mentioned, let me know.

> HTML has ... proved less robustly stable over the 25+ years I've worked with it

The first website I made in 2002 still views fine in a modern browser. I didn't do anything fancy, though. I would be interested in what has been unstable as it might give me ideas on what to avoid in HTML.

I don't find HTML to be that much harder than plain text or Markdown so I think I'll keep using it for smaller projects. LaTeX is worth considering as well, particularly given that I will have math on some of my webpages. One issue is that the stability of LaTeX depends strongly on which packages you use. I need to take a closer look at the health of every package I use. I think avoiding external dependencies is easier with HTML.


My sense is that Markdown is probably pretty safe for most uses, particularly if you control the processing. If not, then yes, it can bite. For me that means pandoc to generate endpoints such as HTML, PDF, etc. I'm fairly confident that most of that toolchain should continue to work (provided computers and electricity exist) for another 2-4 decades.

For certain more complex formatting, Markdown has limitations and features are more likely to change. But I've used Markdown to format novel-length works (from ASCII sources, for my own use) with very modest formatting needs (chapters, some italic or bold text, possibly blockquotes or lists), and it excels at that.

For HTML, it's a combination of factors:

- Previous features which have been dropped, most to thunderous applause. (<blink>, <marquee>, etc.)

- Previous conventions which have largely been supersceded: table layouts most especially. CSS really has been ... in some respects ... a blessing.

- Nagging omissions. The fact that there's no HTML-native footnoting / endnoting convention ... bothers me. You can tool that into a page. But you can't simply do something like:

    <p>Lorem ipsum dolor sit amet.
        <note>Consectetur adipiscing elit</note> 
        Nulla malesuada, mauris ac tincidunt faucibus</p>
... and have the contents of <note> then appear by some mechanism in the rendered text. A numbered note, a typographical mark ( * † ‡ ...), a sidenote, a callout, a hovercard, say.

In Markdown you accomplish this by:

    Lorem ipsum dolor sit amet.[^consectetur] Nulla malesuada, mauris ac tincidunt faucibus

    [^consectetur]: Consectetur adipiscing elit.
Which then generates the HTML to create a superscript reference, and a numbered note (when generating HTML). Or footnotes according to other conventions (e.g., LaTeX / PDF) for other document formats.

- Similarly, no native equation support.

Maybe I'm just overly fond of footnotes and equations....

But HTML and WWW originated, literally, from the world's leading particle physics laboratory. You'd think it might include such capabilities.

- Scripting and preprocessors. I remember server-side includes, there's PHP, and JS. Some browsers supported other languages -- I believe Tcl and Lua are among those that have been used. Interactivity and dependency on other moving parts reduces reliability.

The expression "complexity is the enemy of reliabilty" dates to an Economist article in 1958. It remains very, very true.

HTML is for me more fiddly than Markdown (though I've coded massive amounts of both by hand), so on balance, I prefer writing Markdown (it's become very nearly completely natural to me). OTOH, LaTeX isn't much more complex than HTML, and in many cases (simple paragraphs) far simpler, so if I had to make a switch, that's the direction I'd more likely go.


I agree with you entirely on the abandoning of conventions with HTML. I haven't paid much attention to multi-column layouts in CSS over the years but my impression is that it's gone from tables to CSS floats to whatever CSS does now that I'm not familiar with. Browsers are typically backwards compatible so this isn't that big of a deal to me. But I have no idea if what's regarded as the best practice today will be seen as primitive in 15 years.

> The fact that there's no HTML-native footnoting / endnoting convention ... bothers me.

I've seen people use the HTML5 <aside> element for sidenotes, styled with CSS. Some even make them responsive, folding neatly into the text as the viewport shrinks. I'm not sure if this is the intended use for <aside> but the result is reasonable and I intend to do the same. If you're set on footnotes, though, yes, I don't know a native implementation.

Equation support with MathML is okay in principle but not practice. I'd like to have equations without external dependencies (MathJax's JS alone is like 750 kB!), but that's not possible until Chrome decides to catch up with Firefox and Sarafi on MathML. I've been thinking about just using MathML as-is (no external math renderer), and if Chrome users complain, I'll tell them to get a better browser. ;-) Maybe that'll help some Chrome users understand why they should test their websites in other browsers.


Semi-relatedly, I think even the linear form of UnicodeMath [1] is very readable, and it would be great if there was more support for building it up into nicer presentation forms in the browser wild (MathJax has had it on the backlog since at least 2015, for instance), as that seems to me to be a better "fallback" situation than raw MathML given its readability when not built up.

[1] http://www.unicode.org/notes/tn28/UTN28-PlainTextMath-v3.pdf

> I haven't paid much attention to multi-column layouts in CSS over the years but my impression is that it's gone from tables to CSS floats to whatever CSS does now that I'm not familiar with.

CSS Grid [2] is the happiest path today. It's a really happy path (I want these columns, this wide, done). CSS Flexbox [3] is a bit older and nearly as happy a path. Some really powerful things can be used with the combination of both, especially in responsive design (a dense two dimensional grid on large widescreen displays collapsing to a simple flexbox "one dimensional" flow, for example).

Flexbox may be seen as primitive in a few years, but Grid finally seems exactly where things should have always been (and what people were trying to accomplish way back when with tables or worse framesets). Even then, Flexbox may be mostly seen as primitive from the sense of "simple lego/duplo tool" compared to Grid's more precise/powerful/capable tools.

[2] https://caniuse.com/#feat=css-grid

[3] https://caniuse.com/#feat=flexbox


Thanks for mentioning UnicodeMath. That does seems like a better fallback solution than raw MathML. It appears there's a newer version of the document you linked to that was posted on HN, by the way: https://news.ycombinator.com/item?id=14687936

I'll also look more closely at CSS Grid.


Thanks for mentioning grid, as that's a tool I've not looked at myself.

CSS columns and Grid are not entirley substitutable, though they share some properties.

I see Columns as a way of flowing text within some bounding box, whilst Grid is preferred for arranging textual components on a page, more akin to paste-up in Aldus Pagemaker (am I dating myself) though on the rubber sheet of the HTML viewport rather than on fixed paper sizes.


Yeah, they are very different things. One is for text/inline flow and the other block flow. As a fan of CSS Columns (multicol) I hope that the interaction between Columns and Grid gets better standardized. (In my case I wanted better support for embedding Grids in columns; my tests worked in everything but Firefox. So it is interesting to me that Firefox seems the most interested in pushing multicol forward as a standard [1], since it stopped being a Trident/Spartan priority when Windows 8.1/10 abandoned multicol as a key UX principle of Windows 8 apps.)

[1] https://hacks.mozilla.org/2019/11/multiple-column-layout-and...


CSS columns are actually ... mostly ... pretty useful:

https://developer.mozilla.org/en-US/docs/Web/CSS/CSS_Columns...

My preference is to use them with @media queries to create more or fewer columns within auxiliary elements (headers, footers, asides), usually to pretty good effect.

Multi-column body text is largely an abombination.

For images, I'm still largely sticking to floats.

I've done some sidenote styling that I ... think I like. I don't remember how responsive this CodePen is or isn't though I've created some pretty responsive layouts based on it:

https://codepen.io/dredmorbius/full/OVmKZX

I consider equation support a lost cause.


I feel like the difference with Markdown is that it's not meant to be a hidden source format. It's meant to take an existing WYSIWYG styled-text format—the one people use when trying to style text in plaintext e-mail or IM systems—and to give it a secondary rendering semantics corresponding to what people conventionally think their ASCII-art styling "means."

If a Markdown parser breaks down, it's quite correct for it to just spit out the raw source document—because the raw document is already a readable document with clear (cultural/conventional) semantics. All a Markdown parser does is make a Markdown-styled text prettier; it was already a readable final document.


Whether or not it's intended to be a hidden source format, the fact remains that if it does not render reliably and repeatably, it's failing to do its job.

Markdown's job is to be a human-readable, lightweight, unobtrusive way of communicating to software how to structure and format a document.

It's one thing for a freshly-entered document to fail -- errors in markup occur and need to be corrected. It's another to change the behaviour and output of an unchanged text, which is what Markdown implementations have done.

(I've run into this myself on Ello where, For Mysterious and Diverse Reasons, posts and comments which I'd previously entered change their rendering even when I've not touched the content myself. This is compounded by an idiotic editor which literally won't not muck with plain ASCII text entered and insists on inserting or creating hidden characters or control codes. Among the reasons for my eventual disenchantement of what would otherwise be an excellent long-form text-publishing platform.)


> Markdown’s job is... communicating to software

No, that’s a misunderstanding. Markdown is, as I said, a formalization of existing practice. Nobody’s supposed to be “writing Markdown” (except computers that generate it.) You’re supposed to be writing plaintext styled text the same way you always have been in plaintext text inputs. Markdown is supposed to come along and pick up the pieces and turn them into rich text to the best of its ability. Where it fails, it leaves behind the original styled text, which retains the same communication semantics to other humans that the post-transformation rich text would.

The ideal Markdown parser isn’t a grammar/ruleset, but an ML system that understands, learns, and evolves over time with how humans use ASCII art to style plaintext. It’s an autoencoder between streams of ASCII-art text and the production of an AST. (In training such a system, it’d probably also learn—whether you’d like it to or not—to encode ASCII-art smilies as emoji; to encode entirely-parenthetical paragraphs as floating sidebars; to generate tables of contents; etc. These are all “in scope” for the concept of Markdown.)

In short: you aren’t supposed to learn Markdown; Markdown is supposed to learn you (the general “you”, i.e. humans who write in plaintext) and your way of expressing styles.

If there’s any required syntax in Markdown that a human unversed in Markdown wouldn’t understand at first glance as part of a plaintext e-mail, then Markdown as a project has failed. (This is partly why Markdown doesn’t cover every potentially kind of formatting: some rich-text formatting tags just don’t have any ASCII-art-styled plaintext conventions that people will recognize, so Markdown cannot include them. That’s where Markdown expects you to just write HTML instead, because at that point you’ve left the domain of the “things non-computer people reading will understand”, so you may as well use a powerful explicit formal language, rather than a conventional one.)


Interesting viewpoint, though not one that persuades me.

At least not today ;-)

Human expression is ultimately ambiguous. In creating some typographic output, you've got to ultimately resolve or remove that ambiguity. Preferably in some consistent fashion.

There's an inherent tension there. And either you live with the ambiguity or you resolve it. I lean on the "deambiguate" side. Maybe that means using Markdown as a starting point and translating it ultimately to some less-ambiguous (but also less convenient) format, as I've noted.

But that means that the "authoritative source" (Markdown manuscript) is not authoritative, at least as regards formatting guidelines. Whether or not this is actually a more accurate reflection of the status quo ante in previous, print-based, typographic practice, in which an author submits a text but a typesetter translates that into a typographic projection, making interpretations where necessary to resolve ambiguities or approximate initial intent, I don't know.

Interesting from a philosophical intent/instantiation perspective though.


It is exactly the "guess the future" problem that static sites avoid.

The vast bulk of software goes unsupported in less than 25 years. If you want to depend on something that long, you can guess which package will survive that long, or you can store your data in formats that the widest array of tooling supports.

If you drop into a coma after uploading your static HTML and wake up in 25 years, you might have to use whatever fills the text-manipulation-scripting niche then to beat it into the right shape to import into whatever kids these days are using.

If you used Wordpress, well, maybe it takes over the world, maybe it ends up a Wikipedia entry. (Putting aside, of course, that your site began hosting cryptominers a week after you slipped into that coma because you missed an update.)


> It's an interesting line of thinking though: exactly what properties of HTML make it so long lived?

I've thought about this on and off for a few years. Here's what I've come up with:

1. Popularity. You can't really display anything in a web browser without it, blank pages with one AJAX script notwithstanding.

2. Ease of use. Open a text editor, type some markup, save the file with .html, and open in a browser. When you're done, transfer to a server to show the world. That's a pretty straightforward process.

3. Well-defined, open standard. Every important piece of the web is defined, from the markup to the protocol to transfer it. I think that reasonably bug-free implementations of those standards help.


I'd argue your #3 is wide of the mark.

It's not that there's a well-defined open standard.

It's that browsers will eat any old crap that's thrown at them and turn it into something plausible, if not precisely what the author intended or reader really wants.

Yes, there's a standard, and yes it's open. It's observed far more in the breach, as a few minutes with a validator on well-known sites will demonstrate.

Your comment alone (prior to my response to it) returns:

    Tidy found 21 warnings and 0 errors!


Even worse, the browsers could not handle standard html.

HTML was based on SGML and it has all the nice SGML features. Something like <title/Hello World/ was valid HTML afair.

But then the browser never implemented it properly, so html5 just describes the behaviour of the browsers.


>It's that browsers will eat any old crap that's thrown at them and turn it into something plausible, if not precisely what the author intended or reader really wants.

Reminds me of the fairly prescient "In Praise of Evolvable Systems" essay from 1996: https://web.archive.org/web/20190409041249/http://www.shirky...


I think that the right way to rephrase that is "use the right tool for the right job".

How many blogs are powered by wordpress? How many of them can be replaced with a static gen?

HTML has a lot of garbage, but at least it's very hard to break it.


Your comment merges well with one slightly above from TheFlyingFish, It's that browsers pretty good at displaying stuff even when html is not to spec.

and really it's that everyone uses browsers that still display text and such on the screen even if it's broken in several places.

This could change if google decided to stop showing pages with broken html - like them killing flash big cuts at a time.

I have turned several worpress based sites into static html with one of the static html making plugins - and that turned those tools into the right ones for those jobs. I think most WP sites can be converted and be just fine, most people don't add new posts to them regularly from what I've seen.


> exactly what properties of HTML make it so long lived?

I think we're thinking about this backwards. It's not anything inherent to HTML that make it long lived, it's that the code to parse static HTML is simple, it's more or less standardized and has stuck around for a long time.


Back in 2001 I redid the UWA computer club website (https://ucc.asn.au/) using XSLT with a custom doctype ('grahame').

In the early 2000s XML was the cool shiny thing. They're still using it, in fact I found out recently that someone wrote a Markdown to 'doctype grahame' converter to 'modernise' the site.

I guess what I actually built back then was an early static site generator, but it's still kind of cool they're using it 19 years later, hacky as it was / is :)


I still use my old XML doctype with xslt to produce some websites I maintain. Whenever, if ever, xslt is removed from browsers, converting it to a static site generator will be easy.

I regret nothing. Editing simple xml using Emacs is a breeze.


Last time I worked with XSLT was in 2014, redesigning a major Brazilian airline reservation and ticketing system. At the time their passenger service system (Navitaire New Skies [1]) had already switched their white label front-end app from a home grown XSLT web framework to ASP .NET MVC 5, but the company I was working for wasn't particularly interested in paying the (higher) fee for using the "new" front-end framework.

[1] https://www.navitaire.com/new-skies-reservation-system


There is an obscure search engine called wiby.me that only indexes pages like what is posted here. I used to design websites in the late 90's, and very much miss the simple HTML pages of yore.


Thanks for that suggestion. I'm glad it's a three-day weekend!


I don't think this is really unique to Web 1.0; certainly something that still works from the Web 1.0 days seems "impressive" just because of the passage of time, but there's probably some element of survivorship bias there. You mention ColdFusion as an example, but this guy's site is made using FrontPage. He didn't know in 2001 that he'd still be able to run FrontPage in 2020. He made a bet, and it paid off. Other people made similar bets, on other technologies, and unfortunately got it wrong.

My personal website uses Jekyll, and while there's always the possibility it would become abandoned and stop working (I've definitely found someupgrades to be a pain, and ruby tooling in general doesn't help either), I'll always have the simple, readable markdown files the site is based on. While this wouldn't be an option for a non-technical website author, if I really had to, I'm sure I could write a simple markdown->html renderer over a weekend (or a converter to transform it into the future format-du-jour).


I been using ColdFusion for over 20 years. The HTML it creates can be as simple or complex as the developer intends. The output can last decades without updating, I don't understand your comparison.


I recently changed jobs to a shop with 15-20 year old ColdFusion+SQL instances that were originally HP3000 Image Databases and COBOL screens. At first I laughed, now after a couple years, I agree ColdFusion is pretty robust and its HTML isn't bad at all. Its easy to do fairly complex forms, file operations, email generation, document generation, database in-out things, and it just keeps running and running. Its easy to read and understand and even though we're using MX-era code, the server still installs and runs on recent (Ubuntu 16.04LTS) Linux with no issues.


The issue isn't the HTML it renders, it's finding a hosting provider that supports running it server-side.

Of course you could run it yourself, but maintaining a server for a basic blog or personal site arguably exits the realm of "simple."


For a blog or personal site (that didn't have any functionality that really needed a back end) I suppose you could just scrape the generated pages and push them up to any host, but CF seems like a fairly awkward static site generator compared to the usual suspects like Jekyll, Hugo, etc.


They weren't talking about the output. They were talking about availability of hosting.


> Looks like Web 1.0 got something right after all :)

The secret is creating a standard early on that thousands of different pieces of software depend on, so that changing it would expensive and require a phenomenal amount of decentralized coordination.

Don't worry about making it good -- just make it good enough that people won't want to tear their hair out and unanimously agree to never touch it again. Make the short term cost of applying hacks on top of it low, and the cost of throwing everything out high.


It also helps if you self-host, I have personally self-hosted a subversion repository with my own projects using Trac [1] for 10 years now.

[1] https://trac.edgewall.org/


Trac has been end of life for some time. It doesn't run on python 3. There are open bug tickets about it that have been stale for years.

Maybe it will be upgraded now that python 2 is officially dead, but given it wasn't so far and there was no effort in that direction, I wouldn't bet on it.


Is "EOL" the right term? They released a new version 2 days ago:

https://trac.edgewall.org/wiki/TracChangeLog


I find it odd too that they did some minor releases, yet python 3 was not on the radar.

End of life is correct. It is end of life since it doesn't run on current platforms.

I am not sure if the latest distributions (Ubuntu, Debian, RedHat) have all removed python 2 packages. If not, it will be gone with the next major release. You're going to be in trouble to run software with no available interpreter, plus all the libraries in use are effectively abandoned.


> I am not sure if the latest distributions (Ubuntu, Debian, RedHat) have all removed python 2 packages. If not, it will be gone with the next major release

Red Hat has not. Ubuntu has not in its most recent stable release. Debian "unstable" is still using Python 2, so I don't think your statement holds up.

https://distrowatch.com/table.php?distribution=redhat

https://distrowatch.com/table.php?distribution=ubuntu

https://distrowatch.com/table.php?distribution=debian

Also, Trac developers ave been making progress on Python3 as recently as 8 days ago:

https://trac.edgewall.org/ticket/12130


The tables show RHEL 8 and Debian 10 are setup with python 3 out of the box.


Most probably it is EOL, but it still works for me, I only need it to browse the code from time to time and to look at some past commits.


    Good Practice: Use the least powerful language suitable for expressing information, constraints or programs on the World Wide Web.[0]
[0]: https://www.w3.org/2001/tag/doc/leastPower.html


I think I should have said "don't base it on something that CAN'T last". This requires no future knowledge. We know that a WordPress version and its supported PHP will be obsolete.


I've been running a WordPress blog since early 2006 with very little maintenance. I'm sure it's all outdated again, but still appears to work.


If you haven't been keeping the WP back end up to date it's not functionality that's a problem it is security. Unpatched WordPress installs account for a huge portion of malware distribution. There's a number of exploits that allow attackers to upload files to your server. So they upload malicious payloads that exploits then download to infected systems.


> If you haven't been keeping the WP back end up to date it's not functionality that's a problem it is security.

I'm aware. It got hacked in like 2007 but (I think?) never since. I run some other stuff on that box and sometimes look at the resource usage etc.


Most of those exploits are from plugins. If they aren’t using those they can also change the default login url. Also Wordpress lets you export and reimport to current versions without coding. I think it’s one the best future proof platforms, most of the web still runs on it.


Many WordPress exploits are in plug-ins but there's still plenty in the base install (over multiple versions).

Also suggesting that "most of the web" runs on WordPress is a bit absurd. WordPress accounts for a huge portion of spam-y SEO blogs and other outright noise on the web. It's popular no doubt but definitely not most of the web.

It's popularity and porous security is a big problem as it's such a huge malware delivery vector. Everything from worm payloads to JavaScript crypto miners is served up from millions of exploited WordPress installs.


Oh man, don't I know it. I work for a small business whose long-neglected Wordpress site (nothing e-commerce-ey, almost no plugins, just a glorified billboard/contact-info type site for a non-tech company that no-one had updated in literally years) had been exploited in uncountable ways. It had probably been owned long before I was even hired a year ago. A few months ago it just broke, it was too riddled with problems to salvage.

I was able to convince the bosses to let me take on the fixing-the-site project solo, even though my job has little do with IT. I replaced it all with a static site generator I wrote in Go. No logins, no PHP, no database, nothing to exploit in the first place. Anyone in the office can update it by copying images into arbitrary subfolders in the generator's images folders, and double-clicking the update executable. It builds and uploads a fresh site in a couple of minutes with nice gallery carousels. And as a bonus it loads basically instantly on even the bargain-basement shared hosting we're on.

I do wish that IE compatibility wasn't one of the bosses' firm requirements, due to a lot of our clients not being tech people and still using IE on decade-old computers. Life would be so much simpler if I could just use CSS grids for layout. I f'ing love grids.


"WordPress is used by 62.5% of all the websites whose content management system we know. This is 35.8% of all websites." - https://w3techs.com/technologies/details/cm-wordpress

It's okay if you hate it, but these are the stats


Wow, TIL that "36%" is most. Your own quote tells you that their measurements are only sites they scan and can determine the CMS used. As I said, WordPress is extremely popular in the SEO spam community and powers thousands of dead blogs, but it's a far cry from powering "most of the web".

None of that is material to the original point that thousands upon thousands of unpatched WordPress sites might work but also deliver tons of malware. WordPress' popularity is problematic because it has had a d will keep having serious security problems. WordPress exploits are entirely automated and performed constantly by zombie networks.


Use static HTML and javascript, I don't think they'll break js compatibility in a long long time, ES3 is still well supported and it's out in 1999.


To be fair keeping around an old runtime of FrontPage isn't much different than keep around an old runtime of PHP.


> If you want something to last, don't base it on something that won't last.

Sounds similar to the Lindy Effect.[0]

[0] - https://en.wikipedia.org/wiki/Lindy_effect


I've been using my Wordpress site for 12 years now. Sure I upgrade versions from time to time, but the original post is still there and works perfectly.

It's had around 8 million page views in that time.


> build a static HTML page.

No way in hell today’s HTML will survive 25 years now that google owns it, browsers will literally crash due to lack of user tracking. Best just host a static txt file.


Nah, the next iteration would be called GHTML, it will include fact checking by Google AI and Google Analytics by default, all for free. Everyone will use it otherwise Chrome will give you strange security errors and after all you wouldn't want to use the web that is full of fake news and other content that can be offensive to someone. /s


This is the internet as I remember it.


Alas, we're not spring chickens anymore... But isn't nostalgia a great feeling? (Frontpage was my very first introduction to a WYSIWYG editor.)


:)


I have been running an older wordpress that I hacked up nicely. If you keep php below 5.5 no problems. Even the old mysql vs sqli still works great.


What you said made me glad that I've just developed DocxManager (https://docxmanager.com) - its concept is like WordPress (document-focused editor, themes and templates) but it generates standard html/css/js, and use Word as the document editor.


> If you want something to last, don't base it on something that won't last.

The ancient Egyptians knew something about this.


Fortunately there are plenty of static site generators. Frontpage is out of support and likely the currently popular generators will meet their end one day as well. But even then you can still run them in the future, and their output should not have any major issues (unlike a CMS which might get hacked if it's not kept up to date).


Or use free/open software, preferably popular free/open software that runs on a popular OS.

Even if the OS and software effectively die, you still would be able to run the latest version in a VM or in emulation.


this is why perl, txt, and static html are the top-level languages and formats for my project.

with some tweaking, it can be viewed in just about any browser, down to ie3, lynx, mosaic, netscape, and opera3.


You realize that you’re talking about a FrontPage site. The very definition of building something on a technology that won’t last.


but look at the code... are modern website really better? i think that FrontPage did an awesome job there.


There were all sorts of FrontPage server side extensions to IIS that you could use to support your site if I am not mistaken.


FPSE was mostly about publishing; think a proprietary version of WebDAV. So it was used by FrontPage itself, but not by the HTML produced by it.


Static HTML is also very secure. My personal sites are always static, it keeps server costs low and everything gets cached.


My olde PHP sites are still running just fine.


I have couple of sites I built in 2005 with plain procedural php and they are still running absolutely fine today on php 7.4 with no major rewrites.


Hmm, I wonder if certain javascript based site generators will still be around in 20 years..


> Can you imagine running the same WordPress version for 25 years?

If you keep active on your Wordpress install, the regular updates will be no issue for you and will (almost) never break your website. Not sure why you would expect a regular Wordpress user to run the initial install without recommended/mandatory upgrades over a long period of time.


Followed to its logical if not very practical conclusion, we all wind up on gopher...


My 23 year old web site: https://jgc.org/ It's still updated from a Perl script that generates static HTML.


So I wanted to link my 23 year old website, but precisely because it's still updated, it looks like this: https://www.stavros.io/

It's gone through many renames and redesigns, but, in true Japanese style, it's still the same website. I do have an old snapshot, though: https://anonymoussoftware.stavros.io/


>> Greek. Amateur F1 driver. Technology enthusiast. Single parent. Liar.

thats quite a conversation starter...


Yes, everyone always has lots of fun trying to spot the lies.

Hint: "Liar" is the lie.


Long time listener, first time caller: You're into auto racing?! Link some stuff?


No, it's a joke! Because it says "liar" afterwards. And the previous post was also a joke, ie that I was lying about being a liar, which is a paradox.

I am a below average driver, alas. I am Greek, though, and some would say that's more exciting!


Oii, what the fuck, hahahaha....

From what I've heard, you might as well smash your side mirrors off at the dealer when you buy a car, aha...so, yeah, checks out.


(In Greece, to be clear)


That is a vile and noxious rumor. My mirrors have never been touched in the twenty years I've been driving.


Good to know! Will scratch that from my list of stereotypes, haha - be well, man.


You too! It's interesting, though, my friend who now lives in the US always remarks on how, in the US, you can just drive mindlessly on autopilot, whereas here you need to be paying 100% attention at all times due to people always swerving in and out of stuff.

I hate driving here.


Similarly, I feel a continuity in how old my website is, but in Ship of Theseus fashion lots of little parts changed over the years (and pieces were lost to storms, etc). I was really excited at one point to find an old time capsule of a snapshot from a particular redesign I recall being fond of around 1999: http://worldmaker.net/wmo99/

Amusing to myself and contributing to overall Ship of Theseus analogy, the current design is a responsive, flexbox-based recreation of sorts of the original goals for that 1999 site. I'd like to think the 1999 version of myself would very much appreciate it (especially after all the work in making corner GIFs versus the magic of CSS border-radius, and fighting TABLEs for layous).


Imagine stumbling upon an old-fashioned website, reading it is maintained by the CTO of Cloudflare, of all people.


I think from time to time that I should completely change it and update it. But I can't really be bothered, Cloudflare takes a lot of my time.


I shouldn't bother mate. It is fast and the content is easily accessible. I've just spent a happy half hour browsing your blog. It looks like you'll be "needing" a 3D printer soon to really waste time on building IoT stuff.

I have five different models of ESP8266 and ESP32s scattered across my desk along with Dupont wires, assorted sensors and a soldering iron, breadboards etc. Its a great way of taking your mind off the daily grind - my job title is MD.


https://imgur.com/a/DCum3Ur here is how it render on my 14" screen :/


Haha that's pretty awesome. I have dedicated this year to simplicity and building my company's new website in Hugo with plain vanilla javascript.


I guess you have some sort of "classic car" feelings towards it? How often does the perl script get updated?


Whenever I need to update the site.


My oldest content is from 96 or 97, but the page only exists offline. Hand written, of course, in pico - back then I didn't really know the web existed we just had www/ directories on our Uni's Unix accounts. Around '97 I updated to have frames, then '98 I think was SSI.


Ahh pico! Good times, Good times.


I'm going to SSG after playing enough with WordPress and other CMSes...


My personal site was posted on September 12, 1999, is still updated, and has no problems. It;s a static site that mostly uses straight HTML/CSS. There are a few scripts that generate pages, but generating HTML/CSS pretty easy. https://dwheeler.com.

Geocrasher said:

> I guess what I'm saying is that if you want to build a site to last 25 years without numerous redesigns, build a static HTML page.

Yes. I don't get paid to maintain my personal site, so simplicity and longevity are most important. If I have to rewrite things because of incompatible changes in the infrastructure components (e.g., Python2 to Python3), or because proprietary company C has decided to stop supporting product P that I depend on, then I have to spend time that doesn't actually provide any new value. Keeping things simple, and minimizing dependencies, can be useful. Like everything else, there's a trade-off.


You might say your website was designed to last[1]

1. https://jeffhuang.com/designed_to_last/


No kidding. Already, I've found some interesting articles to read.

I'm taking a look at https://dwheeler.com/essays/easy-cross-platform-gui.html, which has references to XULRunner etc. which since 2009 have fallen out of favor.

Would you continue to recommend those wanting to invest in (for the 80% of use cases) wxWidgets for FLOSS cross-platform GUI apps? BoaConstructor et. al look interesting.

Thanks for taking the time to look at this comment. If it helps give you some context, I'll throw in that I currently am most familiar with WinForms .NET apps or very small Win32 native applications, and have avoided JS successfully so far.


A lot of that stuff is overtaken by events, but I clearly say that the essay was written in 2009. Nevertheless, if you wanted to see what I wrote in 2009, there it is. It hasn't disappeared from The Ether, there's a disturbingly large amount of information that was written only a few years ago and has totally disappeared. One of the reasons that much information has disappeared is because the website can no longer stay running. If your website is designed to last, then the information is more likely to stay available. Yes, I know it's more complicated than that. But it's a start.


wxWidgets is a poor platform abstraction that just results in the lowest common denominator of UI.


I haven't seen any cross-platform widget APIs that allow you to build a MacOS toolbar, for example―at least among the popular APIs. You mostly can only specify that the titlebar and the toolbar should be merged. Qt can draw something, but it will look like a Qt toolbar, not a Mac one.

So I don't see how WxWidgets is an outlier here.


It's fine for small apps and in-company utilities and isn't hard to use at all if you're a c++ house.


Boa Constructor best RAD IDE ever! why does no one understand this?


You could use a static website generator such as Jekyll or Hugo. Then, if the tools stop working for any reason, you always have the generated HTML than you can update.


Hugo has a single static binary. So the site will always generate the same way with the same binary.


... with the same execution environment


They already are, hand-rolled: "There are a few scripts that generate pages"


Or you could not do that. Did he not just show that his way works just fine?


Pbatengf, lbh'ir qrpbqrq zl frperg zrffntr. Fbeel, ab cevmrf. ?


Answering your question will unfortunately defeat the purpose of that text being what it is.


spoiler alert

rot13


This. Notice how simplicity is also the most important thing for users. Typography and graphics aside, it's remarkable to see how modern your layout looks.



Once worked on an enormous, very popular site built by hand in Frontpage.

It had millions of pageviews, made over 6 figures a month in AdSense and been updated so often and for so long that the owner didn’t actually know how many pages there were. Had to hire someone just to index it.

Not bad for plain old html and css.


Ah, the old days of the web, when it was possible to make money via AdSense. Users would actually bookmark sites those days, so there was no need to throw an email signup popup in their face when the page loaded. The comments would have real people conversing, and not filled with spambots pushing fake Guccis and Air Jordans.


Comments went a downward spiral right about when facebook got popular. It canabolised the regular comments + the comments from other creatives/bloggers


As I spent half a day trying to wrangle my way through some sass grunt compiler frontend bullshit just trying to update the colour of some links on a client website, I find myself nodding sagely again. In the early days you could view source, see what was going on, copy and recreate someone else’s site, learn a whole bunch of new stuff and actually get shit done. Now, it’s all JavaScript bullshit and 100k lines of css. It’ll last about a month before it’s out of date and replaced by the Next Big Thing. HTML, css, a sprinkle of JavaScript. That’s what’s proven to last.


I'm not sure a FrontPage site would be much better. It's also a big mess of generated markup you'd have to go through manually, if you didn't have the proper FP version.


My favorite like this site is http://www.burger.com - this dude has a hilarious array of hobbies and awesome beveled button links.


He is a member of the Cherokee Nation with interests ranging from model railroading to fireflies. And he puts it all out there on his personal webpage, updated regularly since 1996. There was a time when it was normal to stumble upon pages like this one.


Have to wonder how much that domain name is worth...?


Probably expensive, but I doubt it's actually worth anything. I reckon it's much harder to build a brand when you start with a generic word.


Burger.com would be the brand


I can see it work for a delivery service type thing. For a few years, a large platform here just used "pizza.de".


You're probably right, but Burger Records is a moderately successful independent record label.



They mean USD $25,000 and not $25.


Knowing GoDaddy, $25 is closer to what they'd offer the domain owner for it. $25k would be the re-sale price.


Much of the world uses . for a number separator.

https://docs.oracle.com/cd/E19455-01/806-0169/overview-9/ind...


I sorta knew that. But it still feels weird for dollar amounts. As 25.000 looks a lot like 25.00 ($25 dollars, 0 cents).


From that document:> and some countries separate thousands groups with a thin space

Thin space (U+2009) is also how you are supposed to do it when using the SI unit system, according to the SI spec. For the decimal separator, the SI standard is to use which of '.' or ',' is customary.


However they should use the non-breakable space separator ' ' less error-prone


Doesn't make it smart.


"Webmaster" - I miss that term..


Ah, the time when you could actually "master" all the web technologies to keep a website up and running...


There's no reason why you can't slap some CSS (flexbox, cssgrid!) with some ES6 JavaScript linked on an index.html on a Netlify server...

Loads super fast.


You still can. We sure make things more complicated than they have to be sometimes.


I went from a page that said:

> My charge for typical business or civil work is $450.00 per hour.

To one that said:

> I am a relative newcomer to the world of turtles.

...in two clicks. I love this site.

Also learned a recipe for a quick and easy blackberry cobbler[0].

[0]: http://www.burger.com/bcobbler.htm


To this day I still visit www.scaruffi.com to read his opinions about music - the layout hasn’t changed since the 90s or even the 80s for some pages.


> or even the 80s

Probably not ;)

> Initial release 1993; 27 years ago

https://en.wikipedia.org/wiki/HTML


24 years of weekly quotes is actually really impressive.


And everything loads just about instantly haha


Ah papyrus oh how I missed you from high school.


that is the best last name in the history of the internet


But is it still running on Cern/3.0, installed circa 1993. Ours is:

  $ nc www0.cs.ucl.ac.uk 80
  HEAD /staff/m.handley/ HTTP/1.0                    

  HTTP/1.0 200 Document follows
  MIME-Version: 1.0
  Server: CERN/3.0
  Date: Fri, 14 Feb 2020 17:02:59 GMT
  Content-Type: text/html
  Content-Length: 9185
  Last-Modified: Sun, 16 Jun 2019 15:27:37 GMT
It's running on Sun Sparc hardware from the same era, and has been in active use for all of those 27 years.


I used to have a bunch of Sparc's circa-early 2000's. Sold some, recycled some, wish I'd have kept at least one. The price of a SparcstationLX that works is silly now-a-days.


https://developers.google.com/speed/pagespeed/insights/?url=... 100 Points. Mobile First. Better than React Native.


100 points on pagespeed is not that hard with static sites.

- drop 99% of the JS (PWA, lazy-loading, infinite scroll, jquery, you don't need any of them for a webpage), convert the remaining for 1% to vanilla js and use it as progressive enhancement.

- use EM or % as layout width/height

- inline css, js, and svg

EDIT

- no webfonts!

The only thing that'll remain as an issue are tables wider, than viewport, on mobile.

My site: https://developers.google.com/speed/pagespeed/insights/?url=...


Don't know about PWA. Service workers are pretty lightweight and help with caching.

But yeah, lazy-loading, infinite scroll, etc., are all designed to cover up design flaws that impact performance. I think lazy-loading can be potentially done right, but almost none of us do anything right.

> use EM or % as layout width/height

Why? EM/REM is good for handling font sizes, but for anything else it may not make sense and a custom font setting in the browser can break layouts if the size of boxes are based on font sizes. PX is perfectly adequate for layout, and is actually a relative unit(PX !== hardware pixel). Same for borders, padding, margin, etc. Even REM is better than EM for most cases. People who adjust the font size in their browser don't necessarily want their layout to change and potentially degrade as a result.

> inline css, js, and svg

Can be a good idea, especially if you can somehow identify the CSS used on page load and discard anything nonessential. Though maybe HTTP/2 makes inlining obsolete. IDK

> no webfonts!

Thank you! Web fonts are perfectly sufficient in 99% of cases.


> Why? EM/REM is good for handling font sizes, but for anything else it may not make sense and a custom font setting in the browser can break layouts if the size of boxes are based on font sizes. PX is perfectly adequate for layout, and is actually a relative unit(PX !== hardware pixel). Same for borders, padding, margin, etc. Even REM is better than EM for most cases. People who adjust the font size in their browser don't necessarily want their layout to change and potentially degrade as a result.

I had a lot of bad experience with px, but it is true that for borders it's the only reasonable choice.

REM is not that well supported, especially in awkward browsers (Dillo, for example).

Imo EM is nicer for padding/margin; it keeps the text/layout ratio even if the font is resized, unlike px.

But point taken, it cannot be used as the only unit.


Gee, maybe that's sort of an indication on how far backwards the user experience on web has fallen.


This is hilarious! Turns out 18 year old websites were mobile friendly after all


Have you actually tried the site on a mobile device? It's impossible to read the text and navigation is hell. Wouldn't classify that as "mobile friendly".


What do you mean? It has reasonable column width so you can zoom in and out as necessary, often with a double tap. That is, it leaves it up to the client to adjust as necessary -- in contrast with the typical mobile site, which:

1) Forces a particular size/resolution, locking out zoom capabilities

2) Has a floating header with a constant size relative to your device screen, blotting out the same real estate no matter how much you zoom. And, of course, using the same header pixel height for portrait vs landscape, making the latter practically unusable.

Yes, this site is better than 99% of mobile sites out there.

Edit: Some further comments: It's generally better to have a site that obeys the standards and thus plays nice with any client, than one that locks you into the hip designer's meth-addled decision. This site in particular works well with my extensions like VimFX for clicking links from the keyboard.


> so you can zoom in and out as necessary

That's the thing - it shouldn't be necessary. I don't do that on desktop browsers; why should mobile devices be any different? If the site is not legible, I set the desired zoom once and I'm done. There's no need to go back and forth.

Now, I can't do that if the elements stay mismatched on mobile. I have to zoom in and then back when I want to interact with small elements - each time. It gets old when I have to click tiny links or upvote arrows more than a few times. It's jarring and not a good user experience.

> And, of course, using the same header pixel height for portrait vs landscape, making the latter practically unusable.

The floating header issue notwithstanding (I don't like them either), this is exactly why the web designers should tailor their elements to different viewports.

> It's generally better to have a site that obeys the standards

Yes, and that also includes accessibility guidelines. Compatibility with keyboard navigation is one of them, but so is the size and spacing of the controls, links, and buttons.


True. Zooming in shouldn’t be necessary. And mobile sites shouldn’t use floating headers or footers. And they shouldn’t hijack interface modes. And they should pick sizes that don’t feel like a straitjacket or make it seem like you’re peering at it through blinds. And they should be accessible and standards compliant.

But I wasn’t comparing to the three or four mobile-optimized sites that satisfy all that. I was referring to the more typical millions that don’t.

And yes, given how bad they are, I’d much rather have suboptimal design that I can recover from with a pinch or a double tap than one I can’t.

What would you cite as an example of a mobile site that is more usable than this one?


> What would you cite as an example of a mobile site that is more usable than this one?

This one being the Italian one from the submission or the Hacker News itself? I'm assuming the Italian one. Off the top of my head, sites with good mobile version and similar structure (lots of text and links) are Wikipedia and The Guardian. One tiny nitpick are tables with multiple columns in Wikipedia articles. This is a common struggle and I haven't found an elegant way of showing data-dense tables on mobile screens.

But other than that, all the elements on both of these sites are big enough, I don't have to zoom in, I don't have to scroll sideways, there are no floating elements, and they have distinctive style. They even work well with a dark mode add-on on Mobile Firefox.


Yes, Wikipedia is one of the better ones! But even so, I still find myself switching to the desktop version whenever I need to link a specific section, or get more of the text in view, or view the article in a different language, or not lose my section when come to a page via the back button.

I checked out Guardian.co.uk (assuming that's what you meant), and yes, it is a pretty well designed site that works great on and doesn't seem to even distinguish mobile vs desktop users. But still, this is about the only mobile site I couldn't find anything wrong with. This isn't the typical 99% site.


What do you mean? The text isn't much smaller than on HN on my 5.5" phone. It's entirely readable.


HN is hardly a good example. It's terrible on mobile.


I find HN to be one of the best mobile websites. It's fast and text isn't massively oversized. You can actually fit information on screen. Many mobile websites I've seen are horrible to use, because they have very poor information density and the sites seem to be designed for 4" or smaller screens.


I tried to upvote you about HNs great mobile design, but the buttons are so small I accidentally downvoted ;)


Then zoom in, which most "mobile optimized" sites prohibit.

My further comments: https://news.ycombinator.com/item?id=22328505


It's not so bad if your mobile browser supports pinch to zoom.


Most mobile browsers support it out of the box. If the site designer set "user-scalable=no" in the meta viewport property, than that will prevent zooming. It's an accessibility issue and should be avoided.


iOS has ignored that meta tag for a while now.


I just tried, it works well (Chrome/Android on a 5.5"-something screen).


Just tried on iPad, the site works great. Did you mean ‘phone’ and not ‘mobile device’?


Would you also consider a laptop to be a "device" that is "mobile"?


You say that as if it’s weird to call an iPad a mobile device. Would you say that a tablet is not a mobile device? What do you define as mobile device, and what devices would you use to determine if a web site is “mobile friendly”?

The common definition of mobile device is phone or tablet. The common definition of laptop is computer. These despite the fact that phones and tablets are computers and despite the fact that laptops and even desktops can be moved. It’s pretty easy to find lots of examples of the common definitions. Here’s a good one: https://en.wikipedia.org/wiki/Mobile_device


Does that definition matter if tablets are probably less than 1% of all mobile devices?


Why Google something for two seconds when you can just speculate wildly? Tablets are almost 10% of mobile sales via my first search hit https://www.zdnet.com/article/smartphone-market-a-mess-but-a...

The question of whether definitions matter when one sub-category or subset is a minority or majority... I’m not sure how to answer that. Why would a definition stop mattering just because something different is a small subset? I must assume that a categorical term includes everything in the category. If you don’t mean everything in the category, then don’t use the term that refers to the category. If you mean phone, then say phone. ?? Right? I’m confused why you would argue anything else.


Depends on your audience. One of my healthcare web sites is almost 60% iPad, because doctors love them.


A landscape iPad has the same 4:3 aspect ratio as most PC monitors during that time (800px x 600px or 1024px x 768px)


ohhh so pedantic. Love it.


I am sorry, you’re totally right, is it pretty unreasonably nit picky to differentiate between 10 inch screens and 3 inch screens when it comes to web UX. Smart phones do count as all mobile devices and the only thing that matters when determining if a site is mobile friendly. And of course it’s usually a good idea to broaden the argument categorically to something larger than your personal experience to make the stronger point that most people would agree with you and the other person is obviously up in the night.


Back when Netscape 0.9 was new I had daily arguments with some of the "web designers" who insisted on using HTML targeting browser bugs and other invalid HTML tricks to optimize the aesthetics of their sites.

All you needed to do then, and today, is make sure your HTML is valid and that you don't break things on purpose ("this site optimized for MSIE1.0" type of stuff) and your site will forever be mobile and any-other-html-rendering device friendly.


Almost all of the performance-unfriendliness has been designed in. Especially for monetisation.


Just because mobile phones got same (or better) screen resolution than in 1990s/2000s desktop PCs


Pre frames and table-layout designs are pretty good, because they are dead simple HTML - and thus, they allow reflow, being responsive as a result.

This is not one of those.


The first wysiwyg editors (front page, golive, dreamweaver, etc..) were heavy abusers of tables.


In 2002, 1024×768 and 800x600 were mostly used resolutions for computer screens so yes, those were mobile friendly...


More like 18 year old websites are so rare that page rating services don't even work for them.


Sure they work! It's very easy to get 100/100 when you don't use any blocking CSS or JavaScript and all images are optimized for 56k modems.


> Unable to process request. Please wait a while and try again.


I'm surprised to see <marquee> still exists and works in modern browsers. And saddened to see it updates at ~20fps, at least on Safari.

Time for a smooth, GPU accelerated 60+fps marquee implementation?


While <marquee> support is near universal, I was sad to find out that <blink> has not fared so well.

I'm sure a lot of you already know this easter egg, but if you search "blink tag" in Google, Google makes all the blink tags actually work (using JS of course but still)

https://www.google.com/search?q=blink+tag


I simulate the <blink> tag on a 404 page I maintain.

But I use CSS, not JS.


Wait, how do the rest of you test if a field accepts or rejects html?


What field accepts or rejects html? What are you talking about?


If, say, an admin text field is supposed to have the ability to support and display HTML on an end users machine.


Here is a life-saver maintained by a 77 year young lawyer for a lot of public good: http://www.drtsolutions.com/. Case laws against SARFAESI, an Indian law that expedites bank recovery for non-performing assets. He updates it manually in FrontPage even today!


Wow, that page is amazing!

I hadn't seen the old Google logo in years: http://www.drtsolutions.com/drtqueries.htm The search widget doesn't even use an <iframe>. Just a plain <form>.


Page colors are ugly as shit, but so readable - modern web devs/people who make them do "modern" stuff, you suck compared to these sites in this thread.

Sorry, not sorry.


One of the most prolific and well-known music reviewers - Pierro Scaruffi - has a website built in 1995 with a design not updated much, or at all, since: https://www.scaruffi.com


Similar to Robert Christgau! https://robertchristgau.com/


He even reviewed one of my favorites albums launched recently: https://www.scaruffi.com/vol8/bentknee.html


I've come across this site many times when looking up different bands...the breadth and diversity of bands and genres covered here is truly amazing!


That's a great website, thank you.


> This website does NOT use cookies. Period.

Love it!


I sympathize with the author. I have built my maths site just 2 years after this one, when I was in high school. Ever since I've only been adding material, and occasionally moving old stuff into subdirectories; other than that, it's the same old geocities website made with FPE, except it's now hosted on a university server and has my academic title and office and no more colored background. Oh, and I now edit it with notepad++ and track it with git.

I've had plans to rebuild it for the last 8 years or so, to make it better and slicker and easier to navigate (as it stands, my new papers are mixed together with my scribe notes from undergrad). But I never figured out how to achieve this without also requiring javascript or relying on tools that may not survive the next decade and that I cannot tweak to my needs without learning a new programming language (hello Jekyll, hi Hugo). Nor did I ever find the Right Way how it should be structured; move one thing to the front and something else gets harder to find. I guess it will survive me.

Makes me a lot less judgmental when I see another academic website that can trace its lineage back to geocities and angelfire.


My favorite "Retro awful": Site:

https://www.lingscars.com/


I remember how she was ridiculed on Dragon's Den,yet she's the one employing a bunch of people and having a successful business. I remember reading that she's even hired someone to do some maintenance in the town,because the local council couldn't afford it anymore.


this one looks very retro but the code is actually quite modern. it has custom fonts, css animations, gradients, etc. and no tables.


Also, beneath the intentional craziness, it's very clever marketing.


It's like the visual equivalent of chiptunes.


It's impressive the amount of content inside! There are countless pages about literature, religion and physics. It's a good reminder of the original goal of WWW: share information.


> It's a good reminder of the original goal of WWW: share information.

That is what I miss the most about the old web. We wanted to build something better by sharing knowledge. And for a while we did. Then mainstream came and corporates took over.


Well, to be honest the corporations were always there. It's just when the marketers discovered "cyberspace" that everything went to hell.


But thats so not web 2.0/3.0. You gotta add more padding so we can create smooth scrolling. /s. I am a culprit of this too now btw.


In my ~17 year career as a professional web developer and consultant, I'm not sure that any technology has made me more frustrated and miserable than the days when I had to help people who insisted on using Frontpage to build their websites.


Remember the Front Page license.

Originally, Front Page had a four page license. It specified that if you use Front Page to create a web site, you cannot disparage Microsoft, Expedia and a list of several other Microsoft owned properties.

So with a license like that, I can't assume that any site created with Front Page is unbiased when it comes to a list of various Microsoft owned properties.

After the slashdot effect (long ago) Microsoft removed this from the license.


There is a search engine dedicated to finding "classic" websites:

https://wiby.me/

Click the "surprise me..." link to see a random one.


This is a fantastic site. I'm a little worried about using the "surprise me" feature at work, but I will return to this in my free time. I wonder what website attributes it looks for when it indexes.


This is brilliant! The first "surprise me" mad me smile :)

http://toastytech.com/evil/index.html


That could eat up some hours. I landed on a model airplane site and started getting sucked in before realizing I have work to do.


With the surprise me button I found Berkshire Hathaway website.

https://www.berkshirehathaway.com/

This is really cool search engine :D


There’s so many italian old italian sites with this design

My favorite in High Scool was http://ripmat.it

That site is the only reason I managed to learn Math school


Those backgrounds are fantastic. Just right arrowing through them... is a trip. http://ripmat.it/mate/a/ac/ac5.html


This site is very fast.


My personal web-page is from 1992 and updated occasionally. This page is preserved as it was in 1994: http://timonoko.github.io/alaska . It started as Gopher-page in 1992 and I just moved those associated pictures into it, without truly understanding formatting and all that shit. Some dudes in Usenet told me about <p> and <img> tags.


The biggest drawback of sites from this era is they don't reflow on mobile screens. On a desktop they still work as well as they ever did. I'm still searching for a good WYSIWYG HTML composer that can generate clean, responsive pages. Seems like this is a problem where there isn't sufficient incentive for the big tech companies to tackle, and the only s/w that seems to come close is BlueGriffon.


> I'm still searching for a good WYSIWYG HTML composer that

You need a dev team implementing Agile for react-native-ux with CI/CD capabilities and devops.


That's a client side problem, not a server side problem. The rendering and presentation of a webpage are entirely up to the client, absolutely nothing dictates that a page should look a certain way on a certain client.


There are a ton of attributes in the HTML that dictates how the website should render - it has a bgcolor and a margin on the body tag, a width and height on the main table, center tags all over the place, etc. Suggesting that a browser should ignore the HTML spec and do something else would completely destroy the web.


On the other hand, reader mode suggests that sometimes ignoring the way the site wants to be presented is a good thing.


That's a user derived choice. I'm not saying user's shouldn't be able to change the way a site is laid out if they want to. I'm saying that by default it should use the HTML spec.


User agents already can ignore css and have user defined stylesheets and even JavaScript. None if this is new and it's a failure of our profession that that is abnormal and produces spectacularly poor results for users.


Users should be able to override the default behaviour of their browser if they want to, but the default behaviour should still be defined by the HTML spec rather than the browser vendor.

It's really frustrating when two browsers implement important parts the spec differently and push website developers to work around the behaviour one browser or the other with browser-specific code. It doesn't lead to sites being rendered differently as developers embrace the variety of user agents. It leads to people adding "This site is best viewed in Netscape Navigator" gif or "This application requires Chrome" on log in pages. Those are bad things.


Maybe the solution is to let go of this notion that every device needs to render exactly the way the designers want and embrace that difference.

I've been on the internet since the mid 90s. I'm well aware of what was, andit was that way because people then as now wanted things to look and act exactly like they wanted and not embrace that not everyone wants or needs your carefully crafted graphical design.


I don't have any notion that sites should be exactly the same in every browser, but they should be approximately the same. Having two browsers render the same HTML in completely different ways would be very odd.


>Having two browsers render the same HTML in completely different ways would be very odd

Or completely normal? Why shouldn't I bump up my minimum don't size to help read text and reduce eyestrain? Ditto for fixing low contrast text.


You should, and the browser should let you. But if the text is set to 14px in the HTML or CSS and your browser is set to display it at the defined size, it should display at 14px. Browser vendors shouldn't decide to ignore the spec of using the defined size and do their own thing instead. That's what was being suggested.


Then again, the whole page is laid out in a table element, so I could see why mobile browsers won't reflow it perfectly.


Well, I'm not so sure -- have you tried viewing that site via a WAP proxy? That's the 2002 way to solve the problem.


My 83 year old dad has updated his website since the mid 90s, it's actually pretty interesting to look back it in the Wayback machine, the design is exactly the same in 97. He'd really get a kick out of it if someone commented on one of his articles. http://aoi.com.au/


Check out my Dad's from 2002. He's still using it as an e-commerce site, regularly getting orders and directing customers to it.

Deleted URL thanks to friendly advice


hmm you might not want people to know about your dad's e-commerce site built in 2002...

EDIT: checked the website, "add to cart" sends you directly to paypal and there doesn't seem to be any account system.

Still you should be careful with which communities you share this kind of information, hint: think of the H of HN


Good point. Appreciate it


No problem, I know how it is to just want to talk about a personal story without thinking about the information you're letting out for the "public" to see :D


My father still updates his website with Frontpage (he had a recent 6 month outage because he inadvertently deleted the Windows XP vmdk on his system, but I recovered that for him recently.) He’s 75, and isn’t interested in converting or learning anything new at this stage.

The funny thing is for years his home-made site was the top google hit if you searched for “hill's criteria” (See Hill's criteria of causation). His site is http://drabruzzi.com/


Hmm, how do you manage to acquire a legal of copy of Frontpage (even Express) these days for your dad?


The later versions "Sharepoint Designer" are free from Microsoft and still available on their downloads page.

Frontpage may be old enough to consider it 'abandonware,' a quick Google for "free frontpage" has a lot of downloads including one from Kean.edu with an embedded key.


He still has the install disk! Though my repair was to pull the vmdk off his old, broken laptop to get him working again.


Was FrontPage ever sold digitally? I wouldn't be surprised if my parents still had their disc copies of Office 95-XP (some even legal thanks to PC bundle deals).


Most don't (shouldn't?) care about the legality of 20+ year old software that's no longer being sold.


My website is still as of 1999, but it received some design updates (and a blog section) two years ago. However, there's still some original content, some even older than the particular website. E.g., see this 1998 demo for what we may now call a single page app, entirely rendered in JS from central data files (but using frames – well, it was the 1990s):

https://www.masswerk.at/demospace/relayWeb_en/welcome.htm

Slogan: "Microsoft keeps talking about Active Server Pages – We're offering Active Client Pages"

Mind the charts section, rendering graphs by outputting tables with tiny images using `document.write()`, since the canvas element wasn't even dreamt of. (Displaying charts was a tricky business, then. Usually these were rendered server side as GIFs, where they caused heavy load. The alternative were Java applets, which had an enormous effect on the client load and delayed page display quite considerably, while the JRE was starting up. Enter JS to the rescue…) Also, note the period design, including marquee tickers, custom fonts from GIFs, etc…


My personal/hobby business web site (https://www.rlvision.com) is based on code 22 years ago. It's built with tables, because that's how you did things back then. The age shows. But I haven't found reason to rebuild it yet. Simply put, it works. It may not be mobile friendly, but the goal is to make available my Windows software, so my aim is desktop users.


It's funny to me that one reason why people said to stop using tables was because of file size. Now we have everyone download almost a megabyte (or more) of javascript to render a few kb of html.


I don't recall page size being a reason. I recall TABLE layouts being a lot smaller than most of their alternatives at the time (FRAMESETs in particular come to mind, because we knew HTTP connection overhead was a thing even back then and needing separate files for each individual website "part" felt like a huge bandwidth waste back then).

The big problem was always Accessibility-related semantics. Websites laid out in TABLEs were often quite confusing to screen readers, as TABLE has a lot of supposedly important semantics in how it should be read/engaged with and using a TABLE for layout follows none of them. (What does a table header mean in a layout? Most layouts wouldn't have good headers. How do you describe what a table column is supposed to be for without a column header?) It's a shame that narrative was never clear enough that Accessibility was always the big reason TABLEs were considered a Bad Idea for layout.

(Speaking of downloading a megabyte of data, I recall how long I felt that a 1.44 MB floppy was the best restriction for the size of an entire website. If it was bigger than a floppy you were probably doing something wrong. I stopped counting floppies a long time ago; that person might be ashamed at how many floppies a typical website downloads these days.)


This website looks extremely familiar. I think I visited it about 15 years ago and I am not completely sure but didn't you have a nice drawing and photo editing software there? I can't remember the name of it but it was the best ever, until it was disappointingly discontinued because of lack of buying customers :(


Yes, ArtGem (https://rlvision.com/artgem_about.php). We were a couple of guys that tried at making shareware, but sadly failed. I continued running the site to host my own utilities, some of which eventually turned into shareware. I don't earn much money from it, but I enjoy making software and it makes me happy when other people find them useful as well!


So nice to hear from the developer of some software I very much liked and used as a kid. I especially liked the smudge feature. You definitely made me happy and I thank you for that.


Replace Genius looks great - I will be trying it out soon!


happy user of RLVision Artgem - thank you !


Holy, that was it - ArtGem! Perfect middle ground between Photoshop and Paint. Great features, pleasant feel to use.


Only 520 lines of HTML. And readable! view-source:http://www.fmboschetto.it/


This guy has a Frontpage-generated site too, and it's full of useful Win32 programming tips; he replies if you email him too!

http://flounder.com

Nothing wrong with readable content, regardless of the generator. In fact, I like reading his site precisely because it is speedy to load and render, and because it has content (unlike, for example, Apple's developer documentation).


I'm surprised no one mentioned it here: before Front Page was a Microsoft product it was created by an independent company, Vermeer. But as many pointed out it produced horrible code.

My favorite editor of the day was a "hand coder" called Home Site

https://en.m.wikipedia.org/wiki/Macromedia_HomeSite


My personal favorite, which is in a similar vein - an Italian, although the site is in English; Frontpage; still updated - is https://www.luigicases.com/. He makes leather straps and cases for cameras. Famous in the classic camera community. Only active site that I can think of that still uses frames.


Those pics tell a glamorous story worth of a movie.


I have a 'personal' website which is also about 18years old. It was the first website I built while in elementary school, also in Frontpage. I uploaded it to I think Geocities or something like that, I think it was Yahoo related hosting, can't remember exactly, but it was free hosting.

Some 'fancy' JS effects do not work on the page now, but it is still up. I forgot about and remembered it few years ago and checked it to find it still up. But I can't remember where could I login to see the files and what are the credentials so it makes me giggle that it will stay up for who knows how much longer as a small part of my past :)

http://dzigi.itgo.com

http://dzigi.itgo.com/o_autoru.htm "about author page" with a bio and pic haha


Geocities got taken over by Yahoo, I think there was an intermediate step, and they added a JS widget to free pages, there was a hack to hide it. IIRC they started allowing PHP and not just SSI at/around that stage.


My first site is still live. It went live in 1997, hand coded (using tables). Went through a few redesigns but the 2000 version has been left online as a fixed digital artifact from the time: http://pwp.detritus.net/


Seriously very nice design for 2000.


I made a lot of sites in Frontpage in high school.

My favorite bit was the rollover buttons that used a Java applet to do so.


My personal website is also around 18-year-old. I actually do not remember how old it is. I did not have a domain at first and hosted it on AOL or something.

Here it was: https://web.archive.org/web/20030908174016/http://www.benibe...

But in 2005, I made a complete redesign: http://www.benibela.de/index_en.html

The backend went through a few reimplementations. Individually made html files (with front page express or something), a template tool written in Delphi, another template tool written in Java, a complete XQuery interpreter written in FreePascal


Here’s an archive link in case this gets hugged to death:

https://web.archive.org/web/20200214134509/http://www.fmbosc...


It's actually quite hard to hug static HTML pages to death, unless on purpose, with things like slowloris.


One of my favorite websites is clocking in at 25 years now- Marathon's Story [0], a website dedicated to the lore of Bungie's Marathon series.

[0] http://marathon.bungie.org/story/


This site is best viewed with Netscape Navigator


I was looking for the web ring links.


2251 Mb size! "Qui c'è una applet Java. Mi spiace che il tuo browser non le supporti" = Here's a Java applet, I'm sorry your browser doesn't support it Marquee still gets animated :O "Questa pagina è ottimizzata per un formato 1024 x 768 pixel a 16,8 milioni di colori, carattere medio" = This page is optimized for a 1024x768 pixel format, 16.8 million colors, medium sized font Still apart from the ancient tooling, it's an example of a personal wiki as periodically come on HN, there's a lot of content!


I just wrote to the gentleman pointing him to this discussion :)


Another mention of this fantastic cycling website: https://www.sheldonbrown.com/harris/


As an Italian, reading this is site is absolute bliss. There's just so much to discover. I really suggest non-Italian speakers to automatically translate it with Google and dive into it.


I can't believe how well google translates this. Is there something about Italian that lends itself to english translation or is translate getting this good with other languages too I wonder?


I visited this on mobile expecting it to be a laugh, but was surprised to find that it's actually amazing!

You can see the whole page in a single column, and just pinch zoom to the bit you're interested in to read/interact. Scrolling downwards and sideways to pan around works fine, super intuitive. The UX of this is so great, feels just like that original iPhone demo [1].

...why don't we do this again?

[1] https://youtu.be/vN4U5FqrOdQ?t=2530


Anyone else still sad over the demise of FrontPage Express? It did everything I needed at the time, it was free, and really easy to use. The HTML wasn't as bad as FrontPage either.


I got my start with web dev using Netscape Composer, which was a similar enough tool. Seamonkey, the successor of Netscape and Mozilla Suite, still includes it to this day and it works well!

https://www.seamonkey-project.org/


Dude's gonna wonder why he had a sudden 6500% jump in traffic.


My personal site, http://don.dream-in-color.net has been at that URL (and with this design) for over 20 years. The reading list (http://don.dream-in-color.net/books/ ) dates back to a page that was originally served over FTP and will turn 25 years old in May.


If you like this? Check this one: https://www.gratiz.nl/ Updated every day :-)


My page(https://www.towardssoftware.com) isn't quite that old, but it gets the job done with an incredibly simple content manager - plain html/css editing.

There is almost no javascript, and I have to say, it's done wonderful so far. I have tried Hugo to manage stuff, and a couple of others, but for a lot of blog type of stuff, html just works!


I had this perception that those spinning gifs and moving text made pages crazy hard to parse but I was pleasantly surprised that this site seemed simpler and easier to parse than half the sites today with pop ups, notifications, and blocking modals. Is it just me, or are these notifications and modals that are getting so prevalent really degrading a lot of the web browsing experience today.


It used to be fun to load the Hamster Dance page on different machines and see how much it would slow each one. And that was just from a bunch of animated .GIFs.

Taboola, Facebook thumbs, Twitter counters, and the like are the Hamster Dances of the 21st century.


This is pretty cool. I had to look Frontpage up! Logic dictates that there must have been a point in time where the prevailing opinion shifted from “Uses ancient UI” to “Has a cool retro feel”. Probably all technologies go through this? Like vinyl becoming cool a few years back. Has happened with Flash games recently. Is there a name for this?


"Nostalgia".


I think they meant a name for the process of something becoming the matter of nostalgia.


time...


I started with FrontPage 98 in '99. Moved to a self-written PHP CMS, then WordPress, then back to static HTML.

Had I stayed with FrontPage, my life might have been simpler - porting 20 years of content is not simple - but I would have missed out on learning a lot of HTML, CSS, PHP, Python, MySQL, character set conversion, MySQL vs UTF-8, etc.


I really have a soft spot for these kind of sites. Often people have pretty interesting and unique content on this kind of websites. I always try to help by keeping things very simple to maintain but just a bit better, like making a PHP include of the menu and then replacing the top part on every page once.


I learned Frontpage in 2000 in a King's Cross (Sydney) internet cafe where you paid 2 AUD for unlimited time...but you couldn't leave, not even to go to the toilet.

That html went straight to Geocities.

The feeling of power part of a minority of people who could actually publish something on the net was amazing.


A company in Germany called Arcor had the front page website of my band from 2001 which used frontpage serverside extensions still online about 4 years ago. I couldn't find the FTP password to download the source code so it died when they finally pulled the plug.


I generated my website at server.giessmann.net just last year using Lotus FastSite and the geocities gif archive. I included my fax number, icq id and a PO box. Of course there are 9/11 conspiracy theories and a banner to download the latest Netscape browser.


http://explorermag.com/

My 21 year old site focuses on Windows NT and the upcoming Windows 2000 release. Back when MSFt focused on operating systems. Billg still in charge!


This really takes me back, in a good/nostalgic way. That 3 column layout with a header on top was the go-to layout for content heavy sites.


We are talking about a God given website. Of course it's old and still ongoing :P

Jokes aside, the first thing I read was the sentence "here there is a java applet, sorry your browser doesn't support it" :D Which is funny, after all.


Not to forget the geocitiesizer:

https://www.wonder-tonic.com/geocitiesizer/

"Make Any Webpage Look Like It Was Made By A 13 Year-Old In 1996"


Mine, in plain HTML, is almost 25 years old. I have to admit that I did change the layout a little, through the years, but it has been rather constant, because updating 974 HTML files, is not something that is easily done.


It's pretty easily done? Strip out everything but body, make a wrapper to include the pages? Any static bits you can search-replace, that's what I used to do before discovering server-side includes.


I am talking about 974 HTML files of which probably more than 95% is static content. Doing this manually, will take me about a year (considering the time I have available). So, maybe I should develop a script/parser to process the contents and generate them as server-side includes? But what is the benefit? That I can change the layout more easily, while first having to learn how to efficiently work with server-side includes? (Now I am just uploading my HTML files through FTP to my very cheap hosting provider.) But why would I? I am not interested in the layout, just the contents. I am not writing the website for a broad public, it is mainly because I like to record events (big and small) in my life. I guess, I myself am the most important user. Sometimes, I can surprise people by telling them the exact date that I did/experienced something.


One of my favorite pieces of software, that I still use to this day, is 20 years old version of SpaceMonger.

https://i.imgur.com/XMwNRR3.png


I missed the tools like FrontPage/DreamWaver and gorgeous Flash websites.


Agreed. The modern-day version of those would be Scratch (https://scratch.mit.edu/) which kids use to start exploring coding, but it's not the same.


You're right, it's not the same. Scratch is too Computer Science-y for something that can be as simple as a website. Text, images and video, plus hyperlinks that link one page to another. That's enough for the majority of people to share information.

I really hope something comes along that reinvigorates the public's interest in creating content that resides on their own website, rather than a walled-garden social media account.


Never missed FrontPage/DreamWaver..


DreamWaver is an appropriate name...


Another classic from an electronic music pioneer, author of the original TRON score: http://www.wendycarlos.com/


Also of note, this website is still running despite the HN hug of death.


Its.So.Quick.

I love the speed of the page! And seemingly nobody is eavesdropping me.



One made with Word and still updated: http://villemin.gerard.free.fr/


I love that you kept the style too. This brings back good memories. I’ve learned HTML using Frontpage and gifs were a must in my geocities hosted websites.




I made my first web site with Frontpage and the big leap for me was learning about nested tabled within tables. game changer.


Check out Butkus.org - my dad has had it since before 2000 and updates it regularly, with an ancient copy of frontpage.



Here's a real gem: https://bible.ca/


I love the old style of (personal) website like this. It's seems both nostalgic and maybe a bit more authentic.


The content is super wholesome as well.


Frontpage wow, that was my go to back in the days. Then I saw the markup it was making, yikes!!


This is dedication in content creation and maintenance. We can all learn something here!


The source is refreshingly sparse and tidy which was kind of jolting for a second.


You're telling me their site still works in 2020 without needing to serve the client as a server-side rendered react app with the data being provided by several node.js microservices containerized and deployed to a kubernetes cluster and accessed through a GraphQL interface? IMPOSSIBLE!


Only someone with unwavering faith would maintain a Frontpage website in 2020 :)


I just love the footer "optimized for 1024x768 @ 16M colors" :)


and it loads at light speed


Hosted in Italy too!


Which makes the speed even more surprising!!


The mass of the Alps cause a dilation in space time


<bgsound src="ue.mid" loop="-1"> is princeless


Wow that loaded quickly! I wonder what she's doing to optimize it.



Anything not text based is problematic. But beyond that it is hard.


And loads faster than any of the crap SPOs that are en vouge today


"UFO's don't exist" - untrustworthy site.


It loads so fast!


What is the oldest website still standing?



Now that's a responsive layout.



This one is mobile friendly as well.


CERN lab site.


Frontpage: the original no code software.


does anyone know what year the <marquee> tag became depreciated? I’m surprised my iPhone renders it


Where can I find more sites like this


Try here https://wiby.me/ Hit the 'surprise me' link


NeoCities has a collection of sites with that old-school Geocities styling.


I had forgotten all about Frontpage!


ahahah from Lonate Pozzolo!! I am from Bellinzago and this is just mind-blowing.

great website!


I love it


SLAPP LIKE NOW!


I recently did a history of my old websites:

https://battlepenguin.com/tech/a-history-of-personal-and-pro...

Most of the content is still there, but it's been shifted between static pages, Rails, Wordpress and now Jekyll.

It's neat to see one of these gems still out there; a picture of the 90s web that's still functional and being used. Too many of these sites are lost; only available in the Internet Archives.


I took a similar journey over the years from static pages to custom static generators to PHP to Drupal to a custom Django-based blog engine to Jekyll/static pages.

It's interesting because I'm sometimes sad I lost the code for some of those old versions. Those old early PHP and custom static generator codebases would be interesting to revisit with today's ideas, even if just to laugh about. (But also because I know there's probably not-great blog content lost to them.) One of the "custom static generators" I recall was actually a really early not-quite-SPA JS app. I remember it ran really slowly in browsers at the time and worse got slower with each new content added, but these days I wonder if it would seem fine on modern JS engines. (I've got a feeling about the only thing I'd need to change would be to swap `document.write(stuff)` for `element.innerHtml = stuff` and it'd perform quite well today.)


The spinning e-mail symbol GIF evokes pure nostalgia, bro. Came here to laugh, not to feel.


I’m looking for the little dude with a shovel “in construction”.


..."under construction"


Personal websites... is something that screems mental ilness also writing youtube comments...




Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: