Hacker News new | past | comments | ask | show | jobs | submit | Mister_Snuggles's comments login

I have a Prime G2 and the RPN mode is just bad. It feels like it was an afterthought, and some things just don't seem to work properly in that mode.

My 48G+ is a much, much better RPN calculator.

The Prime G2 is also missing the equation library from the 48G+, which I found weird. Maybe they expect you to download what you need instead of using space in Flash for it.


That's my take on the RPN mode but I like the algebraic mode and agree the 48 series is way better for RPN.

(My favorite HP calc was the 28, which I bought when the 48 was available. I discovered soon afterwards that a powerful spring will eventually break the battery door and it seemed to be a problem that no engineering student managed to solve.)


I use my 32S (lives on my monitor pedestal next to my 8087 chip) weekly. When I was doing floating point/integer bit twiddling the 'base' key was my deeply appreciated friend. But that was 36 years ago. RPN (if trained into the brain) really is faster and I love how the register transfers work.

Nowadays when I'm not feeling especially tactile I tend to reach for the Free42 Android app, which lives in a stack with the 32S. If I ever get bored, I'm going to relearn how to program them both. Next time around on the wheel, likely, if I have hands.

I've replaced the battery on the 32S twice. Suck on that, modernity.


I never used an Atari computer, nor did I know anyone who did, but I always wonder what the world would be like if Windows and macOS didn't "win".

If Atari and Amiga had won instead, what would the world look like?

What would the server world look like? Would there be some weird "Amiga Server Enterprise Edition"? Would servers just be Linux without any meaningful competition?

Would Atari have shook the world by introducing a new CPU that resulted in amazing battery life compared to the Amiga competition? Would some of us be using AtariPhones? Would Android be a thing?

Would retrocomputing folks talk about their Windows 3.1 boxes the way that Ataris and Amigas are currently talked about?


I'd assume it would have been pretty similar but we'd be running on Motorola CPU's (or a descendant of them)

The platforms would have needed to be opened up to clone builders to reach critial mass.

Amiga was a lot like DOS/Classic Mac OS, single user, unprotected memory...we would have seen it added on to like Windows 3.x/9x, until a re-written version with the right stuff took over (like Windows NT/2000/XP did).

Someone like Linus would have still likely written a UNIX clone and open-sourced it.

The minicomputers and UNIX servers/workstations would have still hung around for a while. The real trick is the Amiga CPU and the rest of the hardware would need to keep getting improved, catch up in speed, reach 64 bits, SMP...


> Amiga was a lot like DOS/Classic Mac OS, single user, unprotected memory

AmigaOS was actually implementing real preemptive multitasking and was much more "modern" than MacOS and Windows from the same era, and a lot of things were actually unix-alike ! Comparing it to DOS is like an insult :) But yes, memory was unprotected because mot Amiga CPUs didn't have an MMU (but if you did, you could use this https://www.nightvzn.net/portfolio/web/amiga_monitor/archive... ).


Amiga engineers were quite found of UNIX, most of its design was them searching better ways for graphical systems.

The initial game focus was the outcome of various factors during Amiga's transition from idea into a shippable product.

There are some vintage Q&A sessions that go through this.


> The platforms would have needed to be opened up to clone builders to reach critial mass.

Not that much. More standardization on the Unix side and we'd all be happy.


It would have been a very different place.

In the IBM compatible world, the clones drove down the price then drove forward progress. It is doubtful that much of a clone market would develop in the Amiga/Atari world since the parent companies were already competing against IBM compatibles on price. Without clones to break free (as happened in the IBM compatible world), there wouldn't be clones to drive forward progress. I'm not even sure cloning the Amiga would be practical. Apparently Commodore had enough trouble "cloning" the Amiga (i.e. developing higher performance, yet compatible machines). Without the clones driving progress, companies like SGI and Sun would likely still be in the picture.

If Amiga/Atari domination somehow did happen, I suspect the CPU situation would be flipped around, with Motorola having both the incentive and finances to continue on with a fully compatible 680x0 line of processors and Intel chasing after its own equivalent of an 680x0-to-PPC transition.

As for the retrocomputing thing: DOS/Windows 3.x nostalgia is very much a thing in today's world. In that alternate reality, they would likely be higher profile (as Amiga/Atari are today).


If nothing else, it would have been nice to see Digital Research and Gary Kildall get the last laugh via Atari computers taking off.


What a terrible loss it was when he died. He was the Bill Gates we needed.


I think if Atari and Amiga won the world would have a lot more focus on the media side of things, the playful, graphical, musical expression would be more evident. I don't think we'd have spent as long in the beige-box era, and maybe we'd still have little colorful imacs running something Haiku-esque, with enlightenment on top and some breakcore tracker music on bootup.

This is my fantasy, you're welcome to enjoy it while you're here and remember, no shoes on the couch.


the playful, graphical, musical expression would be more evident.

This was also the Mac's distinguishing feature at the time. It still is, so some extent. A lot of what drove mass adoption of home computers was that everyone wanted to bring the same computing environment, i.e. OS, they used at work or at school at home as well. This was DOS/Windows or System 7.


One reason was being able to bring home "free" software from work.


This was also OS/2 but this one didn't work so well.


> Would there be some weird "Amiga Server Enterprise Edition"?

I suspect Commodore's SVR4 port would have played a role: https://en.wikipedia.org/wiki/Amiga_Unix

Likewise, Atari had a Unix port on the TT workstation: https://en.wikipedia.org/wiki/Atari_TT030


We also had (well, have) a Unix-like extension for the shipped TOS/GEMDOS/GEM OS in the form of MiNT/FreeMiNT, which went on to form the foundation of the official MultiTOS which unfortunately died along with the ST when Atari Corp died.

It had/has a POSIX(ish) API, device files, mountable filesystems, pre-emptive multitasking, TCP/IP, etc. while still being able to run classic TOS binaries.

You can run this in an emulator or on hardware still today and it still gets active development, under the GPL.

e.g. https://www.youtube.com/watch?v=GOkDuLmgWFo


That looks like it would be a ton of fun to try. I wonder how that would run on my MiST!


quite well. Though it benefits from a faster machine with an 020 or 030 and more memory.


Atari HATED to share anything with people outside the company. The couldn't even help developers build software for their machines, let alone let someone copy & commoditize their hardware. The Apple II was incredibly open and extensible, and successful. Macs where not and never more than a minor player until computers shifted to a mobile, general consumer product and Apple out executed and leveraged their single ecosystem.


> If Atari and Amiga had won instead, what would the world look like?

If the PC didn't establish itself, Commodore might have opted to release the 900 as a Unix workstation. With any luck, they'd port Coherent to the new Amiga and it'd be a Unix machine from day 1.


From watching Youtube, Ataris may have been the best computer to be exposed to in one's teenage years for so many reasons. It was very capable and efficient for it's CPU. Still might be to learn to build with constraints.


Windows became big because it was the successor of MS-DOS, the OS for the IBM PC, which architecture became popular because of cheap clones.

This wouldn’t happen with Atari or Amiga.


> This wouldn’t happen with Atari or Amiga.

Atari machines were cheap. Amiga not so much, but, eventually, they would catch up.


The server would still be UNIX, but the big iron UNIXes, not GNU/Linux with some BSDs and Windows on the mix, there would be no reason for Linux.

ARM was introduced by Acorn, even if there was some Apple money.

As for the rest, we would keep having nice graphical desktop environments, with interesting paradigms instead of trying to fit GUIs into UNIX clones.

And vertical integration, plenty of it, as it has become the norm again nowadays.


> The server would still be UNIX, but the big iron UNIXes, not GNU/Linux with some BSDs and Windows on the mix, there would be no reason for Linux.

Linus would still get frustrated by the AIX'es and Solaris'es of the world, AT&T would still try to get rid of any BSD being publicly offered and Linux would still be invented under the GPL and get the GNU userland for free.

The biggest difference is that it would be currently used on more architectures than the two it's mostly used on these days.


Without PC as it was, most likely Linus would never had Minix to play with, or be frustrated with its MS-DOS compatible PC, so the genesis wouldn't have taken place.


Without a free and open Unix, Linus would be frustrated on whatever platform he would be using.


> And vertical integration, plenty of it, as it has become the norm again nowadays.

Except, not really. If you work at a startup or business that has to deal with "vertical integration" at-cost, your first goal is to get rid of it. Fly.io, Datadog, managed K8s, all of this stuff is literally first-to-go when scaling a business that needs to be profitable. Business-conscious users know they're being fucked over whether it's Apple, Microsoft or Oracle - you can't market it as "integration" to people that see the monthly bill.

And in the EU, vertical integration from large companies that can EEE their competitors is under extreme scrutiny now. Many execs have exploited it to engender themselves an unfair advantage and deserve the same scrutiny Microsoft got for their "integration" case.

If American governance shows the same competence, "vertical integration" will be the most feared business model of the 21st century and innovation will be put back on the table. For everyone.


The only PCs left that aren't subject to vertical integration are gamer PCs, and even lose are losing to consoles, as people focus on other devices for their daily computing activities.

Large majority of the population is using unupgradable laptops, where at most memory sticks, hard drives, battery can be changed.

I haven't touched a desktop at work since 2006.

Some even make do with a tablet for their computing needs.

Likewise those that have servers in-house, those are no longer PC towers under a desk, rather slice of pizza boxes in server racks.


I am a huge Amiga fan, but the Amiga was going nowhere and was never going to win. The OS is just as terrible as classic Windows and MacOS from a reliability standpoint; yes not using a message pump for timeslicing was a really nice property but in most ways the design was _worse_ in terms of any hope of eventually getting memory protection in place.

I love the Amiga - it represented a unique point in time that coalesced a lot of interesting technologies and people together trying to do something interesting - but it was as far from a technology that had long term potential as you could get, pretty much in every way.


Ironically, the Atari ST's OS -- much maligned as 'primitive' by Amiga users -- was not like this. It had a proper syscall mechanism through TRAPs -- so proper 68000 architecture memory protection entirely possible with user/supervisor separation etc etc -- and an event loop with message passing (tho rarely used). Later extensions to add unix-like multitasking (MiNT -> FreeMiNT) actually ended up fairly elegant, and memory protection is a possibility for some things.

My understanding is that AmigaOS syscalls were basically JSRs?

The original shipped OS was basically a fork of CP/M and PC-DOS-ish but GEM overtop of it showed more attention to cleaner architectural concerns, though it was never really used to its full intent.


I was a die hard Amiga fan and I agree. The way to memory protection on some “what could have been” Amiga, would have been running a copy of the OS for every application and change the IPC if you wanted two programs to talk to each other


I'm very happy with Restic backing up to BackBlaze B2.

I have a "config file", which is really just a shell script to set up the environment (repository location, etc), run the desired backups, and execute the appropriate prune command to implement my desired retention schedule.

I've been using this setup for years with great success. I've never had to do a full restore, but my experience restoring individual files and directories has been fine.

Do you have any links related to the index corruption issue? I've never encountered it, but obviously a sample size of one isn't very useful.


Does anyone have an up-to-date comparison of Borg vs Restic? Or a compelling reason to switch from Restic to Borg?

I've previously used Borg, but the inability to use anything other than local files or ssh as a backend became a problem for me. I switched to Restic around the time it gained compression support. So for my use-case of backing up various servers to an S3-compatible storage provider, Restic and Borg now seem to be equivalent.

Obviously I don't want to fix what isn't broken, but I'd also like to know what I'm missing out on by using Restic instead of Borg.


I prefer restic simply because I find it easier to understand and use. This means backups actually happen. It also feels less like it is constantly changing. Constant stream of new features isn’t a thing I’ve ever desired in a backup solution.


+1, I'm in a similar situation and be curious too about an up-to-date comparison.


Comparisons might be interesting, but one needs to be aware that they would be a bit apples to oranges:

- unreleased code that is still in heavy development (borg2, especially the new repository code inside borg2).

- released code (restic) that has practically proven "cloud support" since quite a while.

borg2 is using rclone for the cloud backend, so that part is at least quite proven, but the layers above that in borg2 are all quite fresh and not much optimized / debugged yet.


This could be enforced in the schema via triggers and/or security permissions. Cooperation from the client is not required.

EDIT: Oracle has append-only tables, and can also use "blockchain" to verify integrity. See the IMMUTABLE option on CREATE TABLE[0]. PostgreSQL doesn't appear to have append-only tables, so using security and/or triggers seems to be the only option there.

[0] https://docs.oracle.com/en/database/oracle/oracle-database/2...


More to the point Oracle has flashback queries:

    SELECT ... AS OF <timestamp or logical clock time>
So you can query the database as of any point in the past, making manual work to implement the same feature with custom columns redundant.

Oracle can also show you a record of transactions made and what SQL to run to undo them, including dependency tracking between transactions.


This sounds like proprietary syntax for SQL 2011 temporal queries. Apparently supported by eg MS SQL server and MariaDB.


You're correct, I overlooked triggers. Though that may be a bridge too far for some folks, triggers are only really comfortable for folks who are deep on RDBMSes. For lots of app developers the ORM is the limit of the world.


ORMs could offer a unique advantage by allowing the user to describe an append-only table and generating the required triggers (or the appropriate CREATE TABLE options). They'd also be able to include helpers to make working with the table easier - like defaulting to selecting current rows only, or an easy way to specify that you want rows as-of a certain point in time.

I'm not sure if any ORMs actually support this though.


I suspect there have been a great many examples and attempts at the ORM level - this one with "bi-temporal chaining" springs to mind: https://github.com/goldmansachs/reladomo

And without the ORM layer there's this extension for Postgres: https://github.com/hettie-d/pg_bitemporal


My "ORM" supports event sourcing, or at least easily plugging it in at the model level.

https://github.com/codr7/hostr/blob/532295b40dcad6c54082e1d3...


> Python code used by Excel runs on the Microsoft Cloud with enterprise-level security as a compliant Microsoft 365 connected experience, just like OneDrive. The Python code runs in its own hypervisor isolated container using Azure Container Instances and secure, source-built packages from Anaconda through a secure software supply chain. Python in Excel keeps your data private by preventing the Python code from knowing who you are, and opening workbooks from the internet in further isolation within their own separate containers. Data from your workbooks can only be sent via the built-in xl() Python function, and the output of the Python code can only be returned as the result of the =PY() Excel function. The containers stay online as long as the workbook is open or until a timeout occurs. Your data does not persist in the Microsoft Cloud.

This is disappointing. A much easier way to 'keep your data private' would be to run it locally. Surely a bundled Python interpreter run inside a sandbox that prevents network access would be just as secure, and cheaper for Microsoft since they don't have to run any Azure resources to support it.


> just as secure

no way, itd be more secure...they wouldnt be able to peek into your spreadsheet


But would it have enterprise-level security?


I've literally added Pandas to some of my projects just so I could use a DataFrame to print a nicely formatted table instead of writing code like this.

Surely there's a library out there to do this job, it seems like such a common use-case. I'm surprised it's not in the standard library to be honest!


There is a library: I use tabulate ([1]) semi-regularly for this purpose. There are probably more.

I agree it would be very nice to have this in the standard library.

[1] https://pypi.org/project/tabulate/, https://github.com/astanin/python-tabulate


Thank you - this looks like exactly what I was hoping for!


I recently used the rich library to print tables in terminal.

https://github.com/Textualize/rich


PrettyTable is another option, actively maintained and much lighter than Pandas


Alberta just announced that this is happening starting in the fall: https://www.cbc.ca/news/canada/calgary/alberta-classroom-cel...



It's already been a rule in most schools anyway. This just formalizes it.


My kids’ high school required kids to have a phone. They strongly encouraged the kids use the calendar to track assignments, they used the camera for capturing homework instructions from a white board, they sometimes made movies for class assignments, and they had some kind of Twitter-like app for the teachers to broadcast to the kids and the kids to contact teachers (this didn’t seem to be used often). They also used the browser a lot for looking stuff up.

When I went to the fall parent-teacher orientation night and they explained all this, it didn’t sound like phone use was a huge problem, but maybe they just didn’t want to talk about the problems. I did like that they had structure around phone use that included appropriate and inappropriate uses of the phone.

I totally get banning phones with young kids, but high schoolers should be treated more like adults as they progress through the grades. Some of the kids were entering the workforce or joining the military after their senior year. They should be ready for that.


> I totally get banning phones with young kids, but high schoolers should be treated more like adults as they progress through the grades. Some of the kids were entering the workforce or joining the military after their senior year. They should be ready for that.

High school is from the age of 13 to 18; big difference between those two.


Definitely. That’s why I talked about a progression. It makes no sense for 13 year olds having the same phone rules as 18 year olds.


Starts July 1 in BC!


> life improvements like, in order to clear some registers all one need to do is write 1 instead of the whole =& ~(REG | 0x8) (or something like that, it’s been a while since I write embedded C)

This is a really interesting feature since it addresses a shortcoming of the C language (inability to easily express bit set/clear) by introducing a new hardware feature.


SBI/CBI only allowed changing one bit, new registers allow for multiple bits and also toggling them.


> Ban all the SQL control characters in usernames/passwords and 400 any request that contains them. Hash and base64 both the username and password so they get converted to something where SQL injections are impossible because of the character set used.

The problem that this tries to solve has been solved by every SQL database for a long time. Bind-parameters in queries are your friend, never build a query using string concatenation.


Yeah, and I use them, but I still get paranoid. Maybe that's due to my lack of understanding of them, my mental model is that it still resembles string concatenation on the backend. Now that I type it out, that sounds wrong, so I probably need to take a look at that.


You definitely need to spend a little time with them, they are safe and don't require all the crazy workarounds you've detailed to solve a non-issue.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: