Yes (mostly) is the answer. You can use arrow as a backend, and I think with v3 (recently released) it's the default.
The harder thing to overcome is that pandas has historically had a pretty "say yes to things" culture. That's probably a huge part of its success, but it means there are now about 5 ways to add a column to a dataframe.
Adding support for arrow is a really big achievement, but shrinking an oversized api is even more ambitious.
That's fine but we're an industry defined by fashions. OOP was the thing for decades then it wasn't, whether or not you bought into it, you had to play along. If the trend now is around how well you can juggle agents to produce a result, it doesn't matter if it's suboptimal or not, it only matters how well you can do it, even if you are not on board with the emerging pattern.
Everyone seems to be missing important piece here. Ollama is/was a one click solution for non technical person to launch a local model. It doesn’t need a lot of configuration, detects Nvidia GPU and starts model inferencing with single command.
Core principle being your grandmother should be able to launch local AI model without needing to install 100 dependencies.
For fun, this is how an actual "non-technical" individual would hear/read your comment:
> Exactly. I can be in a non-technical team, and put the blah inside blah. The blah is to install blah and use it to blah and blah. The same blah can point at blah when blah there. Using blah at the time I wrote that it wasn't as straightforward.
I think when people say "non-technical", it feels like they're talking about "People who work in tech startups, but aren't developers" instead of actually people who aren't technical one bit, the ones who don't know the difference between "desktop" and a "browser" for example. Where you tell them to press any key, and they replied with "What key is that?".
> Ollama is/was a one click solution for non technical person to launch a local model
Maybe it is today, but initially ollama was only a cli, so obviously not for "non technical people" who would have no idea how to even use a terminal. If you hang out in the Ollama Discord (unlikely, as the mods are very ban-happy), you'd see constantly people asking for very trivial help, like how to enter commands in the terminal, and the community stringing them along, instead of just directing them to LM Desktop or something that would be much better for that type of user.
>If you ever need a CMS in your Django project I strongly recommend Wagtail, it came after the initially most popular django-cms and learned a lot of lessons - feeling much more like a part of Django.
Nope. I would choose plain Django 100% of the time, especially with LLMs. Wagtail is an antipattern.
If you've ever had to use Django-CMS you'd understand.
I went from a Wagtail project to a Django-CMS project (at a company that was upstreaming bits to Django-CMS) and there were so many things where it used the database badly (the usual antipattern of a loop over some queries).
It's easy to structure queries in django in an optimised way as long as you architect around .filter and not .get.
TAOCP is trash. I wish I grew up in the era where you could just hit up zlib for an accessible book on any topic instead of highly rated and hardly read 'classics' like TAOCP.
I wish more people would criticise Svelte but most people just don't care because it's irrelevant. It's like complaining about Backbone or something, not worth the effort.
reply