Hacker Newsnew | past | comments | ask | show | jobs | submit | xmorse's commentslogin

Writing this in Mojo would have been so much easier

It's barely gaining adoption though. The lack of buzz is a chicken and egg issue for Mojo. I fiddled shortly with it (mainly to get it working some of my pythong scripts), and it was suprisingly easy. It'll shoot up one day for sure if Latner doesn't give up early on it.

Isn't the compiler still closed source? I and many other ML devs have no interest in a closed-source compiler. We have enough proprietary things from NVIDIA.

Yeah, the mojo pitch is so good, but I don't think anyone has an appetite for the potential fuckery that comes with a closed source platform.

Yes, but Latner said multiple time it's closed until it matures (he apparently did this with llvm and swift too). So not unusal. His open source target is end of 2026. In all fairness, I have 0 doubts that he would deliver.

Given Swift for Tensorflow, lets see how this one goes.

That one did get open sourced but nobody ended up wanting to use it

Who would anyone want to pair a subpar language with a subpar ML framework?

That is the thing, what lessons were learnt from it, and how will Mojo tackle them.

I feel like its in AMD/Intel/G’s interest to pile a load of effort into (an open source) mojo

Mojo is not open source and would not get close to the performance of cuTile.

I'm tired of people shilling things they don't understand.


it's all over this thread (and every single other hn thread about GPU/ML compilers) - people quoting random buzzword/clickbait takes.

Use-cases like this are why Mojo isn't used in production, ever. What does Nvidia gain from switching to a proprietary frontend for a compiler backend they're already using? It's a legal headache.

Second-rate libraries like OpenCL had industry buy-in because they were open. They went through standards committees and cooperated with the rest of the industry (even Nvidia) to hear-out everyone's needs. Lattner gave up on appealing to that crowd the moment he told Khronos to pound sand. Nobody should be wondering why Apple or Nvidia won't touch Mojo with a thirty-nine and a half foot pole.


Kernels now written in Mojo were all in hand written in MLIR like in this repo. They made a full language because that's not scalable, a sane language is totally worth it. Nvidia will probably end up buying them in a few years.

NVidia is perfectly fine with C++ and Python JIT.

CUDA Tile was exactly designed to give parity to Python in writing CUDA kernels, acknowledging the relevance of Python, while offering a path researchers don't need to mess with C++.

It was announced at this years GTC.

NVidia has no reason to use Mojo.


I don't think Nvidia would acquire Mojo when the Triton compiler is open source, optimized for Nvidia hardware and considered a industry standard.

Nobody is writing MLIR by hand, what are you on about? There are so many MLIR frontends

how mojo with max optimize the process?

what about a fourty feet pole? would it be viable?

I really want Mojo to take off. Maybe in a few years. The lack of an stdlib holds it back more than they think, and since their focus is narrow atm it's not useful for the vast majority of work.

It would help if they were not so much macOS and Linux focused.

Julia, Python GPU JITs work great on Windows, and many people only get Windows systems as default at work.


Approximately nobody writing high performance code for AI training is using Windows. Why should they target it?

As desktop, and sometimes that is the only thing available.

When is the Year of NPUs on Linux?


This targets Blackwell GPUs so I’m not sure what you are talking about

The same, hardware available for Windows users, as work devices at several companies, used by researchers that work at said companies,

https://www.pcspecialist.de/kundenspezifische-laptops/nvidia...

Which as usual, kind of work but not really, in GNU/Linux.


I've commissioned a board of MENSA members to devise a workaround for this issue; they've identified two potential solutions.

1) Install Linux

2) Summon Chris Lattner to play you a sad song on the world's smallest violin in honor of the Windows devs that refuse to install WSL.


I go with customers keep using CUDA with Python and Julia, ignore Chris Latter's company exists, while Mojo repeats Swift for Tensorflow history.

What about that outcome?


https://termcast.app

A Raycast porting to the terminal. It will let you run Raycast extensions as TUI apps. Powered by opentui


https://playwriter.dev

A browser automation Chrome extension and MCP. It consumes less context than playwright MCP and is more capable: it uses the playwright API directly, the Chrome extension is a CDP protocol proxy via WebSockets.

I use it for automating workflows in development but also filing taxes and other boring tasks


This Chrome extension allows you to control your own browser via MCP.

It bridges the CDP protocol from the MCP to the browser, meaning you can do everything Playwright can.

The MCP works using a single tool: execute. It will run Playwright code to control the browser, meaning context usage is small compared to Playwright MCP and the capabilities are more extensive


I made something similar that lets you run TUI applications inside other terminals via ghostty-vt, to implement things like TMUX in opentui

It will be used to render ANSI inside opencode

https://github.com/remorses/ghostty-opentui


The advantage of space is that you have infinite scale. Maybe data centers in space do not work at low scale but you have to think of them at much larger scale.

Elon Musk considered data centers in space simply for the fact that more solar power is available in space than Earth


Super cool. I just created a similar project that runs Ghostty in the terminal. Meaning you can create something like tmux from scratch.

I will use it to add support for colored bash tool output in opencode.

https://github.com/remorses/opentui-ansi-vt.git


This project is similar to Playwright MCP, but with some important differences and benefits.

---

1. Key differences from Playwright MCP

   1. Uses your existing Chrome tabs via an installable Chrome extension instead of launching a new browser window.
   2. Exposes only one MCP tool: `execute`, which runs arbitrary Playwright code in a sandbox.
   3. The extension communicates with the MCP via Chrome DevTools Protocol (CDP) over WebSocket.
2. Benefits

   1. Less context bloat: the LLM only needs to reason about a single tool instead of many.
   2. More capable: the agent can write any Playwright code to control the browser.
   3. Fewer resources: reuses your existing Chrome browser instead of spawning a new instance per MCP project.
   4. Human–machine collaboration: you can temporarily disable the extension to:

      * Solve CAPTCHAs
      * Unstick the LLM
      * Intervene manually when needed
3. Using it in your own scripts

   1. You can connect to the extension from your own Playwright scripts.
   2. This lets you automate tasks in your existing Chrome tabs, instead of running a separate browser instance.
4. Non-disruptive browser control

   1. The extension does not force-focus Chrome windows during interactions.
   2. You can run automation in non-headless mode without Chrome constantly stealing focus and interrupting your work.
5. Security model

   1. The extension connects to a localhost WebSocket server started by the MCP.
   2. This server proxies MCP commands to the extension.
   3. Only tabs where the extension is enabled are accessible to the MCP.
   4. Other tabs remain isolated and inaccessible to random agents.
6. Safety considerations

   1. The MCP can still cause damage if run unsupervised.
   2. You should:

      * Run it with permissions enabled, and/or
      * Keep it under active human supervision.


Control opencode agents inside Discord.

Each opencode project is a Discord channel.

Start sessions by creating threads.

Supports voice channels via Google Realtime Voice API.

Like Jarvis but for coding agents


I want to invest. Send bank account


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: