Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tested it using prompts from ideogram (login walled) which has great prompt adherence. Flux generated very very good images. I have been playing with ideogram but i don't want their filters and want to have a similar powerful system running locally.

If this runs locally, this is very very close to that in terms of both image quality and prompt adherence.

I did fail at writing text clearly when text was a bit complicated. This ideogram image's prompt for example https://ideogram.ai/g/GUw6Vo-tQ8eRWp9x2HONdA/0

> A captivating and artistic illustration of four distinct creative quarters, each representing a unique aspect of creativity. In the top left, a writer with a quill and inkpot is depicted, showcasing their struggle with the text "THE STRUGGLE IS NOT REAL 1: WRITER". The scene is comically portrayed, highlighting the writer's creative challenges. In the top right, a figure labeled "THE STRUGGLE IS NOT REAL 2: COPY ||PASTER" is accompanied by a humorous comic drawing that satirically demonstrates their approach. In the bottom left, "THE STRUGGLE IS NOT REAL 3: THE RETRIER" features a character retrieving items, complete with an entertaining comic illustration. Lastly, in the bottom right, a remixer, identified as "THE STRUGGLE IS NOT REAL 4: THE REMI

Otherwise, the quality is great. I stopped using stable diffusion long time ago, the tools and tech around it became very messy, its not fun anymore. Been using ideogram for fun but I want something like ideogram that I can run locally without any filters. This is looking perfect so far.

This is not ideogram, but its very very good.



Ideogram handles text really well but I don’t want to be on some weird social network.

If this thing can mint memes with captions in it on a single node I guess that’s the weekend gone.

Thanks for the useful review.


Flux is amazing actually. See my other comment where I verified a prompt on their fastest model. Check the linked reddit thread too.

https://news.ycombinator.com/item?id=41132515


If you have the hardware (24GB VRAM plus 32GB) the easiest way to run locally seems to be SwarmUI

See: https://www.reddit.com/r/StableSwarmUI/comments/1ei86ar/flux... (SwarmUI is cross platform and runs on macs, and linux)


You can run it locally in ComfyUI. I was able to run it with 12GB of vram and reportedly even 8GB is doable, albeit very slow.




Consider applying for YC's Winter 2026 batch! Applications are open till Nov 10

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: