A long time ago, I worked on Renderfarm.fi, which was like what you're proposing but free. I was just an extremely junior developer working on the front end. The volunteer computing system itself was based on BOINC. I recall it was quite the effort to keep it running. If I remember right we rendered everything multiple times on different nodes to make sure that jobs weren't being faked.
There's no point to this post other than a trip down memory lane. I wish you the best with this project!
It's not exactly a nice move to link to the download folder before the main website has been updated, because as you might notice this has resulted in a lot of confusion from normal HN users who don't follow the Blender mailing lists/developer forums closely. The release is in fact today, the 30th of July, but possibly several hours from now, as many developers are currently at SIGGRAPH in California.
Even if the link is changed to https://www.blender.org/download/releases/2-80/ (which it probably should be), that page hasn't been updated since RC3, which will possibly just confuse more people.
It would probably be better for the Blender community to not jump the gun and wait until the website is updated.
From my understanding (SOMEBODY CORRECT ME IF IM WRONG) Eevee is designed for realtime rendering in mind - it uses tricks similar to videogames to get high quality renders without raycasting everything.
Yes, Eevee is a fast pbr renderer and can give very similar results to cycles with no effort, except there's some setup needed to do indirect lighting and reflections... and some tweaking needed for good lamp shadows in some cases, but the payoff is huge. I have a high resolution render that takes 12 minutes in cycles, 30 seconds in eevee with no grain... some features are not there line a shader bevel, but otherwise very similar, and there's workarounds for that.
This is (a release candidate for) a huge milestone in Blender's history. So much has been improved, that some, including myself, have speculated why they didn't choose to make it a 3.0 release.
Aside from the fantastic new features geared towards existing users, which are described in detail & with pictures in the OP, this release also makes Blender a lot more user-friendly for those who haven't used it before. 3D software almost necessarily has a steep learning curve for new users, as you have to learn not only how to use a new program, but also how 3D content creation itself works. But in this release the developers and designers have made an effort to get rid of the biggest "gotchas" that many new users complained about when using previous versions of Blender.
If you've wanted to get into 3D content creation before, there's never been a better time!
Another thing to note is that the future of Blender development never looked as good as it does now.
The recently introduced 'Blender Development Fund' already recieves € 37245 every month in donations, which directly goes to hiring more Blender developers.
As impressive as 2.80 is, I'm really eager to see what 2.81+ brings, one area which looks like it will get a major improvement is 'sculpting' where a newcomer (Pablo Dobarro) has been making waves with a lot of interesting development in a separate branch, resulting in hints from the Blender Foundation of him being hired soon.
Pablo Dobarro's work around improving sculpting is really incredible, and imho, makes Blender compete as a zbrush alternative (at least as a solid casual alternative). If you want to see this project succeed, you can donate to his patreon:
https://www.patreon.com/pablodp606/posts
In release notes[0] for Cycles there are a lot of mentions of CUDA. Also many mentions of OpenCL, with the ominous note that it’s been “disabled on macOS platform”.
I’m wondering how complex can animations be, with reasonable frame render times, on macOS with Radeon Pro Vega 16? I know it’s a very open-ended question but I’m curious for any take.
(For some context, I’m completely unfamiliar with the pipeline/ecosystem, but wanted to hobby around with 3D for a while. Lacking a suitable GPU, now I’m considering how viable would this be on latest MBP’s graphics. If not so much, I might go for a cheaper GPU option & postpone my 3D experiments until I can have a fixed workstation with fast GPU in addition to laptop I use for work.)
I used to use Blender a lot for a variety of things in the past. However, with 2.8, the UI became so slow on my Mac Mini that I decided to buy an external eGPU. However, I hadn't read the "OpenCL disabled on macOS platforms" update. So the eGPU didn't really help. So for me, I'm still on Blender 2.7 as the UI is much faster there. For reference, I have Blender running on a 5k display, so there're a lot of pixels to move around. Nevertheless, buying an eGPU won't help you a lot with Blender 2.8 as the internal GPU is too slow for the UI - at least in a reasonably high resolution.
I was briefly pondering buying a second Linux box just to use Blender, but that also sounds insane. So until Apple patches their broken Nvidia relationship up, or Blender supports something like MoltenVK, there's no good way of running 2.8 on most macOS devices.
Edit: I didn't test the RC yet. So maybe the performance is better now. Also, I never tried on a smaller display. It might work just fine on a 1920x1280 screen.
I can't imagine OpenCL in any way being involved with the rendering of the UI. Maybe you're confusing it with OpenGL?
If you're having problems making use of an eGPU (that's supported by apple, which rules out nvidia!), you should report that. eGPUs will probably be a common use case.
That probably has something to do with Apple's abysmal support for OpenGL. They have even deprecated it for their proprietary Metal API. From Blender's perspective supporting a proprietary API is not worth it.
If you're just learning then new Eevee renderer in 2.80 is great for 90% of things and is pretty much real-time. For those final shots you can use cycles with your CPU, it's going to take longer but lack of a GPU shouldn't stop you as a hobbiest.
edit: after some more reading I'm not sure if Eevee works on CPU, but I don't have blender 2.8 on a cpu-only machine to test this.
The new 2.8 release comes with the Eevee "real time" 3d engine, which should work well with your current system.
If you need to use Cycles, it'll still work fine with your CPU for now - and you can always either use an online render farm (there are many!) for more complex stuff, or buy a separate rig if you end up using it enough.
Yes, it is unfortunate that Cycles no longer supports OpenCL on macOS. There has been talk in some of the Blender groups about porting it to Metal (Cycles was designed from the ground up to support multiple platforms like OpenCL and CUDA), and they were speculating that it could be done by a skilled developer in 3–6 months [0]. Hopefully there are enough Blender users on Mac to justify this effort. Anyone here have ideas about organizing / funding this?
In the meantime, check out AMD ProRender [1]. It appears to be a viable alternative to Cycles for most things and can run on Metal on macOS.
At any rate, for more substantial renders, I strongly recommend cloud farms. You can make your own using spot instances to save money, and fire up more servers to get your render done more quickly. Getting an overnight render done in less than an hour (without tying up your workstations) is super helpful since it gives you more freedom to iterate. This kind of task (where you need a huge amount of processing power periodically for specific jobs) is where cloud computing really shines.
And there is also Eevee, which is not a Cycles replacement, but I believe it is fully supported on macOS.
Thank you, had no idea about ProRender and haven’t thought of offloading the renders to EC2. Looks like using a spot P2 instance could be really cost-effective (if prices keep at reasonable levels), definitely worth trying first.
There is a useful tool for this called brenda. The original repo by creator James Yonan hasn't been updated in years, so it uses an old version of Blender by default, doesn’t allow you to choose availability zone (which affects pricing), and doesn't support GPU rendering out of the box. I forked it [0] to address these issues for my own use, and updated the documentation to try and make it easier for people to get started.
I actually saw brenda come up a few times while researching readbeard’s suggestions, as you said the original seemed not super up-to-date. Thank you for mentioning your fork! Going to try in the next couple of days.
Even if my experiments won’t justify spinning up multiple instances, this should greatly reduce setup overhead.
> to try and make it easier for people to get started
What do you think about putting a simple GUI in front of this toolset—for those not proficient with CLI (I imagine many 3D artists using Blender may fall into that category)? I’ve been doing something similar as part of a consulting job recently, so couldn’t help thinking along those lines… Would be happy to help make it more accessible, should be an interesting exercise.
I am interested in jumping in. I’m concerned that new features, even if better, would make it hard to find up to date documentation and tutorials. Any recommendations?
'Too hard to learn' was a legitimate concern in the early days... even though the payoff is huge. Perhaps it still is some, but not more than most other 3d suites. The interface is so much discoverable, and has switchable key maps so that it's similar to other 3d apps.
I got started with a tutorial (on Udemy I think) that used a much older version and while it definitely took me longer I think I learned a lot more. I got to poke around and try things and make mistakes that I probably wouldn’t have tried otherwise.
> If you've wanted to get into 3D content creation before, there's never been a better time!
I've heard that several times as this version has been in the works. Will this make most of the tutorials for Blender harder to work through until they are updated? I've heard that the UI will be pretty different.
The is the most frustrating downside - the side effect of all major software overhauls - all of a sudden the wealth of tutorials and guides are out of date!
This is probably a reference to the integrated motion tracking support[1]. Sure, you can use any 3D application to generate content for AR, but it's a lot easier to overlay 3D object into an existing scene when the 3D application can process a video and automatically adjust the camera parameters to match.
Sounds like the best time is "in the coming days." :) But seriously: I'm not sure I want to cut my teeth on a version that hasn't been fully tested for bugs.
Anyone can read in the developer blog’s last entry (2 clicks away from blender’s homepage) that the final 2.8 version was scheduled in only 5 days, but OP seemingly wanted the karma points. Otherwise there isn’t any real reason for sharing a RC of a project you are not working with without checking what is the release plan.
> Our 3D rendering engine currently uses a Chrome-only technology called Native Client to power Earth Studio. However, we’re closely tracking the evolution of WebAssembly (especially threading). Stay tuned!
Sure, but realistically, that makes no difference when (A) only one browser ever implemented it, and (B) its continued use by its originator company _despite_ deprecation last year (after its team was destaffed two years ago) still stinks of an attempt at lock-in, open source or not.
There are still things you can't do with the web platform alone. WebAssembly is still in its infancy, and not everything is really figured out yet. In this case, it looks like they're held back by the lack of proper threading:
>However, we’re closely tracking the evolution of WebAssembly (especially threading). Stay tuned!
It seems reasonable to get this program out and running today and move it over to WebAssembly in the future. One of the alternatives would've been to deliver a native app to every platform, but Chromium is already a native app that runs on many platforms that you can compile yourself, and then you get a security sandbox for free. Seems a better to me, frankly, especially in a world where many "desktop" apps just ship with Chromium anyways.
You've made a bit of a logical leap in the way you've interpreted my post.
So, because I think Google shouldn't use a technology that they themselves marked as deprecated, leaving it only supported in their own browser using their own technology that only they implemented, it shouldn't exist in any way shape or form — despite the fact that wasm is already supported in Chrome and other web browsers?
Obviously, they should have just waited until they'd finished porting to wasm. The world would have survived without the NaCl version of a tool they never had before; bloods wouldn't run down the streets if we'd have had to wait a little bit longer for the only version of the tool that should have released.
> It's like saying because an app using the Mac touch bar is Mac only is a terrible thing
It's really not. The touch bar doesn't purport to be a standards-compliant technology available for every platform to access in an equitable way; apart from the fact it's only on Macs, nothing about it was ever advertised as being explicitly designed for cross-platform use.
The analogy to Chrome, a browser that is advertised as standards-compliant and even advancing standards, to the extent that it already includes wasm support, utterly falls apart.
> It's how platforms work
Chrome is not a platform, and I find myself shocked when anybody thinks so. It's as though the lessons of the past (IE, ActiveX, plugin lock-in) remain unlearnt; a new generation of people who don't know how good they've had it for so long.
Chrome is __not__ a platform; it is merely the window through which we consume the real platform: the open web.
When Google tries to push Chrome as a platform, we need to all push back. We've had one browser as a platform, and it was not good, it was unhealthy for developers as well as advancements in web technologies, and I and anybody who still has to support IE5 and IE6 do not want to see that again.
There are some differences between Chrome and previous browser incumbents:
- Most of Chrome is open source. This would surely run in Chromium.
- It's cross platform.
- It supports web standards. WebAssembly for example.
Are things really full circle? I mean the situation could be better but the amount of things that work seamlessly between multiple browsers has never been larger. Very advanced web apps are now assumed to work across at least Chrome and Firefox, and probably Edge, with only a handful of exceptions, when in the past most things beyond basic pages simply didn't or required Java or Flash binary blobs. I don't think anyone in the browser space is intentionally making things worse, things have mostly just gotten better. NaCl may be a notable exception, but prior to WebAssembly it seemed like a good idea, and frankly the technology itself still seems pretty useful. NaCl also still has a lot of things WebAssembly doesn't yet.
I see no bad intentions. Just people making the best out of a rapidly moving platform. Someone else in this thread mentioned a WebAssembly port is coming eventually, no reason to doubt it given NaCl is deprecated.
Also, P.S.: The NaCl situation is unideal, but frankly I'd rather boot up Chrome to run something than, for example, installing and using an NPAPI plugin that has full privileges and code I can't inspect. (I am, actually, a Firefox user at home.)
(Disclaimer: Google employee, but nowhere near the Chrome team.)
It doesn't take bad intentions to end up destroying an open ecosystem, just a lack of effort to create standard solutions while being the majority market share gorilla.
Microsoft employees likely thought they were doing good by adding non-standard features to IE as well given that "standards couldn't keep up" for them either.
Sorry, I just don't see that pattern repeating here. Nothing like JScript or MSJVM, no ActiveX or BHO or COM.
NaCl solved real problems people had. NaCl allowed secure native code inside Chrome extensions and webapps, before Asm.js and before WebAssembly. It helped with removing insecure NPAPI usages (in the past, a Chrome extension could contain an NPAPI plugin![1])
Nowadays, NaCl is a lot less necessary. Many apps can run just fine in Asm.js and WebAssembly. This is definitely due to cross collaboration from the browser vendors, improving JavaScript performance and solidifying the idea of a cross-platform, safe, portable binary format that could be implemented by all browser vendors. That seems like a fantastic outcome to me, as an end user.
Other browsers could've implemented NaCl if they pleased. I recall Mozilla pushing for pure JS and later asm.js instead. In the end, the solution we really ended up with is a lot like a mix of NaCl and Asm.js. You could argue we would've gotten here without NaCl, but I think the end user is certainly better off without things like NPAPI plugins.
> Most of Chrome is open source. This would surely run in Chromium.
That's neither here nor there. For the purposes of this discussion, Chrome and Chromium are the same browser, and the design and development of the Chromium platform is very much dominated by Google, sufficient to call them the owners of the platform.
Finland and Denmark have mandatory military service for males. Sweden got rid of mandatory military service in 2010, then reinstated it in 2016. Norway currently has "weak conscription" in that recruits are not forced to serve. Latvia was a part of the Soviet Union a generation ago, and had conscription until 2005.
So what all those countries have in common is that a large portion of the male population has actual military training.
Interestingly, while in English we now say "business card", it seems that at least Swedish and Finnish, and probably other languages too, retained some form of "visiting card". In Swedish "visitkort" and in Finnish "käyntikortti" (literally, visit card) are still the terms for business card in use today.
They were distinct concepts, my great grandfather had both [1], so maybe in those other languages they didn't overlap in use like they did in America. For timing I found them in his address book he brought to France in WW1.
"Biglietto da visita" in Italian, it was used also for accompanying a gift, most people used to have two sets, one, a proper "visit card" without anything but the title and the name (and the title was usually striken with a pen to show familiarity) and one more properly a "business card" with the postal address and (once available) phone number.
Let me add Norwegian to the list as well, only with a double t in visitt (double because the vowel in front should have shorter pronunciation I think).
There's no point to this post other than a trip down memory lane. I wish you the best with this project!