Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

Tools are running on Linux at least last 5 years very well. Xilinx as well as Altera.

Never heard such ridiculous claim, that ISE with WebTalk enabled sends every piece of code back to Xilinx. And I was intern at Xilinx! You can always seem in WebTalk report what is leaving your computer. You also proposed usable solution for avoiding WebTalk.

The problem is that FPGAs will stay niche product and never became mainstream. What will you do as a hobbyist with some Ultrascale+ device? What can’t you do with 50$ SoC that you can do with FPGA in same price range? SDR is cool, for everything else development cycles are just too long or complex peripherals are needed. How many hobbyists can properly route 64 bit wide DDR3 interface?



> What can’t you do with 50$ SoC that you can do with FPGA in same price range?

Among other things...

Timing-accurate reproduction of vintage platforms, for maximum compatibility with original software and peripherals [1] [2] [3] [4]. Modern full-power desktop processors can do some of the earlier/slower ones, but virtually no SoC-type chips, even the ones in flagship smartphones, have the performance to keep up. The software architecture is like splitting the difference between an HDL simulator and a traditional computer/console emulator, so you can probably guess how slow it is.

Conversely, modern peripherals compatible with the original hardware ([5] [6]) can also benefit a great deal from FPGA, since many modern capabilities and densities are only available in chips with more complex interfaces than the flat system bus typically used by old-school expansion slots/ports. Because the host interfaces are parallel (and often directly driven by the timing of reads/writes on the system bus rather than dedicated handshake signals), the required timing is hard to consistently achieve by bit-banging the bus with any kind of microcontroller or SoC (the PRUs in a Sitara might be able to keep up with the raw interface, but are very limited in how much data they can fetch that fast).

[1] http://kevtris.org/Projects/console/index.html

[2] https://github.com/MiSTer-devel/Main_MiSTer/wiki

[3] http://www.fpgaarcade.com/core-status/

[4] http://c64upgra.de/c-one/

[5] https://github.com/mntmn/amiga2000-gfxcard

[6] https://sd2snes.de/blog/


>Never heard such ridiculous claim, that ISE with WebTalk enabled sends every piece of code back to Xilinx.

Well then buddy, jesus christ, email your former colleagues and get them to reword the installer for ISE. That's absolutely how it reads. I didn't make that shit up out of thin air, that's how I interpreted the license agreement on the ISE installer.


Webtalk sends usage statistics and nothing more (i.e. device, resources, number of luts, runtime, etc). ISE has bee deprecated since 2012 as vivado has replaced it.


Both tools are monstrosities. Xilinx should once and for all decide if they are in the business of selling hardware or selling software.

If the latter they're terrible at it and it's painfully obvious that the culture of the shop is not software-centric.

[edit] the hobbyist argument doesn't hold water. Software and normal computer manufacturers made the exact same argument in the 70's (why would anyone ever need a computer at home?).

If Xilinx opensourced their toolchain - or much better - OpenSourced their protocols, my bet is their chip sale volume would at least double in a year. In particular because folks would choose their H/W to avoid vendor lock-in.

My bet is they're so ashamed of the pile of spaghetti code they're peddling to their customer that they just can't OpenSource it for fear of losing face and reputation for ever.


> they're so ashamed of the pile of spaghetti code they're peddling to their customer that they just can't OpenSource it for fear of losing face and reputation for ever.

LOL no. Everyone knows how this particular sausage is made. It's obvious we are talking of an organically growing decades old codebase. It won't surprise anyone.

To be more on topic, ever since I learned of the PicoEVB a few months ago, I am just giddy of the possibilities, it's amazing we finally got a mobile and PCIe connected FPGA. I wasn't even looking for one because I haven't thought it possible. We truly live in the future.


> It's obvious we are talking of an organically growing decades old codebase. It won't surprise anyone.

Indeed, I'd be surprised if the current teams even know how to actually set up a working build environment, as opposed to something like "Here's a pile of VM templates, each one from a former subsystem lead who left the company 5-10 years ago, and a how-to document that assumes you're running VMWare Workstation 4.0 on Windows XP. Talk to Joe if you need licenses for anything.".


Do you use Viado or ISE? They are like night and day. Vivado is amazing, and constantntly innovating how to make hardware design more tractable and robust. Check out a video of Vivado IP integrator and tell me you're not impressed.

I've no idea what the code looks like but it's clear whatever code was there for ISE was replaced by a ground up rewrite when Vivado came along.

Open sourcing fpga tools won't make them better. Look at any of the gEDA offerings. Compare free verilog simulators like icarus with commercial ones like VCS - even the system verilog standard that was first released in 2005 is still not supported in icarus. If open source woukd work for EDA then why is there such a chasm here?


I've never used a "real" commercial simulator to compare with (I'm assuming the student/locked ModelSim doesn't count), but Verilator seems to work pretty darn well.


Don't get me wrong, Verilator and Icarus are very impressive in their own right, but neither support the full standard of SV 2005 - either choosing to support most of Verilog (pre SV) or only the synthesizable sub-set. Unfortunately this is a far cry from where the commercial tools are these days, and the standard has moved on also.

Similarly for FPGA tools, one fpga seems to have been cracked (ICE) so there is the yosys tool for it, and again while very impressive for what it does, it's a long way from any commercial offering in terms of usability and capability. So while everyone calls for open source EDA to solve all the 'vendor issues', we must ask why for the cases it does exist has it not beat to commercial offerings?


The reason Xilinx won't open source their toolchain, I'm guessing, is that it would reveal to their competitors the optimizations they are using to get more speed or density.

Benchmarks and business are won on speed and density (how much I can fit on a given part, and how fast it runs).

I can't think of many software projects that had active business competitors (taking sales away from them) that went open source.

I don't think code quality has anything to do with it.


There are tons of PhD works in open access on the topic.

Here things are a mirror image of what you have in "physical IP" world. TSMC, Intel and co. all publish tons of research in open access, but real world commercial value of such works is near 0.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: