Hacker Newsnew | past | comments | ask | show | jobs | submit | theblackcat1002's commentslogin

“About US$2 million was earmarked for the protest movement in Hong Kong, but has now been frozen as part of a general overhaul and restructuring by a new agency boss.”

so the funding doesn't really give out?


As someone who has work with all three NLP toolkit: huggingface, openmt-py and fairseq. I always have trouble juggling through the heavy abstraction of openmt-py.

For example in openmt-py you need to write fields, reader and raw datasets before you even load into their complex dataset class. Each item is heavily abstracted through several layers of classes. I understand this improve code reuse, but introduce a huge steep curve for newcomer.

Huggingface approach on the other hand is slightly more "messy" [2] but easier to understand and add your own tweak.

[1] https://github.com/OpenNMT/OpenNMT-py/blob/master/onmt/bin/p...

[2] https://github.com/huggingface/transformers/tree/master/src/...


Frechet Inception Distance using language model is a solution (commonly used in text GAN). I think the core idea is that you should never rely on one metrics for evaluation but rather a mix of them ( even statistic ones: unique word count, word distribution similarity etc )


"You have a model evaluation problem. You decide to use Frechet Inception Distance. Now you have two model evaluation problems."


"There was a problem talking to Instagram. Please try again in a moment." blocked by Instagram?


Instagram used to have an open API, but that is closed down now. The app is currently using some private-ish endpoints, but they are ratelimited. I need to add caching. More people have started using my app recently, and I have not had time to add caching yet.


I got the same error message using my own user for Twitter


This will be quite useful if ported to database


It is / will in various shapes. Check out Andy Pavlo's (CMU) work on self-driving databases https://www.cs.cmu.edu/~pavlo/



It seems to be a rewrite of ngxtop

https://github.com/lebinh/ngxtop


It does seem to be that except it doesn't "tail log files" like the original (see limitations at the bottom if the readme.


On the other side, AMD APU code name was found in latest macOS release which should hint the other side of the story.

https://www.tomshardware.com/news/apple-may-start-selling-ma...


I much prefer them to switch over to AMD if cost was a concern. Rather than outright dumping x86 codebase.


I doubt that cost is the concern. Both Apple and Intel are big players, they can find a fair price between them, and Apple always had the threat of switching to ARM to get better prices.

I'm pretty sure this move is for power consumption and maybe so all Apple products are on the same architecture.


A13X cost $30, compared to cheapest Intel used in MacBook Air cost $200+. I think it is quite a difference. That means consumer are paying $300+ for x86 compatibility.


> A13X cost $30, compared to cheapest Intel used in MacBook Air cost $200+.

You aren't comparing costs fairly here. A13X costs $30 each + $XXX million to develop. With Intel the development costs are part of the SKU. If Apple launches a series of desktop CPUs, the cost to develop those chips is going to be substantial. Some of that cost will be in common with the iPad/ iPhone, but a good chunk will be unique to their new CPUs. Since Apple ships far fewer Macs than iPhones, the development cost/ unit will be significantly higher.


iPad Pro is already beating some MacBooks in CPU benchmarks. Apple might just reuse the same CPU’s.


> iPad Pro is already beating some MacBooks in CPU benchmarks. Apple might just reuse the same CPU’s.

Maybe some. Just considering the size of the devices, I'd expect the 16" MacBook Pro would have beefier CPU options than the iPad Pro.


Sure, but I don't think the design cost is going to be that high--maybe even less than the design cost of having separate iPad and iPhone CPU's.


Maybe. But they will likely have at least 3-4 different CPUs for the various Macs and different clock speeds for those different designs (though clock speeds and core count will likely be handled primarily through binning). Development cost for each additional CPU will be spread over fewer and fewer units.

- MacBook Air

- High performance MacBook

- iMac / Mac Mini

- iMac Pro/ Mac Pro

If next gen Macs are going to support some kind of x86 emulation/ compatibility layer, performance isn't going to have to be comparable with Intel, it's going to have to be 2-3 times faster so I'm expecting something quite a bit beefier than what the iPad Pro ships with.


Yes that is why I also wrote in another reply [1] it doesn't make much sense financially. And I dont quite see how it make any sense technically either. Even if Apple refuse to use AMD CPU for whatever reason Intel's investor roadmap ( Which tends to be more accurate then what they shared to consumers ) shows they are finally back on track. ( It will still take a year or two to catch up though )

Software is expensive, writing, testing , QA.

On the hand, they are spending billions on stupid Apple TV Dramas, I guess they might as well make their own CPU for high end Mac.

[1] https://news.ycombinator.com/item?id=23465728


> it doesn't make much sense financially.

This I disagree with. The Intel premium here is likely somewhere in the ballpark of $100-200 per CPU. Spread across 16-20 million Macs sold per year, we're looking at conservatively $2 billion/ year they can invest in CPU design.

More important, Apple will control what features get added to their CPUs and can integrate other functionality into the CPU the way they have with the A-series chips.


Yes if you look at it from all of Mac perspective and selling it at the same price ( Which I hope they dont ) But per unit, it would be MacBook funding development of higher TDP CPU from 50W to 250W. Those are low volume, require new Node tuned for Higher Power, and possibly some design changes. If they follow the same Chiplet design as AMD, that could be $500M budget. If they are making the same monolithic die that could go up to $1B+.

And this is a recurring long term investment.


Source on that pricing?


Apple designs their own chips. They have a single fixed cost for the design work which gets amortized by the massive volume of device sales. The only variable cost is the cost of third party fabrication. AMD can’t compete with that.


I think it’s about time we relegate x86 only codebase to VMs.


That would actually make sense. I was kind of surprised to see many Hackintosh builds with Ryzen CPUs and reference motherboards working pretty much out of the box...


Yes, if you look up Geekbench, nearly all the top scores are from AMD Hackintosh.

https://browser.geekbench.com/v5/cpu/singlecore


I'm not familiar with geekbench - is it expected that a phone is at the top of those rankings? That seems a bit sketchy to me.


The top results are nonsensical.


Is a CPU with both ARM and AMD-sourced x86-64 cores possible?


It seems like it should be -- especially if Apple licensed AMD's Infinity Fabric. Apple could buy discounted dual or quad-core chiplets and add them onto their system. x86 performance would decrease to encourage shifting architectures, but it would allow a couple years of transition time.

All the talk about x86 emulation doesn't seem feasible. x86 is crufty enough when implemented in silicon and would be much, much worse being implemented by a team that hasn't spent their entire life learning all the weird little performance tricks for the architecture. Even if they somehow succeeded, Intel has deep pockets too and lots of lobbyists and would probably push for (and get) an injunction while in court. Even if Intel lost, the injunction would hurt Apple severely during the transition period. Apple would need still-patented x86_64, SSE x.x, AVX, virtualization instructions, etc that are all patented. In addition, if Oracle v Google decided in Oracle's favor, that would open yet another attack avenue.

Throwing in a couple hardware cores shouldn't cost a ton and would stop those legal concerns in their tracks.


Physically? Yes. The trick is finding someone willing to license the X86 IP to make it.


AMD already shipped one ARM server chip. At this point, I think they're more interested in their patents covering non-x86 parts of the chip that make it possible to pipeline data into the CPU.

If Apple is transitioning regardless, it's either lose out on potential profits completely or take what they can get for a few years. Making a deal would hurt Intel and get them money. AMD could probably hold out for a guarantee that Apple would buy their chips for the next 3-5 years too (at least on desktop).


This is actually what I did, but with web application and services. I tried app development in the one of my first few projects but finally gave up for web apps because it allows faster trial and error ( mainly I can push changes in UI and UX faster to a wider audience ). My take away is that stay away from "hot" topics ( news, game app ) unless you have a solid background because you need a really polish product to compete.


What metrics do you use for web applications and services? I mean with mobile app, you have downloads and ratings. But for web apps, you have no such thing.


I'm not a web developer but I am aware of Alexa ranking. That can be used to see how popular some website is worldwide or in a particular country.


Can you give an example of a web app/service that you created? Are these SaaS or content sites monetized by ads/affiliate etc?


The first thing I saw the name immediately relates to the Malay word Lagu. But no you don't see a word like this in real world, but you can certainly add them together.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: