Hacker Newsnew | past | comments | ask | show | jobs | submitlogin

> I don’t use RAG, and have no doubt the infrastructure for integrating AI into a large codebase has improved

It really hasn't.

The problem is that a GenAI system needs to not only understand the large codebase but also the latest stable version of every transitive dependency it depends on. Which is typically in the order of hundreds or thousands.

Having it build a component with 10 year old, deprecated, CVE-riddled libraries is of limited use especially when libraries tend to be upgraded in interconnected waves. And so that component will likely not even work anyway.

I was assured that MCP was going to solve all of this but nope.



How did you think MCP was going to solve the issue of a large number of outdated dependencies?


Those large number of outdated dependencies are in the LLM "index" which can't be rapidly refreshed because of the training costs.

MCP would allow it to instead get this information at run-time from language servers, dependency repositories etc. But it hasn't proven to be effective.




Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: