Hacker Newsnew | past | comments | ask | show | jobs | submit | crywas's commentslogin

The Fear Index by Robert Harris, 2011



I don't use web reddit. Tried old and new and it's horrible on both side!

But I use Relay Pro some pros :

-Gallerie View -List view -better comments section


huge computation train on red, blue, and yellow colors and generate shape I wonder what would come out in 100B parameter training data and then 2 steps require a lot of computation not efficient (IF it would work at all? )


1. Consistency models are a new type of generative models designed specifically for efficient one-step or few-step generation. They achieve high sample quality without adversarial training. 2. Consistency models can be trained in two ways: (1) Consistency distillation: distilling a pretrained diffusion model into a consistency model. This results in high quality one-step generation. (2) As a standalone generative model without relying on a pretrained diffusion model. This still achieves strong performance for one-step generation, outperforming other non-adversarial single-step generative models. 3. Consistency models allow trading off compute for sample quality by using multistep generation, similar to diffusion models. They also enable zero-shot image editing applications like diffusion models. 4. Empirically, consistency distillation outperforms existing distillation techniques for diffusion models like progressive distillation, achieving state-of-the-art FID scores on CIFAR-10, ImageNet 64x64, and LSUN 256x256 for one-step and multi-step generation. 5. As standalone generative models, consistency models outperform other single-step, non-adversarial generative models on CIFAR-10, ImageNet 64x64, and LSUN 256x256, though not GANs. 6. Consistency models share similarities with techniques in deep Q-learning and momentum-based contrastive learning, indicating potential for cross-pollination of ideas. 7. Some limitations and future work include: - Evaluating consistency models on other modalities like audio and video. - Exploring connections to deep Q-learning and contrastive learning in more depth. - Developing more sophisticated training methods for consistency models. - Improving the efficiency and stability of the multistep sampling procedure.


it's like a prison with extra steps.


idk


Generative AI is digitizing skillsets, making them programmable and upgradeable. As a result, a new class of Generalists will dominate the era of generative AI. As expertise and experience are no longer needed to perform with proficiency in new fields, those with a breadth of experience and passionate curiosity will rise to the top. With AI, individual creators can become armies of one.

This will flip the corporate world on its head and is already disrupting training and education. Socratic learning with ChatGPT and performing surgery in VR are just the beginning. The combination of AI and Extended reality will bring about the return of the apprenticeship, where our teachers are machines.

Even our relationships will change. In a time where our behaviors are guided by algorithms, and humans become more machine-like, machines are becoming more human. AI companions provoke emotions and elicit feelings of romance, while children are less concerned with whether their friends are real or synthetic.

This isn't the future. This is happening today. We’ll explore what these trends mean now and for the decades to come.


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: