Hacker Newsnew | past | comments | ask | show | jobs | submit | Bayes7's commentslogin

| „[…] was a friend telling me his LaTeX thesis took 90 seconds to compile towards the end“

Sure, but in order to iterate you won’t have to compile the whole document but can just keep the chapter you are working on by structuring it with \includes




Would you mind sharing how it turned out useful?


Curious too. It feels like the maths parts of programming are the hobby (proof checkers, Haskell) and the cranking out Go/JS etc is the paid bit. I studied maths but never used more than high school level at work.


Is there any good article/paper that describes how it actually works or is implemented not just in high-level and hand-waving terms?



Yeah Hacker Factor's multi-post critiques are where I first saw it analyzed. For reference they run the popular fotoforensics.com image analysis site.

They also have scathing critique (eg [1]) about the Adobe-led C2PA digital provenance signing, having themselves been part of various groups that seek solutions to the provenance problem.

[1] https://www.hackerfactor.com/blog/index.php?/archives/1013-C...


thanks!


There tends to be more information under the search term "perceptual hashing"


The secrecy of the inner tech is intentional.


great project!


Check out sioyek, it’s great and can open epubs like normal pdfs:

https://sioyek.info/ https://github.com/ahrm/sioyek


Sioyek uses the MuPDF engine, which supports EPUB: https://mupdf.com/


I stumbled over it but didn’t installed it because the website doesn’t mention the ebook functionality.

“Sioyek is a PDF viewer with a focus on technical books and research papers”


Nightmarish with a tablet.


This was a great read, thanks a lot! One a side note, any one has a good guess what tool/software they used to create the visualisations for matrix multiplications or memory outline?


excalidraw <3


Summarised by https://xkcd.com/2494


"[...] modern neural network (NN) architectures have complex designs with many components [...]"

I find the Transformer architecture actually very simple compared to previous models like LSTMs or other recurrent models. You could argue that their vision counterparts like ViT are conceptually maybe even simpler than ConvNets?

Also, can someone explain why they are so keen to remove the skip connections? At least when it comes to coding, nothing is simpler than adding a skip connection and computationally the effect should be marginal?


Skip connection increase the live range of one intermediate result across the whole part of the network skiped: the tensor at the beginning of a skip connection must be stored in memory for longer while unrelated computation happen: it increase the pressure on the memory hierarchy (either the L2, or scratchpad memory).

This is especially true for example for inference for vision transformers, where it decrease the batch size you can use before hitting the L2 capacity wall.


Okay, I see that for inference. But for training it shouldn't matter because I need to hold on to all my activations for my backwards pass anyways? But yeah, fair point!


also removing skip connections leads to a rougher loss landscape, hence it should be harder to find the optimal weights.


Yes there's very good theoretical reasons for skip connections. If your initial matrix M is noise centered at 0, then 1+M is a noisy identity operation, while 0+M is a noisy deletion... It's better to do nothing if you don't know what to do, and avoid destroying information.

I appreciate the sibling comment perspective that memory pressure is a problem, but that can be mediated by using fewer/longer skip connections across blocks of layers.


hey, off topic but can you explain or link a post which explains what the benefits of the alias -> function definition are over just defining the function directly? Thanks!


I am puzzled as well, why not define the function and call it wiki?


I'm just used to keeping all my aliases together for easy location with alias command and I like having ones with arguments with the others.

Aside from that, no benefit really that I can think of.

So yes, to be clear to anyone else you can just put:

  wiki(){ w3m -F "https://en.wikipedia.org/w/index.php?search=${1}&title=Special:Search&ns0=1"; }
in your .bashrc - and if you're me you just forget how you defined it and you have to cat it every once in a while :)

Oh, and you might be entertained by this silly alias adapted from an IOCC entry...

  $ cstdin
  printf("Hello World\n");
  Hello World

  alias cstdin='gcc -pedantic -ansi -Wall -o ~/.stdin ~/.stdin.c -lm && ~/.stdin'


  $ cat ~/.stdin.cc
  #include <stdio.h>
  <snip many headers>
  int main(int argc, char **argv)
  {
  #include </dev/tty>
  return 0;
  }
I'm sure it would also make way more sense as a dedicated script. I have a C++ one in there too.


ah I see, cool thanks!


Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: