Humanloop is looking for engineers to join the founding team as we develop our machine learning training platform. Our aim is to make programming computers as natural as teaching a colleague, so that anyone can collaborate with AI to achieve their goals.
We have spun out of UCL's AI Centre and are backed by some of the world's best investors (to be announced!).
We're looking for full-stack and frontend engineers who will feel confident owning product, can grow into leadership positions, and care deeply about creating real-world value for our customers. You can see the full specs of the types of roles at careers.humanloop.com
Our current stack: pytorch, fastapi, postgresql, react (nextjs), tailwindcss
We're operating a hybrid remote-first workplace. It's best that you're within a timezone UTC±5 for working hours overlap and so that you can easily attend our frequent off-sites.
Any questions, drop me an email jordan@humanloop.com (cofounder) or apply at careers.humanloop.com
Right now document level classification and span tagging within text documents. These can also be combined (as in the landing page screenshot) so that for a given input, you're learning multiple tasks at once as you annotate.
The core of this platform should generally be independent of the data input type and the output labels, so we're building out other annotation options for our business customers. If there's a use case you would like it to support, it would be great to chat jordan[at]humanloop.com :)
At the moment we dont support the ability to render the HTML but it is something that has come up before. One of the teams we're speaking to wants to classify blog posts and would like to be able to preserve their formatting. If this is something that's important to you we would consider adding it so maybe drop me an email at raza[at]humanloop.com and we can discuss?
Datasaur are great. I hope Ivan would think it's fair that I'd describe their current product as as a modern, cloud-hosted Brat (https://brat.nlplab.org/ – this remains very popular!) with the features to make that work with teams. As you point out we're focusing on the tight integration of annotation and training enabling you to move faster and iterate on NLP ideas... essentially trying for move a waterfall ML lifecycle to a an agile one.
Fine tuning on BERT is the way to go. It's what we do, and that already reduces the data annotation requirements by an order of magnitude. Doing that offline in a notebook is still wanted by some (you can use our tool just as the annotation platform, and download the data and you'll still get the efficiency benefit through active learning) but integrating or deploying that model is still a time-suck. Having the model deployed in the cloud immediately has a load of supplementary benefits (easy to update, can always use the latest models etc) too, we hope.
Firstly, congrats on the launch! Active learning is a super interesting space.
You say it's possible to download the data and use Humanloop for annotation only while still benefitting from active learning. I'm curious about your experience with how much active learning depends on the model. Are the examples that the online model selects for labelling generally also the most useful ones for a different model trained offline?
Cheers. It's a good thing to be wary of. Poor use of active learning will end up biasing the data according to the model it's trained on – so that data won't be the best X samples to train on a different model. Most of this issue comes from bad active learning selection methods. If you have well calibrated uncertainty estimates and sample for diversity and representiveness too, it's far less of a concern.
(I feel the book must have been inspired by this site somehow, particularly when you read the various 'notes' that pop up after selecting a font from the dropdown.)
Moru thanks for that! The website is not maintained anymore but it doesn't distribute any malware. Maybe that's why the amount of views is slowly dropping every year.
Added a false positive report https://forums.malwarebytes.com/topic/254447-fontsforwebcom-...
I really like this idea of a Chrome extension that makes every load on Twitter/Facebook/Hacker News/[your vice goes here] just take something like 0.5s longer. If no one else builds this, I think I will at some point.
Planting a tree is as easy as planting a tiny seed in the ground. What all these headlines omit is that the majority of things planted won't survive. My back yard has a few thousand tree seedlings growing in it naturally.
When we hear of tree planting we picture a mayor or rich person planting a four year old tree in a park. The reality is that trees grow from seed.
In the case of commercial forestry all the trees planted won't make it.
A better vision of tree planting is Johnny Appleseed.
Edits. The Twitter from the Ethiopian minister says tree seedlings.
> When we hear of tree planting we picture a mayor or rich person planting a four year old tree in a park. The reality is that trees grow from seed.
Yeah but trees are hardline r-strategists and produce millions of disposable seeds with a very low survival rate. Planting a seed is in no way equivalent to planting a germinated, sprouted, and developed seedling.
The article gives no details but I'd look to see how many things are planted from seed in a nursery each day to at least get an indication of the number of things which can be "planted" in a given day.
Many trees could be said to have been planted by someone scattering apple seeds out of their pockets.
The article says something about 40 trees planted by each citizen... so I'm guessing this 350 M batch was distributed equally to everyone they could reach.
It does seem quite a bit, furiouspete talked about jobbing as a student in reforestation in Canada (his shrooms video) and apparently if you are quick you go through 4k a day per person a 12h shift.
Even 200 seeds per hour would require ~145,000 participants to plant all 350M seeds in a 12 hr time frame. I’m literally just imagining a few guys with a leaf blower and a bag of seeds for how they got 350M in 12 hours haha.
Not the worst way to teach kids a bit about how to nurture their environment. The absolute worst case here is a bunch of kids learning stuff and applying it later in their life. The best case is that and some meaningful percentage of these trees actually survives and helps restore a bit of land. Sounds like an interesting example to follow elsewhere. There's no shortage of recently arified land across the world.
I think there are a lot of countries finding out that they are not passive participants in the curve balls nature is throwing them and that they can influence what happens with simple, low tech solutions like planting trees and taking care of them and maybe a few simple changes in behavior like keeping sheep and cattle away from volatile areas.
Why poor Americans? It's purely a matter of taste.
Apple is culturally significant and Blackcurrants are banned in the US (due to then carrying a disease that would devastate one of the forests). So one is more important than lime and the other is totally unfamiliar.
Humanloop is looking for engineers to join the founding team as we develop our machine learning training platform. Our aim is to make programming computers as natural as teaching a colleague, so that anyone can collaborate with AI to achieve their goals.
We have spun out of UCL's AI Centre and are backed by some of the world's best investors (to be announced!).
We're looking for full-stack and frontend engineers who will feel confident owning product, can grow into leadership positions, and care deeply about creating real-world value for our customers. You can see the full specs of the types of roles at careers.humanloop.com
Our current stack: pytorch, fastapi, postgresql, react (nextjs), tailwindcss
We're operating a hybrid remote-first workplace. It's best that you're within a timezone UTC±5 for working hours overlap and so that you can easily attend our frequent off-sites.
Any questions, drop me an email jordan@humanloop.com (cofounder) or apply at careers.humanloop.com