Hacker News new | past | comments | ask | show | jobs | submit | Pattio's comments login

That's fair, but I think it is nice to have some alternatives that are free and open. Furthermore, some people might have sensitive data which they wouldn't feel comfortable uploading to the cloud.

Also, I wanted to ask: were you using Gooogle Vision, as when I was doing the research it seemed that they do not allow you to export the model.


Yes, once you start the search a directory corresponding to the date of your run will be created inside the saves directory. Inside this directory all your models and progress will be saved automatically (you can later resume the search by changing save_folder value inside configuration file). Once the search is done you will find best_topology file which will containt the topology with its weights. Furthermore, inside the deepswarm.log file you will see all the previously evaluated models with their loss/accuracy values.


The paper is not published yet, as this work was done for my dissertation, however we should publish the paper in coming few weeks. As for the results they are not as good as state-of-the-art methods, but they seem to be pretty competitive when compared to other open source libraries. The current problem is that complex nodes like add and skip nodes are not implemented yet, so it can only generated sequential structures. Once the paper is published I will update GitHub readme file.


What about compute resources needed?


Runtime on CIFAR-10 dataset for different ant counts: https://edvinasbyla.com/assets/images/deepswarm-runtime.pdf)

Runtime compared to genetic architecture search (using similar settings): https://edvinasbyla.com/assets/images/devol-deepswarm-runtim...

The error rate on CIFAR-10, before the final training (meaning that topologies weren't fully trained and no augmentation was used): https://edvinasbyla.com/assets/images/ant-before-train.pdf

The error rate on CIFAR-10, before the final training compared to genetic architecture search (using similar settings): https://edvinasbyla.com/assets/images/devol-deepswarm-cifar....

The 2 main factors that contribute to faster search are (1) ants search for architectures progressively (meaning that early architectures can be evaluated really fast), (2) ants can reuse the weights as they are associated with the graph.

All test were done using Google Colab. Even though results might not seem that impressive, I am still really excited to see what will happen when ants will be allowed to search for more complex architectures which use multi-branching.


I just looked up the ant colony algorithm, and intuitively the idea of pheromones and path reinforcement makes a lot of sense. Are you the first one to try it for NN search?


To best of my knowledge there are no published papers that use ACO for CNNs neural architecture search. However, I found three published papers that used ACO for different NAS, but they used static graphs and no heuristics.


Join us for AI Startup School this June 16-17 in San Francisco!

Guidelines | FAQ | Lists | API | Security | Legal | Apply to YC | Contact

Search: