Deep Learning Internship
Braincreators / Amsterdam (NL)Apply on site
Internship: Practical Alternatives to Neural Architecture Search for Object Detection
At BrainCreators, we're at the forefront of applied AI with many years of successful research internship projects that combine cutting edge science with the challenges of applying AI in the real world. The focus of this year’s AI research internship projects will be on the technical challenges at the heart of our Machine Learning platform, BrainMatter.
What we expect from you
- A full-time commitment to the research internship project.
- A solid background in the theoretical subjects relevant for your particular project and ML coding skills in pyTorch.
- Good communication and presentational skills, and a willingness to learn as much as possible in this exciting year.
- Your project will have a scientific component on which you are encouraged to work towards a publishable paper at the end of the year.
- Your project will also have an applied component, the result of which is a functional and documented piece of cutting-edge software that can be integrated into BrainMatter.
- Bachelor’s degree in Artificial Intelligence or related field.
What we can offer you
- The opportunity to work in our research team as a full time member.
- A workplace in our Prinsengracht HQ with access to our compute cluster if required.
- Support and supervision, including a weekly personal supervision meeting and research team group meeting as well as support for integration into our software stack when needed.
- Internal weekly workshops about scientific and industrial progress.
- Become part of a vibrant team of AI realists that know how to get things done.
- Our best interns will be offered a full time job opportunity after graduation.
Neural architecture search (NAS) methods have been shown to be effective at synthesizing neural network architectures for a variety of deep learning tasks that surpass the performance of hand crafted networks. However, NAS methods are notoriously resource intensive, often requiring days if not weeks to traverse the large search space of input architectures. Furthermore, NAS methods have mainly focused on simpler tasks such as image classification due to the combinatorial explosion of the search space for tasks like object detection where there are many more hyperparameters involved.
Meanwhile, machine learning practitioners in industry that are often tasked with solving complex problems for customers on a variety of input datasets in short timeframes have to ask themselves, how do I choose an optimal network architecture to solve this particular task? NAS, for the aforementioned reasons, is not always a practical choice. So what is?
In this research internship, you will explore the problem of designing a novel machine learning system specifically for object detection that, given an input dataset and a small set of base requirements (ie desired level of accuracy, inference latency, and memory consumption), outputs a neural network architecture or family of architectures (think Faster-RCNN, YOLO, EfficientDet) that meet the desired requirements in a consistent and justifiable way.
Read the full description and how you can apply here.