Community Talk on Neural Search: demos included!
Past 4 weeks were super-packed with the Search with Machine Learning course by brilliant Grant Ingersoll and Daniel Tunkelang. We went from fundamentals in search, like configuring OpenSearch, indexing data and implementing a Flask UI app, to Learning to Rank, and then to Content and Query Understanding. This gave a holistic, all-around view of important sub-problems of a search engine system. I’m happy to have taken this course: it is well organized, with very supporting and caring teaching crew, Grant and Daniel. Both bring unique perspectives — engineer’s and data scientist’s — into the topic of Search, showing how multifaceted it has become over the years. The atmosphere on the course has been awesome, everyone is so supportive and sharing ideas / practical hints to help advance in the course together. The new student cohort is getting together in June 2022, so you have time to get in and take your search engine mastery to the next level.
I was also fortunate to give a Community Talk on Neural Search as part of the course (thanks to Judy Zhu for inviting me!). You can find the talk recording in the Vector Podcast:
Knowing why and when to use neural search is still something we evaluate in the search engine industry. Is it the promise of semantic similarity that attracts us? Or the fact that we can search images with textual queries (multimodality)? And beyond this decision — what goes into the process of introducing neural search into our existing product architectures? What options we have to implement it? Surprisingly, you can leverage neural search without disrupting your existing pipeline too much, while even just staying in the comfortable bounds of your sparse search engine.
In this talk the following systems / uses cases were demoed:
- Multilingual search with @muves_io
- Image search with CLIP by @laion_ai
- Question answering with @weaviate_io
As usual, this episode has been annotated with time codes and research papers to make the most of it.