Evripidis (or Evri) is a post-doctoral Research Associate at the University of Edinburgh, School of Informatics, and the University of Groningen, Faculty of Science and Engineering.
He is interested in bio-accurate artificial intelligence and information theory, and he currently studies the effectiveness of different forms of working memory constrained by insect neuroscience and nanotechnology hardware, aiming to build an anatomically-accurate polarised light compass circuit.
Scientific questions he is currently working on:
PhD, Bio-inspired Robotics & Autonomous Systems, 2023
University of Edinburgh
MSc, Artificial Intelligence, 2016
University of Edinburgh
BSc, Computer Science, 2013
Aristotle University of Thessaloniki
I explore the effectiveness of different forms of working memory constrained by the biology and nanotechnology hardware and build an anatomically-accurate polarised light compass circuit.
Advisors: Prof Barbara Webb, Prof Subramanian Ramamoorthy.
Thesis examiners: Prof Thomas Nowotny (external), Prof J. Douglas Armstrong (internal).
Project: Insect neuroethology of reinforcement learning, Funding: RAS CDT.
I investigated how insects form associative memories based on reinforcements from the environment and how this impacts their behaviour in the context of reinforcement learning.
Advisors: Dr Michael Mangan (Sheffield), Prof Barbara Webb (Edinburgh).
Project: Invisible Cues, Funding: Engineering and Physical Sciences Research Council.
I was responsible for investigating the information content of polarised light in relation to animal navigation before using the outcomes to develop a technical specification / design for manufacture of a novel robot sensor.
I focus on trying to imitate the learning mechanism of the larval Drosophila, which creates associations among odours and tastes. The goal is to create such a mechanism in neural level and put it on a robot platform. The robot will try to find the gustatory source following the gradients of the associated odour.
My main task was to implement a toolbox, using C# and the WPF subsystem, which could be used to analyse and compare human gestures, tracked using different capturing devices, i.e., Microsoft Kinect, Vicon, WIMUs. I also implemented an extension of it, which was compatible with Unity3D.
I have been tutoring, demonstrating and marking for a variety of courses including:
Introductory Applied Machine Learning: tutor, demonstrator, marker
Instructors: Dr Amos Storkey & Prof Chris Williams
Reinforcement Learning: tutor
Instructors: Dr Pavlos Andreadis & Dr Stefano Albrecht
System Desighn Project: Machine Vision and Quantitative Analysis expert
Instructors: Dr Steve Tonneau, James Garforth & Prof Barbara Webb
Heterogeneous Parallel Programming: I was responsible for answering students' questions regarding the course material and assignments
Instructor: Prof Wen-mei W. Hwu