Virtual fly test tube

In an ongoing collaboration with dr Frank Hirth‘s lab, a couple of months ago I have been invited to generate the code for a simulated insect navigating a virtual 2-dimensional world. This investigation aimed at testing whether a recently developed neural model of the central complex in insects (i.e. see related work here and here) could be exploited to guide purposeful navigation. [UPDATE: the paper has been accepted and will be soon published. You can find the abstract here: http://journal.frontiersin.org/article/10.3389/fnbeh.2017.00142/abstract]

The first step of this task was to graphically create both virtual arena and simulated insect. To test the system, I have first implemented a sort of test tube, where actions are not driven by a neural system, but rather selected randomly. This was used to test the movements of the agent (e.g. whether the obstacles were effective or ignored), simulated perception of cues in the arena (e.g. whether the visual field was properly coded) and in general all the graphical features related to both environment and agent.

The second step was to record the graphical output, so to be able to generate short videos of the behaviour of the simulated agent. There are several examples online: in this case, you can see the video is essentially defined in three blocks of code lines: “video initialization”, “frame capture” and “video recording” (see below, commented text).

I share here this preliminary code as a relatively simple example of how to produce a live stream of graphical outputs and record it to be seen later on. Again, I want to stress here that the agent in this example does not rely on a neural system to produce its behaviour. The actual study based on the use of the neural system is currently under review.

Video example of random navigation in the environment:

Source for the main file (at the end the zip file with the entire code).

To download the archive with all the files for the entire code: fly_game

Insert math as
$${}$$