In this study we present the results of evaluating the sonification protocol of a new assistive product aiming to help the visually impaired move more safely with the help of sounds organized in different cognitive profiles. The evaluation was carried out with 17 sighted and 11 visually impaired participants. The experiment was designed over both virtual and real environments and divided into 3 virtual reality based tests. Finally, four participants became experts by means of longer and deeper training, and they participated in a real life test and in a focus group at the end of the tests. Both quantitative and qualitative results were extracted, showing that the proposed system is able to help the users understand their surroundings via sounds. However, important limitations have been found in the sample used (some important demographic characteristics are strongly linked among them, limiting segregated analysis), the usability of the most complex profile, or even the special difficulties faced by the completely blind participants as compared to the sighted and low vision participants.