pulse2percept 0.9.0.dev0 documentation¶
Retinal degenerative diseases such as retinitis pigmentosa and macular degeneration result in profound visual impairment in more than 10 million people worldwide, and a variety of sight restoration technologies are being developed to target these diseases.
Retinal prostheses, now implanted in over 500 patients worldwide, electrically stimulate surviving cells in order to evoke neuronal responses that are interpreted by the brain as visual percepts (‘phosphenes’). However, interactions between the device electronics and the retinal neurophysiology result in perceptual distortions that may severely limit the quality of the generated visual experience:
(left: input stimulus, right: predicted percept)
Built on the NumPy and SciPy stacks, pulse2percept provides an open-source implementation of a number of computational models for state-of-the-art visual prostheses (also known as the ‘bionic eye’), such as ArgusII, BVA24, and PRIMA, to provide insight into the visual experience provided by these devices.
Simulations such as the above are likely to be critical for providing realistic estimates of prosthetic vision, thus providing regulatory bodies with guidance into what sort of visual tests are appropriate for evaluating prosthetic performance, and improving current and future technology.
If you use pulse2percept in a scholarly publication, please cite as:
M Beyeler, GM Boynton, I Fine, A Rokem (2017). pulse2percept: A Python-based simulation framework for bionic vision. Proceedings of the 16th Python in Science Conference (SciPy), p.81-88, doi:10.25080/shinma-7f4c6e7-00c.
Once you have Python 3 and pip, the stable release of pulse2percept can be installed with pip:
pip install pulse2percept
The bleeding-edge version of pulse2percept can be installed via:
pip install git+https://github.com/pulse2percept/pulse2percept
When installing the bleeding-edge version on Windows, note that you will have to install your own C compiler first. Detailed instructions for different platforms can be found in our
Detailed instructions for different platforms can be found in our Installation Guide.
You can also skip installation and run pulse2percept in a Jupyter Notebook
on Google Colab. Simply make the first cell in the notebook
!pip install pulse2percept for the stable version or
!pip install git+https://github.com/pulse2percept/pulse2percept.git
for the latest version.
Where to go from here¶
- Have a look at some code examples from our Example Gallery.
- Familiarize yourself with visual prostheses, electrical stimuli, and our computational models.
- See if your question has already been addressed in the FAQ.
- Request features or report bugs in our Issue Tracker on GitHub.