Skip to content

Commit 7805ec2

Browse files
committed
Upload files
0 parents  commit 7805ec2

14 files changed

Lines changed: 1364 additions & 0 deletions

.gitignore

Lines changed: 4 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,4 @@
1+
.ipynb_checkpoints
2+
AidNet/.ipynb_checkpoints
3+
src/.ipynb_checkpoints
4+
src/__pycache__

AidNet/AidNet_framework.png

1.11 MB
Loading

AidNet/config.yaml

Lines changed: 7 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,7 @@
1+
#### AidNet model parameters
2+
context_size_left: 4096 # samples - context added to the left side of the audio input to avoid convolutional edge effects
3+
context_size_right: 4096 # samples - context added to the right side of the audio input to avoid convolutional edge effects
4+
fs_audio: 24414.0625 # Hz - sampling rate for the audio input and output of the model
5+
p0: 0.00002 # dB SPL reference of 2e-5 Pascal
6+
audio_normalisation: 0.04 # 1/25 normalisation factor applied to the audio inputs
7+
n_encoder_layers: 9 # number of encoder layers

AidNet/hearing_thresholds.csv

Lines changed: 8 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,8 @@
1+
freqs,NH-mean,NH-std,HI,gain
2+
250,58.717861311763,7.43838327796419,57.8045464772208,0.0000001
3+
500,45.554198046307,8.4240673714432,47.1734890343861,1.61929098807916
4+
1000,32.3408008899154,5.97274893313995,53.4739727965252,21.1331719066098
5+
2000,28.0800127385574,4.41847043362963,62.9286521044867,34.8486393659293
6+
4000,30.6058869996123,6.05867393704345,63.3045044472408,32.6986174476285
7+
8000,47.2197347254779,4.513238889114,71.7891590508677,24.5694243253898
8+
16000,43.6211909043233,6.44764545462357,68.1540186736578,24.5328277693345

AidNet/model_weights.hdf5

34.8 MB
Binary file not shown.

AidNet_example.ipynb

Lines changed: 502 additions & 0 deletions
Large diffs are not rendered by default.

LICENSE

Lines changed: 53 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,53 @@
1+
UCL Academic License: Terms and conditions for use and reproduction
2+
3+
1. Definitions.
4+
"License" shall mean the terms and conditions for use and reproduction of the Work as defined by Sections 1 through 8 of this document.
5+
6+
"Licensor" shall mean University College London who is authorized by the Authors to grant the License.
7+
8+
"Licensee" (or "Licensee’s") shall mean an individual or legal entity exercising permissions granted by this License.
9+
10+
"Source" form shall mean the preferred form for making modifications, including but not limited to software source code, documentation source, and configuration files.
11+
12+
"Object" form shall mean any form resulting from mechanical transformation or translation of a Source form, including but not limited to compiled object code, generated documentation, and conversions to other media types.
13+
"Work" shall mean “Modeling neural coding in the auditory brain with high resolution and accuracy” made available under the License, as indicated by a notice that is included in or attached to the work.
14+
15+
"Derivative Works" shall mean any work, whether in Source or Object form, that is based on (or derived from) the Work and for which the editorial revisions, annotations, elaborations, or other modifications represent, as a whole, an original work of authorship. For the purposes of this License, Derivative Works shall not include works that remain separable from the Work and Derivative Works thereof.
16+
17+
2. Grant of License on Work.
18+
2.1. Licensee acknowledges that Licensor controls all right, title and interest (including intellectual property rights) in the Work.
19+
20+
2.2. Subject to the terms and conditions of this License, the Licensor hereby grants to the Licensee a free of charge, non-exclusive, non-transferable copyright license to use the Work in Object form for any non-commercial purpose only, including teaching and research at educational institutions, research at not-for-profit research institutions and for personal not-for-profit use. Non-commercial use expressly excludes any profit-making or commercial activities, including without limitation sale, license, manufacture or development of commercial products, use in commercially-sponsored research, provision of consulting service, use for or on behalf of any commercial entity, and use in research where a commercial party obtains rights to research results or any other benefit. Any use of the Work and Derivative Works of the Work for any purpose other than non-commercial research shall automatically terminate this License.
21+
22+
2.3. No other rights, even implied, are granted herein, other than those mentioned in clause 2.2; in particular the Licensee is not allowed to:
23+
a) modify, adapt, translate or create Derived Works based on the Work other than for the purposes of academic research purposes or knowingly permit any third party to engage in the foregoing.
24+
b) use the Work for commercial purposes or knowingly permit any third party to engage in the foregoing.
25+
c) sublicense, distribute, disclose, market, rent, lease, or transfer the Work to third parties without the prior written authorization of the Licensor.
26+
d) use the Work for any diagnostic or treatment purposes.
27+
3. Grant Back of License on Derived Works
28+
Each Licensee hereby grants to Licensor a perpetual, worldwide, non-exclusive, royalty-free, irrevocable license to reproduce, prepare, publicly display, publicly perform, sublicense, and distribute Derivative Works for any purposes in Source or Object form.
29+
30+
4. Distribution of Derived Works.
31+
Licensee may reproduce and distribute copies of the Derivative Works for non-commercial academic research purposes in any medium and in Source or Object form, provided that Licensee meet the following conditions:
32+
a) Prior to any distribution of Derived Works, the Licensee shall notify the Licensor of said distribution pointing out the method and format of said distribution (specifying online / offline resource and intended audience).
33+
b) The distribution of the Derived Works shall be subject to the terms and conditions for use, reproduction, and distribution as defined by Sections 1 through 7 of this document.
34+
c) Licensee must cause any modified files to carry prominent notices stating that Licensee changed the files; and
35+
d) Licensee must retain, in the Source form of any Derivative Works that Licensee distribute the following copyright and attribution notice: © copyright 2024 University College London, all rights reserved; this Derivative work is made available for non-commercial academic research purposes and subject to an UCL Academic License (url to license)
36+
e) If the Work includes a "NOTICE" text file as part of its distribution, then any Derivative Works that Licensee distribute must include a readable copy of the attribution notices contained within such NOTICE file, in at least one of the following places: (a) within a NOTICE text file distributed as part of the Derivative Works; (b) within the Source form or documentation, if provided along with the Derivative Works; or, (c) within a display generated by the Derivative Works, if and wherever such third-party notices normally appear. The contents of the NOTICE file are for informational purposes only and do not modify the License. Licensee may add Licensee’s own attribution notices within Derivative Works that Licensee distribute, alongside or as an addendum to the NOTICE text from the Work, provided that such additional attribution notices cannot be construed as modifying the License. Licensee may add Licensee’s own copyright statement to Licensee’s modifications.
37+
38+
5. Publications
39+
The originators of the Work shall be acknowledge in all written publications by referring to the following publication: "Optimal hearing aid design through restoration of the neural code" by Fotios Drakopoulos, Lloyd Pellat, Shievanie Sabesan, Yiqing Xia, Tingchen Gong, Andreas Fragner, Nicholas A Lesica.
40+
41+
6. Trademarks.
42+
This License does not grant permission to use the UCL trademarks, except as required for reasonable and customary use in describing the origin of the Work and reproducing the content of the NOTICE file.
43+
44+
7. Disclaimer of Warranty.
45+
Unless required by applicable law or agreed to in writing, Licensor provides the Work (and each Licensee provides the Derivative Works) on an "AS IS" BASIS, WITHOUT WARRANTIES OR CONDITIONS OF ANY KIND, either express or implied, including, without limitation, any warranties or conditions of TITLE, NON-INFRINGEMENT, MERCHANTABILITY, or FITNESS FOR A PARTICULAR PURPOSE. Licensee is solely responsible for determining the appropriateness of using the Work and assume any risks associated with Licensee’s exercise of permissions under this License.
46+
47+
8. Limitation of Liability.
48+
49+
In no event shall Licensor be liable to Licensee for damages, including any direct, indirect, special, incidental, or consequential damages of any character arising as a result of this License or out of the use or inability to use the Work (including but not limited to damages for loss of goodwill, work stoppage, computer failure or malfunction, or any and all other commercial damages or losses), even if Licensor has been advised of the possibility of such damages.
50+
51+
52+
53+
END OF TERMS AND CONDITIONS

README.MD

Lines changed: 41 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,41 @@
1+
### AidNet: A DNN-based hearing aid for optimal restoration of central neural coding
2+
3+
This repository contains the source code of AidNet, a deep-learning-based model developed to restore impaired neural activity in the inferior colliculus (IC). The example AidNet model was trained to compensate for the hearing impairment of an example noise-exposed animal with a symmetric, mild, high-frequency hearing loss. An example Jupyter notebook `AidNet_example.ipynb` is included that can be used to process any sound using the provided AidNet model. A Google Colab version of the notebook can also be found [here](https://colab.research.google.com/drive/1iyJmhtX9c1OJ8LvsNw-KfrRf2L92BdY6?usp=sharing) and can be executed without a local Python installation. The supporting work can be cited as follows:
4+
> Drakopoulos, F., Pellatt, L., Sabesan, S., Xia, Y., Gong, T., Fragner, A. & Lesica, N. A. (2026). Optimal hearing aid design through restoration of the neural code.
5+
6+
The *AidNet* folder contains the deep-neural-network (DNN) architecture, weights and configuration parameters of the pre-trained AidNet model. The *src* folder contains all the supplementary Python functions to execute AidNet. This repository also contains an example sound file `scribe_male_talker.wav` from the [UCL SCRIBE dataset](http://www.phon.ucl.ac.uk/resource/scribe), a pre-compiled Python version of the provided Jupyter notebook `AidNet_example.py`, this `README.MD` document and a license file. The repository can be used together with the [ICNet repository](https://github.com/fotisdr/ICNet) to simulate neural activity in the IC.
7+
8+
<p align="left">
9+
<img src="AidNet/AidNet_framework.png" alt="AidNet Training Framework" height="9100%" width="100%">
10+
</p>
11+
12+
13+
## How to use the AidNet model
14+
15+
To run the example Jupyter notebook and the AidNet model, it is necessary to install the Python packages included in `requirements.txt`. We recommend installing [conda](https://www.anaconda.com/download) and executing the following commands:
16+
17+
```
18+
conda create --name AidNet python=3.11
19+
conda activate AidNet
20+
pip install -r requirements.txt
21+
```
22+
23+
The first two commands are used to create (and activate) a new conda environment named `AidNet` with Python v3.11. If using only Python (without conda), running the last command is sufficient to install all necessary packages.
24+
The example Jupyter notebook can then be executed on Jupyter lab by running `jupyter lab &`, selecting the `AidNet_example.ipynb` notebook and running all cells.
25+
26+
----
27+
28+
## Citation
29+
If you use this code, please cite the corresponding paper or this repository:
30+
31+
> Drakopoulos, F., Pellatt, L., Fragner, A. & Lesica, N. A. (2024). AidNet: A DNN-based hearing aid for optimal restoration of the neural code (v1.0). Zenodo. .
32+
33+
DOI
34+
35+
##
36+
For questions, please reach out to one of the corresponding authors:
37+
38+
* Fotios Drakopoulos: f.drakopoulos@ucl.ac.uk
39+
* Nicholas A Lesica: n.lesica@ucl.ac.uk
40+
41+
> This work was supported by UK EPSRC EP/W004275/1, BBSRC BB/Y008758/1 and MRC MR/W019787/1.

requirements.txt

Lines changed: 11 additions & 0 deletions
Original file line numberDiff line numberDiff line change
@@ -0,0 +1,11 @@
1+
# This file may be used to create an environment using:
2+
# $ pip install -r <this file>
3+
# platform: linux-64
4+
tensorflow==2.13.1
5+
matplotlib==3.10.6
6+
scipy==1.15.3
7+
numpy==1.24.3
8+
librosa==0.11.0
9+
jupyterlab==4.4.7
10+
typing_extensions==4.15.0
11+
acoustics==0.2.6

scribe_male_talker.wav

1.34 MB
Binary file not shown.

0 commit comments

Comments
 (0)