Digital Twin to Model the Effect of Spaceflight on the Retina

Dear all -

We will meet today to discuss progresses made towards building a digital twin model of the retina. For the past few weeks, we focused on assembling datasets and building a robust imputation pipeline. @vaishnavi.nagesh has made some good progress on that front. We continue the discussions on this and support her work.

Some changes:

  1. We are making rapid progress and so we will switch to a weekly meeting format starting next week (every Thursday 10 am PST). We aim to push this work into a publication quickly and we can use all the help we can from you. Some tasks in mind are: (a) literature review, (b) coding, (c) making figures/diagrams, (d) building an interactive dashboard & web interface.

  2. We will organize a bi-weekly working session focused on paired/group coding, using VSCode’s Live Share extension. This meeting will be every other Friday 11 am PST. This meeting is intended to get members up to speed with accessing and manipulating data, so that all of us can feel comfortable using developer tools. Come to code with us and ask questions.

  3. Within the project, we will self-organize into two sub-group cohorts. One coding cohort, and the other data cohort. The coding cohort’s job is to test new features & models. The data cohort’s focus is on assembling more relevant dataset and putting everything together so that the dataset can be used by the coding cohort. You are welcome to join either or both. Our goal is to build this out into a foundational model of the retina. We will have some in-depth discussions on foundational models. I can tell you that this is really exciting!

Hope you join us on this journey together!

Best,
Jian.

3 Likes

Hello! I am a bit busy. I will listen to the recording.

1 Like

Hi Jian,
Hope the meeting went well and I am terribly sorry for not being able to make it today. I’ll listen to the recording.

No drastic updates from my end, very minor ones:

  1. Imputation code has been merged and I have started documenting methods and results from imputation. I’ll upload the doc into google drive.
  2. Started comparing our imputation approach to the reference publication of GANs that was posted on this forum.
    In general have started reading about GANs for next steps, but happy to pivot to anything that needs more attention.

Today and tomorrow joining meeting + coding session is a little bit difficult since I need to tend to a health care issue.

2 Likes

Hi all -

We had a short meeting today and I did not end up recording it but will write an update here instead (faster this way).

  1. Evaluation of MICE. We looked over the mechanism of MICE and reasoned through the conditions when it should be most effective. This exploration is summarized in these powerpoint slides. Please take a look.

  2. Moving forward, I think we need to focus on some evaluations of the imputation, i.e., more cross validation tests. One test that @jakubm had suggested in our last meeting is to artificially remove some datasets (even though it is very limited; we need to be highly selective in doing this) and compare the imputed value with the real value (artificially removed). There might also be other statistical measures. We need a good literature review and have a table of all of these measures (need help!).

  3. Think about sequential imputation: we evaluated and know that MICE currently uses a ā€œrandomā€ imputation approach: filling data rows randomly. I think we can improve this by doing a non-random, sequential approach by first evaluating existing information content at each row. Then filling data first at the rows that have the most information already. However this is an hypothesis, we can test this by making a direct comparison. @vaishnavi.nagesh, does the library currently support this?

See you all tomorrow at the coding session. I will go over the github repo and we can talk more about the dataset & the dashboard that is being developed.

Best,
Jian.

1 Like

Dear all -

In preparation to the group coding session today, I want to make sure you have:

  1. VS Code installed on your local computer.

  2. A Github account. If you don’t already have one, it might be a good idea to get one now. It takes a few steps and some minutes to set it up (involving authentication with your phone). Then, you want to log into your Github account from within VS Code.

  3. Get the Live Share extension within VS Code. This is easy: one click of a button.

Here is a test link if you need to test the connection: Visual Studio Code for the Web

I haven’t tested this with a lot of people, and if we hit a bottle-neck, we will switch to something else :slight_smile:.

See you all soon,

Jian.

1 Like

Hi @jgong and all: I will be able to join at 11:30 today.

1 Like

Hello @jgong! This is what I see when I tried opening the link to work on the code.

Here is the publication behind the archs4 dataset we were discussing in the meeting:

https://www.nature.com/articles/s41467-018-03751-6

It will give some insights how it is made.

Jian.

2 Likes

downloaded most relevant data with respect to the genes for different eye conditions here.

1 Like

Hi @vaishnavi.nagesh and all, I am still working on my actions from last meeting, which I summarize below. I have completed Steps 1 and 2, so here is the link to a text file with the list of genes that come from the gene sets that are related to the phenotypes. V, you could go ahead and impute these genes and see how it performs!

Step 1: Identify a list of gene sets most closely related to each phenotype. See list of gene sets here.

Step 2. Combine all the genes from the phenotype-related gene sets in a non-redundant manner.

Step 3. Use linear regression to identify which genes are most predictive of each phenotypic measurement.

Step 4. Identify how much overlap there is between the regression genes and the gene set genes. If there’s a lot of overlap (eg > 70%), use that list. If there isn’t much overlap (eg < 50%) then impute both gene lists and see which imputation performs better.

5 Likes

Thanks so much Lauren! I’ll get on this!

1 Like

Hi @vaishnavi.nagesh and all, I solved the issue of mapping human-to-mouse ENSEMBL gene IDs that I mentioned this morning (I used the approach described here)

As a result, the text file now has a list of mouse ENSEMBL gene IDs that you can use for imputation, derived from the gene sets most closely related to each phenotype.

I’m now working on steps 3-4 above. Will share code once I’m finished.

4 Likes

Thanks so much Lauren! That medium link is so cool!

1 Like

Thanks for sharing Lauren!

Hi @jgong and team, here’s an interesting idea: using images of the eye to predict/identify diseases in other parts of the body. Could we have an aspect of our digital twin that provides predictions for other health effects of spaceflight?

Dr. David Rhew, Microsoft’s Global Chief Medical Officer & VP of Healthcare, speaks briefly about this idea at the Gilbert Beebe Symposium on AI: https://www.youtube.com/live/318i4lFrhHQ?si=KgkspHgV5FYLbSyT&t=1079

Here are some of the citations from his slide:

Chopra R, Chander A, Jacob JJ. The eye as a window to rare endocrine disorders. Indian J Endocrinol Metab. 2012 May;16(3):331-8. doi: 10.4103/2230-8210.95659. PMID: 22629495; PMCID: PMC3354836. The eye as a window to rare endocrine disorders - PubMed

Eyes offer a window into Alzheimer’s disease The Eyes Offer a Window into Alzheimer’s Disease  | UC San Francisco

Yuan TH, Yue ZS, Zhang GH, Wang L, Dou GR. Beyond the Liver: Liver-Eye Communication in Clinical and Experimental Aspects. Front Mol Biosci. 2021 Dec 24;8:823277. doi: 10.3389/fmolb.2021.823277. PMID: 35004861; PMCID: PMC8740136. Beyond the Liver: Liver-Eye Communication in Clinical and Experimental Aspects - PubMed

Flammer J, Konieczka K, Bruno RM, Virdis A, Flammer AJ, Taddei S. The eye and the heart. Eur Heart J. 2013 May;34(17):1270-8. doi: 10.1093/eurheartj/eht023. Epub 2013 Feb 10. PMID: 23401492; PMCID: PMC3640200. The eye and the heart - PubMed

Mendez I, Kim M, Lundeen EA, Loustalot F, Fang J, Saaddine J. Cardiovascular Disease Risk Factors in US Adults With Vision Impairment. Prev Chronic Dis. 2022 Jul 21;19:E43. doi: 10.5888/pcd19.220027. PMID: 35862513; PMCID: PMC9336192. Cardiovascular Disease Risk Factors in US Adults With Vision Impairment - PubMed

Glover K, Mishra D, Singh TRR. Epidemiology of Ocular Manifestations in Autoimmune Disease. Front Immunol. 2021 Nov 2;12:744396. doi: 10.3389/fimmu.2021.744396. PMID: 34795665; PMCID: PMC8593335. Epidemiology of Ocular Manifestations in Autoimmune Disease - PMC

20 Surprising Health Conditions an Eye Exam can Catch 20 Surprising Health Problems an Eye Exam Can Catch - American Academy of Ophthalmology

5 Likes

Loving this idea, Lauren! Will dig into this and it feels like a powerful application for the digital twin. Eye behavior is indeed predictive to a whole suite of other potential issues of the human body.

Even the redness in the eye tells us a person is tired/didn’t sleep well. We do this all the time. Vision is one of our most powerful sensor to the world, but reversely, it is also a reverse sensor, a window to our own internal/holistic health.

Jian.

3 Likes

@dchander here is the Retinal group which was mentioned this morning

This collaboration is a subgroup in the @AIMLawg (the meeting this morning was the @ALSDAawg )

Great idea!

1 Like

:yellow_heart: Interesting! Examination of the eye is a seperate section in any medical case assessment. Ranging from Iron deficiency anemia and jaundice to Kayser–Fleischer (KF) rings in Wilson’s disease and anisocoria in CVAs, the eye really is a window to the body. During our internal medicine training, if we skipped examination of the eye by chance, there was no doubt that the examiner would fail us. I still remember diagnoses being made just by a 1 minute Eye exam!

2 Likes

This is beautiful!

Reminiscent that for centuries (and in school-training) the old adage was the ā€œeyes are a window to the soulā€ (and I read in between the lines, ie health), from ā€˜The Prince of Medicine’, Galen, and ā€˜The Father of Optics’ Hasan Ibn al-Haytham.

Sweet, connecting this to space :slight_smile:

3 Likes