Using SNP profiles of over 10k individuals from the Alzheimer's Disease Sequencing Project (ADSP) we have developed a computational framework for making diagnostic predictions regarding the likelihood that someone will develop Alzheimer's Disease (AD). A key feature of this framework is a neural network algorithm that, through machine learning, has been trained to predict AD patients or non-AD controls with high accuracy. Importantly, these predictions were made on individuals never seen by the classifier, suggesting high accuracy diagnoses could transfer to the general population. In fact, only a few hundred genomic loci are needed, and have been identified by the learning algorithms. The neural net outputs a ‘confidence’ level for each prediction; for individuals registering high-confidence predictions, the classifier is over 90% accurate. Since network weights have already been trained, and only a relatively small number of key variant loci are needed, this system could aid in clinical diagnostics; and as new genomes and clinical status are added, the system will continue to improve performance over time.
Here I provide a step-by-step analysis-walkthrough towards the goal of developing a platform for Alzheimer's Disease diagnosis based on machine learning techniques. Here are some entry pages: Intro, More Neural Nets, PCA, t-SNE.
The study of actin dynamics is centrally important to understanding synaptic plasticity. Fortunately, actin research has provided a vast pool of experimental studies, and several quantitative models that provide excellent characterizations of actin polymerization kinetics. To simulate filament scaffolding in a dendritic model, I developed a stochastic 3D model of actin dynamics based on parameters from previously established in steady-state, monte carlo and stochastic models. The ability to simulate the evolution of actin networks in 3D makes this model unique.
The media player is loading...
It is now generally accepted that many forms of adaptive behavior, including learning and memory, engender lasting physiological changes in the brain; reciprocally, neural plasticity among the brain’s synaptic connections provides the capacity for learning and memory. Whenever I have to summarize my primary research focus using just a few words, they always include: "synaptic plasticity". Indeed, I feel that the key to fully understanding cognitive processes like memory formation is through studying neural dynamics at the cellular-network, synaptic, and molecular levels.
I have developed a machine learning tutorial, focusing on supervised learning, but it also touches on techniques like t-SNE. It makes heavy use of Tensorflow Playground to visualize what is happening in multilayer neural networks during training. It also provides learners with an opportunity to try and solve problems classification problems live right on the web app.
Molecular-level synaptic plasticity is among my primary interests. I've studied and quantified membrane diffusion properties of excitatory and inhibitory receptors, and have developed models how these particles swarm to potentiate synapses. I find stochastic particle diffusion is intertwined with the first principles of statistics and probability. Given that synaptic potentiation is dependent on marshalling receptors undergoing stochastic diffusion, it seem that neurons have evolved into innate statistical computers. The result of 100 billion of these statistical computers making 100 trillion connections is the human brain. Here are some of my notes and code for simulating membrane diffusion.
You've found my wiki. This is where I horde random information. I have every intention of linking it all together someday. If you are so inclined, recent additions to this wiki can be found in the box on the right. For a non-curated glimpse of my activity you can check out the latest wiki updates. Older wiki content can be accessed using the [search box] or perusing all pages. If you would like to contact me, you can find this info on my home page.