Main Page: Difference between revisions

From bradwiki
Jump to navigation Jump to search
No edit summary
No edit summary
 
(141 intermediate revisions by 2 users not shown)
Line 1: Line 1:
{| class="wikitable" align=right
{{SmallBox|float=left|clear=both|margin=0px 1px 8px 1px|padding=10px 1px 10px 1px|width=95%|font-size=16px|Genomics and Machine Learning|txt-size=12px|pad=6px 12px 2px 12px|
!colspan=2| About Me
----
|----
!colspan=2|[http://onesci.com/User:Bradley_Monakhos/Vita_Curriculum Vita]
|----
!colspan=2| Quick Links
|----
!colspan=2|[[MediaWiki:Sidebar|Sidebar]]<br>[[Wikipedia:Cheatsheet|Editing Guide]]<br>[[MediaWiki:Common.css|CSS]]<br>[[Friends]]<br>[http://toolserver.org/~dapete/ImageMapEdit/ImageMapEdit.html?en Image Mapper]<br>[[Hikes]]
|----
|}


{|


When I'm not fumbling around with [http://onesci.com OneSci] I can be found <s>working</s> having fun at the [http://sdsu.edu SDSU] Center for Behavioral Neurobiology. However, I just finished my masters thesis and plan to work as lab tech (<small>in New York</small>) for a while and then try my luck applying to PhD programs in neuroscience.  
Using SNP profiles we have developed a computational framework for making diagnostic predictions regarding the likelihood that someone will develop dementia. A key feature of this framework is a neural network algorithm that, through machine learning, has been trained to predict patients or controls with high accuracy. Importantly, these predictions have proven to generalize well to hold-out genomes from independent sequencing projects, suggesting the classifier may perform well across samples of the general population. The bp status of just ~1k genomic loci was sufficient to to have 80% prediction accuracy. Furthermore, the neural net outputs a ‘confidence’ score for each prediction; on high-confidence predictions the classifier is over 90% accurate (''confidence'' is not quantified ''post hoc'', it is divined ''a priori'' by deep neural nets). Since the neural network weights have been trained, and because only a relatively small number of genomic targets are needed, we hope this system can be further developed into a clinical diagnostic tool. As it is, this is still far off; many independent test genomes will be required to validate such a tool. In the meanwhile, we hope to continue to improve the classifier's performance using novel data and methods.
}}


|}


{{Box|width=45%|min-width=300px|float=left|font-size=14px|[[Actin|Actin Modeling]]|
The study of actin dynamics is centrally important to understanding synaptic plasticity. Fortunately, actin research has provided a vast pool of experimental studies, and several quantitative models that provide excellent characterizations of actin polymerization kinetics. To simulate filament scaffolding in a dendritic model, I developed a stochastic 3D model of actin dynamics based on parameters from previously established in steady-state, monte carlo and stochastic models. The ability to simulate the evolution of actin networks in 3D makes this model unique.
<br><br>
[[File:Actin modeling.png|right|600px]]


===DEEP-ISH THOUGHTS RANTS and MUSINGS===
}}


{{Box|width=45%|min-width=310px|float=right|font-size=14px|[[:Category:Synaptic Plasticity|Synaptic Plasticity]]|[[File:Synapses web.jpg|center|500px|link=Synaptic Plasticity]]{{Clear}}
It is now generally accepted that many forms of adaptive behavior, including learning and memory, engender lasting physiological changes in the brain; reciprocally, neural plasticity among the brain’s synaptic connections provides the capacity for learning and memory. Whenever I have to summarize my primary research focus using just a few words, they always include: "'''''synaptic plasticity'''''". Indeed, I feel that the key to fully understanding cognitive processes like memory formation is through studying neural dynamics at the cellular-network, synaptic, and molecular levels.
}}


==SEO Search Engine Optimization==
<!-- ####################################################### -->
''Here's a letter I wrote to a friend about his new website. I thought I'd share in case anyone is interested in my take on building a google friendly website.''
{{Clear}}
<!-- ####################################################### -->


{{Box|width=45%|min-width=310px|float=left|font-size=14px|[[Neural Nets|Machine Learning Tutorial]]|
[[File:Neural-net-01.png|500px|link=Neural Nets]]{{Clear}} <br><br>


I don’t have anything bad to say about wordpress; it’s probably the best blog software out there, and it can definitely work as a homepage. However, I just think that optimizing it for search engines would be quite difficult. Let me explain. Google has automated software (referred to as web crawlers, spiders, bots) that crawl from page to page on the internet and takes a copy of all the text on each page it visits (if you want to know how your website looks to a search engine spider, go to your homepage using firefox and press Ctrl+U [or Apple+U on a mac]). These spiders get around the internet by following every link on every new page they find. This is why some webmasters have a link to a sitemap somewhere on their homepage. This ensures that the google spider will index every page of their website. If there are sub pages on your website that someone cannot eventually get to simply by clicking their way there, then the spider will not find them. For example, if the google spider follows a link from someone elses webpage to your Simple Machines Forum, the spider will not be making a stop at your homepage or other blog articles, because as of right now, there are no links from your forum, to your homepage (at least that I can find). But anyway, the spider then sends this cached text version of your website to google for search engine ranking analysis. The text on your homepage (and any other pages that were cached) at the time it was crawled, dictate what google is going to use to rank your pages for various search strings entered by people using google. Thus, someone trying to get high search rankings for a particular set of keywords will have a much easier time if nearly all the textual content on their homepage isn’t constantly changing. Furthermore, if you eventually get into the top ten for a good search string, you don’t want to lose that good ranking. For instance, someone might google [anarchist news and discussions] and your website makes it into the top 10; well next week the google spider comes along and indexes the new content on your homepage, and you fall out of the rankings again. Just something to think about.  
I have developed a [[Neural Nets|machine learning tutorial]], focusing on supervised learning, but it also touches on techniques like t-SNE. It makes heavy use of Tensorflow Playground to visualize what is happening in multilayer neural networks during training. It also provides learners with an opportunity to try and solve problems classification problems live right on the web app.  


Getting people to visit, revisit, and participate, should be the primary concern of every webmaster. Getting high search engine rankings for relevant key words is the MOST important thing to get new visitors. Having interesting and provocative content is the key to getting return visitors. I can help you with search engine optimization; getting repeat visitors are totally on you ;-)
<br><br>
}}


Let me know if you have anymore questions.
{{Box|width=45%|min-width=310px|float=right|font-size=14px|[[Brownian Motion]]|
Molecular-level synaptic plasticity is among my primary interests. I've studied and quantified membrane [[:Category:Diffusion|diffusion]] properties of excitatory and inhibitory receptors, and have developed models how these particles swarm to potentiate synapses. I find stochastic particle diffusion is intertwined with the first principles of [[:Category:Statistics|statistics and probability]]. Given that synaptic potentiation is dependent on marshalling receptors undergoing stochastic diffusion, it seem that neurons have evolved into innate statistical computers. The result of 100 billion of these statistical computers making 100 trillion connections is the human brain. Here are some of my [[:Category:Diffusion|notes and code for simulating membrane diffusion.]]
----
[[File:Brownian-Diffusion.gif|350px|center]]
}}


-Brad




== Looking at Labs ==
{{Box|width=45%|min-width=340px|float=left|[[Hello]] internet person!|
You've found [[User:Monakhos|my wiki]]. This is where I horde random information. I have every intention of linking it all together someday. If you are so inclined, recent additions to this wiki can be found in the box on the right. For a non-curated glimpse of my activity you can check out the [[Special:RecentChanges|latest wiki updates]]. Older wiki [[content]] can be accessed using the <nowiki>[search box]</nowiki> or perusing [[Special:AllPages| all pages]]. If you would like to contact me, you can find this info on [http://bradleymonk.com my home page]. You can find a list of my [[publications]] here.
}}


[http://neuroscience.columbia.edu/department/index.php?ID=27&bio=189 Good Stuff]
{{Box|width=45%|min-width=310px|float=right|Popular Pages and Categories|
{{SmallBox|float=left|clear=none|margin=1px 1px 1px 10px|padding=1px|width=45%|font-size=12px|border-style=none|text-align=left|
*[[:Category:Synaptic Plasticity{{!}}Category:Plasticity]]
*[[APOE]]
*[[:Category:Journals]]
*[[:Category:Math]]
*[[:Category:Neuroscience Methods]]
}}
{{SmallBox|float=left|clear=none|margin=1px 1px 1px 10px|padding=1px|width=45%|font-size=12px|border-style=none|text-align=left|
*[[Genomics Terminology]]
*[[:Category:Diffusion]]
*[[:Category:Qual]]
*[[:Category:Neurobiology]]
*[[:Category:Malinow]]
}}


== My First Computer ==
}}
Well, not exactly, but it's the <big>[http://onesci.com/BM/computer first computer I built myself]</big>.




<!-- ####################################################### -->
{{Clear}}
<!-- ####################################################### -->








__NOTOC____NOEDITSECTION__
 
 
__NOTOC__
[[Category:Neurobiology]]

Latest revision as of 21:38, 11 March 2024

Genomics and Machine Learning


Using SNP profiles we have developed a computational framework for making diagnostic predictions regarding the likelihood that someone will develop dementia. A key feature of this framework is a neural network algorithm that, through machine learning, has been trained to predict patients or controls with high accuracy. Importantly, these predictions have proven to generalize well to hold-out genomes from independent sequencing projects, suggesting the classifier may perform well across samples of the general population. The bp status of just ~1k genomic loci was sufficient to to have 80% prediction accuracy. Furthermore, the neural net outputs a ‘confidence’ score for each prediction; on high-confidence predictions the classifier is over 90% accurate (confidence is not quantified post hoc, it is divined a priori by deep neural nets). Since the neural network weights have been trained, and because only a relatively small number of genomic targets are needed, we hope this system can be further developed into a clinical diagnostic tool. As it is, this is still far off; many independent test genomes will be required to validate such a tool. In the meanwhile, we hope to continue to improve the classifier's performance using novel data and methods.


Actin Modeling

The study of actin dynamics is centrally important to understanding synaptic plasticity. Fortunately, actin research has provided a vast pool of experimental studies, and several quantitative models that provide excellent characterizations of actin polymerization kinetics. To simulate filament scaffolding in a dendritic model, I developed a stochastic 3D model of actin dynamics based on parameters from previously established in steady-state, monte carlo and stochastic models. The ability to simulate the evolution of actin networks in 3D makes this model unique.


Synaptic Plasticity

It is now generally accepted that many forms of adaptive behavior, including learning and memory, engender lasting physiological changes in the brain; reciprocally, neural plasticity among the brain’s synaptic connections provides the capacity for learning and memory. Whenever I have to summarize my primary research focus using just a few words, they always include: "synaptic plasticity". Indeed, I feel that the key to fully understanding cognitive processes like memory formation is through studying neural dynamics at the cellular-network, synaptic, and molecular levels.

Machine Learning Tutorial



I have developed a machine learning tutorial, focusing on supervised learning, but it also touches on techniques like t-SNE. It makes heavy use of Tensorflow Playground to visualize what is happening in multilayer neural networks during training. It also provides learners with an opportunity to try and solve problems classification problems live right on the web app.



Brownian Motion

Molecular-level synaptic plasticity is among my primary interests. I've studied and quantified membrane diffusion properties of excitatory and inhibitory receptors, and have developed models how these particles swarm to potentiate synapses. I find stochastic particle diffusion is intertwined with the first principles of statistics and probability. Given that synaptic potentiation is dependent on marshalling receptors undergoing stochastic diffusion, it seem that neurons have evolved into innate statistical computers. The result of 100 billion of these statistical computers making 100 trillion connections is the human brain. Here are some of my notes and code for simulating membrane diffusion.



Hello internet person!

You've found my wiki. This is where I horde random information. I have every intention of linking it all together someday. If you are so inclined, recent additions to this wiki can be found in the box on the right. For a non-curated glimpse of my activity you can check out the latest wiki updates. Older wiki content can be accessed using the [search box] or perusing all pages. If you would like to contact me, you can find this info on my home page. You can find a list of my publications here.

Popular Pages and Categories