Self Study - Neuroinformatics

To standardize and streamline the process of data collection and to produce more timely results.

A fulfillment for two classes, my project of choice was to investigate a hypothetical design for a research-based system revolving around the neuron connections of the brain. An abstract statement is provided below. 

Lessons Learned

  • Relational modeling and logical design

  • SQL- implementing, populating, and querying databases

  • Organizational and managerial decisions in database development

Rationalizing Scale

Three main problems prevailed in the processing of data:

  • Lack of an anatomically organized collection process within industry to place data in an organizational schema

  • Need for a sizable network volume to handle heavy data load (10.5TB)

  • Configuring both schema and network in human readable form

Techniques

Hybrid rational, entity-attribute-value (EAV) database model

  • Benefits

    • "Attribute" titled entities consisting of "attribute" classes to reduce complexity

    • Comparable to "fan" pattern

    • Interoperability, adaptability, and scalability

  •  Drawbacks: 

    • Lack of integrity constraints and null values

    • Extensive upfront work

Use Case 1: Data Analysis

Relevant Models

Abstract

The human brain is one of the many wonders of the universe. Its complexity has consumed scientific researchers for centuries. It relies on over 100 billion nerve cells that too, rely on other millions of cells and neurons. However to this day the brain still remains remarkably mysterious to even its possessor. In the past century WEmatter researchers have proposed to map the white matter tracts in the brain by decompressing each individual cell body and reconstructing that cell. This process is referred to as connectomics with a “connectome” denoting the sum total of connections between the neurons in a nervous system. Similar is the close relative, “genome,” as it implies completeness; your connectome contains over one million times more connections than your genome has letters. Connectomes are so intriguing because unlike gene’s they have the capacity to evolve or “reweigh” themselves throughout your lifetime; consider a new habit or a new language.

The task alone is daunting. The first problem is collecting the data and the second is making use of it. To put it in perspective, a roundworm has a total of 302 neurons linked by 7,500 synapses. In comparison the human cerebral cortex contains 1011 neurons linked by 1014 synaptic connections that span a range of six to nine orders of magnitude.  With the typical neuron composed of three parts, dendrite, axon and soma, every synapse is distinct in nature and magnitude. While axons can reach magnitudes of between 10μm (micrometer) and 106μm there counterpart, a dendrite, may only fall between 10 and 1000 μm, with less certainty. To match and validate a synapse, massive data collection is required. Anatomical pairings are completed at resolutions of 2 nm or better. To compose, annotate and process such data is timely in addition to costly; the image data alone requires 10.5 TB of storage. To map the human brain at the neural level is entirely more ominous than any other living animal that we know. To organize, abstract and curate it into human-readable form for analysis is an unmanning job. Using computer-aided software, technology and processing such procedures can be alleviated.   

The knowledge accumulated from resulting data has the potential to be life altering. Data could provide researchers gained knowledge and potentially cure unresolved diseases. 

 

Full Report