Almost every living organism in the world has deoxyribonucleic acid, commonly known as DNA. This is a molecule that is responsible for carrying a cell’s genetic information and hereditary characteristics. DNA may also carry the genetic information of some viruses.
The genetic information that DNA carries is used in processes such as growth, development, function, and reproduction. DNA is comprised of distinct parts such as genes which are the regions of the DNA responsible for controlling/influencing physical characteristics, development, and growth.
Given the importance of DNA in carrying genetic information, it comes as no surprise that one of the most common questions surrounding DNA is whether or not it can help to predict an individual’s future. This article seeks to answer this question using current evidence on DNA testing for prediction purposes.
1. DNA and Future Health & Well-being
Most of the research on DNA and prediction of the future has focused on examining how scientists can use DNA tests to predict diseases and in so doing, efficiently achieve the objective of providing customized preventative care and treatment.
The extent to which DNA predicts a disease depends on individual diseases. For example, there are diseases such as sickle cell anemia, Tay-Sachs disease, Downs syndrome and even Huntington’s disease that are purely genetic and are therefore referred to as 100% penetrant.
On the other hand, some diseases are more complex in that behavioral factors significantly influence the chances of getting the disease. These include conditions such as Type 2 Diabetes and rheumatoid arthritis. Additionally, such complex diseases often have several genetic and behavioral components. For example, in the case of Type 2 Diabetes, research suggests that there are more than 12 known genes that contribute to the condition. In the case of breast cancer, scientists have identified 93 genes which if subjected to mutation, can cause breast cancer. These findings suggest that it is possible to predict one’s future regarding health.
However, many scientists are in consensus that while genetic testing plays a role in predicting one’s health future, it does so with little accuracy. This is because the process of disease development often depends on how an individual’s genetic makeup interacts with influences within the individual’s environment. These include an individual’s behaviors, foods, chemicals, medications and even stress factors.
In fact, many researchers agree that it is these environmental/behavioral factors that influence the risk of someone getting a certain disease more. For example, factors such as obesity, diet and a sedentary lifestyle may significantly affect the risk of getting Type 2 Diabetes. Similarly, an individual’s age at first childbirth and even weight may significantly affect the risk of breast cancer.
According to a study published in Science Transitional Medicine journal, many people are likely to get negative results for most diseases. Although such results signify a low risk of getting the disease, the study suggests that these low-risk readings may end up less useful than results signifying high risk.
High-risk results could inspire an individual to implement lifestyle changes hence reducing the risk of disease. In addition to that, the study suggests that even individuals who got negative results could still have ended up with the disease.
Another source of inaccuracy when using DNA to predict the potential of contracting certain diseases and conditions in the future is the fact that there are currently very few definitive genetic correlates for diseases and most of these occur rarely.
Nonetheless, there is agreement that testing DNA for genes that predispose one to a certain disease is likely to have certain benefits and more so in preventative care. For example, a DNA test may indicate the presence of a BRCA 1 or BRCA 2 mutation both of which may indicate the potential for future breast cancer. Such results may prompt an individual to get regular mammograms and also implement lifestyle changes that may minimize the risk of disease.
It is for this very same reason (of encouraging preventative care) that the Food and Drug Administration has recently approved DNA testing for genes that predispose people to conditions such as;
- Parkinson’s disease
- Late-onset Alzheimer’s disease
- Factor XI deficiency which is a blood clotting disorder
- Hereditary thrombophilia
- Gaucher disease type 1
- Celiac disease (that causes an inability of the body to digest gluten)
- Early-onset primary dystonia; a movement disorder that causes uncontrolled movements and involuntary muscle contractions.
Based on the research mentioned above and evidence, it is clear that while scientists can use DNA to predict one’s predisposition to certain diseases in the future, it does this with little accuracy. However, people can use DNA tests as a precursor of preventative care hence ensuring health and well-being in the future.
2. DNA and the Prediction of Intelligence and Academic Excellence
It is a well-known fact that academic performance often depends on some inherent and external factors such as quality of teaching and social background. An emerging body of research suggests that scientists can use DNA to predict intelligence and potential future academic excellence.
Research conducted in 2010 proposed that high intelligence is familial and heritable. Further, the researchers found that the same genetic and environmental factors responsible for normal distribution of intelligence are responsible for high intelligence. In another study published in 2013, researchers also affirmed that childhood intelligence is heritable and also highly polygenic. This means that polygenes determine childhood intelligence.
In a more comprehensive study at King’s College London, researchers identified 74 different genetic variants all of which influence an individual’s academic achievement in subjects such as Math, English, Science, Humanities and even second languages. These 74 variants were picked from approximately 10 million genetic variants called single nucleotide polymorphisms previously identified in the research. The researchers then used polygenic scoring to predict and examine differences between the intelligence levels of children.
According to the findings, those children with the highest polygenic scores had higher chances of obtaining grades A and B in their GCSE exams while those with lower polygenic scores were more likely to score lower grades in their GCSE exams.
Further, the study established that up to 65% of those with high polygenic scores proceeded to do their A levels; at the same time, only 35% of those with lower polygenic scores pursued A levels. The study concluded that it was possible to use polygenic scoring to predict up to 10% of the differences between the academic achievements of children.
The findings from the study at King’s College London indicate that DNA testing is potentially a significant predictor of future academic excellence and achievement.
3. DNA and Prediction of Physical Attributes and Appearances
As noted earlier, DNA contains the genetic information that controls and influences physical attributes such as appearance.
As recent developments show, scientists can use a technique known as DNA phenotyping to predict physical appearance and characteristics with relatively great accuracy. In fact, police departments are increasingly using the help of DNA phenotyping to predict how suspects look like. This technique often relies on DNA clues left at a crime scene, e.g., a drop of blood.
In a study published in PLOS Genetics, researchers combined both genetic and facial map data enabling them to link specific DNA markers with certain facial features to build a predictive algorithm. DNA phenotyping was used to predict genetic ancestry, freckles, eyes, hair and even skin color among others. The technique also predicts facial shape.
Some studies have also found that using DNA phenotyping can predict extreme height with an accuracy of 0.75. On the other hand, the accuracy of correctly predicting eye and hair color was 0.9. In both cases, 0.5 was random while 1 was a perfect indicator.
When used in forensics, DNA phenotyping is meant to improve investigations and facilitate the arrest of suspected criminals in the future. This has led to some experts likening the technique to receiving evidence from a “genetic eyewitness.”
Some studies have also attempted to explore the role of genetic variations in predicting the risk of injury and predicting an individual’s future athletic prowess and performance. Subsequently, researchers have identified at least 200 genes that may have a positive relationship with athletic performance.
For example, there are those who suggest that genes can help predict a child’s future athletic performance especially before the age of 9. One such DNA test makes use of a saliva sample to search for and identify three combinations of ACTN3 genes. One variant of these genes is from the mother while the other is from the father.
Research suggests that children who have two copies of the X variant from both parents do not make alpha-actinin-3 which implies that such children may perform best in endurance sports such as long-distance running, swimming and even cross-country skiing.
Conversely, children with one copy of the X variant and one copy of the R variant are likely to make protein and may, therefore, succeed more in power sports such as soccer and cycling. Children who have two copies of the R variant are likely to make more ACTN3 which makes it possible that such children will achieve in power sports such as weightlifting, sprinting and football.
Unfortunately, research on the ability of DNA to predict physical appearances and performance is still relatively inconclusive. For example, while DNA phenotyping shows a lot of promise in predicting the physical appearances of individuals, researchers have not successfully used DNA to predict athletic success and performance. It is therefore not possible to make a firm verdict on the use of DNA for prediction of future athletic performance.
4. DNA and Prediction of Death
Death is often a subject that many people would rather avoid altogether. Nonetheless, scientists suggest that there is a potential link between DNA and life expectancy. Human DNA comprises of 46 chromosomes all of which end with a DNA thread known as a telomere. This thread of DNA is responsible for protecting the chromosome. Some experts have even likened it to the plastic tip found on shoelaces.
Telomeres are often quite long at birth and shorten all through one’s life as the cell divides. Eventually, the telomere shortens too much leaving the cell with little protection. Ultimately, the shortening of the telomere makes the cell inactive and in some cases, even leaves it dead. It is this knowledge that has prompted scientists to increasingly believe that the length of the telomere may come in handy when predicting cellular health as well as a person’s longevity. In fact, studies indicate that elderly people often have shorter telomeres than younger people.
In addition to that, researchers have discovered that certain experiences affect the rate at which the telomeres shorten/degrade. For example, in an analysis of DNA samples from children at five years of age and at ten years old, researchers established that those children who had experienced multiple types of violence such as bullying and physical abuse had shorter telomeres than those children who only experienced one form of violence or no violence at all.
The shortening of the telomere gradually increases the risk of certain diseases such as age-related diseases. On the same note, studies suggest that people who are physically active in the 40s and 50s are likely to have telomeres that are 10% shorter than those of people in their 20s. On the other hand, those individuals who led a sedentary lifestyle in their 40s and 50s had telomeres that were up to 40% shorter than those in their 20s.
Some researchers have also shown that it is possible to use other clues in the DNA to predict death. The researchers attribute this to a “biological clock” found in the DNA of people. Using a process known as methylation, scientists measure an individual’s biological age simply through an analysis of the chemical modifications that have happened to the DNA over time.
According to the studies, individuals whose biological clocks indicated an age “older” than the actual age was likely to die much sooner. More specifically, those with a DNA methylation age that was five years higher than their actual age had a 21% higher chance of mortality. Furthermore, this relationship was held true even when other factors such as heart disease and lifestyle factors such as smoking were considered.
As the research suggests it is possible to predict the chances of a person dying in the near future using DNA. However, just like in other predictive functions of DNA, research in this area is still largely inconclusive.