Dec 18, 2023 · What Is Quantitative Research? Quantitative research involves the process of objectively collecting and analyzing numerical data to describe, predict, or control variables of interest. The goals of quantitative research are to test causal relationships between variables, make predictions, and generalize results to wider populations. ... Quantitative psychologists study and develop the methods and techniques used to measure human behavior and other attributes. Their work involves the statistical and mathematical modeling of psychological processes, the design of research studies and the analysis of psychological data. ... Quantitative psychology is a field of scientific study that focuses on the mathematical modeling, research design and methodology, and statistical analysis of psychological processes. It includes tests and other devices for measuring cognitive abilities . ... Explore the latest full-text research PDFs, articles, conference papers, preprints and more on QUANTITATIVE PSYCHOLOGY. Find methods information, sources, references or conduct a literature review ... ... Research Methods in Psychology AP A Han dbook s in Psychology VOLUME Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological SECOND EDITION Harris Cooper, Editor-in-Chief Marc N. Coutanche, Linda M. McMullen, A. T. Panter, sychological Association. Not for further distribution. ... The issue of publication bias is also closely tied to the ongoing replication crisis in psychology (e.g., Shrout & Rodgers, 2018) and, therefore, to the need for research transparency. Nylund-Gibson and Choi (2018) present a seemingly different article—a user-friendly account of conceptualizing and conducting a latent class analysis ... ... Research in quantitative psychology has developed several methods and techniques to improve our understanding of humans. Over the last few decades, the rapid advancement of technology had led to more extensive study of human cognition, including both the emotional and behavioral aspects. ... Mar 21, 2013 · The Oxford Handbook of Quantitative Methods in Psychology aims to be a source for learning and reviewing current best-practices in quantitative methods as practiced in the social, behavioral, and educational sciences. Comprising two volumes, this text covers a wealth of topics related to quantitative research methods. ... Jul 29, 2010 · Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean ... ... This is primarily aimed at first- and second-year undergraduates interested in psychology, data analysis, and quantitative research methods along with high school students and professionals with similar interests. ... ">

Qualitative vs Quantitative Research Methods & Data Analysis

Saul McLeod, PhD

Editor-in-Chief for Simply Psychology

BSc (Hons) Psychology, MRes, PhD, University of Manchester

Saul McLeod, PhD., is a qualified psychology teacher with over 18 years of experience in further and higher education. He has been published in peer-reviewed journals, including the Journal of Clinical Psychology.

Learn about our Editorial Process

Olivia Guy-Evans, MSc

Associate Editor for Simply Psychology

BSc (Hons) Psychology, MSc Psychology of Education

Olivia Guy-Evans is a writer and associate editor for Simply Psychology. She has previously worked in healthcare and educational sectors.

The main difference between quantitative and qualitative research is the type of data they collect and analyze.

Quantitative data is information about quantities, and therefore numbers, and qualitative data is descriptive, and regards phenomenon which can be observed but not measured, such as language.
  • Quantitative research collects numerical data and analyzes it using statistical methods. The aim is to produce objective, empirical data that can be measured and expressed numerically. Quantitative research is often used to test hypotheses, identify patterns, and make predictions.
  • Qualitative research gathers non-numerical data (words, images, sounds) to explore subjective experiences and attitudes, often via observation and interviews. It aims to produce detailed descriptions and uncover new insights about the studied phenomenon.

On This Page:

What Is Qualitative Research?

Qualitative research is the process of collecting, analyzing, and interpreting non-numerical data, such as language. Qualitative research can be used to understand how an individual subjectively perceives and gives meaning to their social reality.

Qualitative data is non-numerical data, such as text, video, photographs, or audio recordings. This type of data can be collected using diary accounts or in-depth interviews and analyzed using grounded theory or thematic analysis.

Qualitative research is multimethod in focus, involving an interpretive, naturalistic approach to its subject matter. This means that qualitative researchers study things in their natural settings, attempting to make sense of, or interpret, phenomena in terms of the meanings people bring to them. Denzin and Lincoln (1994, p. 2)

Interest in qualitative data came about as the result of the dissatisfaction of some psychologists (e.g., Carl Rogers) with the scientific study of psychologists such as behaviorists (e.g., Skinner ).

Since psychologists study people, the traditional approach to science is not seen as an appropriate way of carrying out research since it fails to capture the totality of human experience and the essence of being human.  Exploring participants’ experiences is known as a phenomenological approach (re: Humanism ).

Qualitative research is primarily concerned with meaning, subjectivity, and lived experience. The goal is to understand the quality and texture of people’s experiences, how they make sense of them, and the implications for their lives.

Qualitative research aims to understand the social reality of individuals, groups, and cultures as nearly as possible as participants feel or live it. Thus, people and groups are studied in their natural setting.

Some examples of qualitative research questions are provided, such as what an experience feels like, how people talk about something, how they make sense of an experience, and how events unfold for people.

Research following a qualitative approach is exploratory and seeks to explain ‘how’ and ‘why’ a particular phenomenon, or behavior, operates as it does in a particular context. It can be used to generate hypotheses and theories from the data.

Qualitative Methods

There are different types of qualitative research methods, including diary accounts, in-depth interviews , documents, focus groups , case study research , and ethnography .

The results of qualitative methods provide a deep understanding of how people perceive their social realities and in consequence, how they act within the social world.

The researcher has several methods for collecting empirical materials, ranging from the interview to direct observation, to the analysis of artifacts, documents, and cultural records, to the use of visual materials or personal experience. Denzin and Lincoln (1994, p. 14)

Here are some examples of qualitative data:

Interview transcripts : Verbatim records of what participants said during an interview or focus group. They allow researchers to identify common themes and patterns, and draw conclusions based on the data. Interview transcripts can also be useful in providing direct quotes and examples to support research findings.

Observations : The researcher typically takes detailed notes on what they observe, including any contextual information, nonverbal cues, or other relevant details. The resulting observational data can be analyzed to gain insights into social phenomena, such as human behavior, social interactions, and cultural practices.

Unstructured interviews : generate qualitative data through the use of open questions.  This allows the respondent to talk in some depth, choosing their own words.  This helps the researcher develop a real sense of a person’s understanding of a situation.

Diaries or journals : Written accounts of personal experiences or reflections.

Notice that qualitative data could be much more than just words or text. Photographs, videos, sound recordings, and so on, can be considered qualitative data. Visual data can be used to understand behaviors, environments, and social interactions.

Qualitative Data Analysis

Qualitative research is endlessly creative and interpretive. The researcher does not just leave the field with mountains of empirical data and then easily write up his or her findings.

Qualitative interpretations are constructed, and various techniques can be used to make sense of the data, such as content analysis, grounded theory (Glaser & Strauss, 1967), thematic analysis (Braun & Clarke, 2006), or discourse analysis .

For example, thematic analysis is a qualitative approach that involves identifying implicit or explicit ideas within the data. Themes will often emerge once the data has been coded .

RESEARCH THEMATICANALYSISMETHOD

Key Features

  • Events can be understood adequately only if they are seen in context. Therefore, a qualitative researcher immerses her/himself in the field, in natural surroundings. The contexts of inquiry are not contrived; they are natural. Nothing is predefined or taken for granted.
  • Qualitative researchers want those who are studied to speak for themselves, to provide their perspectives in words and other actions. Therefore, qualitative research is an interactive process in which the persons studied teach the researcher about their lives.
  • The qualitative researcher is an integral part of the data; without the active participation of the researcher, no data exists.
  • The study’s design evolves during the research and can be adjusted or changed as it progresses. For the qualitative researcher, there is no single reality. It is subjective and exists only in reference to the observer.
  • The theory is data-driven and emerges as part of the research process, evolving from the data as they are collected.

Limitations of Qualitative Research

  • Because of the time and costs involved, qualitative designs do not generally draw samples from large-scale data sets.
  • The problem of adequate validity or reliability is a major criticism. Because of the subjective nature of qualitative data and its origin in single contexts, it is difficult to apply conventional standards of reliability and validity. For example, because of the central role played by the researcher in the generation of data, it is not possible to replicate qualitative studies.
  • Also, contexts, situations, events, conditions, and interactions cannot be replicated to any extent, nor can generalizations be made to a wider context than the one studied with confidence.
  • The time required for data collection, analysis, and interpretation is lengthy. Analysis of qualitative data is difficult, and expert knowledge of an area is necessary to interpret qualitative data. Great care must be taken when doing so, for example, looking for mental illness symptoms.

Advantages of Qualitative Research

  • Because of close researcher involvement, the researcher gains an insider’s view of the field. This allows the researcher to find issues that are often missed (such as subtleties and complexities) by the scientific, more positivistic inquiries.
  • Qualitative descriptions can be important in suggesting possible relationships, causes, effects, and dynamic processes.
  • Qualitative analysis allows for ambiguities/contradictions in the data, which reflect social reality (Denscombe, 2010).
  • Qualitative research uses a descriptive, narrative style; this research might be of particular benefit to the practitioner as she or he could turn to qualitative reports to examine forms of knowledge that might otherwise be unavailable, thereby gaining new insight.

What Is Quantitative Research?

Quantitative research involves the process of objectively collecting and analyzing numerical data to describe, predict, or control variables of interest.

The goals of quantitative research are to test causal relationships between variables , make predictions, and generalize results to wider populations.

Quantitative researchers aim to establish general laws of behavior and phenomenon across different settings/contexts. Research is used to test a theory and ultimately support or reject it.

Quantitative Methods

Experiments typically yield quantitative data, as they are concerned with measuring things.  However, other research methods, such as controlled observations and questionnaires , can produce both quantitative information.

For example, a rating scale or closed questions on a questionnaire would generate quantitative data as these produce either numerical data or data that can be put into categories (e.g., “yes,” “no” answers).

Experimental methods limit how research participants react to and express appropriate social behavior.

Findings are, therefore, likely to be context-bound and simply a reflection of the assumptions that the researcher brings to the investigation.

There are numerous examples of quantitative data in psychological research, including mental health. Here are a few examples:

Another example is the Experience in Close Relationships Scale (ECR), a self-report questionnaire widely used to assess adult attachment styles .

The ECR provides quantitative data that can be used to assess attachment styles and predict relationship outcomes.

Neuroimaging data : Neuroimaging techniques, such as MRI and fMRI, provide quantitative data on brain structure and function.

This data can be analyzed to identify brain regions involved in specific mental processes or disorders.

For example, the Beck Depression Inventory (BDI) is a clinician-administered questionnaire widely used to assess the severity of depressive symptoms in individuals.

The BDI consists of 21 questions, each scored on a scale of 0 to 3, with higher scores indicating more severe depressive symptoms. 

Quantitative Data Analysis

Statistics help us turn quantitative data into useful information to help with decision-making. We can use statistics to summarize our data, describing patterns, relationships, and connections. Statistics can be descriptive or inferential.

Descriptive statistics help us to summarize our data. In contrast, inferential statistics are used to identify statistically significant differences between groups of data (such as intervention and control groups in a randomized control study).

  • Quantitative researchers try to control extraneous variables by conducting their studies in the lab.
  • The research aims for objectivity (i.e., without bias) and is separated from the data.
  • The design of the study is determined before it begins.
  • For the quantitative researcher, the reality is objective , exists separately from the researcher, and can be seen by anyone.
  • Research is used to test a theory and ultimately support or reject it.

Limitations of Quantitative Research

  • Context : Quantitative experiments do not take place in natural settings. In addition, they do not allow participants to explain their choices or the meaning of the questions they may have for those participants (Carr, 1994).
  • Researcher expertise : Poor knowledge of the application of statistical analysis may negatively affect analysis and subsequent interpretation (Black, 1999).
  • Variability of data quantity : Large sample sizes are needed for more accurate analysis. Small-scale quantitative studies may be less reliable because of the low quantity of data (Denscombe, 2010). This also affects the ability to generalize study findings to wider populations.
  • Confirmation bias : The researcher might miss observing phenomena because of focus on theory or hypothesis testing rather than on the theory of hypothesis generation.

Advantages of Quantitative Research

  • Scientific objectivity : Quantitative data can be interpreted with statistical analysis, and since statistics are based on the principles of mathematics, the quantitative approach is viewed as scientifically objective and rational (Carr, 1994; Denscombe, 2010).
  • Useful for testing and validating already constructed theories.
  • Rapid analysis : Sophisticated software removes much of the need for prolonged data analysis, especially with large volumes of data involved (Antonius, 2003).
  • Replication : Quantitative data is based on measured values and can be checked by others because numerical data is less open to ambiguities of interpretation.
  • Hypotheses can also be tested because of statistical analysis (Antonius, 2003).

Antonius, R. (2003). Interpreting quantitative data with SPSS . Sage.

Black, T. R. (1999). Doing quantitative research in the social sciences: An integrated approach to research design, measurement and statistics . Sage.

Braun, V. & Clarke, V. (2006). Using thematic analysis in psychology . Qualitative Research in Psychology , 3, 77–101.

Carr, L. T. (1994). The strengths and weaknesses of quantitative and qualitative research : what method for nursing? Journal of advanced nursing, 20(4) , 716-721.

Denscombe, M. (2010). The Good Research Guide: for small-scale social research. McGraw Hill.

Denzin, N., & Lincoln. Y. (1994). Handbook of Qualitative Research. Thousand Oaks, CA, US: Sage Publications Inc.

Glaser, B. G., Strauss, A. L., & Strutzel, E. (1968). The discovery of grounded theory; strategies for qualitative research. Nursing research, 17(4) , 364.

Minichiello, V. (1990). In-Depth Interviewing: Researching People. Longman Cheshire.

Punch, K. (1998). Introduction to Social Research: Quantitative and Qualitative Approaches. London: Sage

Further Information

  • Mixed methods research
  • Designing qualitative research
  • Methods of data collection and analysis
  • Introduction to quantitative and qualitative research
  • Checklists for improving rigour in qualitative research: a case of the tail wagging the dog?
  • Qualitative research in health care: Analysing qualitative data
  • Qualitative data analysis: the framework approach
  • Using the framework method for the analysis of
  • Qualitative data in multi-disciplinary health research
  • Content Analysis
  • Grounded Theory
  • Thematic Analysis

Print Friendly, PDF & Email

An official website of the United States government

Official websites use .gov A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS A lock ( Lock Locked padlock icon ) or https:// means you've safely connected to the .gov website. Share sensitive information only on official, secure websites.

  • Publications
  • Account settings
  • Advanced Search
  • Journal List

Back to the Future of Quantitative Psychology and Measurement: Psychometrics in the Twenty-First Century

Pietro cipresso, jason c immekus.

  • Author information
  • Article notes
  • Copyright and License information

Edited and reviewed by: Axel Cleeremans, Free University of Brussels, Belgium

*Correspondence: Pietro Cipresso [email protected]

This article was submitted to Quantitative Psychology and Measurement, a section of the journal Frontiers in Psychology

Received 2017 Aug 15; Accepted 2017 Nov 17; Collection date 2017.

Keywords: quantitative psychology, measurement, psychometrics, computational psychometrics, mathematical psychology

This is an open-access article distributed under the terms of the Creative Commons Attribution License (CC BY). The use, distribution or reproduction in other forums is permitted, provided the original author(s) or licensor are credited and that the original publication in this journal is cited, in accordance with accepted academic practice. No use, distribution or reproduction is permitted which does not comply with these terms.

Measurements in psychology always have been a significant challenge. Research in quantitative psychology has developed several methods and techniques to improve our understanding of humans. Over the last few decades, the rapid advancement of technology had led to more extensive study of human cognition, including both the emotional and behavioral aspects. Psychometric methods have integrated very advanced mathematical and statistical techniques into the analyses, and in our Frontiers Specialty (Quantitative Psychology and Measurement), we have stressed the methodological dimension of the best practice in psychology. The long tradition of using self-reported questionnaires is still of high interest, but it is not enough in the twenty-first century.

We stress the use of innovative methods and technologies as psychometric tools. One of the most significant challenges in quantitative psychology and measurement concerns the integration of technologies and computational techniques into current standards.

In the following, our aim is to show how data collection can involve human behavior, internal states and the manipulation of experimental settings. In particular, we define typical psychophysiological measures for a deeper understanding of internal states—analyzing the central and peripheral nervous system, hormonal factors in the endocrine system and the fascinating field of gene transcription in human neuroscience. These factors represent the measurement of the “internal” sphere that is becoming so interesting for measurement in all the field of psychology, including social and affective science, not only in the cognitive sciences. The idea to read internal states has always been very clear in clinical and experimental psychology, but now is becoming even more widespread. This is thanks to the improvements in technologies and lower costs.

Next, we highlight the measurement of the exhibited behavior patterns representing the “external” sphere of human thinking through expressed behavior. Again, technology is a critical aspect shedding new light on the field. The use of low-cost and high-end technologies for understanding verbal and nonverbal patterns is helping to identify innovative ways to measure the psychological factors leading to a behavior. They can be considered a new challenge of behavioral science, e.g., the use of commercial devices (such as the Kinect) in motor and cognitive neurorehabilitation. Linked to psychophysiology and exhibited behavior patterns, virtual reality is becoming a cutting-edge tool for experimental manipulation, building personalized experimental settings, but found in a laboratory.

We define and highlight the use of virtual reality in psychology as an incredible low-cost tool collecting data and creating realistic situations that can be used for clinical, experimental, social settings among others, and so of keen interest in several psychology fields.

In conclusion, we present new methods and techniques already used in other fields, but incredibly expanding also in psychology and psychometrics. Computational science, complex networks, and simulations, are highlighted as the promising new methods for the best convergence of psychological science and technologies. These have ability to create innovative tools for better comprehension and a quantitative measurement in psychology.

Psychophysiology: nervous system, endocrine system, and gene transcription

The use of biosensors in human research has become a reliable method for a quantitative and objective measurement of participants' at psychological, behavioral, and physiological level. The use of biosensors, specifically psychophysiology, is not an alternative to self-reports, but they can be considered as a great asset in our effort to integrate additional information to enhance our understanding of specific patterns.

The advantage of psychophysiology is the possibility of recording internal states during an experience (Mauri et al., 2010 ; Blascovich, 2014 ; Kreibig et al., 2015 ; McGaugh, 2016 ). This means that the researcher can evaluate the impact of a specific experience without interrupting the experience to ask the user her/his opinion.

In the valence-arousal model (Russell, 1979 ; Lang, 1995 ) researchers are interested in the identification of affective states of subjects during experimental sessions. There exists two “activation” dimensions mainly investigated, namely the physiological arousal and emotional valence. The Arousal-Pleasantness plan is drown in psychophysiology as a robust measurement of the affective states. Physiological arousal is a measure of the sympathetic branch of the autonomic nervous system (ANS). Sympathetic activation generates an increase in the activity of sudoriparous glands (also known as Sweat glands) that it is possible to measure trough Electrodermal activity (EDA) computed as galvanic skin response (GSR), or also as skin conductance resistance (SCR). EDA is a direct measure of sympathetic activation with no intervention of the parasympathetic branch. Other good measures of physiological arousal that are affected by the two branches of ANS are skin temperature, heart rate, respiration rate, and pupil dilation. To measure emotional valance is more complex, and researchers generally used facial expression for identifying this dimension. At psychophysiological level facial expressions have been tested through the use of superficial electromyography (sEMG) a noninvasive way to quantify the muscle activation. In particular activations of the zygomatic major and corrugator supercilii facial muscles are known as the best indicators of emotional valence. We will have a positive valence with increased activity of the zygomatic major muscle, and a negative valence with a higher activation of corrugator supercilii facial muscle (Blumenthal et al., 2005 ). Since emotional valence identify positive or negative direction we might be interested in a more direct measure of emotional intensity, and the best index to this purpose is the pupil dilation (Mauri et al., 2010 ).

Respiratory activity can be recorded to identify both the voluntary and the autonomic respiration activity on cardiovascular activity, that can also be recorded. In particular respiration (RSP) can be recorded through respiratory inductance plethysmography (RIP) with thoracic and abdominal strips. On the other hand cardiac activity can be recorded through an electrocardiogram (ECG), and in particular identifying the R peak in the ECG waveform (with the conventional PQRSTU peaks). The oscillations in R to R peaks in the ECG waveform (also known as NN intervals to emphasize normal beats) provide several information on sympathovagal activations and can be used to compute heart rate variability (HRV) indexes in the temporal and spectral domain. According to the guidelines of the Task Force of the European Society of Cardiology and the North American Society of Pacing and Electrophysiology, temporal, spectral and nonlinear indexes of HRV, can be considered as a robust way to identify responses in the ANS (Malik, 1996 ).

Electroencephalographic (EEG) analyses of both the classic spectral bands (e.g., alpha, beta, delta, gamma, …) and the related potential (ERP) that is evoked also have been used extensively in psychophysiological research (Thompson, 2016 ). More in general the data collection related to the brain is one of the huge field in neuroscience and psychology. In particular neuroimaging techniques received a lot of attention in psychological science and are able to provide a wide spectra of information related to the human thinking related to cognitive, affective and relational aspects (Cipresso et al., 2012b ). Moreover, the improvement in the quality of the used methods to automatize the analysis of brain imaging results, provided new access to important information of structural brain and the related connections. This has been made possible by using deep learning and computational techniques, but also integrating different kinds of methods, such as EEG in the scanner addressing both spatial and temporal precision, and the interesting development of PET and fMRI, including spectroscopy, diffusion, connectome and the other challenges that neuroimaging and neuroscience methods are able to provide.

By using biosensors to record peripheral and central nervous activity, we can obtain an incredibly huge amount of data that can be analyzed for a deeper understanding of internal states during experimental tasks in psychological studies. In any case there are also several problems with the use of biosensors, in particular the obtrusiveness needs to be considered. Even if the goal is to not interfere with the experiment, this cannot be avoided and in a way or another we finish to affect behavior by measuring them. In most of the behavioral experiments, i.e., research related to emotions, affective states, and cognitive assessment, conditioning the participants is an important part that can risk to affect the validity and the reliability of a study.

This is a well-known problem in all the fields of research and there are different ways to deal with this issue. First, more precisely, less-intrusive biosensors have been developed. Also, there have been significant advancements in the development of integrated biosensors with emphasis to avoid any discomfort, building wearable biosensors without patches or cables.

Human experiments necessarily involve technologies used to read the impact of an experience goal in the study. However, the use of these technologies is not transparent to the participants and the more we want to know, the more will be investigated with sensors that will be evident in the subject. We need to seriously wonder if there is a way to have contactless biosensors, able to collect data which remains invisible to the data generator, that is a human being. In physics, scientists need to face the same problem which is well-known as the “observer effect,” where a physical measurement of a system is possible only influencing the system itself. In biobehavioral sciences the observer effect is even worse (Lewandowski et al., 2016 ). In consciousness research, the observer effect can be circumvented through “no-report” paradigms, where the idea is precisely to avoid asking people to report on their experiences to avoid the observer effect.

Interestingly, a foundational part of psychology and psychometrics consists not only of observing one's experiences, but also to think about them and report them. This appears to be a keen question where we are called to provide feedback as researchers in the field. Pervasiveness and ubiquitous computing could be a field that could provide some solutions by using biosensors integrated in technology—so personal as to be as invisible to individuals.

Considering smartphones and the data collected by the integrated sensors (such us the interconnected heart rate watches), we can understand that data collection can be transparent to everyone. In a big data world, it probably is easier to infer information from existing devices than collecting new data. On the other hand, this requires a change in knowledge and methods, i.e., to change from laboratory experimental-driven designs to field data-driven analyses. Even if this scenario is possible, it is not the solution to each measurement problems, but still a useful integration and direction to pursue the best way for a better understanding.

Psychophysiology addresses both the activities of the nervous system and the endocrine system, i.e., the collection of glands in people that secrete hormones directly into the circulatory system to reach distant target organs. Concentrations of hormones, patterns of released hormones, and the numbers and locations of the receptors of the hormones are related significantly with human behavior at the cognitive, emotional, and relational levels. Moreover, the efficiency of hormone receptors is involved in gene transcription and vice versa. In particular, hormones influence human behavior, which in turn can influence hormones, and the cycle continues. Thus, the endocrine system also must be investigated as an important measurement in psychology. For example, several studies about psychological stress demonstrated that prolonged stress causes the release of glucocorticoid (Lazzarino et al., 2013 ; Cattaneo and Riva, 2016 ). This release is controlled by the hypothalamus (such as the sympathetic nervous system, which is activated by acute, time-limited stressors), but, in this case, the control is endocrinal. In fact, the hypothalamus, with a release factor [corticotropin-releasing hormone (CRH)], induces the pituitary gland to release adrenocorticotropic hormone (ACTH), also known as corticotropin, that targets the adrenal glands (also known as suprarenal glands) (Popoli et al., 2012 ; Fries et al., 2016 ). This process, referred to as the hypothalamic–pituitary–adrenal axis (HPA axis or HTPA axis), is a major neuroendocrine system that manages stress reaction regulating several body functions (among which, digestion, emotions, and sexuality) (Dickerson and Kemeny, 2004 ). The most important glucocorticoid is cortisol, which is indeed considered to provide an objective measure of chronic stress. The metabolic effect of cortisol is slower than the effect of adrenaline, but it lasts longer (Singh et al., 1999 ). The release of cortisol has several effects on an organism, such as increasing the serum glucose with gluconeogenesis, increasing the metabolism of fat, and suppressing the immune system (to save energy). In addition, it shows negative effects on some cognitive processes, such as memory and attention (McEwen and Sapolsky, 1995 ) by affecting the hippocampus, which mediates the cortisol-induced feedback inhibition of the HPA axis. Cortisol also can produce the death of neural cells in the frontal lobe as well as producing detrimental consequences on the cardiovascular apparatus (Ockenfels et al., 1995 ; Miller et al., 2007 ).

Analysis of exhibited behavior patterns

Gomez-Marin and colleagues (Gomez-Marin et al., 2014 ; Gomez-Marin and Mainen, 2016 ) defined animal behavior as “the macroscopic expression of neural activity, implemented by muscular and glandular contractions acting on the body, and resulting in egocentric and allocentric changes in an organized temporal sequence” (p. 1456). Behavior is relational, dynamic and multi-dimensional, and to measure it, in any analysis, we must consider all of these aspects.

Normally, psychological researchers are interested in self-reported behaviors during experiments, but this raises an important question: Do people behave coherently with respect to what they self-report? This problem probably is more significant than measurement, since it can affect the validity of all of our research. Technologies can be great instruments in psychological investigations by reducing the gap between the individuals' behaviors and their opinions of their behaviors. For example, a test subject could make every effort not be stressed, but the subject actually could be more stressed than he/she thinks and/or more stressed than the general population (i.e., a normative sample). If psychophysiology can be used to understand internal states, behavioral patterns can be used to understand exhibited behaviors (Cipresso, 2015 ; Krakauer et al., 2017 ). Exhibited behaviors can also differ from what we expect. For example, in the famous Nisbett and Wilson experiment of 1977 (Nisbett and Wilson, 1977 ), a group of participants, hearing a continuous unsettling noise while watching a movie, declared to have enjoyed the experience less than others who didn't have that distraction. Nisbett and Wilson showed that the expressed pleasantness levels were the same for the group with the noise and the other group without the noise. This is surprisingly true in psychological experiments and needs to be considered when self-reported measures are quantified. The lesson learned is that exhibited behaviors are not only the expressed behavior, and this needs to be taken into account in behavioral research.

In particular, activity-related behavior suggests an action regulation that is clearly continuous and observable and can be identified in expressions, contours and other qualities, including vocal tonality. Moreover, gesture and posture are important cues of human communication and part of non-verbal behavior representing internal states, even if “unsaid.” On the other hand, verbal behavior could be interpreted to indicate the “said” elements (Nisbett and Wilson, 1977 ; Giakoumis et al., 2012 ; Gomez-Marin et al., 2014 ).

Technologies to detect exhibited behavior also are now available in low-cost devices, such as Microsoft Kinect, which was built for gaming but has been used extensively in behavioral research. High-end technologies also are used for the analysis of behaviors; for example, they are used for path analysis for neurological patients and in motion-capture systems. Other technologies that have been used extensively in behavioral research are based on body movements (such as Kinect or Nintendo Wii) and eye movements (by using commercial and high-end eye-trackers) (Cipresso et al., 2013 ). More recently, accelerometers and gyroscopes have been used as laboratory devices or in mobile devices we use every day (such as smartphones). Smartphones have become an important tool for researchers who are interested in understanding behavior during daily activities and in the field (out of the laboratory) (Cipresso et al., 2012a ; Gaggioli et al., 2013 ). The complexity of sensors included in current smartphones allows us to know an individual's position (with GPS), phone calls made (and received), physical activity and many other exhibited behaviors. Also, smartphones make it convenient for people to self-report their activities during specific daily contexts.

Other classic technologies used to attain people's exhibited behaviors are audio and video devices, which are used for both qualitative studies and quantitative analysis. Some of these devices can be very sophisticated, providing speech analysis and video analysis for information retrieval based on artificial intelligence (Camastra and Vinciarelli, 2008 ).

Virtual reality

One of the best ways to experience situations in a controlled environment, such as in a laboratory, is definitively Virtual Reality (VR). Thanks to VR, researchers are able to understand and measure several cognitive and emotional states and traits, and even personality traits (Heim, 1993 ; Ryan, 2001 ; Sherman and Craig, 2002 ; Cipresso and Serino, 2014 ; Villani et al., 2016 ; Kane and Parsons, 2017 ). From a technological perspective, VR requires a standard commercial PC capable of 3D visualization, a head-mounted display (HMD) endowed with position trackers and a game controller, such as a joypad (Riva et al., 2015 ). The image changes in real time thanks to the information that the tracker through the computer records from the position and orientation of the users' HMD in the subjects' head.

Psychologists usually describe VR as “an advanced form of human-computer interface that allows the user to interact with and become immersed in a computer-generated environment in a naturalistic fashion” (Schultheis and Rizzo, 2001 ). In general, the sense of presence , i.e., the feeling of being “inside” the simulated world, is the key element of VR as a communication device (Riva et al., 2014 , 2015 ). In the same way that individuals are consciously “being there,” the feeling of presence in a technologically mediated environment provides a very similar experience; i.e., subjects are not “outside” the synthetic experience, but are “inside” it. In other words, VR provides an experience to be lived as if in a real place, at the same time allowing experimental control in a lab setting with a totally manipulated environment (Riva et al., 2015 ).

In the past, the use of VR was limited by the expensive cost of the hardware device and software licenses. Over the last few years, the huge market of different head-mounted displays (HMDs), such as Oculus, HTC VIVE, OSVR, Gear, and others, has made it possible to have a bundle of a PC and VR system, including HMD, joypad and other input devices, for < 3,000 Euro (Brooks, 1999 ; Riva et al., 2003 ; Riva and Waterworth, 2014 ). However, unfortunately, the cost of the software is still problematic, not for the licenses, but for the personnel costs for making an integrated VR requiring code knowledge that psychologists are not prone to learn. In this effort, Riva and colleagues tried in the last decades to create open access and free solutions for creating virtual environments without the need of any code (Riva et al., 2011 ; Cipresso and Riva, 2016 ).

In the ‘90s, cybertherapy and virtual rehabilitation were considered interesting fields of research with several challenges and problems to solve (Lamson, 1994 ; North et al., 1996 ; Rothbaum et al., 1997 ). However, several clinical, controlled trials demonstrated the efficacy of cybertherapy (Riva, 2003 , 2005 ; Holden, 2005 ; Malloy and Milling, 2010 ; McCann et al., 2014 ). The contemporary reducing of the costs of HMD led to a new era of cybertherapy and virtual rehabilitation, making the field of interest of assessment and quantitative measurement also in the clinical field.

Since VR is a computer-based program, it can track everything. This is a great benefit for quantitative psychology and measurement since the researcher is able to have precise, per-millisecond data for each event that occurs during the experience. Thus, VR can elicit behavior in a replicable setting and simultaneously is able to record data and computing indexes by keeping experimental conditions as well. The use of VR allows the building of complex settings within which researchers can manipulate and replicate to test realistic situations for the behavioral aspects, reported to be relational, dynamic and multi-dimensional by definition (Cipresso, 2015 ). VR also can be connected to external devices and, within virtual environments, it is possible to integrate devices, such as biosensors, making it possible to measure quantitatively the experience during navigation by using specific interconnected biosensors and internal logs that provide indications about each event (Cipresso, 2015 ).

By fusing data from biosensors and devices interconnected within the VR environments, it is possible to synchronize all these signals with the log of the VR events that the researcher has set to identify experimental conditions as well as unexpected occurrences, incidental findings and all of the behaviors one may wish to analyze. In this sense, VR can be considered a great way to collect quantitative data of people's actual behaviors during realistic situations in simulated environments.

Computational science, complex networks, and simulations

One of the most pervasive scientific paradigms in the twenty-first century has been complexity. From hard science to social science, the use of such a paradigm affected the evolution of how we think about given phenomena. The idea that very simple interacting elements are able to produce a complex structure is fascinating, but it also is useful, since it allows us to explain complex phenomena as having emerged from the interactions of simple elements that can be understood and analyzed (Miller and Page, 2009 ).

The bottom-up approach exploits complexity, opens new ways to study psychological constructs, and provides new tools to answer old questions. Indeed, psychology historically experienced and contributed to the diffusion of complexity, with superb contributions in artificial intelligence, complex networks, psychophysics, and, in general, by using the theoretical and pragmatic level models, methods, and concepts that still are part of complex science (Myung, 2000 ; Friedenberg, 2009 ; Guastello et al., 2009 ).

The increased computational capacity that is currently available provides a new approach to quantitative psychology, and, more generally for measurements, to think well beyond just the new way to analyze data. For example, network complexity is used extensively to analyze relational data, but it also is used to create new ways of thinking in psychology, such as the network theory of mental disorders (Fried and Cramer, 2016 ; Jones et al., 2017 ). In addition, computational technologies are providing new ways to create psychological platforms for the assessment of patients in clinical settings as well as their rehabilitation. The use of mobile applications, virtual reality, and psychophysiology for psychological science is also becoming even more computationally oriented by also integrating classification, automatic recognition, and machine-learning algorithms for measurements, as well as for use as a new way to treat mental disorders (Villani et al., 2011 , 2016 ; Michalski et al., 2013 ; Serino et al., 2013 ; Cipresso et al., 2017 ).

From this perspective, pervasiveness and unobtrusiveness are the keys for the integration of computational technologies and methods in psychological tools.

Toward the challenges of the twenty-second century

Considering the current development of psychometric tools for quantitative psychology and measurement, we posit that the first two decades of the twenty-first century highlighted a future in which human-computer confluence was possible at the methodological and practical levels (Figure 1 ). It seems clear that neuropsychological assessment and psychological evaluation will be based on technologies and computational methods, but we can expect more than this for the future. The pervasiveness of low-cost and high-end technologies is exploding, and we can expect that, in the next few decades, they will be integrated further into our daily lives and into objects (e.g., the Internet of Things, IoT). They will be so unobtrusive as to be invisible to the users, e.g., contactless biosensors that can record physiological signals without any patches or sensors on the body (acting from a distance or with conductive objects, such as a chair that record the patient's ECG from her or his back or the use of a mouse to determine the conductance level of human skin) (Cipresso et al., 2013 ; Cipresso, 2015 ).

Figure 1

Data fusion and computation from the available sources.

Further, we can expect the use of many other technologies and methods. For example, we cannot exclude a “personalized psychology,” such as the well-known “personalized medicine,” to use genetic information for the understanding of functional and dysfunctional behavior.

In any case, the use of new technologies and new methods can only be driven by new psychologists, in particular new psychometricians who rely on the actual knowledge of psychological science as it is at the moment, but also can build new ways of thinking about psychological settings, experiments, studies, and, above all, interventions. These capabilities will provide a deeper understanding of human behavior and lead to improvements in the well-being of humankind.

Author contributions

PC and JI conceived the idea. PC wrote the manuscript. PC and JI revised the manuscript and approved the final version.

Conflict of interest statement

The authors declare that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

  • Blascovich J. (2014). Using physiological indexes in social psychological research, in Handbook of Research Methods in Social and Personality Psychology, eds Reis H. T., Judd C. M. (Cambridge, UK: Cambridge University Press; ), 101–122. [ Google Scholar ]
  • Blumenthal T. D., Cuthbert B. N., Filion D. L., Hackley S., Lipp O. V., van Boxtel A. (2005). Committee report: guidelines for human startle eyeblink electromyographic studies. Psychophysiology 42, 1–15. 10.1111/j.1469-8986.2005.00271.x [ DOI ] [ PubMed ] [ Google Scholar ]
  • Brooks F. P., Jr. (1999). What's real about virtual reality? Comput. Graphics Appl. IEEE 19, 16–27. 10.1109/38.799723 [ DOI ] [ Google Scholar ]
  • Camastra F., Vinciarelli A. (eds.). (2008). Machine learning for audio, image and video analysis, in Advanced Information and Knowledge Processing (Berlin; Heidelberg: Springer; ), 83–89. [ Google Scholar ]
  • Cattaneo A., Riva M. (2016). Stress-induced mechanisms in mental illness: a role for glucocorticoid signalling. J. Steroid Biochem. Mol. Biol. 160, 169–174. 10.1016/j.jsbmb.2015.07.021 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Cipresso P. (2015). Modeling behavior dynamics using computational psychometrics within virtual worlds. Front. Psychol. 6:1725. 10.3389/fpsyg.2015.01725 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cipresso P., Gaggioli A., Serino S., Pallavicini F., Raspelli S., Grassi A., et al. (2012b). EEG alpha asymmetry in virtual environments for the assessment of stress-related disorders. Stud. Health Technol. Inform. 173, 102–104. 10.3233/978-1-61499-022-2-102 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Cipresso P., Matic A., Lopez G., Serino S. (2017). Computational paradigms for mental health. Comput. Math. Methods Med. 2017:5607631. 10.1155/2017/5607631 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Cipresso P., Riva G. (2016). Personality assessment in ecological settings by means of virtual reality, in The Wiley Handbook of Personality Assessment, ed Kumar U. (Hoboken, NJ: John Wiley & Sons; ), 240–248. [ Google Scholar ]
  • Cipresso P., Serino S. (2014). Virtual Reality: Technologies, Medical Applications and Challenges. Hauppauge, NY: Nova Science Publishers. [ Google Scholar ]
  • Cipresso P., Serino S., Gaggioli A., Albani G., Riva G. (2013). Contactless bio-behavioral technologies for virtual reality. Ann. Rev. Cyberther. Telemed. 191, 149–153. 10.3233/978-1-61499-282-0-149 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Cipresso P., Serino S., Villani D., Repetto C., Sellitti L., Albani G., et al. (2012a). Is your phone so smart to affect your state? an exploratory study based on psychophysiological measures. Neurocomputing 84, 23–30. 10.1016/j.neucom.2011.12.027 [ DOI ] [ Google Scholar ]
  • Dickerson S. S., Kemeny M. E. (2004). Acute stressors and cortisol responses: a theoretical integration and synthesis of laboratory research. Psychol. Bull. 130, 355–391. 10.1037/0033-2909.130.3.355 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Fried E. I., Cramer A. O. (2016). Moving forward: challenges and directions for psychopathological network theory and methodology. Perspect. Psychol. Sci. 12, 999–1020. 10.1177/1745691617705892 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Friedenberg J. (2009). Dynamical Psychology: Complexity, Self-Organization and Mind. Marblehead, MA: ISCE Publishing. [ Google Scholar ]
  • Gaggioli A., Pioggia G., Tartarisco G., Baldus G., Corda D., Cipresso P., et al. (2013). A mobile data collection platform for mental health research. Pers. Ubiquitous Comput. 17, 241–251. 10.1007/s00779-011-0465-2 [ DOI ] [ Google Scholar ]
  • Giakoumis D., Drosou A., Cipresso P., Tzovaras D., Hassapis G., Gaggioli A., et al. (2012). Using activity-related behavioural features towards more effective automatic stress detection. PLoS ONE 7:e43571. 10.1371/journal.pone.0043571 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Gomez-Marin A., Mainen Z. F. (2016). Expanding perspectives on cognition in humans, animals, and machines. Curr. Opin. Neurobiol. 37, 85–91. 10.1016/j.conb.2016.01.011 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Gomez-Marin A., Paton J. J., Kampff A. R., Costa R. M., Mainen Z. F. (2014). Big behavioral data: psychology, ethology and the foundations of neuroscience. Nat. Neurosci. 17, 1455–1462. 10.1038/nn.3812 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Guastello S. J., Koopmans M., Pincus D. (2009). Chaos and Complexity in Psychology. Cambridge: Cambridge University. [ Google Scholar ]
  • Heim M. (1993). The Metaphysics of Virtual Reality. Oxford, UK: Oxford University Press. [ Google Scholar ]
  • Holden M. K. (2005). Virtual environments for motor rehabilitation: review. Cyberpsychol. Behav. 8, 187–211. 10.1089/cpb.2005.8.187 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Jones P. J., Heeren A., McNally R. J. (2017). Commentary: a network theory of mental disorders. Front. Psychol. 8:1305. 10.3389/fpsyg.2017.01305 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Kane R. L., Parsons T. D. (2017). The Role of Technology in Clinical Neuropsychology. Oxford, UK: Oxford University Press. [ Google Scholar ]
  • Krakauer J. W., Ghazanfar A. A., Gomez-Marin A., MacIver M. A., Poeppel D. (2017). Neuroscience needs behavior: correcting a reductionist bias. Neuron 93, 480–490. 10.1016/j.neuron.2016.12.041 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Kreibig S. D., Samson A. C., Gross J. J. (2015). The psychophysiology of mixed emotional states: internal and external replicability analysis of a direct replication study. Psychophysiology 52, 873–886. 10.1111/psyp.12425 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Lamson R. (1994). Virtual therapy of anxiety disorders. CyberEdge J. 4, 1–28. [ Google Scholar ]
  • Lang P. J. (1995). The emotion probe - studies of motivation and attention. Am. Psychol. 50, 372–385. 10.1037/0003-066X.50.5.372 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Lazzarino A. I., Hamer M., Gaze D., Collinson P., Steptoe A. (2013). The association between cortisol response to mental stress and high-sensitivity cardiac troponin T plasma concentration in healthy adults. J. Am. Coll. Cardiol. 62, 1694–1701. 10.1016/j.jacc.2013.05.070 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Lewandowski A., Baker W. J., Sewick B., Knippa J., Axelrod B., McCaffrey R. J. (2016). Policy statement of the American board of professional neuropsychology regarding third party observation and the recording of psychological test administration in neuropsychological evaluations. Appl. Neuropsychol. Adult. 23, 391–398. 10.1080/23279095.2016.1176366 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Malik M. (1996). Task force of the European society of cardiology and the north American society of pacing and electrophysiology. Heart rate variability. standards of measurement, physiological interpretation, and clinical use. Eur. Heart. J. 17, 354–381. 10.1093/oxfordjournals.eurheartj.a014868 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Malloy K. M., Milling L. S. (2010). The effectiveness of virtual reality distraction for pain reduction: a systematic review. Clin. Psychol. Rev. 30, 1011–1018. 10.1016/j.cpr.2010.07.001 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Mauri M., Magagnin V., Cipresso P., Mainardi L., Brown E. N., Cerutti S., et al. (2010). Psychophysiological signals associated with affective states, Paper presented at the Engineering in Medicine and Biology Society (EMBC), 2010 Annual International Conference of the IEEE (Buenos Aires: ). [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • McCann R. A., Armstrong C. M., Skopp N. A., Edwards-Stewart A., Smolenski D. J., June J. D., et al. (2014). Virtual reality exposure therapy for the treatment of anxiety disorders: an evaluation of research quality. J. Anxiety Disord. 28, 625–631. 10.1016/j.janxdis.2014.05.010 [ DOI ] [ PubMed ] [ Google Scholar ]
  • McEwen B. S., Sapolsky R. M. (1995). Stress and cognitive function. Curr. Opin. Neurobiol. 5, 205–216. 10.1016/0959-4388(95)80028-X [ DOI ] [ PubMed ] [ Google Scholar ]
  • McGaugh J. L. (2016). Emotions and Bodily Responses: A Psychophysiological Approach. Cambridge, MA: Academic Press. [ Google Scholar ]
  • Michalski R. S., Carbonell J. G., Mitchell T. M. (2013). Machine Learning: An Artificial Intelligence Approach. Berlin; Heidelberg: Springer Science & Business Media. [ Google Scholar ]
  • Miller G. E., Chen E., Zhou E. S. (2007). If it goes up, must it come down? chronic stress and the hypothalamic-pituitary-adrenocortical axis in humans. Psychol. Bull. 133, 25–45. 10.1037/0033-2909.133.1.25 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Miller J. H., Page S. E. (2009). Complex Adaptive Systems: An Introduction to Computational Models of Social Life. Princeton, NJ: Princeton University Press. [ Google Scholar ]
  • Myung I. J. (2000). The importance of complexity in model selection. J. Math. Psychol. 44, 190–204. 10.1006/jmps.1999.1283 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Nisbett R. E., Wilson T. D. (1977). Telling more than we can know: verbal reports on mental processes. Psychol. Rev. 84:231 10.1037/0033-295X.84.3.231 [ DOI ] [ Google Scholar ]
  • North M. M., North S., Coble J. (1996). Virtual Reality Therapy. Ann Arbor, MI: IPI Press. [ Google Scholar ]
  • Ockenfels M. C., Porter L., Smyth J., Kirschbaum C., Hellhammer D. H., Stone A. A. (1995). Effect of chronic stress associated with unemployment on salivary cortisol: overall cortisol levels, diurnal rhythm, and acute stress reactivity. Psychosom. Med. 57, 460–467. 10.1097/00006842-199509000-00008 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Popoli M., Yan Z., McEwen B. S., Sanacora G. (2012). The stressed synapse: the impact of stress and glucocorticoids on glutamate transmission. Nat. Rev. Neurosci. 13, 22–37. 10.1038/nrn3138 [ DOI ] [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Fries G. R., Gassen N. C., Schmidt U., Rein T. (2016). The FKBP51-glucocorticoid receptor balance in stress-related mental disorders. Curr. Mol. Pharmacol. 9, 126–140. 10.2174/1874467208666150519114435 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Riva G. (2003). Applications of virtual environments in medicine. Methods Inf. Med. 42, 524–534. [ PubMed ] [ Google Scholar ]
  • Riva G. (2005). Virtual reality in psychotherapy: review. Cyberpsychol. Behav. 8, 220–230. 10.1089/cpb.2005.8.220 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Riva G., Davide F., IJsselsteijn W. A. (2003). Being There: Concepts, Effects and Measurements of User Presence in Synthetic Environments. Amsterdam: IOS Press. [ Google Scholar ]
  • Riva G., Gaggioli A., Grassi A., Raspelli S., Cipresso P., Pallavicini F., et al. (2011). NeuroVR 2–a free virtual reality platform for the assessment and treatment in behavioral health care. Stud. Health Technol. Inform. 163, 493–495. 10.3233/978-1-60750-706-2-493 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Riva G., Mantovani F., Waterworth E. L., Waterworth J. A. (2015). Intention, Action, Self and Other: An Evolutionary Model of Presence Immersed in Media. Berlin; Heidelberg: Springer. [ Google Scholar ]
  • Riva G., Waterworth J. (2014). Being present in a virtual world, in Oxford Handb. Virtual, ed Grimshaw M. (Oxford: Oxford University Press; ), 205. [ Google Scholar ]
  • Riva G., Waterworth J., Murray D. (2014). Interacting with Presence: HCI and the Sense of Presence in Computer-Mediated Environments. Berlin: Walter de Gruyter GmbH. [ Google Scholar ]
  • Rothbaum B. O., Hodges L., Kooper R. (1997). Virtual reality exposure therapy. J. Psychother. Pract. Res. 6, 219–226. [ PMC free article ] [ PubMed ] [ Google Scholar ]
  • Russell J. A. (1979). Affective space is bipolar. J. Pers. Soc. Psychol. 37, 345–356. 10.1037/0022-3514.37.3.345 [ DOI ] [ Google Scholar ]
  • Ryan M.-L. (2001). Narrative as Virtual Reality. Londres: Parallax. [ Google Scholar ]
  • Schultheis M. T., Rizzo A. A. (2001). The application of virtual reality technology in rehabilitation. Rehabil. Psychol. 46:296 10.1037/0090-5550.46.3.296 [ DOI ] [ Google Scholar ]
  • Serino S., Cipresso P., Gaggioli A., Riva G. (2013). The Potential of Pervasive Sensors and Computing for Positive Technology: the Interreality Paradigm Pervasive and Mobile Sensing and Computing for Healthcare. Berlin; Heidelberg: Springer. [ Google Scholar ]
  • Sherman W. R., Craig A. B. (2002). Understanding Virtual Reality: Interface, Application, and Design. Amsterdam: Elsevier. [ Google Scholar ]
  • Singh A., Petrides J. S., Gold P. W., Chrousos G. P., Deuster P. A. (1999). Differential hypothalamic-pituitary-adrenal axis reactivity to psychological and physical stress 1. J. Clin. Endocrinol. Metab. 84, 1944–1948. 10.1210/jc.84.6.1944 [ DOI ] [ PubMed ] [ Google Scholar ]
  • Thompson T. P. (2016). EEG in Depth: The Intersection of Electroencephalography and Depth Psychology. Carpinteria, CA: Pacifica Graduate Institute. [ Google Scholar ]
  • Villani D., Cipresso P., Gaggioli A., Riva G. (Eds). (2016). Int Integrating Technology in Positive Psychology Practice. Hershey, PA: IGI Global. [ Google Scholar ]
  • Villani D., Grassi A., Cognetta C., Cipresso P., Toniolo D., Riva G. (2011). The effects of a mobile stress management protocol on nurses working with cancer patients: a preliminary controlled study. Stud. Health Technol. Inform. 173, 524–528. 10.3233/978-1-61499-022-2-524 [ DOI ] [ PubMed ] [ Google Scholar ]
  • View on publisher site
  • PDF (723.4 KB)
  • Collections

Similar articles

Cited by other articles, links to ncbi databases.

  • Download .nbib .nbib
  • Format: AMA APA MLA NLM

Add to Collections

HYPOTHESIS AND THEORY article

Quantitative methods in psychology: inevitable and useless.


						Aaro Toomela*

  • Institute of Psychology, Tallinn University, Tallinn, Estonia

Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the studied processes; the second looks for identification of cause–effect relationships between the events with no possible access to the understanding of the structures that underlie the processes. Quantitative methodology in particular as well as mathematical psychology in general, is useless for answering questions about structures and processes that underlie observed behaviors. Nevertheless, quantitative science is almost inevitable in a situation where the systemic-structural basis of behavior is not well understood; all sorts of applied decisions can be made on the basis of quantitative studies. In order to proceed, psychology should study structures; methodologically, constructive experiments should be added to observations and analytic experiments.

Science begins with questions. Everybody can have questions, and even answers to them. What makes science special is its method of answering questions. Therefore a scientist must ask questions both about the phenomenon to be understood and about the method. There are actually not one or two but four principal questions that should be asked by every scientist when conducting studies ( Toomela, 2010b ):

1. What do I want to know, what is my research question?

2. Why I want to have an answer to this question?

3. With what specific research procedures (methodology in the strict sense of the term) can I answer my question?

4. Are the answers to three first questions complementary, do they make a coherent theoretically justified whole?

First, there should be a question about some phenomenon that needs an answer. Next, the need for an answer should be justified – in science it is quite possible to ask “wrong” questions, which answers do not help understanding the studied phenomena. Vygotsky (1982a) gave in his colorful language an ironic example of answering scientifically wrong questions:

One can multiply the number of citizens of Paraguay with the number of versts [an obsolete Russian unit of length] from Earth to Sun and divide the result with the average length of life of an elephant and conduct this whole operation without a flaw even in one number; and yet the number found in the operation can confuse anybody who would like to know the national income of that country

(p. 326; my translation).

It can be said that modern psychology is more advanced than science of Vygotsky’s time; perhaps the questions asked in the modern science are meaningful. This opinion, however, may be wrong. One source of wrong questions about the studied phenomena is unsatisfactory answer to the last question – when answers to first three questions do not agree one with another. In this paper I am going to suggest that psychology asks “wrong” questions far too often. The problem is related to the mismatch in answers to the first and third question. Specifically, quantitative methodology that dominates psychology of today is not appropriate for achieving understanding of mental phenomena, psyche.

The number of substantial problems with quantitative methods brought out by scholars is increasing every year. Already one observation could make scientists cautious. The questions provided above are in a certain order – first we should have a question about the phenomenon and only then the appropriate method for finding an answer should be looked for. Substantial part of modern psychology follows the opposite order of decisions – first it is decided to use quantitative methods, and the question about the phenomenon is already formulated in the language of data analysis. Between 1940 and 1955, statistical data analysis became the indispensable tool for hypothesis testing; with this change of scientific methodology, statistical methods began to turn into theories of mind. Instead of looking for theory that perhaps can be elaborated with the help of statistical tools, statistical tools began to determine the shape of theories ( Gigerenzer, 1991 , 1993 ).

For instance, a researcher may ask, how many factors emerge in the analysis of personality or intelligence test results. But why to look for the number of factors if personality or intelligence is studied? We would guess here that the original question may be something like, is it possible to identify distinguishable components in the structure of personality or intelligence? However, the decision to use factor analysis for that purpose must be justified before this method is chosen. This justification seems to be missing; it is only a hypothesis – ungrounded hypothesis – that factor analysis is an appropriate tool for identifying distinct mental processes that underlie behavioral data (filling in a questionnaire is behavior). The problems emerge already with the determination of the number of factors to retain. There are formal and substantial criteria for that. Formal decisions are based on Kaiser’s criterion, Cattel’s scree test, Velicer’s Minimum Average Partial test, or Horn’s parallel analysis. There is no evidence that any of these criteria is actually suitable for distinguishing the number of distinct processes that underlie behavior. Researchers also decide the number of factors on the basis of comprehensibility – the solution which generates the most comprehensible factor structure is chosen. But this substantial criterion is always used after applying formal criteria; nobody starts from the possibility that all, say, 248 items of an inventory correspond to 248 distinct mental processes. The number of factors usually retained – from two to six or seven – seems to correspond to processing limitations of the researcher’s working memory rather than to true structure of the mind.

In this paper, epistemological issues that underlie quantitative methods used in psychology are discussed. I suggest that, regardless of the research area in psychology, mathematical procedures of any kind cannot answer questions about the structure of mind. The discussion focuses primarily on the statistical methodology as used in psychology today; yet there are fundamental problems inherent to other kinds of mathematical approaches as well. My intention is not to suggest that scientific studies of mind should reject mathematical approaches. Rather, it should be made clear, which questions can be answered with the help of mathematical methods and which cannot.

Which Questions Can and Which Cannot be Answered by the Statistical Data Analysis Procedures?

We should look for reasons to use statistical data analysis into the works of those, who introduced quantitative methodology into sciences in general and psychology in particular. Today, as a rule, users and developers of statistical data analysis procedures do not ask any more which questions can and which cannot be answered with the help of those procedures. Scholars who introduced mathematical procedures, however, made it clear, what kinds of answers they are looking for. We will see that these scholars would reject the questions answered by statistical procedures today for reasons that are largely ignored without any scientific reason by modern researchers. One of the most influential figures in introducing factor analysis into psychology was Thurstone. There are several ideas in his fundamental work The vectors of mind that are worthy of attention ( Thurstone, 1935 ). These ideas, in the most part, underlie the use of not only factor analysis but the use of all forms of covariation-based data analysis procedures.

What are the (Statistical) Causes of Relationships Between Variables?

Thurstone suggested that the object of factor analysis is to discover the mental faculties. It is interesting that for him factor analysis alone would have never been sufficient for proving that a new faculty has been discovered – he held a position that results of factor analysis must be supported by the experimental observations. In another work he found that in some fields of studies tests are used that are not tests at all – “They are only questionnaires in which the subject controls the answers completely. It would probably be very fruitful to explore the domain of temperament with experimental tests instead of questionnaires.” ( Thurstone, 1948 , p. 406). So, Thurstone would very likely reject the modern practice to study many psychological phenomena – personality, values, attitudes, mental states, etc. – with questionnaires alone as it is often done now. There seems to be no theory that would justify studies of the structure of mind only by questionnaires. Without a theory that links subjectively controlled patterns of answers to objective structure of mind the results of all such studies are not grounded. Thorough analysis of this issue is beyond the scope of the current paper.

We can ask, what was the general question Thurstone aimed at answering with the help of factor analysis? Thurstone was not asking how mental faculties operate; he was looking for identification of what he called abilities , i.e., traits (which are attributes of individuals) which are defined by what an individual can do ( Thurstone, 1935 , p. 48).

The same questions about identification of “abilities” underlie the use of not only factor analysis but other covariation-based statistical data-analysis procedures as well. Here it is feasible to go deeper into the roots of introducing quantitative data-analysis into sciences. Methods for calculating correlation coefficients entered sciences somewhere in the middle of the 19th century but became popular with the works of Pearson (cf., 1896) . He formulated the tasks of statistical data analysis in the following way:

One of the most frequent tasks of the statistician, the physicist, and the engineer, is to represent a series of observations or measurements by a concise and suitable formula. Such a formula may either express a physical hypothesis, or on the other hand be merely empirical, i.e., it may enable us to represent by a few well selected constants a wide range of experimental or observational data. In the latter case it serves not only for purposes of interpolation, but frequently suggests new physical concepts or statistical constants.

( Pearson, 1902 , p. 266).

I think it is especially noteworthy – formula that is searched for, represents observations or measurements – i.e., variables . This fact is so obvious that consequences that follow from it are usually not thought through. The main problem related to use of observations and measurements is that they do not necessarily reflect the reality objectively; they are subjective interpretations of the world by the researcher. This is especially true in the situation when the observation – of external behavior, in psychology – is supposed to reflect operation of the hidden from direct observation construct, mental faculty. Thurstone acknowledged that externally similar behaviors can be based on internally different mechanisms; and he was only interested in finding formulas that express regularities in the external behavior. Pearson essentially did the same; he assumed that with the help of correlations, it is possible to get closer to identification of different causes of external regularities. Pearson also did not aim at describing how these causes operate. He, similarly with Thurstone, was looking for identifying regularities (faculties in Thurstone’s terms) in the observable cause → effect chains without claiming that unique cause, hidden from direct observation, is necessarily identified. This limitation for the aim of statistical analyses can be found in many of his works, as in the following passage, for instance:

We shall now assume that the sizes of this complex of organs are determined by a great variety of independent contributory causes, for example, magnitudes of other organs not in the complex, variations in environment, climate, nourishment, physical training, various ancestral influences, and innumerable other causes, which cannot be individually observed or their effects measured.

( Pearson, 1896 , p. 262).

When Pearson correlated sizes of organs he was, thus, aware that mathematical formulas that reflect certain commonalities in the variation of two variables do not reflect unique roles of individual contributory causes; these causes determine the measured sizes in ways that are not known.

The general form of the question Pearson answered with statistical analyses can be formulated: What is the value of a certain variable when we know a value of another variable that is correlated to the first? An example of this kind of use was provided by Pearson when he reconstructed “the parts of an extinct race from a knowledge of a size of a few organs or bones, when complete measurements have been or can be made for an allied and still extant race.” ( Pearson, 1899 , p. 170).

Correlation, in this case, can be understood as a representation of some abstract cause which “makes” variables to covary. Thus, the same question can be reformulated: Is it possible to discover an abstract cause-like communality of different variables that is expressed as covariation? I think Pearson was very clear in understanding that correlation reflects covariations of appearances; the true underlying causes of covariation, the mechanisms that determine how the covariation emerges, cannot be known with statistical procedures – there are many independent causal agents operating, “which cannot be individually observed or their effects measured.” At the same time, he could interpret covariations between variables in non-mathematical terms; he interpreted them as reflecting common cause. For instance, he concluded on the basis of statistical analyses that fertility and fecundity are inherited characteristics ( Pearson et al., 1899 ) – he, thus, suggested that some non-mathematical factor, inheritance, underlies the correlations he discovered.

Thurstone went a step further and suggested – it is possible to find formulas for expressing patterns of covariations; factor analysis identifies “faculties” or “abilities” that underlie correlation among several variables simultaneously. He was also clear that factor analysis expresses relationships between appearances; possible differences in internal mechanisms that may underlie externally similar behaviors are not reflected in the results of factor analysis.

Limits on the Questions that Can be Answered with the Help of Statistical Data Analysis Procedures

Statistical theories reflect regularities only in appearances.

Pearson was fully aware of the limits of statistical theories. Theory that looks for mechanisms should be clearly distinguished from pure descriptions of regularities in superficial observations; statistical laws… “have nothing whatever to do with any physiological hypothesis” ( Pearson, 1904 , p. 55). According to him, “the statistical view of inheritance is not at basis a theory, but a description of observed facts, with which any physiological theory must be in accord” ( Pearson, 1903–1904 , p. 509). We know from the modern biological theory of inheritance, how correct he was: the statistical laws discovered by him had really nothing to do with the discovery of the structure of DNA, even though they may have directed biologists to look for possible substrate of inheritance. It is also noteworthy that, contrary to Pearson, after discovering the structure of DNA and explaining the biological mechanisms of inheritance, there was no need at all to check whether this theory of structure accords with Pearson’s laws or not; these laws became irrelevant for the theory.

Thurstone was looking for discovering mental faculties with the help of the factor analysis. Similarly with Pearson, he did not assume that discovered faculties can be directly related to mental operations; he did not assume one-to-one correspondence between observed behaviors and mechanisms that underlie them:

The attitudes of people on a controversial social issue have been appraised by allocating each person to a point in a linear continuum as regards his favorable or unfavorable affect toward the psychological object. Some social scientists have objected because two individuals may have the same attitude score toward, say, pacifism, and yet be totally different in their backgrounds and in the causes of their similar social views. If such critics were consistent, they would object also to the statement that two men have identical incomes, for one of them earns when the other one steals. They should also object to the statement that two men are of the same height. The comparison should be held invalid because one of the men is fat and the other is thin.

( Thurstone, 1935 , p. 47)

It was shown above that Thurstone was not asking how mental faculties operate; he was looking for just identification of abilities. There can be, thus, an ability to make money; and this ability is treated the same independently of whether the income is made by earning or by stealing. Statistical procedures used by Thurstone aim at discovering an ability to make income, for instance, but there would be no clue as to the mechanisms of the income-making. So, if there is a phenomenon like income in the world, perhaps factor analysis would be helpful to discover it.

There are, however, reasons to disagree partly with Thurstone’s interpretation of this procedure. He suggested, for example, that incomes based on different sources should be considered to be different if social scientists were fully consistent; he disagreed with this idea. Essentially, Thurstone seems to assume that it is possible to isolate the phenomenon from the world and perhaps study it after isolation as a thing in itself. In the real world, however, a thing that exists completely isolated from the world would be unknowable in principle, because we know the world only being in relation with it. Income is by definition the amount of money or its equivalent received during a period of time. If we analyze the phenomenon of money, we discover that it is a relational phenomenon. Money is a medium of exchange and unit of account; money, thus, is a phenomenon that mediates certain economic relationships. Outside the society, money ceases to be money; it becomes just a physical object. Societies determine relations toward money in much more complex ways than just in economic terms. For instance, in modern democratic societies it would be legally possible to confiscate money if the money turns out to be stolen; but there are no societies where money would be confiscated because it was earned. So, the incomes of two men, for one who earns and for the other who steals, are not the same indeed.

Thurstone would likely – and fairly – reject this critique by telling that “Every scientific construct limits itself to specified variables without any pretense to cover those aspects of a class of phenomena about which it has said nothing” ( Thurstone, 1935 , p. 47). By saying that a factor represents some isolated characteristic of the studied phenomenon, Thurstone retains consistency of his approach. And this is exactly where the weakness of statistical theories lies: these theories are about regularities in appearances with no necessary connection to the underlying mechanisms. Thurstone, similarly with Pearson, was fully aware of this limitation:

This volume is concerned with methods of discovering and identifying significant categories in psychology and in other social sciences. […] It is the faith of all science that an unlimited number of phenomena can be comprehended in terms of a limited number of concepts or ideal constructs. […] The constructs in terms of which natural phenomena are comprehended are man-made inventions. To discover a scientific law is merely to discover that a man-made scheme serves to unify, and thereby to simplify, comprehension of a certain class of natural phenomena. A scientific law is not to be thought of as having an independent existence which some scientist is fortunate to stumble upon. A scientific law is not a part of nature. It is only a way of comprehending nature. […] While the ideal constructs of science do not imply physical reality, they do not deny the possibility of some degree of correspondence with physical reality. But this is a philosophical problem that is quite outside the domain of science.

( Thurstone, 1935 , p. 44).

If biologists would have accepted this view, there would be no modern science of inheritance, for example. Modern biological theories do not look for “some degree of correspondence” between theories and physical reality; these theories aim at full correspondence. 1 In other words, scientific theories are not assumed to represent human-made generalizations based on covariations between appearances with no necessary connection to the reality that underlies these connections. On the contrary, the aim of sciences has become to understand exactly what Thurstone, Pearson and other statistical theorists did not aim at – to understand phenomena as they exist, not as they seem to us.

Statistical theories of mechanisms depend on postulates that are not grounded and on conditions that are not satisfied

Modern quantitative psychology may sometimes claim that its aims are similar to modern biology or physics – the discovery of the mechanisms that underlie the appearances, the observable behaviors. Founders of the statistical theorizing denied such possibility by means of quantitative data analysis that is based on analysis of covariations between variables; perhaps they missed something fundamental that makes possible what they declared not to be possible? Perhaps it became possible to discover, for instance, by means of factor analysis the structure of mind as it is, not just as a man-made law that reflects only superficial covariations between observed events?

There are reasons to suggest that quantitative tools are not appropriate for this aim. Statistical data analysis procedures used in modern quantitative psychology are based on postulates that do not contradict the aims of statistical theorizing Thurstone, Pearson and their followers had. The same postulates, however, are incompatible with the aims of those who look for properties of mind as it is and not only for the generalizations that can be made about any kind of observations.

Postulate of quantitative measurement. Modern psychology must postulate that variables that are entered into analyses can be interpreted in terms of underlying mechanisms. Otherwise interpretation of the results of analyses in terms of those mechanisms would not be valid. Reasons to doubt whether this postulate is actually true, emerge already from the Pearson’s works. Namely, he extended statistical theorizing to characteristics that cannot be quantified ( Pearson and Lee, 1900 ). For Thurstone and Pearson, it was not a problem that measured variables represent events with essentially unknown underlying causes because they did not aim at understanding those causes; they just looked for descriptions of statistical regularities in different observations. So for them even the question whether a variable represents something that can be quantified or not, was not an issue. But it must be one of the first problems to solve if physical or psychological reality is aimed to understand – what exactly is encoded in variables?

Psychology of today can be called pathological – many hypotheses are accepted as true without attempts being made to test them; the hypothesis that psychological attributes are quantitative is not tested in psychology of today ( Michell, 2000 ). Worse, there are all reasons to suggest that attributes that are “measured” in psychology cannot be measured, because they are not quantitative (e.g., Essex and Smythe, 1999 ; Michell, 2010 ). Therefore, covariations between variables have no meaningful interpretation as to the underlying mechanisms in principle, because different levels of variables may denote qualitatively different phenomena. This alone would be sufficient for rejecting interpretations of quantitative analyses about underlying mechanisms. But there is more – as Thurstone also pointed out – externally similar behaviors can rely on internally different mechanisms. Thus even the same level on some variable may represent qualitatively different phenomena in different cases. It follows that under such circumstances no quantitative procedure can distinguish qualitatively different mechanisms that may underlie externally the same behavior – the variable that encodes behavior independently of differences in (psychic) mechanisms simply does not contain information about mechanism ( Toomela, 2008 ).

If a researcher would be interested in distinguishing the psychological mechanisms of behavior, other procedures would be needed. A researcher would invent different methods to reveal differences in externally similar behaviors. For instance, in many situations it could be possible just to ask directly from the person a justification for his or her behavior. It is important that the methods that must be created for discovering potential differences in externally similar behaviors are only qualitative because, as we saw, variables entered into quantitative data analyses lack the necessary information.

Postulate of continuity. There is another postulate, which underlies quantitative data analysis procedures. Thurstone, for instance, postulated:

The standard scores of all individuals in an unlimited number of abilities can be expressed, in first approximation, as linear functions of their standard scores in a limited number of abilities

( Thurstone, 1935 , p. 50).

So, there is a postulate that linear functions characterize relationships between the abilities (i.e., mental faculties) and individual acts of behavior. The main question that should be answered here is not only about postulate of linearity – the same problem would be related to non-linear relationships between variables – but about the postulate of continuity that is also made with this postulate of linearity. If it would turn out that some relationships between events are in essence qualitative then no factor analysis, or any other kind of quantitative data manipulation, can reveal those qualitative aspects of changes.

Often qualitative relationships hold between events. Lack of one nucleotide in a gene may be related to qualitatively different processes of protein synthesis, related to that gene. One extra chromosome does not just end up with more proteins; it ends up with qualitatively different pathologies, depending on the chromosome. It is also not meaningful to postulate a continuous quantitative series of events in the following continuum: one chromosome missing – the normal number of chromosomes – one extra chromosome in addition to the normal set.

Postulate of correspondence between inter-individual and intra-individual levels of analysis. In modern psychology, it is often assumed that intra-individual faculties can be revealed by studying inter-individual differences. This can be a major problem with all theories about individual attributes that are based on studies of differences between individuals: differences between individuals do not reflect distinctions inside individual minds (e.g., Lewin, 1935 ; Epstein, 1980 ; Toomela, in press-b ).

Several quantitative scholars have provided substantial reasons why inter-individual differences cannot ground interpretations at the intra-individual level. They propose that quantitative analyses should be conducted with variables that encode intra-individual variability (e.g., Molenaar, 2004 ; Hamaker et al., 2005 ; Molenaar and Valsiner, 2005 ; Nesselroade et al., 2007 ; Boker et al., 2009 ). This approach, however, still assumes continuity and quantification. Before analyses of intra-individual variabilities, it must be demonstrated that the attributes, which are encoded as variables, can be quantified at all. The variables that are used in intra-individual analyses, however, are usually based on the scores of the same tests and inventories as used in inter-individual analyses. Therefore conducting analyses at the intra-individual level still cannot ground interpretations about attributes of mind. Another problem related to intra-individual quantitative approach follows from their assumption that data collected over time, reflect qualitatively the same processes. This assumption is in many cases wrong. A person answering the same question repeatedly does not necessarily rely on the same mental operations – already second time the same question is asked, a person can answer in a certain way because he remembers answering the same question before. Data encoded as variables, again, do not reflect such qualitative changes of mental operations that underlie externally similar answers (see also Toomela, 2008 , in press-b ).

Postulate of interpretability of covariations between variables. Modern quantitative psychology also assumes that components of mental attributes can be discovered by analyzing covariations of variables. This postulate is questionable as well. As a rule, qualitatively different wholes emerge from the same elements in qualitatively different relationships. Quantitative data analysis, however, is not suitable for taking quality of relationships into account.

Human language, for instance, is based on units – words – that are composed from a limited number of sounds or letters in different relationships. We can take a series of events, words, and find perfect covariation between variables, sounds, in those events. Let us take, for instance, a series of events – words – this-shit-hits-pool-loop-polo. We create the following data-file from our observation of those six cases so that variables represent presence or absence of letters in each event/word:

www.frontiersin.org

We could make many different statistical analyses with those data and would not get any closer to understanding what is happening. Perhaps we would discover that all variables are perfectly correlated; we would discover that this data set can be perfectly “explained” by one factor, etc. Statistically, such results would be a perfect dream for a quantitative scientist. And yet all this would have no meaning. The data in the table show where the problem is – first three and last three qualitatively different cases are identical after quantification. Here we know that the cases are not identical; we do not know it when solving usual scientific problems. In any case, quantification of data into variables where the possibility of qualitatively different relationships between variables is ignored ends up with non-sense if qualitatively different wholes emerge from the same attributes encoded as variables.

Now it can be objected that such phenomena perhaps are not common. Nothing would be further from truth – the world around us provides massive amount of examples where the same elements in different relationships “cause” the emergence of qualitatively different wholes. The structure of DNA and its relationship to protein synthesis in a cell is an example; all chemical substances that are composed from the same elements in different relationships would be examples; different tools that can be made from the same material; different houses that can be built from the same stones; money that is earned and money that is stolen is also not the same, etc.

Conditions that are not satisfied. Over the last decade or two, an increasing number of substantial problems with statistical data analysis have been revealed. Some of them I have already mentioned above. But the list is definitely not complete with this. For instance, there are fundamental problems of interpreting variables that encode behavioral data ( Toomela, 2008 ). The problems with interpretations emerge when (1) variables contain information about events at different levels of analysis; (2) wrong attribute from many that characterize the observed event is chosen for encoding into a variable; (3) measurement tool is not sufficiently sensitive (i.e., certain behaviors and mental phenomena underlying it exist but are not represented in the tools that are supposed to “measure” this mental phenomenon); (4) the studied phenomenon is absolutely necessary and therefore it does not vary; (5) variables represent variability that emerges because of the properties of the test or questionnaire rather than because the phenomenon really varies; and (6) the variable does not encode variability at the causally relevant range.

Results of the statistical data analyses cannot be interpreted in terms of the processes that underlie observed behaviors unless the meaning of the variables is clear. This condition is not satisfied in psychology. If the meaning of a variable is not clear then statistical data analysis may end up with demonstrating misleading dependencies or misleading independencies. Common textbooks of statistical data analysis all agree that discovery of a dependency between variables cannot be interpreted causally; these textbooks usually do not mention that absence of dependence also cannot be unequivocally interpreted – statistical independence of variables does not demonstrate absence of causal connections. If neither dependence nor independence can be unequivocally interpreted, the results of statistical data analyses cannot be taken as evidence for or against causal connections.

Remark on other Kind of Questions – Questions that Cannot be Answered Statistically

Quantitative psychology asks questions about patterns of relationships between variables; the main question to be answered by such analyses is whether it is possible to identify some faculty, some ability, some cause that underlies observed behavior. In the discussion above I brought again and again examples from biology and chemistry, where the format of questions is different. In addition (not instead!) to asking whether a certain cause can be identified, questions are asked about the structure of the studied phenomena – what elements in which particular relationships underlie the emergence of a whole phenomenon that is aimed to understand. Quantitative data manipulations cannot reveal structure because in structures qualities of elements and qualities of relationships between elements determine the whole.

Altogether, there is not one epistemology that underlies science but two; one is looking for identification of cause → effect relationships and the other is aiming at structural-systemic description of the phenomena under study (see more on these two epistemologies, e.g., Toomela, 2009 , 2010a , in press-a ). These two epistemologies are rooted in philosophy. Next a very short description of the philosophical roots of these epistemologies is provided. It turns out that modern quantitative psychology is based on Cartesian–Humean epistemology whereas modern biology, chemistry, and several other sciences are based on Aristotelian epistemology. Furthermore, psychology pretends to be like other sciences and superficially aims at understanding reality that underlies appearances. This, however, is impossible. We will see that in psychology there is a fundamental mismatch between questions asked and methods used to answer these questions.

Two Epistemologies

Two epistemologies that underlie different views on science are first of all distinct in their understanding of what is cause and causality. History of the notion of causality is complex; philosophers, and scientists have formulated a wide variety of theories of causation, each substantively different from the others. A nice summary of different definitions of causality can be found in Chambers’ Cyclopaedia ( Chambers, 1728a , b ). Under the entry “CAUSE” there is First Cause and Second Causes and many more. Under the “Causes in the School Philosophy,” there are: (1) Efficient causes; (2) Material causes; (3) Formal causes; (4) Final causes; (5) Exemplary causes. In the other way, again, “Causes” are distinguished into Physical, Natural, and Moral. Or yet another way, “Causes” are considered as Universal or Particular; Principal or Instrumental; Total or Partial; Univocal or Equivocal, etc. Two prominent views on causes and causality are relevant in the context of this paper.

Aristotle suggested that to know causes means to explain, to know “why” (e.g., Aristotle, 1941c , p. 240, Bk.II, 194 b ). This knowledge of causes is not just knowledge, it is scientific knowledge: “We think we have scientific knowledge when we know the cause ( Aristotle, 1941b , p. 170, Bk.II, 94 a ). So, we can say that the aim of sciences is understanding what the causes of the studied phenomena are.

Aristotle distinguished four kinds of causes. In different works he described them from different perspectives. I am suggesting that Aristotelian philosophy of causality rooted structural-systemic epistemology that is followed by many sciences today. Shortly, according to this epistemology, scientific understanding implies description of the distinguishable elements, their specific relationships, the qualities that characterize the novel whole that emerges in the synthesis of those elements, and dynamic processes of the emergence of the whole ( Toomela, 2009 , 2010a ). The connection of this kind of epistemology to Aristotelian becomes evident with the following quote:

All the causes now mentioned fall under four senses […] some are cause as the substratum (e.g., the parts ), others as the essence ( the whole, the synthesis , and the form). The semen, the physician, the adviser, and in general the agent, are all sources of change or of rest. The remainder are causes as the end […]

( Aristotle, 1941a , p. 753, Bk.V, 1013 b , my emphasis)

So, here we find concepts of parts, relationships or synthesis, and whole or form. We also find here another important notion for structural-systemic epistemology – emergence or causes of change. These four causes are called by tradition that was established long after Aristotle’s time, material , formal , efficient , and final cause , respectively.

Descartes and Hume

Two thousand years after Aristotle, we find considerably more limited views on causality. Instead of four complementary kinds of causes only one – efficient causality – is taken.

Descartes and efficient causality

Descartes’ view on causality is fundamentally different from the Aristotelian. First of all, he accepts only efficient causes and second, these efficient causes are very different from Aristotelian. For Descartes, cause is: independent, simple, universal, single, equal, similar, straight, etc.; effect , in turn, is: relative, dependent, composite, particular, many, unequal, dissimilar, oblique, etc. (cf. Descartes, 1985c ). According to Descartes, effects can be deduced from causes in a series of steps. The cause–effect relationship, therefore, is unidirectional.

Another noteworthy idea in Cartesian epistemology was that “cause and effect are correlatives” ( Descartes, 1985c , p. 22). In most cases, cause–effect relationships are essentially correlations, just covariations of events; there is, however, the First Cause – God – on whose power all causal relationships depend (cf. Descartes, 1985a , b ). As God’s plans cannot be known by less perfect humans ( Descartes, 1985b ), humans can know only correlations between appearances.

Cartesian description of cause contains terms and ideas that we also recognize in modern statistical data analysis. Here we find: independent and dependent variables; we find an idea of linear (or at least continuous) relationships – correlations; the idea that effects can be understood by knowing (efficient) causes – dependent variables or variability is statistically “ explained ,” etc. There are two noteworthy ideas more. First, the notion of “relationship” has only one meaning, that between cause and effect; no other kind of relationship is important. And second, there is no suggestion that qualitatively novel wholes emerge from the synthesis of parts. This idea is also similar to quantitative thinking in modern psychology. The overlap between Cartesian philosophy and modern quantitative epistemology, I suggest, is not just a coincidence; it reflects fundamental agreement between Cartesian causality and modern quantitative approaches to science.

Hume and efficient causality

Slightly different approach to causality, even though similar to Cartesian in looking for efficient causality only, was taken by Hume. According to him,

Similar objects are always conjoined with similar. Of this we have experience. Suitably to this experience, therefore, we may define ac cause to be an object, followed by another, and where all the objects, similar to the first, are followed by objects similar to the second . Or in other words, where, if the first object had not been, the second never had existed

( Hume, 2000 , pp. 145–146).

So, cause is an object which appearance is related to the appearance of the other object. Space limitations do not allow going into detailed description of Hume’s ideas. So I only mention them together with references to specific parts in his works where the corresponding ideas have been expressed by him. First, the relationship between causes and effects is characterized by contiguity ( Hume, 2000 , p. 54). Second, causes relative to effects have priority in time; cause must precede the effect ( Hume, 2000 , p. 54). Third, the number of causes is smaller than the number of effects; therefore many observations of effects can be reduced to a few identified causes ( Hume, 2000 , p. 185). Fourth, the relationship between cause and effect reflects only relationships between appearances; no conclusion about reality that necessarily underlies the connection can be made ( Hume, 1999 , p. 136). Therefore conclusions about relations between causes and effects concern only matter of fact; they concern only the existence of objects or of their qualities ( Hume, 2000 , p. 65). Finally, according to Hume, the relationship between causes and effects is only probable ( Hume, 1999 , p. 115). The more often we observe an effect following the cause and the less often we observe effect not following the cause, the stronger is the impression of causality between the observed events ( Hume, 2000 , p. 105).

Taken together, it turns out that Humean epistemology is practically identical with the modern quantitative science – in both the succession of continuous events ground impressions about cause–effect relationships that can be observed with some probability; in both the impression of causal connection is perceived stronger when the proportion of observations that agree with one direction of events (from the supposed cause to the supposed effect) is higher than the proportion of observations that disagrees with this assumed direction of relationship; in both there can be no evidence that absolutely disagrees with some hypothetical causal relationship because cause–effect relationships can be observed only in degrees and not in necessary all-or-none relationship; and in both it is assumed that large number of observations can be “explained” by knowing small number of causes.

There is one interesting correspondence more between Humean epistemology and modern quantitative psychology. According to Hume, discovery of cause–effect relationships is based not on deductions or thinking but on “some instinct or mechanical tendency” ( Hume, 1999 , p. 130). The same can be said about quantitative science – the ways by which man-made causes (if to use Thurstone’s words) are discovered, are highly mechanical. There are algorithms that are strictly followed in calculations of probabilities, effect sizes, and all other statistical descriptors of the variables in the analyses; there is no adjustment of each particular case of study to particular statistical calculations, for instance. In scientific inquiry, mechanization leads to dead end because it puts constraints on what can be understood in principle.

There is yet one point where the scholars who introduced statistical data analysis into sciences agreed with Hume but the modern researchers tend (at least implicitly) to disagree. It was already discussed above that both Pearson and Thurstone were fully aware that statistical theories are about appearances, about relationships between observed events; no conclusion can be made about the essence of the reasons why the statistical relationships between variables emerge. Hume had identical understanding of the state of affairs – efficient causality is about appearances and not about what he called “secret powers” that underlie the observed relationships:

It must certainly be allowed, that nature has kept us at a great distance from all her secrets, and has afforded us only the knowledge of a few superficial qualities of objects; while she conceals from us those powers and principles, on which the influence of these objects entirely depends. […] there is no known connection between the sensible qualities and the secret powers

( Hume, 1999 , pp. 113–114).

Here modern quantitative psychology seems to disagree – on the basis of different kinds of statistical data analyses often conclusions are made about exactly those “secret powers.” Psychologists today attribute often the statistically “discovered” causes not just to man-made generalizations that leave an impression of causality but rather directly to “secret powers,” to mental attributes that are supposed to underlie the behaviors. It is ignored that behavior is not in one-to-one correspondence with psychic reality that underlies the behavior; externally identical behaviors may emerge from mentally qualitatively different operations and vice versa. So, all quantitative theories are only about appearances and not about underlying mechanisms because quantification of data into variables already excludes the information that is necessary for discovering the mechanisms that underlie observed covariations.

Why only efficient causality?

Aristotelian causality distinguished four complementary causes; Descartes and Hume, nevertheless, proposed only one. It is also important that neither Descartes nor Hume proposed entirely new concepts of causality; they took one Aristotelian cause out of his four. The reasons why they treated causality only in terms of efficient causes are relevant here.

Descartes. It was already discussed above that, according to Descartes, understanding of causality is about correlations between observed events; correlations do not imply necessity – every appearance can be correlated with every other appearance in principle. For Aristotle, causes were essentially constraints – in order to make a statue, bronze is used; there are many substances out of which it is not possible to make statues. If things have been made according to plan, then plan constrained the possible course of events; the result did not come out by accident or by chance but was constrained by plan before the event took place. Descartes, in order to be coherent with his philosophy, could not accept any kind of cause as a constraint.

Descartes believed in God, and not just some God, but God who is “infinite, eternal, immutable, omniscient, omnipotent […] all the perfections which I could observe to be in God.” ( Descartes, 1985a , p. 128). Therefore, logically, there can be no causes that are constraints because God has no constraints; God is omnipotent. God is the First Cause of everything that is. Effects follow from cause by necessity in principle because effects follow from God’s omnipotence. Humans, however, cannot know necessity that relates causes to effects; for them only knowledge about correlations is available:

When dealing with natural things we will, then, never derive any explanations from the purposes which God or nature may have had in view when creating them. For we should not be so arrogant as to suppose that we can share in God’s plans. We should, instead, consider him as the efficient cause of all things, and starting from the divine attributes which by God’s will we have some knowledge of, we shall see, with the aid of our God-given natural light, what conclusions should be drawn concerning those effects which are apparent to our senses

( Descartes, 1985b , p. 202).

Taken together, humans can only know what is given for them through senses – appearances and correlations of them; they cannot know reasons that connect causes to effects because they are imperfect. Correlations do not allow going beyond observations of events; there is no way to know what are the reasons for observed correlations but one – God’s will.

Hume. For Hume, too, efficient causality was not related to necessity: “tis possible for all objects to become causes or effects to each other […]” ( Hume, 2000 , p. 116). If there is no necessary relationship between events then it is not possible to know, why the events are related because it is actually not even possible to prove that the events are related essentially and not by accidents or by mistakes of observation. But his reasons for this view were different from Descartes’. Hume suggested that God is not knowable in principle and therefore the idea of God should not be taken into account in philosophy.

Hume suggested, similarly with Descartes, that humans have no access to knowledge beyond appearances; they cannot know why observed causes are related to observed effects. Human knowledge is actually even more limited – it is also not possible to be sure in discovered laws; the laws of nature can change and what we thought to be a cause may turn out to be the effect, or no connection between events would be discovered eventually ( Hume, 1999 , p. 115). The reasons of human limitations of understanding the world lied for Hume in limitations of the human (and animal) mind; world is not knowable beyond appearances because the mind is unable to go beyond appearances. Here Hume’s psychology becomes central for understanding his views. According to him, the mind works only on the principles of association:

[…] principles of association […] To me, there appear to be only three principles of connection among ideas, namely, Resemblance, Contiguity in time or place, and Cause or Effect . […] But the most usual species of connection among the different events, which enter into any narrative composition, is that of cause and effect;

( Hume, 1999 , pp. 101–103).

So, the only operation available for mind is to form associations between observed events as they appear to us. If this would be the case, then Humean rejection of the possibility to have knowledge beyond senses – his proposition that only efficient causes as they appear to us can be known – would be well grounded. Humean psychology, however, was acceptable in his time, but not any more. The inability of associationism to be sufficient for explaining the human mind was established and grounded with empirical studies almost a century ago. Not only humans, but even apes were demonstrated to be able to think in a way that is not based on associations alone (see also Koffka, 1935 ; Köhler, 1925 ; Köhler, 1959 ; Vygotsky, 1982b ). The idea that animal mind is based only on reflexes and conditioned reflexes, discovered by Pavlov (1927 , 1951 ) was actually rejected by scholars from his own laboratory ( Anokhin, 1975 ; Konstantinov et al., 1978 ).

Modern Quantitative Psychology – Mix of Two Incompatible Epistemologies

Psychology today often aims at understanding structures that underlie observed behaviors. This aim is borrowed from that Aristotelian structural-systemic epistemology. Methods chosen for studies, however, are based on Cartesian–Humean cause–effect epistemology. Both philosophers who limited understanding of causality to efficient causality – Descartes and Hume – and scholars who introduced quantitative methods into sciences – Pearson and Thurstone – agreed that method of associating events by contiguity and covariation cannot ground interpretations in terms of underlying necessary reasons that connect observed causes to observed effects. They all also agreed that what is represented in observed associations between events or variables is subject to doubt. Interpretation of those associations can be only weaker or stronger depending on the relative frequency of events that correspond to certain idea of causality to the frequency of observations that contradict it. Laws discovered by such procedures are therefore not absolute but relative; laws cannot be refuted by observations that contradict it – in psychology effect sizes 1.0 are practically never observed; it is actually conveniently accepted that far-going conclusions can be made when 10–30% of data variability is statistically “explained.” It is ignored that in such situations substantial number of cases disagrees completely with the conclusions of the study. After conducting some “meta-analysis” it often turns out that the laws of association discovered in different studies contradict; and a new law can be proposed to replace those from the analyzed studies. It would not become a surprise when some meta-meta-analysis would yet lead to different generalization. Laws, in this epistemology, are not absolute; they can change without destroying the theory that is built from the collection of associative generalizations. Some philosophers would suggest that this kind of activity is not what science should do:

For it is an important postulate of scientific method that we should search for laws with an unlimited realm of validity. If we were to admit laws that are themselves subject to change, change could never be explained by laws. It would be the admission that change is simply miraculous. And it would be the end of scientific progress; for if unexpected observations were made, there would be no need to revise our theories: the ad hoc hypothesis that the laws have changed would “explain” everything

( Popper, 2002 , p. 95).

Statistical methods in psychology are useless

Taken together, there are reasons to suggest that quantitative methods are useless for psychology – IF the aim of psychology is to develop knowledge about mind, about “secret powers” that underlie observed behaviors. Such understanding would require qualitative approaches that allow distinguishing between externally similar behaviors based on internally different mental processes; and between externally different behaviors that are based on similar mechanisms.

Modern quantitative psychology is based on the epistemology where the questions are asked about efficient causality; explanation is reduced to identification of cause–effect relationships between events. Such approach could be fully consistent if it would be accepted – as did Thurstone and Pearson – that discovery of such relationships cannot be connected to underlying structures in principle. Modern quantitative psychology, however, takes methods from Cartesian–Humean efficient causality epistemology and aims from incompatible with it Aristotelian-structural epistemology. Structural-systemic description of the studied phenomena cannot be based on quantitative methodology. The histories of biology or chemistry which are based on systemic-structural epistemology, also shows that majority of discoveries in these sciences have been made without statistical methods.

Statistical methods in psychology are inevitable

The suggestion to reject quantitative methodology, I made, is conditional – IF the aims of studies would correspond to methods, quantitative methodology would turn out to be extremely valuable, almost inevitable … for applied psychology. Now we need to turn the discussion upside-down. Instead of asking what cannot be accomplished with quantitative methods we ask, what it can bring to us? The world around us is constantly changing and always unique. How to live in the world of unique events? This would be impossible – in order to live, all life-forms must be able to react to future changes of the environment before these changes actually take place ( Anokhin, 1978 ; Toomela, 2010a ); foresight must be based on generalization and abstraction.

Coherent systemic-structural theories, as modern applications of physics, chemistry, and biology amply demonstrate, are extremely practical. But how to behave if the theory about underlying processes has not been created yet? Here quantitative methods become valuable: it is possible to create useful generalizations without knowing the processes that underlie the events. This was exactly what Thurstone, for instance, aimed at:

It is the faith of all science that an unlimited number of phenomena can be comprehended in terms of a limited number of concepts or ideal constructs. Without this faith no science could ever have any motivation. To deny this faith to affirm the primary chaos of nature and the consequent futility of scientific effort

Thurstone, as we saw above, aimed explicitly and only at discovering ways to comprehend nature by describing regularities among observed events; these discoveries would be just man-made schemes, and yet they would help to manage otherwise unmanageable amount of information. A lot could be learned in this way – it would be possible to discover behaviors that should be avoided and behaviors that should be repeated in appropriate conditions – and all this without necessarily knowing, why. Until systemic-structural theories replace associative quantitative theories, psychology can create increasingly strong ground to applied uses of it. Quantitative science is inevitable for applied purposes until a theory about structures that underlie behavior is sufficiently developed for grounding applied uses. If, however, quantitative science continues to look for what it cannot find – the “secret powers” – then it ends up where Hume warned us not to go:

We are got into a fairy land, long ere we have reached the last steps of our theory […]

( Hume, 1999 , p. 142).

Some notes on mathematical psychology in general

Mathematical psychology is not based exclusively on statistical methods. Perhaps non-statistical mathematical psychology is better suited for discovering the structure of mind? Indeed, from a certain perspective, it seems that mathematical psychology is doing well – there are fields of studies where mathematical psychology is prospering: foundational measurement theory, signal detection theory, decision theory, psychophysics, neural modeling, information processing approach, and learning theory ( Townsend, 2008 ). Sometimes it almost seems that the only true science is based on mathematics; so Townsend suggests that psychology undergraduate training should change toward “solid-science” education and in order to do that, “The only practical solution I can espy is for psychology departments to offer a true scientific psychology track , with mandatory courses in the sciences, mathematics and statistics ” ( Townsend, 2008 , p. 275, my emphasis).

A small problem can be that achievements of mathematics, such as axiomatic measurement theory and computer-based, non-metric model fitting techniques, do not have an impact on psychology these “revolutions” deserve ( Cliff, 1992 ). It might be that many problems will be solved with some developments in mathematics which, for instance, would explicate relationships between ways of describing randomness and ways of describing structure ( Narens and Luce, 1993 ; Luce, 1999 ). It might be, however, that mathematics as such is inappropriate for answering questions psychology aims at answering. The most fundamental issue is not how mathematics should be applied in psychology but rather whether it can be applied for answering the core question of the science of psyche – what is mind? No development in any kind of measurement theory, for instance, will be helpful if psychological attributes cannot be measured in principle; there are strong reasons to suggest that they are not indeed ( Valsiner, 2005 ; Trendler, 2009 ; Michell, 2010 ).

In order to proceed, a definition of mathematics is needed. According to Luce (1995 , p. 2):

Mathematics studies structures and patterns described by systems of propositions relating aspects of entities in question. Deriving logically true statements from sets of assumed statements (often called axioms), uncovering symmetries and patterns, and evolving and understanding general structures are the concerns of mathematicians.

It is noteworthy that the term “structure” does not apply directly to the things and phenomena studied by physics, biology, psychology, or any other science. Rather, mathematics studies descriptions of objects and phenomena – systems of propositions – and “structure” refers to the system of descriptions; in that sense mathematics is an abstract science ( Veblen and Young, 1910 ); it is a body of theorems deduced from a set of axioms ( Veblen and Whitehead, 1932 ).

It is important that, as an abstract science, mathematics is based on assumptions, its “starting point” is

a set of undefined elements and relations , and a set of unproven propositions involving them ; and from these all other propositions (theorems) are to be derived by the methods of formal logic

( Veblen and Young, 1910 , p. 1, my emphasis).

So, mathematics is a system of propositions that begins with a set of undefined assumptions, called axioms or postulates; and there are rules of deduction or a system of logic. Thus, mathematics defines a priori certain principles which are not derived from the studies of the world but attributed to it before studies. Mathematical description of the concrete real-world phenomena is successful only if the concrete system of things satisfies the fundamental assumptions of mathematics ( Veblen and Young, 1910 ). Even though axioms can be postulated on the basis of scientific studies of the world and added to the basic set of abstract axioms, the abstract basis of mathematics is nevertheless determined before the studies. Taken together, it can be said that mathematics does not study the world but rather searches for events where the world corresponds to abstract mathematical principles – principles that cannot be proven or even defined.

Mathematics studies only formal aspects of the world. For Poincare, “mathematics is the art of giving the same name to different things” ( Poincare, 1914 , p. 34; also: “Mathematics teaches us, in fact, to combine like with like,” Poincare, 1905 , p. 159). Similarities of things can be discovered by studying relationships:

Mathematicians do not study objects, but the relations between objects; to them it is a matter of indifference if these objects are replaced by others, provided that the relations do not change. Matter does not engage their attention, they are interested by form alone

( Poincare, 1905 , p. 20).

Thus, similarities of things discovered by mathematical studies are purely mathematical – things are similar if the mathematical relationships that describe them formally are similar. And the essence of these mathematical relationships, we saw above, is defined a priori and not derived from scientific studies.

Here are the reasons why not only statistics but any mathematical approach – if mathematics is defined as described above – is unable to reveal the structure of the things and phenomena studied; mathematics cannot in principle answer the questions structural-systemic science aims to answer – what is the studied thing or phenomenon? First, in the world, externally similar things and phenomena can be based on different underlying structures; for mathematics these structural differences do not exist. If, for instance, in a similar threatening situation one person reacts aggressively because he has made a conscious choice and the other impulsively, then for mathematics these two reactions are the same even though in the first case psychological structure included rational processes and in the other it did not. Mathematical prediction of future in such case cannot be very accurate because a person who chooses rationally to react aggressively in certain situations is also able to control his reactions whereas impulsive behavior is directed by the situation. This problem of mathematics has been recognized by mathematicians themselves; Poincare, for instance, suggested:

It is not enough that each elementary phenomenon should obey simple laws: all those that we have to combine must obey the same law ; then only is the intervention of mathematics of any use. […] It is therefore, thanks to the approximate homogeneity of the matter studied by physicists, that mathematical physics came into existence. In the natural sciences the following conditions are no longer to be found: – homogeneity, relative independence of remote parts, simplicity of the elementary fact; and that is why the student of natural science is compelled to have recourse to other modes of generalisation

( Poincare, 1905 , pp. 158–159, my emphasis).

In fact, Trendler (2009) proposed essentially the same reason why psychological attributes cannot be measured – in case of psychological attributes there are too many sources of systematic errors that cannot be controlled experimentally; in other words – psychological attributes are not independent but depend on each other. Therefore they cannot be measured.

Next, mathematics is a secondary science; the successful application of mathematics to the phenomena of the world depends on the experiments conducted in other sciences:

Experiment is the sole source of truth. It alone can teach us something new; it alone can give us certainty

( Poincare, 1905 , p. 140).

Mathematics can help to organize the results of experiments; it can direct generalization; but it does not provide any new knowledge ( Poincare, 1905 ). There are two different problems here. One problem is related to the essence of generalization. In mathematics, generalization is related to relationships between events, but in order to understand what a thing is , it is not sufficient to know with what else it can be related. In principle, there is no constraint on the number of other things with which any given thing in the world can be related; but what the thing is, is defined qualitatively and is fully constrained. A thing is what it is. Mathematical generalizations are not useful for discovering what things are. Another problem with mathematical generalizations can also be related to Poincare’s discussion of the role of mathematics in sciences. According to him,

It is clear that any fact can be generalised in an infinite number of ways, and it is a question of choice. The choice can only be guided by considerations of simplicity

( Poincare, 1905 , p. 146).

In case of phenomena that can be – in psychology they are – externally similar but yet based on qualitatively different psychic structures, the assumption of simplicity is not just wrong – it is fundamentally misleading. The assumption of simplicity forces scientists who rely on mathematics to ignore what should be studied and understood – complexity. Parenthetically, it should be mentioned here that mathematical models can be extremely complex; but they are fundamentally oversimplified if there is an assumption that externally similar events are all based on similar structures.

Third, mathematics can model only what is given by experiments and observations conducted in other sciences. It follows that mathematics is not able to provide any understanding of becoming, of emergence of qualitatively novel things. If these other sciences have described the emergence of novel qualities, this emergence can be modeled mathematically in principle. But again, what is modeled is not the novel thing or phenomenon itself – that model would be structural description of the thing or phenomenon – but relationships between events, i.e., what is modeled mathematically is always external to the thing itself. Here mathematicians perhaps could object and suggest that structural theory is also mathematical model. This suggestion, however, would require fundamental redefinition of what mathematics is; because structural model based on studies of things, such as a model of atom, gene, wristwatch, or mind, is not based on a set of a priori given assumptions-axioms but rather on studies of the world. As Poincare suggested, science can be based on different kinds of generalizations and mathematical generalization that fits so well into physics, is considerably less useful in other sciences. I will discuss briefly the methods of scientific generalization in the next section.

Altogether, there are reasons to suggest that mathematics is not appropriate tool if the aim of science is to understand what the studied thing or phenomenon is. For mathematical psychologist, naturally, mathematics is almost the most important tool for science:

Mathematical psychology has arguably accelerated the evolution of psychology and allied disciplines into rigorous sciences many times over their likely progress in its absence. Let’s nurture and strengthen it

( Townsend, 2008 , p. 279).

I suggest that mathematics has actually had the opposite role for psychology – it has oversimplified theories, blinded scientists, and directed their attention to the study of relationships between things and phenomena instead of guiding them to study what these things and phenomena are. Physics has been most successful not where something can be exactly calculated but where the theory has defined what the things are, atoms, for instance. Yes, mathematics is useful tool here and there – as it can be also for psychology for grounding practical decisions – but no machine has been ever built on the basis of mathematical formulas alone whereas many of them have been constructed completely without any aid from mathematics. The same can be said about biology – it is very powerful science not because of applications of mathematics in some peripheral matters of biology but because of the theories about what are cells, components of cells, organs, organisms, etc. Perhaps what should be “nurtured and strengthened” in psychology is not mathematical psychology but studies that aim to understand what mind is.

If Not Mathematics, Then What?

So, mathematics is a useful tool for generalizations about relationships between events. The value of mathematics, though, is not the same in all sciences. Poincare, for instance, suggested that mathematics is very useful tool for generalization of the results of experiments in physics:

It might be asked, why in physical science generalisation so readily takes the mathematical form. The reason is now easy to see. It is not only because we have to express numerical laws; it is because the observable phenomenon is due to the superposition of a large number of elementary phenomena which are all similar to each other; and in this way differential equations are quite naturally introduced

( Poincare, 1905 , p. 158).

He also suggested, as was discussed above, that mathematical generalization is appropriate only in cases when the matter studied by the scientists is homogeneous; when parts are relatively independent and elementary facts are simple. These conditions are not met in psychology. Therefore another way for generalization must be found. Another way for scientific generalization is a special kind of experiment.

Usually it is thought that there is one kind of experiments. This kind is Cartesian–Humean; the question answered in the experiment is whether certain event is or is not an efficient cause of another event. In order to answer this question, the artificial situation is created where, ideally, all conditions are kept equal but one that is manipulated or “controlled.” If the expected effect follows when manipulated event is present and does not follow when the manipulated event is absent, then it is concluded that the cause of the event has been identified.

There is, however, another kind of experiment that, to the best of my knowledge, was first brought into the theory of scientific experimentation by Engels (even though in several respects similar idea can be found already in Aristotle’ s works; cf. Aristotle, 1941a , p. 690, Bk.I, 981 a –981 b ). Engels discussed the role of induction in scientific discoveries and proposed that there is much more powerful way for scientific proofs than induction:

A striking example of how little induction can claim to be the sole or even the predominant form of scientific discovery occurs in thermodynamics: the steam-engine provided the most striking proof that one can impart heat and obtain mechanical motion. 100,000 steam-engines did not prove this more than one , but only more and more forced the physicists into the necessity of explaining it. […] The empiricism of observation alone can never adequately prove necessity. Post hoc but not propter hoc . […] But the proof of necessity lies in human activity, in experiment, in work: if I am able to make the post hoc , it becomes identical with the propter hoc .

( Engels, 1987 , pp. 509–510)

This kind of experiment that follows from the principles outlined by Engels, I have called “constructive” ( Toomela, in press-a ). In constructive experiments it is attempted to create the thing or phenomenon that is studied. If the phenomenon or thing can be constructed on the basis of knowledge about hypothetical elements and specific relationships between them, the experiment has provided corroborating evidence for the theory. Here the result of the experiment – constructed thing or phenomenon – follows from theory. It is important that there is no logical necessity that a whole with certain emergent properties must emerge when theoretically defined elements are put into certain relationships. Instead of logical deduction, the necessity is proven in the real construction of the phenomenon that is attempted to understand. Mathematics derives logically true statements from assumptions that cannot be proven. It is important that the truth of logical derivations depends fully on the truth of the assumptions. If even one assumption cannot be proven, there is no proof possible for the scientific theory as a whole as well. In constructive experiments, on the other hand, the proof is obtained by actual construction of the studied thing. Such actual construction does not contain any assumptions that cannot be proven – these assumptions are proven by the result of the experiment.

Constructive experiments can be found in different fields of science. Atomic theory is well corroborated by the construction of the nuclear bombs and reactors; chemical theories are corroborated with the synthesis of new molecules. Just now biologists have reported a major breakthrough in constructive experiments on the cells; a bacterial cell with chemically synthesized genome has been created ( Gibson et al., 2010 ). There are also examples of constructive experiments in psychology. Neuropsychological rehabilitation based on Luria’s theory is grounded on structural-systemic theory; numerous psychological functions have been artificially created with special educational programs that were designed on the basis of theories about the elements and relationships of elementary psychological processes of those complex functions ( Luria, 1948 ; Tsvetkova, 1985 ).

Taken together, mathematics is not useful for discovering what things are. For such discoveries, observations and analytic experiments should be combined with constructive experiments.

Conclusions

Science begins with the question, what do I want to know? Science becomes science, however, only when this question is justified and the appropriate methodology is chosen for answering the research question. Research question should precede the other questions; methods should be chosen according to the research question and not vice versa. Modern quantitative psychology, though, has accepted method as primary and research questions are adjusted to the methods. It would not be a problem if methods would fit the questions about the studied phenomena; but they do not. The crucial question that needs to be asked, is – do the answers to the questions what, why, and how I want to know, make a coherent theoretically justified whole? All psychology that aims at understanding the structure of mind with any kind of mathematical tools has to admit that the methods do not correspond to the study questions.

For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking; and associative-quantitative that is based on Cartesian–Humean thinking. The first aims at understanding the structure that underlies the processes leading to events observed in the world, the second looks for identification of apparent cause–effect relationships between the events with no claim made about processes that underlie the appearances.

Quantitative methods are useful for generalizations about the relationships between things and events. What the studied things or phenomena are cannot be revealed by such methods. Structural-systemic science, which aims at understanding structures, relies on qualitative methodology that includes, in addition to the observations and analytic experiments, constructive experiments.

Conflict of Interest Statement

The author declares that the research was conducted in the absence of any commercial or financial relationships that could be construed as a potential conflict of interest.

  • ^ I am not going here into philosophical question whether such full correspondence can be known in principle. I agree that we can never be sure that out theories are correct (cf., e.g., Engels, 1996 ; Kant, 2007 ). I only suggest that theories can be in full correspondence with the physical reality even though we cannot demonstrate it.

Anokhin, P. K. (1975). Ocherki po fiziologii funktsional’nykh sistem. Moscow: Medicina.

Anokhin, P. K. (1978). “Operezhajuscheje otrazhenije deistvitel’nosti,” (Anticipating reflection of actuality. In Russian. Originally published in 1962.). in Filosofskije aspekty teorii funktsional’noi sistemy, eds F. V. Konstantinov, B. F. Lomov, V. B. Schvyrkov and P. K. Anokhin, Izbrannyje trudy. (Moscow: Nauka), pp. 7–26.

Aristotle. (1941a). “Metaphysics,” (Metaphysica.). in The Basic Works of Aristotle , ed. R. McKeon (New York: Random House), pp. 681–926.

Aristotle. (1941b). “Organon,” in The Basic Works of Aristotle , ed. R. McKeon (New York: Random House), pp. 7–212.

Aristotle. (1941c). “Physics,” (Physica.). in The Basic Works of Aristotle , ed. R. McKeon (New York: Random House), pp. 213–394.

Boker, S. M., Molenaar, P. C. M., and Nesselroade, J. R. (2009). Issues in intraindividual variability: individual differences in equilibria and dynamics over multiple time scales. Psychol. Aging 24, 858–862.

Pubmed Abstract | Pubmed Full Text | CrossRef Full Text

Chambers, E. (1728a). Cyclopædia, or, An universal dictionary of arts and sciences . In two volumes. Volume the first. London: Ephraim Chambers Retrieved 21 July 2009 from http://digicoll.library.wisc.edu/HistSciTech/subcollections/CyclopaediaAbout.html

Chambers, E. (1728b). Cyclopædia, or, An universal dictionary of arts and sciences . In two volumes. Volume the Second. London: Ephraim Chambers Retrieved 21 July 2009 from http://digicoll.library.wisc.edu/HistSciTech/subcollections/CyclopaediaAbout.html

Cliff, N. (1992). Abstract measurement theory and the revolution that never happened. Psychol. Sci. , 3, 186–190.

CrossRef Full Text

Descartes, R. (1985a). “Discourse on the method of rightly conducting one’s reason and seeking the truth in the sciences,” (Originally published in 1637) in The Philosophical Writings of Descartes, Vol. 1 , eds J. Cottingham, R. Stoothoff and D. Murdoch (New York: Cambridge University Press), pp. 111–151.

Descartes, R. (1985b). “Principles of philosophy,” (Originally published in 1644 in Latin and in 1647 in French) in The Philosophical Writings of Descartes, Vol. 1 , eds J. Cottingham, R. Stoothoff and D. Murdoch (New York: Cambridge University Press), pp. 179–291.

Descartes, R. (1985c). “Rules for the direction of the mind,” (Originally written in about 1628, published in 1684) in The Philosophical Writings of Descartes, Vol. 1 , eds J. Cottingham, R. Stoothoff and D. Murdoch (New York: Cambridge University Press), pp. 9–78.

Engels, F. (1987). “Dialectics of nature,” in Collected Works, Vol. 25 , eds K. Marx and F. Engels (New York: International Publishers), pp. 313–590.

Engels, F. (1996). Ludwig Feuerbach and the End of Classical German Philosophy (Originally published in 1888). Beijing: Foreign Language Press.

Epstein, S. (1980). The stability of behavior. II. Implications for psychological research. Am. Psychol. 35, 790–806.

Essex, C., and Smythe, W. E. (1999). Between numbers and notions. A critique of psychological measurement. Theory Psychol. 9, 739–767.

Gibson, D. G., Glass, J. I., Lartigue, C., Noskov, V. N., Chuang, R.-Y., Algire, M. A., Benders, G. A., Montague, M. G., Ma, L., Moodie, M. M., Merryman, C., Vashee, S., Krishnakumar, R., Assad-Garcia, N., Andrews-Pfannkoch, C., Denisova, E. A., Young, L., Qi, Z.-Q., Segall-Shapiro, T. H., Calvey, C. H., Parmar, P. P., Hutchison, III, C. A., Smith, H. O., and Venter, J. C. (2010). Creation of a bacterial cell controlled by a chemically synthesized genome. Science 329, 52–56.

Gigerenzer, G. (1991). From tools to theories: a heuristic of discovery in cognitive psychology. Psychol. Rev. 98, 254–267.

Gigerenzer, G. (1993). “The superego, the ego, and the id in statistical reasoning,” in A Handbook for Data Analysis in the Behavioral Sciences: Methodological Issues , eds G. Keren and C. Lewis (Hillsdale, NJ: Lawrence Erlbaum Associates), pp. 311–339.

Hamaker, E. L., Dolan, C. V., and Molenaar, P. C. M. (2005). Statistical modeling of the individual: rationale and application of multivariate stationary time series analysis. Multivariate Behav. Res. 40, 207–233.

Hume, D. (1999). “An enquiry concerning human understanding,” (Originally published in 1748) in An Enquiry Concerning Human Understanding , ed. T. L. Beauchamp (Oxford: Oxford University Press), pp. 81–211.

Hume, D. (2000). “A treatise of human nature,” (Originally published in 1739–1740) in A Treatise of Human Nature , eds D. F. Norton and M. J. Norton (Oxford: Oxford University Press), pp. 1–418.

Kant, I. (2007). “Critique of pure reason,” (Originally published in 1787) in Critique of Pure Reason. Immanuel Kant. Revised 2nd Edn., ed. N. K. Smith (New York: Palgrave McMillan).

Koffka, K. (1935). Principles of Gestalt Psychology . London: Routledge & Kegan Paul.

Köhler, W. (1925). The Mentality of Apes . New York: Harcourt, Brace & Co.

Köhler, W. (1959). Gestalt Psychology. An Introduction to New Concepts in Modern Psychology . New York: Mentor Books.

Konstantinov, F. K., Lomov, B. F., and Shvyrkov, B. V. (eds). (1978). P. K. Anokhin. Izbrannyje trudy. Filosofskije aspekty teorii funktsional’noi sistemy . Moscow: Nauka.

Lewin, K. (1935). A Dynamic Theory of Personality. Selected Papers . New York: McGraw-Hill.

Luce, R. D. (1995). Four tensions concerning mathematical modeling in psychology. Annu. Rev. Psychol. 46, 1–26.

Luce, R. D. (1999). Where is mathematical modeling in psychology headed? Theory Psychol. 9, 723–737.

Luria, A. R. (1948). Vosstanovlenije funkcii mozga posle vojennoi travmy . (Restoration of Brain Functions After War Trauma. In Russian). Moscow: Izdatel’stvo Akademii Medicinskih Nauk SSSR.

Michell, J. (2000). Normal science, pathological science and psychometrics. Theory Psychol. 10, 639–667.

Michell, J. (2010). “The quantity/quality interchange: a blind spot on the highway of science,” in Methodological Thinking in Psychology: 60 Years Gone Astray? eds A. Toomela and J. Valsiner (Charlotte, NC: Information Age Publishing).

Molenaar, P. C. M. (2004). A Manifesto on psychology as idiographic science: bringing the person back into scientific psychology, this time forever. Measurement 2, 201–218.

Molenaar, P. C. M., and Valsiner, J. (2005). How generalization works through the single case: a simple idiographic process analysis of an individual psychotherapy. Int. J. Idiographic Sci. Article 1. Retrieved October 25 2005 from http://www.valsiner.com/articles/molenvals.htm

Narens, L., and Luce, R. D. (1993). Further comments on the “nonrevolution” arising from axiomatic measurement theory. Psychol. Sci. 4, 127–130.

Nesselroade, J. R., Gerstorf, D., Hardy, S. A., and Ram, N. (2007). Idiographic filters for psychological constructs. Measurement 5, 217–235.

Pavlov, I. P. (1927). Lekcii o rabote bol’shikh polusharii golovnogo mozga . (Lectures on the work of the hemispheres of the brain.). Moscow: Gosudarstvennoje Izdatel’stvo.

Pavlov, I. P. (1951). Dvatcatilet’nii opyt ob’jektivnogo izuchenija vyshei nervnoi dejatel’nosti (povedenija zhivotnykh) . Moscow: Medgiz.

Pearson, K. (1896). Mathematical contributions to the theory of evolution. III. Regression, heredity, and panmixia. Philos. Trans. R. Soc. Lond. A 187, 253–318.

Pearson, K. (1899). Mathematical contributions to the theory of evolution. V. On the reconstruction of the stature of prehistoric races. Philos. Trans. R. Soc. Lond. 192, 169–244.

Pearson, K. (1902). On the systematic fitting of curves to observations and measurements. Biometrika , 1, 265–303.

Pearson, K. (1903–1904). Mathematical contributions to the theory of evolution. XII. On a generalised theory of alternative inheritance, with special reference to Mendel’s laws. Proc. R. Soc. Lond. 72, 505–509.

Pearson, K. (1904). Mathematical contributions to the theory of evolution. XII. On a generalised theory of alternative inheritance, with special reference to Mendel’s laws. Philos. Trans. R. Soc. Lond. A 203, 53–86.

Pearson, K., and Lee, A. (1900). Mathematical contributions to the theory of evolution. VIII. On the inheritance of characters not capable of exact quantitative measurement. Philos. Trans. R. Soc. Lond. A 195, 79–150.

Pearson, K., Lee, A., and Bramley-Moore, L. (1899). Mathematical contributions to the theory of evolution. VI. Genetic (reproductive) selection: Inheritance of fertility in man, and of fecundity in thoroughbred racehorses. Philos. Trans. R. Soc. Lond. A 192, 257–330.

Poincare, H. (1905). Science and Hypothesis . London: Walter Scott Publishing.

Poincare, H. (1914). Science and Method . New York: Cosimo.

Popper, K. (2002). The Poverty of Historicism . (First English edition published in 1957). London: Routledge Classics.

Thurstone, L. L. (1935). The Vectors of Mind: Multiple-Factor Analysis for the Isolation of Primary Traits . Chicago: University of Chicago Press.

Thurstone, L. L. (1948). Psychological implications of factor analysis. Am. Psychol. 3, 402–408.

Toomela, A. (2008). Variables in psychology: a critique of quantitative psychology. Integr. Psychol. Behav. Sci. 42, 245–265.

Toomela, A. (2009). “How methodology became a toolbox - and how it escapes from that box,” in Dynamic Process Methodology in the Social and Developmental Sciences , eds J. Valsiner, P. Molenaar, M. Lyra and N. Chaudhary (New York: Springer), pp. 45–66.

Toomela, A. (2010a). Biological roots of foresight and mental time travel. Integr. Psychol. Behav. Sci. 44, 97–125.

Toomela, A. (2010b). “Modern mainstream psychology is the best? Noncumulative, historically blind, fragmented, atheoretical,” in Methodological Thinking in Psychology: 60 Years Gone Astray? eds A. Toomela and J. Valsiner (Charlotte, NC: Information Age Publishing), pp. 1–26.

Toomela, A. (in press-a). “Guesses on the future of cultural psychology: past, present, and past,” in Oxford Handbook of Culture and Psychology , ed. J. Valsiner (Oxford: Oxford University Press).

Toomela, A. (in press-b). “Methodology of idiographic science: limits of single-case studies and the role of typology,” in Yearbook of Idiographic Science, Vol. 2 , eds S. Salvatore, J. Valsiner, A. Gennaro, and J. B. Travers Simon (Rome: Firera & Liuzzo Group).

Townsend, J. T. (2008). Mathematical psychology: prospects for the 21st century: A guest editorial. J. Math. Psychol. 52, 269–280.

Trendler, G. (2009). Measurement theory, psychology and the revolution that cannot happen. Theory Psychol . 19, 579–599.

Tsvetkova, L. S. (1985). Neiropsikhologicheskaja reabilitatsija bol’nykh. Rech i intellektual’naja dejatel’nost . (Neuropsychological rehabilitation of a sick person. Speech and intellectual activity. In Russian.). Moscow: Izdatel’stvo Moskovskogo Universiteta.

Valsiner, J. (2005). Transformations and flexible forms: where qualitative psychology begins. Qual. Res. Psychol. 4, 39–57.

Veblen, O., and Whitehead, J. H. C. (1932). The Foundations of Differential Geometry. Cambridge Tracts in Mathematics and Mathematical Physics. Cambridge: Cambridge University Press, 29.

Veblen, O., and Young, J. W. (1910). Projective Geometry, Vol. I . New York: Ginn & Co.

Vygotsky, L. S. (1982a). “Istoricheski smysl psikhologicheskogo krizisa. Metodologicheskoje issledovanije,” (Historical meaning of the crisis in psychology. A methodological study. Originally written in 1927; First published in 1982) in Sobranije sochinenii. Tom 1. Voprosy teorii i istorii psikhologii , eds A. R. Luria and M. G. Jaroshevskii (Moscow: Pedagogika), pp. 291–436.

Vygotsky, L. S. (1982b). “Problema razvitija v strukturnoi psikhologii. Kriticheskoje issledovanije,” (Problem of development in structural psychology. A critical study. Originally published in 1934) in Sobranije sochinenii. Tom 1. Voprosy teorii i istorii psikhologii , eds A. R. Luria and M. G. Jaroshevskii (Moscow: Pedagogika), pp. 238–290.

Keywords: quantitative methodology, epistemology, causality, mathematics, constructive experiment

Citation: Toomela A (2010) Quantitative methods in psychology: inevitable and useless. Front. Psychology 1 :29. doi: 10.3389/fpsyg.2010.00029

Received: 23 February 2010; Paper pending published: 26 March 2010; Accepted: 25 June 2010; Published online: 30 July 2010

Reviewed by:

Copyright: © 2010 Toomela. This is an open-access article subject to an exclusive license agreement between the authors and the Frontiers Research Foundation, which permits unrestricted use, distribution, and reproduction in any medium, provided the original authors and source are credited.

*Correspondence: Aaro Toomela, Institute of Psychology, Tallinn University, Narva Road 25, Tallinn 10120, Estonia. e-mail: [email protected]

Disclaimer: All claims expressed in this article are solely those of the authors and do not necessarily represent those of their affiliated organizations, or those of the publisher, the editors and the reviewers. Any product that may be evaluated in this article or claim that may be made by its manufacturer is not guaranteed or endorsed by the publisher.

  • For Individuals
  • For Businesses
  • For Universities
  • For Governments
  • Online Degrees
  • Join for Free

American Psychological Association

Methods for Quantitative Research in Psychology

This course is part of Psychological Research Specialization

Mike Stadler, PhD

Instructor: Mike Stadler, PhD

Financial aid available

5,503 already enrolled

Coursera Plus

(127 reviews)

Recommended experience

Beginner level

High school degree or equivalent

Skills you'll gain

  • Research Methods
  • Research Design
  • General Statistics
  • Research And Design

Details to know

quantitative research on psychology

Add to your LinkedIn profile

21 assignments

See how employees at top companies are mastering in-demand skills

Placeholder

Build your subject-matter expertise

  • Learn new concepts from industry experts
  • Gain a foundational understanding of a subject or tool
  • Develop job-relevant skills with hands-on projects
  • Earn a shareable career certificate

Placeholder

Earn a career certificate

Add this credential to your LinkedIn profile, resume, or CV

Share it on social media and in your performance review

Placeholder

There are 9 modules in this course

This is primarily aimed at first- and second-year undergraduates interested in psychology, data analysis, and quantitative research methods along with high school students and professionals with similar interests.

Learn With PsycLearn Essentials

This module introduces you to your PsycLearn Essentials course. Find out what’s included in this course and how to navigate the modules and lessons. You’ll also learn valuable study tips for successful learning.

What's included

2 videos 7 readings

2 videos • Total 3 minutes

  • Get Started With PsycLearn Essentials! • 1 minute • Preview module
  • Metacognitive Checkpoints: Pause and Reflect on Your Learning • 1 minute

7 readings • Total 25 minutes

  • Welcome to PsycLearn Essentials • 3 minutes
  • Survey: How Did You Find Us? • 2 minutes
  • Requirements to Earn a Coursera Specialization Certificate • 2 minutes
  • What’s in Your Course • 5 minutes
  • Coursera Honor Code and Discussion Forum Policy • 3 minutes
  • Study Tips for Success in PsycLearn Essentials • 5 minutes
  • Additional Information • 5 minutes

Introduction to Methods for Quantitative Research in Psychology

1 video • total 2 minutes.

  • Welcome • 2 minutes • Preview module

Scientific Psychology

As the backbone of all science, the scientific method is a systematic approach to generating new knowledge by collecting data to answer a research question.

3 videos 7 readings 5 assignments

3 videos • Total 7 minutes

  • Eight Steps • 2 minutes • Preview module
  • The Scientific Literature • 2 minutes
  • Asking Research Questions • 1 minute

7 readings • Total 39 minutes

  • Using the Scientific Method • 7 minutes
  • The Nature of the Scientific Literature • 10 minutes
  • Reading the Scientific Literature • 8 minutes
  • Categories of Research Questions • 4 minutes
  • Key Takeaways: Scientific Psychology • 3 minutes
  • Key Terms: Scientific Psychology • 2 minutes
  • DOWNLOAD: Steps of the Scientific Method • 5 minutes

5 assignments • Total 32 minutes

  • Check Your Understanding: Steps in the Scientific Method • 5 minutes
  • Check Your Understanding: The Scientific Literature • 4 minutes
  • Check Your Understanding: Reading the Literature • 4 minutes
  • Check Your Understanding: Research Questions • 4 minutes
  • Mastering the Content: Scientific Psychology • 15 minutes

Variables and Measurement

Data are the currency of science, each data point a recording of a measurement on some scale, each point representing one value of a variable that uses that scale. This module explains how psychologists define and use variables and how we measure the values of those variables when we collect data.

3 videos 10 readings 5 assignments

  • Recognizing Variables • 2 minutes • Preview module
  • Measuring Variables • 2 minutes
  • Measurement Error • 2 minutes

10 readings • Total 34 minutes

  • Qualitative and Quantitative Variables • 4 minutes
  • Operational Definition of Variables • 4 minutes
  • Measuring Variables • 4 minutes
  • Replication • 2 minutes
  • Types of Replication • 2 minutes
  • Replication Crisis • 2 minutes
  • Key Takeaways: Variables and Measurement • 4 minutes
  • Key Vocabulary: Variables and Measurement • 2 minutes
  • DOWNLOAD: Operational Definitions • 5 minutes
  • DOWNLOAD: Reliability and Validity • 5 minutes

5 assignments • Total 33 minutes

  • Check Your Understanding: Qualitative and Quantitative Variables • 3 minutes
  • Check Your Understanding: Writing Operational Definitions • 8 minutes
  • Check Your Understanding: Recognizing and Measuring Variables • 4 minutes
  • Check Your Understanding: Variables and Measurement • 3 minutes
  • Mastering the Content: Variables and Measurement • 15 minutes

Research Designs

Different research questions and variables call for different research designs. This module describes the four basic research designs, how they work, and the kinds of data they yield.

5 videos 7 readings 6 assignments

5 videos • Total 10 minutes

  • Exploring Research Designs • 2 minutes • Preview module
  • Using Descriptive Research Methods • 1 minute
  • Using Correlational Research Methods • 2 minutes
  • Using Experimental Research Methods • 1 minute
  • Using Quasi-Experimental Research Methods • 2 minutes

7 readings • Total 31 minutes

  • Types of Research Designs • 2 minutes
  • The Nuts and Bolts of Descriptive Research • 3 minutes
  • The Nuts and Bolts of Correlational Research • 8 minutes
  • The Nuts and Bolts of Experimental Research • 8 minutes
  • The Nuts and Bolts of Quasi-Experimental Research • 3 minutes
  • Key Takeaways: Research Designs • 5 minutes
  • Key Vocabulary: Research Designs • 2 minutes

6 assignments • Total 31 minutes

  • Check Your Understanding: Types of Research Designs • 1 minute
  • Check Your Understanding: Descriptive Research • 3 minutes
  • Check Your Understanding: Correlational Research • 3 minutes
  • Check Your Understanding: Experimental Research • 6 minutes
  • Check Your Understanding: Quasi-Experimental Research • 3 minutes
  • Mastering the Content: Research Designs • 15 minutes

Interpretation: How Valid Are Our Conclusions?

Conclusions in science are always tentative—new evidence may force us to reconsider them, or a once valid-seeming proposition can be rendered invalid on a reconsideration of existing evidence. this module considers different forms of validity and how they are evaluated.

3 videos 5 readings 4 assignments

3 videos • Total 12 minutes

  • Construct Validity • 2 minutes • Preview module
  • Internal Validity • 8 minutes
  • External Validity • 1 minute

5 readings • Total 52 minutes

  • Constructs and Construct Validity • 10 minutes
  • Confounds and Internal Validity • 15 minutes
  • External Validity and Generalization • 15 minutes
  • Key Takeaways: Interpretation • 10 minutes
  • Key Vocabulary: Interpretation • 2 minutes

4 assignments • Total 45 minutes

  • Check Your Understanding: Construct Validity • 10 minutes
  • Check Your Understanding: Internal Validity • 10 minutes
  • Check Your Understanding: External Validity • 10 minutes
  • Mastering the Content: Interpretation • 15 minutes

3 readings • Total 28 minutes

  • Key Takeaways • 25 minutes
  • Key Vocabulary • 2 minutes
  • Course References • 1 minute

Course Assessment

1 video 1 assignment

1 video • Total 1 minute

  • Closing Remarks • 1 minute • Preview module

1 assignment • Total 25 minutes

  • Course Quiz: Methods for Quantitative Research in Psychology • 25 minutes

PsycLearn Essentials APA Student Resources

This module provides a variety of information and tools from the American Psychological Association (APA) that will help inspire you as you complete your coursework and plan your career goals. Get discounted access to Academic Writer, APA’s online tool for writing effectively, as well as valuable advice that will help you develop and strengthen your skillset for learning success and future employment. Additionally, explore resources on various psychological issues. This module also includes APA resources on scholarly research and writing; a list of sites providing valuable resources on diversity, equity, and inclusion in psychology education and in the professional community; resources on a career in psychology; and links to career opportunities at the APA. You can also view videos that offer tips on dealing with stress.

8 readings • Total 19 minutes

  • Introduction to APA Resources • 5 minutes
  • Student Resources • 2 minutes
  • APA Style®, Research, and Writing • 2 minutes
  • Students from Diverse Ethnic, Cultural, and Economic Backgrounds • 2 minutes
  • Psychology Help Center • 2 minutes
  • Psychology Careers • 2 minutes
  • Careers and Internships at APA • 2 minutes
  • Other APA Resources • 2 minutes

Instructor ratings

We asked all learners to give feedback on our instructors based on the quality of their teaching style.

quantitative research on psychology

APA is the leading scientific and professional organization representing psychology in the United States, with more than 146,000 researchers, educators, clinicians, consultants, and students as its members. Its mission is to promote the advancement, communication, and application of psychological science and knowledge to benefit society and improve lives.

Recommended if you're interested in Psychology

quantitative research on psychology

American Psychological Association

Statistics in Psychological Research

quantitative research on psychology

Psychological Research

Specialization

quantitative research on psychology

Ethics of Psychological Research

quantitative research on psychology

University of California, Davis

Quantitative Research

Why people choose coursera for their career.

quantitative research on psychology

Learner reviews

127 reviews

Showing 3 of 127

Reviewed on Sep 24, 2024

I'm a senior student in Vietnam and the course taught me more than my lecturer at school

Reviewed on Sep 25, 2024

I love how I was able to learn so much in such a simple format. This is great.

Reviewed on Jun 2, 2024

This is an easy-to-digest informative course. The contents are exactly what I learned in one semester at a university. I hope the financial aid and free upgrades are more accessible.

New to Psychology? Start here.

Placeholder

Open new doors with Coursera Plus

Unlimited access to 10,000+ world-class courses, hands-on projects, and job-ready certificate programs - all included in your subscription

Advance your career with an online degree

Earn a degree from world-class universities - 100% online

Join over 3,400 global companies that choose Coursera for Business

Upskill your employees to excel in the digital economy

Frequently asked questions

When will i have access to the lectures and assignments.

Access to lectures and assignments depends on your type of enrollment. If you take a course in audit mode, you will be able to see most course materials for free. To access graded assignments and to earn a Certificate, you will need to purchase the Certificate experience, during or after your audit. If you don't see the audit option:

The course may not offer an audit option. You can try a Free Trial instead, or apply for Financial Aid.

The course may offer 'Full Course, No Certificate' instead. This option lets you see all course materials, submit required assessments, and get a final grade. This also means that you will not be able to purchase a Certificate experience.

What will I get if I subscribe to this Specialization?

When you enroll in the course, you get access to all of the courses in the Specialization, and you earn a certificate when you complete the work. Your electronic Certificate will be added to your Accomplishments page - from there, you can print your Certificate or add it to your LinkedIn profile. If you only want to read and view the course content, you can audit the course for free.

What is the refund policy?

If you subscribed, you get a 7-day free trial during which you can cancel at no penalty. After that, we don’t give refunds, but you can cancel your subscription at any time. See our full refund policy Opens in a new tab .

Is financial aid available?

Yes. In select learning programs, you can apply for financial aid or a scholarship if you can’t afford the enrollment fee. If fin aid or scholarship is available for your learning program selection, you’ll find a link to apply on the description page.

More questions

IMAGES

  1. Quantitative Economics: A Deep Dive into Quantitative Methods in Economic Analysis

    quantitative research on psychology

  2. Information Quantitative Analysis on Psychological Research

    quantitative research on psychology

  3. Quantitative Research: Definition, Methods, Types and Examples

    quantitative research on psychology

  4. Quantitative Psychological Research

    quantitative research on psychology

  5. Reporting Quantitative Research in Psychology (eBook)

    quantitative research on psychology

  6. Qualitative Versus Quantitative Research

    quantitative research on psychology

COMMENTS

  1. Qualitative vs Quantitative Research: What's the Difference?

    Dec 18, 2023 · What Is Quantitative Research? Quantitative research involves the process of objectively collecting and analyzing numerical data to describe, predict, or control variables of interest. The goals of quantitative research are to test causal relationships between variables, make predictions, and generalize results to wider populations.

  2. Quantitative Psychology Designs Research Methods to Test ...

    Quantitative psychologists study and develop the methods and techniques used to measure human behavior and other attributes. Their work involves the statistical and mathematical modeling of psychological processes, the design of research studies and the analysis of psychological data.

  3. Quantitative psychology - Wikipedia

    Quantitative psychology is a field of scientific study that focuses on the mathematical modeling, research design and methodology, and statistical analysis of psychological processes. It includes tests and other devices for measuring cognitive abilities .

  4. 909 PDFs | Review articles in QUANTITATIVE PSYCHOLOGY

    Explore the latest full-text research PDFs, articles, conference papers, preprints and more on QUANTITATIVE PSYCHOLOGY. Find methods information, sources, references or conduct a literature review ...

  5. APA Handbook of Research Methods in Psychology

    Research Methods in Psychology AP A Han dbook s in Psychology VOLUME Research Designs: Quantitative, Qualitative, Neuropsychological, and Biological SECOND EDITION Harris Cooper, Editor-in-Chief Marc N. Coutanche, Linda M. McMullen, A. T. Panter, sychological Association. Not for further distribution.

  6. Advances in Quantitative Research Within the Psychological ...

    The issue of publication bias is also closely tied to the ongoing replication crisis in psychology (e.g., Shrout & Rodgers, 2018) and, therefore, to the need for research transparency. Nylund-Gibson and Choi (2018) present a seemingly different article—a user-friendly account of conceptualizing and conducting a latent class analysis ...

  7. Back to the Future of Quantitative Psychology and Measurement ...

    Research in quantitative psychology has developed several methods and techniques to improve our understanding of humans. Over the last few decades, the rapid advancement of technology had led to more extensive study of human cognition, including both the emotional and behavioral aspects.

  8. The Oxford Handbook of Quantitative Methods in Psychology ...

    Mar 21, 2013 · The Oxford Handbook of Quantitative Methods in Psychology aims to be a source for learning and reviewing current best-practices in quantitative methods as practiced in the social, behavioral, and educational sciences. Comprising two volumes, this text covers a wealth of topics related to quantitative research methods.

  9. Quantitative methods in psychology: inevitable and useless

    Jul 29, 2010 · Modern quantitative psychology has accepted method as primary; research questions are adjusted to the methods. For understanding thinking in modern quantitative psychology, two epistemologies should be distinguished: structural-systemic that is based on Aristotelian thinking, and associative-quantitative that is based on Cartesian–Humean ...

  10. Methods for Quantitative Research in Psychology - Coursera

    This is primarily aimed at first- and second-year undergraduates interested in psychology, data analysis, and quantitative research methods along with high school students and professionals with similar interests.