Dynamic vs. Static Markers of Human Presence

The distinction between static and dynamic human identifiers is foundational for modern authentication. Static identifiers—such as fingerprints or iris scans—are reliable but vulnerable to replication. A photograph, a lifted fingerprint, or a synthetic voice sample can be enough to fool systems built on static features. In contrast, dynamic psychophysiological signals are tied to ongoing biological processes. They fluctuate moment to moment, driven by autonomic and somatic nervous system activity, making them both rich in information and resistant to simulation.

Psychophysiological signals are not only real-time but also inherently interactive. Heart rate accelerates and decelerates in response to cognitive load (Lang, 2009), facial muscles contract reflexively during emotional experiences (Cacioppo et al., 1986), and pupils dilate in response to both light and information processing demands (Beatty & Lucero-Wagoner, 2000). By analyzing these signals in combination, it becomes possible to validate that a digital subject is not only present but also alive and responding in real time.

Core Psychophysiological Signals

Heart Rate and Heart Rate Variability

Heart rate (HR) and heart rate variability (HRV) are foundational indicators of autonomic nervous system function. Research in media psychophysiology has shown that HR deceleration is a reliable marker of attention allocation to mediated messages (Fisher, Huskey, Keene, & Weber, 2018). HRV, measured as variation in interbeat intervals, reflects balance between sympathetic and parasympathetic systems and is sensitive to both stress and engagement (Keene, Bolls, Clayton, & Berke, 2017). Remote photoplethysmography (rPPG) now enables measurement of HR from standard webcams by detecting subtle color changes in the skin (Sun & Thakor, 2016).

Facial Electromyography and Microexpressions

Facial EMG measures electrical activity in facial muscles, capturing even subtle contractions associated with affective states. Laboratory studies demonstrate that EMG activity over the corrugator supercilii (brow) region indexes negative affect, while activity over the zygomaticus major (cheek) indexes positive affect (Cacioppo et al., 1986). Even when expressions are suppressed, these micro-level activations persist, offering a robust measure of underlying affect. Computer vision systems can approximate EMG readings by tracking fine-grained facial landmark movements, enabling scalable, non-contact detection.

Pupillometry and Ocular Dynamics

The pupil responds rapidly to both external and internal stimuli. Beyond simple light reflexes, pupil dilation correlates with cognitive load and emotional arousal (Beatty & Lucero-Wagoner, 2000). In addition, saccades and blink rates provide information about attentional processes and fatigue. Eye-tracking and webcam-based pupillometry now allow these measures to be incorporated into digital verification pipelines.

Additional Signals: Multimodal Redundancy

Other psychophysiological signals—skin conductance (EDA), respiration, postural sway—may also be measured under controlled conditions, but HR, EMG, and pupillometry represent the most scalable and practical measures for remote, webcam-based systems. Together, they provide a robust triad of indicators for verifying human presence.

Evidence of Reliability and Validity

Each of these measures has a long history in laboratory research. Heart rate and HRV have been extensively validated as indicators of attentional engagement and stress (Allen, 2007; Fisher et al., 2018). Facial EMG has reliably distinguished positive from negative affect in experimental psychology for decades (Cacioppo et al., 1986). Pupillometry has been used to study both low-level sensory processes and high-level cognition, with robust evidence linking dilation to working memory load (Beatty & Lucero-Wagoner, 2000).

Reliability across studies is strong, though each signal has limitations. rPPG can be affected by lighting and movement; EMG-like detection via computer vision must contend with occlusion; and pupillometry can be sensitive to screen brightness. These limitations underscore the importance of multimodal integration, where one signal compensates for weaknesses in another.

Why These Signals Resist AI Spoofing

The resilience of psychophysiological signals lies in their dynamic, multi-timescale properties. A synthetic video may replicate the outward appearance of a heartbeat or a blink, but coordinating multiple signals at once—heartbeat frequency, microexpression timing, pupil dilation in response to stimulus onset—is an extraordinarily difficult task for generative systems. The signals not only have to look plausible individually but must also remain coherent with each other and with environmental stimuli.

For example, a deepfake might simulate a person smiling, but unless subtle corrugator and zygomaticus activations are synchronized properly, the smile will not carry the physiological authenticity of genuine affect. Similarly, a replayed video cannot synchronize pupil constriction with real-time changes in ambient lighting. These coherence constraints make psychophysiological signals exceptionally hard to forge convincingly.

The Moveris Measurement Architecture

Moveris leverages webcam-based technologies to measure HR, facial EMG-like signals, and pupillometry simultaneously. Computer vision algorithms track facial landmarks and subtle skin color variations. Temporal processing filters signals for stability, while multimodal fusion algorithms integrate them into a unified liveness profile.

The system outputs a human likelihood score that incorporates both the presence of individual signals and their coherence across time and stimulus conditions. For example, a user’s HR deceleration may be analyzed in tandem with pupil dilation at stimulus onset, while microexpression activity provides additional confirmation of authentic affective engagement.

Implications for Research and Industry

For researchers, multimodal psychophysiological measurement provides a scalable, non-invasive way to study attention, emotion, and engagement outside of the lab. For industry, it offers a solution to the crisis of digital trust by making liveness detection both robust and user-friendly.

In fintech, psychophysiology adds resilience to KYC/AML workflows without adding friction for end users. In media verification, it provides a basis for authenticating recorded and live content. In UX research, it creates richer insights into how real users engage with digital environments. By grounding trust in the living signals of the human body, psychophysiology bridges the gap between laboratory research and applied security.

Conclusion

Psychophysiological signals—heart rate, facial muscle activity, and pupillary responses—are dynamic signatures of human life. They resist spoofing not because they are impossible to imitate superficially, but because they are deeply embedded in living biological systems that respond coherently to environmental input. By integrating these signals into a multimodal framework, Moveris operationalizes decades of psychophysiological research into scalable, real-time verification. The result is a robust foundation for distinguishing authentic human presence from synthetic imitation in the digital age.

References (Selected for this Paper)

  • Allen, J. (2007). Photoplethysmography and its application in clinical physiological measurement. Physiological Measurement, 28(3), R1–R39.
  • Beatty, J., & Lucero-Wagoner, B. (2000). The pupillary system. In J. T. Cacioppo, L. G. Tassinary, & G. G. Berntson (Eds.), Handbook of psychophysiology (2nd ed., pp. 142–162). Cambridge University Press.
  • Cacioppo, J. T., Petty, R. E., Losch, M. E., & Kim, H. S. (1986). Electromyographic activity over facial muscle regions can differentiate the valence and intensity of affective reactions. Journal of Personality and Social Psychology, 50(2), 260–268.
  • Cacioppo, J. T., Tassinary, L. G., & Berntson, G. G. (Eds.). (2017). Handbook of psychophysiology (4th ed.). Cambridge University Press.
  • Fisher, J. T., Huskey, R., Keene, J. R., & Weber, R. (2018). The limited capacity model of motivated mediated message processing: Taking stock of the past. Annals of the International Communication Association, 42(4), 270–281.
  • Keene, J. R., Bolls, P. D., Clayton, R. B., & Berke, C. K. (2017). On the use of beats-per-minute and interbeat interval in the analysis of cardiac responses to mediated messages. Communication Research Reports, 34(3), 265–274.
  • Sun, Y., & Thakor, N. (2016). Photoplethysmography revisited: From contact to noncontact, from point to imaging. IEEE Transactions on Biomedical Engineering, 63(3), 463–477.

Read more story

Mapping Media Stimuli to Human Psychophysiology
Biometric Coherence and the Science of Trust
Read Full Story

Mapping Media Stimuli to Human Psychophysiology

Secure More Customers.
Stop More Fraud.

Protect your business with frictionless, psychophysiology-based verification that AI can’t fake.

Schedule a Demo