Network


Latest external collaboration on country level. Dive into details by clicking on the dots.

Hotspot


Dive into the research topics where Markus A. Wenzel is active.

Publication


Featured researches published by Markus A. Wenzel.


Frontiers in Neuroscience | 2016

The Berlin Brain-Computer Interface: Progress Beyond Communication and Control

Benjamin Blankertz; Laura Acqualagna; Sven Dähne; Stefan Haufe; Matthias Schultze-Kraft; Irene Sturm; Marija Ušćumlić; Markus A. Wenzel; Gabriel Curio; Klaus-Robert Müller

The combined effect of fundamental results about neurocognitive processes and advancements in decoding mental states from ongoing brain signals has brought forth a whole range of potential neurotechnological applications. In this article, we review our developments in this area and put them into perspective. These examples cover a wide range of maturity levels with respect to their applicability. While we assume we are still a long way away from integrating Brain-Computer Interface (BCI) technology in general interaction with computers, or from implementing neurotechnological measures in safety-critical workplaces, results have already now been obtained involving a BCI as research tool. In this article, we discuss the reasons why, in some of the prospective application domains, considerable effort is still required to make the systems ready to deal with the full complexity of the real world.


Frontiers in Neuroscience | 2016

Classification of Eye Fixation Related Potentials for Variable Stimulus Saliency.

Markus A. Wenzel; Jan-Eike Golenia; Benjamin Blankertz

Objective: Electroencephalography (EEG) and eye tracking can possibly provide information about which items displayed on the screen are relevant for a person. Exploiting this implicit information promises to enhance various software applications. The specific problem addressed by the present study is that items shown in real applications are typically diverse. Accordingly, the saliency of information, which allows to discriminate between relevant and irrelevant items, varies. As a consequence, recognition can happen in foveal or in peripheral vision, i.e., either before or after the saccade to the item. Accordingly, neural processes related to recognition are expected to occur with a variable latency with respect to the eye movements. The aim was to investigate if relevance estimation based on EEG and eye tracking data is possible despite of the aforementioned variability. Approach:Sixteen subjects performed a search task where the target saliency was varied while the EEG was recorded and the unrestrained eye movements were tracked. Based on the acquired data, it was estimated which of the items displayed were targets and which were distractors in the search task. Results: Target prediction was possible also when the stimulus saliencies were mixed. Information contained in EEG and eye tracking data was found to be complementary and neural signals were captured despite of the unrestricted eye movements. The classification algorithm was able to cope with the experimentally induced variable timing of neural activity related to target recognition. Significance: It was demonstrated how EEG and eye tracking data can provide implicit information about the relevance of items on the screen for potential use in online applications.


International Workshop on Symbiotic Interaction | 2015

Live Demonstrator of EEG and Eye-Tracking Input for Disambiguation of Image Search Results

Jan-Eike Golenia; Markus A. Wenzel; Benjamin Blankertz

When searching images on the web, users are often confronted with irrelevant results due to ambiguous queries. Consider a search term like ’Bill’: Results will probably consist of multiple images depicting Bill Clinton, Bill Cosby and money bills. Given that the user is only interested in pictures of money bills, most of the results are irrelevant. We built a demo application that exploits EEG and eye-tracking data for the disambiguation of one of two possible interpretations of an ambiguous search term. The demo exhibits the integration of sensor input into a modern web application.


International Workshop on Symbiotic Interaction | 2015

Developing a Symbiotic System for Scientific Information Seeking: The Mindsee Project

Luciano Gamberini; Anna Spagnolli; Benjamin Blankertz; Samuel Kaski; Jonathan Freeman; Laura Acqualagna; Oswald Barral; Maura Bellio; Luca Chech; Manuel J. A. Eugster; Eva Ferrari; Paolo Negri; Valeria Orso; Patrik Pluchino; Filippo Minelle; Baris Serim; Markus A. Wenzel; Giulio Jacucci

This paper describes an approach for improving the current systems supporting the exploration and research of scientific literature, which generally adopt a query-based information-seeking paradigm. Our approach is to use a symbiotic system paradigm, exploiting central and peripheral physiological data along with eye-tracking data to adapt to users’ ongoing subjective relevance and satisfaction with search results. The system described, along with the interdisciplinary theoretical work underpinning it, could serve as a stepping stone for the development and diffusion of next-generation symbiotic systems, enabling a productive interdependence between humans and machines. After introducing the concept and evidence informing the development of symbiotic systems over a wide range of application domains, we describe the rationale of the MindSee project, emphasizing its BCI component and pinpointing the criteria around which users’ evaluations can gravitate. We conclude by summarizing the main contribution that MindSee is expected to make.


International Workshop on Symbiotic Interaction | 2015

Neural Responses to Abstract and Linguistic Stimuli with Variable Recognition Latency

Markus A. Wenzel; Carlos Moreira; Iulia-Alexandra Lungu; Mihail Bogojeski; Benjamin Blankertz

Electroencephalography (EEG) can provide information about which words or items are relevant for a computer user. This implicit information is potentially useful for applications that adapt to the current interest of the individual user. EEG data were used to estimate whether a linguistic or abstract stimulus belonged to a target category that a person was looking for. The complex stimuli went beyond basic symbols commonly used in brain-computer interfacing and required a variable assessment duration or gaze shifts. Accordingly, neural processes related to recognition occurred with a variable latency after stimulus-onset. Decisions involving not only shapes but also semantic linguistic information could be well detected from the EEG data. Discriminative information could be extracted better if the EEG data were aligned to the response than to the stimulus-onset.


Journal of Neural Engineering | 2017

Real-time inference of word relevance from electroencephalogram and eye gaze

Markus A. Wenzel; Mihail Bogojeski; Benjamin Blankertz

OBJECTIVE Brain-computer interfaces can potentially map the subjective relevance of the visual surroundings, based on neural activity and eye movements, in order to infer the interest of a person in real-time. APPROACH Readers looked for words belonging to one out of five semantic categories, while a stream of words passed at different locations on the screen. It was estimated in real-time which words and thus which semantic category interested each reader based on the electroencephalogram (EEG) and the eye gaze. MAIN RESULTS Words that were subjectively relevant could be decoded online from the signals. The estimation resulted in an average rank of 1.62 for the category of interest among the five categories after a hundred words had been read. SIGNIFICANCE It was demonstrated that the interest of a reader can be inferred online from EEG and eye tracking signals, which can potentially be used in novel types of adaptive software, which enrich the interaction by adding implicit information about the interest of the user to the explicit interaction. The study is characterised by the following novelties. Interpretation with respect to the word meaning was necessary in contrast to the usual practice in brain-computer interfacing where stimulus recognition is sufficient. The typical counting task was avoided because it would not be sensible for implicit relevance detection. Several words were displayed at the same time, in contrast to the typical sequences of single stimuli. Neural activity was related with eye tracking to the words, which were scanned without restrictions on the eye movements.


Journal of the Association for Information Science and Technology | 2018

Integrating Neurophysiological Relevance Feedback in Intent Modeling for Information Retrieval

Giulio Jacucci; Oswald Barral; Pedram Daee; Markus A. Wenzel; Baris Serim; Tuukka Ruotsalo; Patrik Pluchino; Jonathan Freeman; Luciano Gamberini; Samuel Kaski; Benjamin Blankertz

The use of implicit relevance feedback from neurophysiology could deliver effortless information retrieval. However, both computing neurophysiologic responses and retrieving documents are characterized by uncertainty because of noisy signals and incomplete or inconsistent representations of the data. We present the first‐of‐its‐kind, fully integrated information retrieval system that makes use of online implicit relevance feedback generated from brain activity as measured through electroencephalography (EEG), and eye movements. The findings of the evaluation experiment (N = 16) show that we are able to compute online neurophysiology‐based relevance feedback with performance significantly better than chance in complex data domains and realistic search tasks. We contribute by demonstrating how to integrate in interactive intent modeling this inherently noisy implicit relevance feedback combined with scarce explicit feedback. Although experimental measures of task performance did not allow us to demonstrate how the classification outcomes translated into search task performance, the experiment proved that our approach is able to generate relevance feedback from brain signals and eye movements in a realistic scenario, thus providing promising implications for future work in neuroadaptive information retrieval (IR).


Journal of Neural Engineering | 2017

Implicit relevance feedback from electroencephalography and eye tracking in image search

Jan Eike Golenia; Markus A. Wenzel; Mihail Bogojeski; Benjamin Blankertz

OBJECTIVE Methods from brain-computer interfacing (BCI) open a direct access to the mental processes of computer users, which offers particular benefits in comparison to standard methods for inferring user-related information. The signals can be recorded unobtrusively in the background, which circumvents the time-consuming and distracting need for the users to give explicit feedback to questions concerning the individual interest. The obtained implicit information makes it possible to create dynamic user interest profiles in real-time, that can be taken into account by novel types of adaptive, personalised software. In the present study, the potential of implicit relevance feedback from electroencephalography (EEG) and eye tracking was explored with a demonstrator application that simulated an image search engine. APPROACH The participants of the study queried for ambiguous search terms, having in mind one of the two possible interpretations of the respective term. Subsequently, they viewed different images arranged in a grid that were related to the query. The ambiguity of the underspecified search term was resolved with implicit information present in the recorded signals. For this purpose, feature vectors were extracted from the signals and used by multivariate classifiers that estimated the intended interpretation of the ambiguous query. MAIN RESULT The intended interpretation was inferred correctly from a combination of EEG and eye tracking signals in 86% of the cases on average. Information provided by the two measurement modalities turned out to be complementary. SIGNIFICANCE It was demonstrated that BCI methods can extract implicit user-related information in a setting of human-computer interaction. Novelties of the study are the implicit online feedback from EEG and eye tracking, the approximation to a realistic use case in a simulation, and the presentation of a large set of photographies that had to be interpreted with respect to the content.


PLOS ONE | 2016

Is Neural Activity Detected by ERP-Based Brain-Computer Interfaces Task Specific?

Markus A. Wenzel; Inês Almeida; Benjamin Blankertz

Objective Brain-computer interfaces (BCIs) that are based on event-related potentials (ERPs) can estimate to which stimulus a user pays particular attention. In typical BCIs, the user silently counts the selected stimulus (which is repeatedly presented among other stimuli) in order to focus the attention. The stimulus of interest is then inferred from the electroencephalogram (EEG). Detecting attention allocation implicitly could be also beneficial for human-computer interaction (HCI), because it would allow software to adapt to the user’s interest. However, a counting task would be inappropriate for the envisaged implicit application in HCI. Therefore, the question was addressed if the detectable neural activity is specific for silent counting, or if it can be evoked also by other tasks that direct the attention to certain stimuli. Approach Thirteen people performed a silent counting, an arithmetic and a memory task. The tasks required the subjects to pay particular attention to target stimuli of a random color. The stimulus presentation was the same in all three tasks, which allowed a direct comparison of the experimental conditions. Results Classifiers that were trained to detect the targets in one task, according to patterns present in the EEG signal, could detect targets in all other tasks (irrespective of some task-related differences in the EEG). Significance The neural activity detected by the classifiers is not strictly task specific but can be generalized over tasks and is presumably a result of the attention allocation or of the augmented workload. The results may hold promise for the transfer of classification algorithms from BCI research to implicit relevance detection in HCI.


International Workshop on Symbiotic Interaction | 2015

Lecture Notes in Computer Science (including subseries Lecture Notes in Artificial Intelligence and Lecture Notes in Bioinformatics)

Luciano Gamberini; Anna Spagnolli; Benjamin Blankertz; Samuel Kaski; Jonathan Freeman; Laura Acqualagna; Oswald Barral; Maura Bellio; Luca Chech; Manuel Eugster; Eva Ferrari; Paolo Negri; Valeria Orso; Patrik Pluchino; Filippo Minelle; Baris Serim; Markus A. Wenzel; Giulio Jacucci

This paper describes an approach for improving the current systems supporting the exploration and research of scientific literature, which generally adopt a query-based information-seeking paradigm. Our approach is to use a symbiotic system paradigm, exploiting central and peripheral physiological data along with eye-tracking data to adapt to users’ ongoing subjective relevance and satisfaction with search results. The system described, along with the interdisciplinary theoretical work underpinning it, could serve as a stepping stone for the development and diffusion of next-generation symbiotic systems, enabling a productive interdependence between humans and machines. After introducing the concept and evidence informing the development of symbiotic systems over a wide range of application domains, we describe the rationale of the MindSee project, emphasizing its BCI component and pinpointing the criteria around which users’ evaluations can gravitate. We conclude by summarizing the main contribution that MindSee is expected to make.

Collaboration


Dive into the Markus A. Wenzel's collaboration.

Top Co-Authors

Avatar

Benjamin Blankertz

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Laura Acqualagna

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar

Mihail Bogojeski

Technical University of Berlin

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar

Baris Serim

University of Helsinki

View shared research outputs
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Top Co-Authors

Avatar
Researchain Logo
Decentralizing Knowledge