Catherine G. Wolf | |
---|---|
Born | |
Died | February 7, 2018 70) | (aged
Nationality | American |
Alma mater | Tufts University, Brown University |
Known for | Human-computer interaction, ALS research |
Awards | Tufts University Distinguished Service Award |
Scientific career | |
Fields | Psychology, Computer science |
Institutions | IBM |
Doctoral advisor | Peter D. Eimas |
Catherine Gody Wolf (May 25, 1947 - February 7, 2018) was an American psychologist and expert in human-computer interaction. She was the author of more than 100 research articles and held six patents in the areas of human-computer interaction, artificial intelligence, and collaboration. [1] Wolf was known for her work at IBM's Thomas J. Watson Research Center in Yorktown Heights, NY, where she was a 19-year staff researcher. [2]
In the late 1990s, Wolf was diagnosed with Amyotrophic lateral sclerosis (ALS), better known as Lou Gehrig's disease. Despite a rapid physical deterioration, Wolf was still able to communicate with the world via electronic sensory equipment, including a sophisticated brain-computer interface. [3] Remarkably, with almost no voluntary physical functions remaining, she published novel research into the fine-scale abilities of ALS patients. [4]
Wolf completed her undergraduate degree at Tufts University, where she majored in psychology. In 1967 she met her future husband, Joel Wolf, then a student at the Massachusetts Institute of Technology. Both continued on to graduate school at Brown University, where Catherine focused her research on the way that children perceive speech. [5] After Brown, Wolf completed additional postgraduate work at MIT before entering the workforce as a full-time researcher. [6]
Wolf's career focused on human-computer interaction. In 1977, she joined Bell Labs, where she became a human factors manager. Eight years later, she began her tenure as a research psychologist at the Thomas J. Watson Research Center, IBM's research headquarters. During her time at IBM, Wolf was particularly interested in learning how people interact with software in the workplace. In response to behaviors she observed, she designed and tested new interface systems in which speech and handwritten words could be converted to digital information. Among other technologies, Wolf worked on a system known as the Conversation Machine, which was the precursor of today's phone banking systems: users could access their accounts by conversing with an automated voice system. [2] She also published papers on the sharing of information in the workplace and search in the context of technical support. [7]
In all, Wolf held title to six patents and more than 100 research articles. In 1997, she was diagnosed with ALS, a.k.a. Lou Gehrig's disease, which eventually prevented her from performing her normal work duties. Wolf went on long-term disability leave in 2004 [2] and officially retired from IBM in 2012. Even after losing almost all muscle function, however, Wolf still contributed to research on human-computer interaction. She also did work with the Wadsworth Center, part of the New York State Department of Health, as a tester of various systems. In 2009, Wolf also published a research article extending a scale commonly used to assess the progression of ALS (known as the ALSFRS-R) to more finely assess the abilities of people with advanced ALS. This paper added significantly to the understanding of what ALS patients might be capable of even after most of their muscle function has been lost. [4]
Wolf first felt symptoms of ALS in 1996, when her foot wouldn't flex properly. She was positively diagnosed with ALS a year later. [8]
In 2001, Wolf decided to have a tracheotomy, a surgical procedure that permanently attached a breathing tube in her neck, allowing her to breathe without the use of her nose or mouth.
Wolf eventually lost the use of all of her muscles except a few in her face and eyes. To communicate, she used a computer system which translated movement of her eyebrows into text. She was adept at communicating in this way, even if though she could only "type" out one or two words a minute. She wrote poetry, [9] sent emails, conducted occasional interviews, [10] and wrote articles for such outlets as Neurology Now. [11] [12] She was even able to stay active on Facebook. [2] [13]
Concurrent with the loss of her muscle control, Wolf became increasingly an expert on brain-computer interface (BCI) systems, [14] and helped other researchers learn more about how such systems can work. She was aware that she might lose the ability to communicate with her eyebrows, so she worked with scientists on an EEG-based interface system for herself, if that day came. EEG (electroencephalography) measures voltage fluctuations along the scalp that result from neuron activity in the brain. With such a setup in place, Wolf hoped to communicate words simply by focusing her thoughts on one letter at a time. Wolf provided researchers with important feedback about BCI's, since they didn't work flawlessly. [3]
Wolf was married to Joel Wolf, a mathematician at IBM's TJ Watson Research Center. They had two daughters, Laura and Erika, and several grandchildren. [2]
On April 26, 2003, Wolf was honored with a Distinguished Service Award from her alma mater, Tufts University, for "the ideal of citizenship and public service." [15]
On February 7, 2018, Wolf died at her home in Katonah, New York at the age of 70. [13] [16]
Neuroscience is the scientific study of the nervous system, its functions and disorders. It is a multidisciplinary science that combines physiology, anatomy, molecular biology, developmental biology, cytology, psychology, physics, computer science, chemistry, medicine, statistics, and mathematical modeling to understand the fundamental and emergent properties of neurons, glia and neural circuits. The understanding of the biological basis of learning, memory, behavior, perception, and consciousness has been described by Eric Kandel as the "epic challenge" of the biological sciences.
In the industrial design field of human–computer interaction, a user interface (UI) is the space where interactions between humans and machines occur. The goal of this interaction is to allow effective operation and control of the machine from the human end, while the machine simultaneously feeds back information that aids the operators' decision-making process. Examples of this broad concept of user interfaces include the interactive aspects of computer operating systems, hand tools, heavy machinery operator controls and process controls. The design considerations applicable when creating user interfaces are related to, or involve such disciplines as, ergonomics and psychology.
Locked-in syndrome (LIS), also known as pseudocoma, is a condition in which a patient is aware but cannot move or communicate verbally due to complete paralysis of nearly all voluntary muscles in the body except for vertical eye movements and blinking. The individual is conscious and sufficiently intact cognitively to be able to communicate with eye movements. Electroencephalography results are normal in locked-in syndrome. Total locked-in syndrome, or completely locked-in state (CLIS), is a version of locked-in syndrome wherein the eyes are paralyzed as well. Fred Plum and Jerome B. Posner coined the term for this disorder in 1966.
A brain–computer interface (BCI), sometimes called a brain–machine interface (BMI) or smartbrain, is a direct communication pathway between the brain's electrical activity and an external device, most commonly a computer or robotic limb. BCIs are often directed at researching, mapping, assisting, augmenting, or repairing human cognitive or sensory-motor functions. They are often conceptualized as a human–machine interface that skips the intermediary component of the physical movement of body parts, although they also raise the possibility of the erasure of the discreteness of brain and machine. Implementations of BCIs range from non-invasive and partially invasive to invasive, based on how close electrodes get to brain tissue.
BrainGate is a brain implant system, currently under development and in clinical trials, designed to help those who have lost control of their limbs, or other bodily functions, such as patients with amyotrophic lateral sclerosis (ALS) or spinal cord injury. The Braingate technology and related Cyberkinetic’s assets are now owned by privately held Braingate, Co. The sensor, which is implanted into the brain, monitors brain activity in the patient and converts the intention of the user into computer commands.
Brain implants, often referred to as neural implants, are technological devices that connect directly to a biological subject's brain – usually placed on the surface of the brain, or attached to the brain's cortex. A common purpose of modern brain implants and the focus of much current research is establishing a biomedical prosthesis circumventing areas in the brain that have become dysfunctional after a stroke or other head injuries. This includes sensory substitution, e.g., in vision. Other brain implants are used in animal experiments simply to record brain activity for scientific reasons. Some brain implants involve creating interfaces between neural systems and computer chips. This work is part of a wider research field called brain–computer interfaces.
Martha Julia Farah is a cognitive neuroscience researcher at the University of Pennsylvania. She has worked on an unusually wide range of topics; the citation for her lifetime achievement award from the Association for Psychological Science states that “Her studies on the topics of mental imagery, face recognition, semantic memory, reading, attention, and executive functioning have become classics in the field.”
Neuroinformatics is the emergent field that combines informatics and neuroscience. Neuroinformatics is related with neuroscience data and information processing by artificial neural networks. There are three main directions where neuroinformatics has to be applied:
Neuroergonomics is the application of neuroscience to ergonomics. Traditional ergonomic studies rely predominantly on psychological explanations to address human factors issues such as: work performance, operational safety, and workplace-related risks. Neuroergonomics, in contrast, addresses the biological substrates of ergonomic concerns, with an emphasis on the role of the human nervous system.
The sensorimotor mu rhythm, also known as mu wave, comb or wicket rhythms or arciform rhythms, are synchronized patterns of electrical activity involving large numbers of neurons, probably of the pyramidal type, in the part of the brain that controls voluntary movement. These patterns as measured by electroencephalography (EEG), magnetoencephalography (MEG), or electrocorticography (ECoG), repeat at a frequency of 7.5–12.5 Hz, and are most prominent when the body is physically at rest. Unlike the alpha wave, which occurs at a similar frequency over the resting visual cortex at the back of the scalp, the mu rhythm is found over the motor cortex, in a band approximately from ear to ear. People suppress mu rhythms when they perform motor actions or, with practice, when they visualize performing motor actions. This suppression is called desynchronization of the wave because EEG wave forms are caused by large numbers of neurons firing in synchrony. The mu rhythm is even suppressed when one observes another person performing a motor action or an abstract motion with biological characteristics. Researchers such as V. S. Ramachandran and colleagues have suggested that this is a sign that the mirror neuron system is involved in mu rhythm suppression, although others disagree.
Fernanda Bertini Viégas is a Brazilian computer scientist and graphical designer, whose work focuses on the social, collaborative and artistic aspects of information visualization.
The Human–Computer Interaction Institute (HCII) is a department within the School of Computer Science at Carnegie Mellon University (CMU) in Pittsburgh, Pennsylvania. It is considered one of the leading centers of human–computer interaction research, and was named one of the top ten most innovative schools in information technology by Computer World in 2008. For the past three decades, the institute has been the predominant publishing force at leading HCI venues, most notably ACM CHI, where it regularly contributes more than 10% of the papers. Research at the institute aims to understand and create technology that harmonizes with and improves human capabilities by integrating aspects of computer science, design, social science, and learning science.
Elizabeth D. "Beth" Mynatt is the Dean of the Khoury College of Computer Sciences at Northeastern University. She is former executive director of the Institute for People and Technology, director of the GVU Center at Georgia Tech, and Regents' and Distinguished Professor in the School of Interactive Computing, all at the Georgia Institute of Technology.
Silent speech interface is a device that allows speech communication without using the sound made when people vocalize their speech sounds. As such it is a type of electronic lip reading. It works by the computer identifying the phonemes that an individual pronounces from nonauditory sources of information about their speech movements. These are then used to recreate the speech using speech synthesis.
The BCI Research Award is an annual award for innovative research in the field of brain-computer interfaces. It is organized by the BCI Award Foundation. The prize is $3000 for first, $2000 for second, and $1000 for third place. The prizes are provided by g.tec medical engineering, Cortec, Intheon and IEEE Brain.. Christoph Guger and Dean Krusienski are the chairmen of the Foundation.
Brain technology, or self-learning know-how systems, defines a technology that employs latest findings in neuroscience. [see also neuro implants] The term was first introduced by the Artificial Intelligence Laboratory in Zurich, Switzerland, in the context of the Roboy project. Brain Technology can be employed in robots, know-how management systems and any other application with self-learning capabilities. In particular, Brain Technology applications allow the visualization of the underlying learning architecture often coined as "know-how maps".
Catherine Plaisant is a French/American Research Scientist Emerita at the University of Maryland, College Park and assistant director of research of the University of Maryland Human–Computer Interaction Lab.
Thomas J. Oxley is the chief executive officer of Synchron and neurointerventionist at Mount Sinai Hospital in New York City. Trained as a vascular and interventional neurologist, he established the Vascular Bionics laboratory at the University of Melbourne and is currently co-head of this lab. Oxley is best known for founding Synchron, a company building next-generation brain computer interface solutions that has recently announced the first clinical data on a novel stent electrode (Stentrode) neural interface that is inserted through blood vessels. The company was initiated sometime after his cold-call to DARPA for funding, and has received substantial funding from the U.S. Defense Advanced Research Projects Agency (DARPA) and the Australian government to research this minimally-invasive neural interface technology.
Thorsten O. Zander is a German scientist who introduced the concept of passive brain-computer interface. He co-founded Zander Labs, a German-Dutch company in the field of passive brain computer interface (pBCI) and neuro-adaptive technology (NAT).
Niels Birbaumer is an Austrian academic who served as a professor at the University of Tübingen until 2019.