Neuroethics
Paul Ford, PhD explores the evolving field of neuroethics, examining how ethical frameworks guide clinical decision‑making and responsible innovation.
Subscribe: Apple Podcasts | Spotify | Buzzsprout
Neuroethics
Podcast Transcript
Neuro Pathways Podcast Series
Release Date: March 1, 2026
Expiration Date: February 28, 2027
Estimated Time of Completion: 30 minutes
Neuroethics
Paul Ford, PhD
Description
Each podcast in the Neurological Institute series provides a brief, review of management strategies related to the topic.
Learning Objectives
- Review up to date and clinically pertinent topics related to neurological disease
- Discuss advances in the field of neurological diseases
- Describe options for the treatment and care of various neurological disease
Target Audience
Physicians and Advanced Practice providers in Family Practice, Internal Medicine & Subspecialties, Neurology, Nursing, Pediatrics, Psychology/Psychiatry, Radiology as well as Professors, Researchers, and Students.
ACCREDITATION
In support of improving patient care, Cleveland Clinic Center for Continuing Education is jointly accredited by the Accreditation Council for Continuing Medical Education (ACCME), the Accreditation Council for Pharmacy Education (ACPE), and the American Nurses Credentialing Center (ANCC), to provide continuing education for the healthcare team.
CREDIT DESIGNATION
- American Medical Association (AMA)
Cleveland Clinic Center for Continuing Education designates this enduring material for a maximum of 0.50 AMA PRA Category 1 Credits™. Physicians should claim only the credit commensurate with the extent of their participation in the activity.
Participants claiming CME credit from this activity may submit the credit hours to the American Osteopathic Association for Category 2 credit.
- American Nurses Credentialing Center (ANCC)
Cleveland Clinic Center for Continuing Education designates this enduring material for a maximum of 0.50 ANCC contact hours.
- Certificate of Participation
A certificate of participation will be provided to other health care professionals for requesting credits in accordance with their professional boards and/or associations.
- American Board of Surgery (ABS)
Successful completion of this CME activity enables the learner to earn credit toward the CME requirements of the American Board of Surgery’s Continuous Certification program. It is the CME activity provider's responsibility to submit learner completion information to ACCME for the purpose of granting ABS credit.
Credit will be reported within 30 days of claiming credit.
Podcast Series Director
Andreas Alexopoulos, MD, MPH
Epilepsy Center
Additional Planner/Reviewer
Ari Newman, BSN
Faculty
Paul Ford, PhD
Center for Neurological Restoration
Host
Glen Stevens, DO, PhD
Cleveland Clinic Brain Tumor and Neuro-Oncology Center
Agenda
Neuroethics
Paul Ford, PhD
Disclosures
In accordance with the Standards for Integrity and Independence issued by the Accreditation Council for Continuing Medical Education (ACCME), The Cleveland Clinic Center for Continuing Education mitigates all relevant conflicts of interest to ensure CME activities are free of commercial bias.
The following faculty have indicated that they may have a relationship, which in the context of their presentation(s), could be perceived as a potential conflict of interest:
| Paul Ford, PhD |
|
||
|
Glen Stevens, DO, PhD |
|
All other individuals have indicated no relationship which, in the context of their involvement, could be perceived as a potential conflict of interest.
CME Disclaimer
The information in this educational activity is provided for general medical education purposes only and is not meant to substitute for the independent medical judgment of a physician relative to diagnostic and treatment options of a specific patient's medical condition. The viewpoints expressed in this CME activity are those of the authors/faculty. They do not represent an endorsement by The Cleveland Clinic Foundation. In no event will The Cleveland Clinic Foundation be liable for any decision made or action taken in reliance upon the information provided through this CME activity.
HOW TO OBTAIN AMA PRA Category 1 Credits™, ANCC Contact Hours, OR CERTIFICATE OF PARTICIPATION:
Go to: Neuro Pathways Podcast March 1, 2026 to log into myCME and begin the activity evaluation and print your certificate If you need assistance, contact the CME office at myCME@ccf.org.
Copyright ©2026 The Cleveland Clinic Foundation. All Rights Reserved.
Introduction: Neuro Pathways, a Cleveland Clinic podcast, exploring the latest research discoveries and clinical advances in the fields of neurology, neurosurgery, neurorehab, and psychiatry.
Glen Stevens, DO, PhD: Neuroethics is at the forefront of modern neuroscience, helping clinicians and researchers navigate the complex choices that arise as groundbreaking technologies reshape brain care.
In this episode, we'll unpack the core principles of neuroethics, explore the dilemmas sparked by innovations like brain-computer interface, and highlight how ethical guidance ensures these advances truly benefit patients and society.
I'm your host, Glen Stevens, Neurologist, Neuro-oncologist in Cleveland Clinic's Neurological Institute, and joining me for today's conversation is Dr. Paul Ford. Dr. Ford is the Director of the Neuroethics Program at the Neurological Institute at Cleveland Clinic. Paul, welcome to Neuro Pathways.
Paul Ford, PhD: Dr. Stevens, it's always a delight to talk with you. Over the years, we've had a chance to discuss many kinds of challenges, and I look forward to today.
Glen Stevens, DO, PhD: Very good. Hopefully it won't be too challenging. As an introduction, why don't you tell our listeners a little bit about yourself, your background, how you came to the Cleveland Clinic, and what you do here on a regular basis?
Paul Ford, PhD: Sure. I started my education doing a math and computer science undergraduate in Walla Walla, Washington, Walla Walla University. And I was interested in virtual reality, and I found that the engineers, in response to their technology that they were developing for artificial intelligence, it is not new, artificial intelligence isn't new, that they were unconcerned with the human ethics aspects. And when I went to the humanities people, they didn't care about the technology at the time. Engineering has come a long way. They're now very interested in these things.
But I went to get a PhD in philosophy to try to approach these technological issues, and found medical ethics, and came to the Cleveland Clinic 25 years ago or so, and restarted my interest in neuroethics when I got a call from the operating room, and a neurosurgeon who was putting in a deep brain stimulator said, "Dr. Ford, I need your help. The patient's telling me to stop the surgery four hours into the surgery. What do I do?" I said, "Well, you put him back to sleep like he's supposed to be." He said, "No, no, no, you don't understand. It's this brain device stimulator and they have to be awake to help participate in the surgery by doing things." And really from that time forward, I got involved in all things neurologic at the Cleveland Clinic, psychiatry, neurology, neurosurgery, to help them with some of the really tough challenges in the neurosciences.
Glen Stevens, DO, PhD: It's a fascinating story. I don't think he had shared that with me before. I wrote a little note on the top of my sheet, and under neuroethics and I sort of put "watchdog and conscience," which is kind of what it is, right, in some ways.
Paul Ford, PhD: Often it is, and it's not a conscience in the kind of policing conscience. But oftentimes the best consults are when a physician calls me and says, "Let me just run this by you, because I want to make sure I'm doing the justified thing. Is it ethically permissible, really, to offer this, even?" And I help them think it through. Or a patient who says, "The doctor is offering me surgery, or I can do medication. I'm having a tough time deciding between them. How do other people think through these things?"
Glen Stevens, DO, PhD: Well, your introduction sounded like a tough one, so everything gets a little easier after that. I was doing a little reading and I read this quote by William Safire in 2002, and he said, in terms of what is neuroethics, he said, "It's the examination of what is right and wrong, good and bad about the treatment of, perfection of, or unwelcome invasion of, and worrisome manipulation of, the human brain." I mean, that's a mouthful.
Paul Ford, PhD: It is a mouthful, and Safire has an incredibly strong influence on how neuroethics was developed and created, connected with the Dana Foundation, and he really gets at the heart of much of what is interesting in neuroethics, really, that I've come to think of the brain differently now that I've talked to a lot of neuroscientists. I think neuroscientists talk about circuits of the brain that are involved in functions, though not one particular part of the brain. And when you have circuits of brains, you have them overlapping and intersecting, and so I think one of the most pressing things that is underneath the quote you gave is that if we control one pathway or alter one pathway, we're probably going to influence other circuits and pathways.
Glen Stevens, DO, PhD: And then there was the International Neuroethics Society in '06 in discussing neuroethics that it looks at the social, legal, ethical, and policy implication of advances in neuroscience. So, it's really, we're going to be doing things, and we need to look at all the ways it can affect it. So, how do you specifically look at neuroethics?
Paul Ford, PhD: There are all of those components, and I think that we look at them in each instance a little differently. So, you think about the clinical care people receive and the choices they have. We think about the values. I don't talk about principles as much as things that are important to people, because an ethical dilemma is always when you have to sacrifice something that's important in order to preserve something else.
Just gave a talk to the local community on brain-computer interfaces and neurostimulation, and one of the examples I gave that is in deep brain stimulation, that a stimulator that is implanted deep into the basal ganglia in the brain, we want to be able to make it as convenient as possible for patients to adjust the settings, or have the settings adjusted. So, there are some systems that will allow adjustment at a distance, a remote adjustment. So, convenience is important, privacy is important, and safety is important. Well, if I can control your stimulator at a distance, then perhaps it can be hacked and be controlled by somebody else at a distance, right? So, by having something being able to be controlled at a distance, you gain power, but you put yourself at risk. You try to mitigate those risks, and those are part of the things that neuroethics can help with is to say, "Now that we want to do something new, what are the off-target effects that may happen, either in the body, or to some of the things that are important to us?"
Glen Stevens, DO, PhD: I guess it seems an obvious question, but where do you see neuroethics adding the most value today?
Paul Ford, PhD: I think that it is probably today adding the most value in the development of new technologies that may allow people to change themselves in new ways, to interact in the world in new ways. You think about the brain-computer interfaces that are being planted in experimentally right now by a variety of different people, companies and institutions and researchers of all types, that are intended to be brain chips that control the outer world, or to communicate with the outer world. There are some really interesting things being done with some implants where thoughts are being converted into speech, and the speech is unlike a letter board where you do single letter at a time. You actually get sentences that are about as fast as some people type, right?
Now, it's not 100 words a minute. It's in the 20, 30, 40, 50 words a minute, which sounds fast, but if you're talking to one another, it's not so fast, but AI and these things have allowed us to have unprecedented applications. Now, I think one nice thing is that people are starting to try to anticipate, what are some of the downfalls? If I'm reading off language, there's at least one study that chose a password that I have to think in order for the computer to start reading it off, so to keep privacy internally, right?
Glen Stevens, DO, PhD: You know, you really then start to wonder, and we won't really get into this, but then it's the whole aspect of consent, because it's hard to know what the future might be with something you might implant in somebody, right? You implant in a deep brain stimulator, the applications used five years from now may be different than now, right, or how it's being utilized.
Paul Ford, PhD: Absolutely, and I think another broad theme that's important is that we need to make sure that we don't unduly bias ourselves against certain disease processes, and what do I mean by that? Particularly for informed consent, you think of somebody that has a small stroke, but it's left them with a challenge in the way in which they speak, dysarthria, or maybe a word-finding problem, that on the surface, the average person, because they can't communicate quickly, may assume they can't give good informed consent. And of course, just because I'm having trouble finding a word doesn't mean that I don't have it there, just it's going to take a little longer. So, we need to make sure that we don't unduly keep people from either research or innovation or clinical practice because of some of our bias.
The other real bias that I have challenged researchers when I review grants are things like, as a category, excluding anyone who has a mental health diagnosis. Well, there's lots of people with mental health diagnosis that have full capacity. Now, there may be a few psychosis or hallucinations or other things that you'd want to exclude, but you have to be very careful even in those to say, "Does it really affect either the research or the clinical care, or is it just my own bias?" And having a person external to sort of raise these to say where our biases are in clinical practice or in research can be very valuable.
Glen Stevens, DO, PhD: Yeah. It sort of goes back to the first thing I mentioned where I put the word "watchdog" down.
Paul Ford, PhD: Yeah, and rather than watchdog, I think of it as another kind of consultant, somebody who can provide insight. And my colleagues are well-meaning and sometimes they just don't have the perspective or haven't paid attention to one aspect, and once I raise it, they immediately say, "Wow. We need to change this aspect and make it better."
Glen Stevens, DO, PhD: So, you've touched on this a little bit, but how should neuroethics guide the responsible implementation and evaluation of new treatments, so maybe even beyond what we're doing currently?
Paul Ford, PhD: Yeah. The BRAIN Initiative in NIH convened a lot of neuroethics research over the last number of years. It came up with some guiding principles that have some use. There's eight of them. Safety is the very first thing. Make sure that these things that we're doing are safe, and safe include those kind of off-target effects that we need to be looking for. If we're looking at movement disorders, then we also need to make sure that it's not affecting depression or obsessions, or other kinds of things. We need to pay attention to these consent issues, the capacity, autonomy, how are these technologies altering that? And what's the whole picture of the person, not just the momentary expression? We need to avoid malign uses, right? Realize that there's going to be dual uses for everything, and we need to be careful how we safeguard ourselves in that way.
Privacy is always a concern, neurodata, that's another principle. And in neural data, particularly, if you have patients that are getting devices or other kinds of recordings that the neurodata is being stored, you want to make sure you advise them to know what's going to be done with that stored data. You think about EEG. If there's masses amount of EEG data being recorded, there are things we know, like seizures and other kinds of patterns, but I have the strong suspicion that we will be able to tell more about those patterns later, kind of like genetics. We do a whole genome and we don't know what the genes mean, by and large, but little by little, we're finding out that some of them have significance. Well, we have the masses of EEG data and other neuroimaging data that we may be able to reanalyze and find out things that we didn't know before. So, we need to advise people that neural data may have more in it than what you think right now, so you need to know what's going to happen to it.
Glen Stevens, DO, PhD: Yeah. I guess we can take a little sidebar here. I was reading very recently about what was going on in Abu Dhabi, and in Abu Dhabi, they were doing the whole genome sequencing of the Emirate population, over 700,000 individuals. And of course, their interest is looking for disease that will be coming, what we can prevent, and those types of things, but it raises all the privacy concerns that you just mentioned, and that is, what really is going to be the use of my genome? How is it going to be used? How isn't it going to be used? Do I own it? Can somebody sell it? Can't they sell it? I don't know the answers to these things.
Paul Ford, PhD: Those are absolutely the important question, and if something new is found, what right or obligation ... what obligation does the researcher have, what right does the research participant have, to have that? We struggled with exactly these questions as we developed the Cleveland Clinic Brain Study, a longitudinal study for people, by and large, who are 50 years and older and don't have a neurological diagnosis, or if you have a relative with multiple sclerosis, then 24 or older. And it's the idea is to follow these individuals for a long time, like Framingham did for heart, and see which people develop a neurological disease.
But we're going to scan and we're going to collect data on a lot of healthy people, many of which will have incidental findings. You do a brain scan and someone that seems healthy, you may find something. And that something may be obviously clinically actionable, but also maybe just a normal variation, or it may be something that should be watched, right? So, how do we discharge our duty to make sure we warn people, but don't worry them unnecessarily, or create a whole bunch of false positives? But to your point, we're also going to be collecting all this data, and if we find there is a marker that we didn't expect that is predictive of developing Alzheimer's, a modifiable one, what's our obligation to then go back and notify these individuals, and provide them guidance on the next steps? And there is.
And so, we've thought through these things in the Cleveland Clinic Brain Study exactly like they should in the genetic ones. I think the example of the genetic pool in Iceland is an example of where they did anticipate a lot of these kinds of things. So, in the neurotechnology, we need to learn from genetics and hold it to the same level, from a neuroethics viewpoint.
Glen Stevens, DO, PhD: Yeah. I was reading some interesting sort of sidebars on that, where they were talking about in populations that are very insular, the genetic expression is not as broad, and you have individuals that have closer genes, and you could potentially use some of the coding to determine, is your genetic code too close to somebody else's, and maybe the potential problems that that could create, so it's interesting, right? But then as you say, what do you do with that data? And do you have a priority situation where we're going to make this decision if we find an abnormality? I've certainly seen some patients from the Brain Study that tumors have showed up, and you clearly can't just leave a tumor to sit there. Someone needs to address it.
Paul Ford, PhD: Well-
Glen Stevens, DO, PhD: Or do they?
Paul Ford, PhD: You're the expert, but how many of us have meningiomas sitting in our head that we will never know about? Isn't there a rate of that the meningioma-
Glen Stevens, DO, PhD: Yeah, 20%.
Paul Ford, PhD: 20%. So, if you see a meningioma that is small, and in certain instances, you're not going to try to do a surgery on it. Am I right?
Glen Stevens, DO, PhD: Correct.
Paul Ford, PhD: So, those folks would have been in some ways better off not having known.
Glen Stevens, DO, PhD: Yeah, and I think there's no question about that, right, that I will hear people sometimes say they have an accident or something, they got a scan done and it showed something, it's an incidental, and they'll often say to me, "I wish I never knew I had this."
Paul Ford, PhD: Yeah, and so we don't want to avoid those situations, but if there's a big ugly glioblastoma multiforme or something that you should debulk, or that will have a meaningful ... or another tumor that ... You can name another tumor that's easily redressable better than I can, but you would be unconscionable not to tell them, right?
Glen Stevens, DO, PhD: Right.
Paul Ford, PhD: It would be against our conscience to say, "Hey, there is something that we can go in, and your life will be much better."
Glen Stevens, DO, PhD: Yeah. I guess it's difficult to anticipate every scenario when you do these large population studies. I guess there needs to be some parameter at which point you pull the trigger and tell somebody something. I have patients that would come to me over long years and just want to do whole body scans. And I've clearly seen people that have had procedures done because of something that, in retrospect, turned out to be incidental, although maybe you didn't know it at the time, but the things went wrong, and then they develop a complication for something that maybe didn't need to be addressed. It was all done for the right intentions, but ... So, there's a lot of ramifications that becomes very difficult, so we're glad that you're around to take up with these things.
Paul Ford, PhD: All of these things, too often, ethicists don't think about the cost of a false positive to people, emotional or real problems that arise from diagnostic tests that have a risk to them. I ran across a person who, a clinician who said they're going to give every person a Alzheimer's blood test.
Glen Stevens, DO, PhD: Well, I'm glad that you mentioned that, because I'm going to do a podcast on this with someone else later, but I was going to ask you that exact question, so thoughts about that. Tell me.
Paul Ford, PhD: In general, I think testing is in a clinical milieu, right, that you need to have some reason to test, because we don't know, some of these tests, that some people may walk around for a very long time with, and never develop a disease based on just a simple high level of amyloid or tau.
Glen Stevens, DO, PhD: And that's really what the answer is, right, that these tests per guidelines are that you need to be symptomatic. But that symptomatic line can be really gray sometimes, or people may not be honest with you because they want the test.
Paul Ford, PhD: Yeah.
Glen Stevens, DO, PhD: And it may be easier for them to get the test out of a primary care physician than out of a specialist, and then the implications of it could be massive, right? If your testing suggesting you have a high likelihood of developing a problem down the line, how does that affect you for the rest of your life?
Paul Ford, PhD: And as you hear in my accent online here, I still have my Canadian accent, so whenever I say out or about ... And the Canadian system has a different approach when you start thinking about rationing, but we still have a responsibility as an individual to say, "Patients can't demand anything from us," and particularly when we continue to do it, it might have a negative impact on the system, and a negative impact on that population of patients, right? Having a bunch of well people undergo massive amount of testing just because they demand it, and there's no clinical indication, can both be bad for the patient and bad for an overburdened system.
Glen Stevens, DO, PhD: Well, as you know, I'm Canadian as well, so we've got two Canadians talking about a lot of testing. It's just not in our DNA.
Paul Ford, PhD: Yes, absolutely.
Glen Stevens, DO, PhD: We use Epic as our medical record here. And I remember when it was implemented, initially, everything would come to you, and you could then over time, you would generally see the patient back and then you would discuss results with them, and then some things started to be released just automatically, right? You have some sort of a procedure done and the results show up. You haven't talked to your physician about it. They just show up. And there's still some variance in exactly what is released, but if history is teaching us something, more and more will just be released, and then you'll have patients receiving records before they've talked to a physician, which I always have an issue with, because they can't really interpret the information that's there. But I'm curious your thoughts on that, of releasing information to patients outside of their physician discussion.
Paul Ford, PhD: I have a little bit of a libertarian bent on this issue, so having said that ... And the devil's in the details. I want to give people access to good information as soon as you can. But on the other hand, I think there are things that we do purely for risk management in reports, that we feel obligated to denote something that is of no clinical significance and given no clinical signs, it means nothing, and yet we put a troubling, "We need to rule out this terrible thing," and there's really no reason. So, how we deliver it is different than should we. And if, when the result comes out that I got a call from my clinician, great. But in the past, sometimes those old ways, where two, three weeks later, I'd finally get a call from my clinician ... Some of the simple things should be able to be released very quickly. You know, "I had some blood test for an iron level in my blood." That's not so hard to understand, right, but I did need a follow-up by the doctor telling me what the next step was.
Cleveland Clinic, I think, does a really nice job. People complain about Dr. Google, but oftentimes when I google, I get the Cleveland Clinic pages up there, and they did a good job of a informed consumer, that's where my ... libertarian bent, an informed consumer. When I pulled up my iron tests, and I pulled up the Cleveland Clinic pages, and got reliable information. We just need to make sure that people get to reliable information, and these reports are done in a way that makes it easy for me to look up the Cleveland Clinic, or sometimes I use the British NHS. I think they have a very nice system.
Glen Stevens, DO, PhD: I remember a story from when records first started being released, and there was a patient in the hospital at the Clinic, and they were on their computer and were looking at the ... So they were admitted on their computer, looking at the results, and an imaging study came up and they were reading it. And they were a little bit confused about it, and they saw who the person was that read the imaging study and they got on the phone and they called the switchboard operator and they asked them to page the person that read the report. So, the person comes on, assuming it's a physician that's calling them, and it says, "Hey, I'm Bob, and you read my MRI." And they're like, "What? What? What?"
But I always tell trainees that patients will read what's there, so you better make sure you're not only looking at everything imaging-wise, hard copy, but you better read the reports and all the details, because the patients will, and they'll ask you questions. And every day, every day that I'm in clinic, patients will ask me something about an MRI report that has nothing to do with why I'm seeing them, but they just want clarification on something. They've googled it with their good computer doctor, but they want some other information, so they definitely look. And I agree with that as well, right? I think they should have. It's just, what information should just be given by the physician, right?
Paul Ford, PhD: And I think neuroradiology is a place where AI is and can continue to be a tremendous tool for them so that they can focus their expertise on the most difficult scans, and reinterpret and check scans that have been flagged. People are afraid of AI, and it should be used properly, but it also can be an extension of tools that will let doctors do what they're best at doing.
Glen Stevens, DO, PhD: Mm-hmm. Let's just pivot and talk about research for a minute, and your involvement with research projects. I imagine this would be very important now, but talk about that a little bit.
Paul Ford, PhD: Great. So there's really two kinds of things that are related to research. One, I do some primary research where I do interviews with patients or clinicians about some aspect of the research. For instance, for the Brain Study, again, we're just about to publish a paper on returning results, like we talked. So, what we did is we asked people, "Which kinds of results would you like under what conditions that-"
Glen Stevens, DO, PhD: So perfect, right, from what we were just discussing. Timely.
Paul Ford, PhD: And rather than just guessing what people want, what's the breadth of responses we might get, and how would people most want it? So that's one.
I do some research myself, but I also help researchers design trials that may fulfill our duties in the best way. You think about a trial that has a placebo-blinded phase. How do we make sure that there's best understanding, or that we really do it in the right way? It used to be for brain surgeries, they would do a placebo burr hole or a partial burr hole, or put in a non-functioning device like a deep brain stimulator in half the population. Those days are long gone, not thought to be ethically supportable now. Usually everyone gets implanted by a working device, and then some are off and some are on, and then they usually cross over in the other way. But an example where we can think of alternate ways that respect the individual and don't put them at undue risk for no potential benefit.
Glen Stevens, DO, PhD: And an institution that doesn't have you, a dedicated neuroethicist, what do they do?
Paul Ford, PhD: A lot of times they look to reflective senior people in their field. Just because you have experience, I've learned, doesn't make me wise, but oftentimes if you have experience and you reflect, these are resources that can help. And rely on and ask people in fields that are adjacent to yours. These multidisciplinary meetings where you have a neurologist, a neurosurgeon, a neuropsychologist, a psychiatrist, a psychologist, a social worker, often convening a group of people like that to say, "Let's think through this carefully. What are the values that are at stake, the things that might loss, and who might lose these?"
Glen Stevens, DO, PhD: So, what single principle or checklist items should every neurologist keep in mind when considering novel neurotechnologies?
Paul Ford, PhD: Yeah. How do we get this person to the maximal human thriving that they can? And the human thriving is to their goals. So, in epilepsy, for instance, it's not really about eliminating the seizures, right? It's about a functional goal. How can we get them to their functional goal? How can we get them so they're not falling and breaking their bones? How do we get them so that they don't have to be watched while they're being in a shower? These are goals that we need to keep focused on, and doing it just for the reduction of seizures is a short form for saying, "I want to get you to your goal." So, will this get them to the goal, or is a lower tech solution going to get you more likely to the goals, or maybe high-tech? But human flourishing is really the goal, and we can lose sight of that if we focus too much on the physical symptoms rather than what they want to do with them.
Glen Stevens, DO, PhD: Research coming down the line that you're excited about?
Paul Ford, PhD: Yeah. There's a few things that I think are exciting. I'm working with the biomedical engineers, and there's one engineer, Paul Marasco, who has a lab where they can convince the non-cognitive part of your body that you have ownership, that that prosthetic is you, your hand, kind of like the rubber hand experiments you see online, and they have a sense of ownership. And when you take it away, they can't explain it, but they feel like they have been re-traumatized, because they emotionally felt ownership of it. Dr. Machado and Dr. Baker are doing this great brain stimulation of the cerebellum. So usually, it's done in the basal ganglia for stroke rehab or for certain kinds of tremor. That's a whole new ... I'm learning all about the cerebellum now. I didn't really think it was that important.
Glen Stevens, DO, PhD: Much more complicated than we ever thought.
Paul Ford, PhD: I had no idea. So, I'm very excited about the cerebellar, the ataxias, those kind of things.
Glen Stevens, DO, PhD: Good. And final takeaways for our audience?
Paul Ford, PhD: Reflect carefully on what biases you might have when you are introducing a patient to a therapy, and just have an open mind of different ways people have of thriving.
Glen Stevens, DO, PhD: Well, Paul, I'd like to thank you for joining me today. It's been a pleasure as always, and we appreciate what you do here.
Paul Ford, PhD: Thank you, Dr. Stevens. Appreciate it.
Closing: This concludes this episode of Neuro Pathways. You can find additional podcast episodes on our website, clevelandclinic.org/neuropodcast, or subscribe to the podcast on iTunes, Google Play, Spotify, or wherever you get your podcasts. And don't forget, you can access real-time updates from experts in Cleveland Clinic's Neurological Institute on our Consult QD website. That's @CleClinicMD, all one word. And thank you for listening.
Neuro Pathways
A Cleveland Clinic podcast for medical professionals exploring the latest research discoveries and clinical advances in the fields of neurology, neurosurgery, neurorehab and psychiatry. Learn how the landscape for treating conditions of the brain, spine and nervous system is changing from experts in Cleveland Clinic's Neurological Institute.
These activities have been approved for AMA PRA Category 1 Credits™ and ANCC contact hours.