The company has just received Breakthrough Device Designation from the FDA for their "Blindsight" brain computer interface technology, fast tracking it's progress toward testing in human subjects
Thank you for highlighting Riz Virk. I was trying to remember the term I had heard for this "sci-fi feedback loop" phenomenon that seems so prevalent in the public facing sides of tech dev. I had lovingly referred to it as the "Torment Nexus" effect, which is definitely a more pessimistic outlook on things, and a perspective on the harms created by innovation selection that I am not certain is completely covered by the former perspective.
I also appreciate you pointing out the potential savior complex inherent in this sort of dynamic. I often feel like I have categorized and reduced a description of the factors driving an innovation and its communications, but then I see something like this and am reminded of a factor I've temporarily forgotten. For example, I had reacted with the idea that maybe expectations were too high, the unexplored risks of even products that succeed on delivering on those expectations, that tech companies nowadays need to generate hype for a prospective product, and that much of the vision of those products is for the value that this hype generates, but I missed the "savior" element. It makes conversations around these technologies much more important.
To that effect, it does make me wonder further about regulation for enterprise planning to fix things such as disabilities. Others have commented similarly, but I wonder, particularly for Elon Musk, what the outcomes would be when someone with mind-boggling money and influence decides that disabilities of any kind are flaws that ought to be corrected, or, worse still, that ordinary facets of people's identities are diseases, disabilities, or otherwise things that can and should be fixed. The line is blurred with many disabilities, where some may feel they are enough a part of the person that they are that they would not "cure" themselves if they had the chance, though obviously this comes down to the individual.
It makes it all the more important that, in this and other cases, the foundation of Musk's beliefs are based on a fundamental misunderstanding of the science he's working with. Someone with that degree of influence has the ability to spread and reinforce this scientific misinformation, and alter society in a large enough way that, when the time comes, it may not come down to the informed consent of the individual when it comes to the choice of whether or not to "cure" their disability, especially if the change is irreversible.
Speaking of which, this makes me feel that fundamental changes would have to occur in the way that technology operates, especially if we have the same people making decisions about medical technology as consumer technology. The idea of "orphaned technologies" in the eye or brain doesn't just make me think about the brain being too complex to deal with, but also the ways in which the cycle of technological hype has created bubble after bubble. It's one thing to have a VR device that is no longer supported by the tech it interacts with, or that you could no longer repair if it malfunctions, or a software system that no longer has support patches, it's another thing if that device is in your brain. A few years ago I wrote about pacemaker tech, which has advanced a lot even in the subsequent years, but as we arrive in a technological theater where these systems require more interactivity, more support from other technology, I wonder whether we will start to see worse and worse risks of having what will more and more quickly be known as a legacy system. Regardless of whether transistors continue to get faster and faster, it seems that the public discussion around technological development gets faster and faster.
I agree with your concern of 'fixing' people. It's absurd to let the blind see and the deaf hear and even more absurd that we've accepted the broad use of prosthetics to fix those missing limbs. If we let this keep going, someday someone is going to 'cure' cancer and heart disease. Worse, what happens if they 'fix' dementia? That's hubris against the will of nature certainly.
I say this in jest mostly but we've turned a corner over the last 10 years where this is becoming an oddly quasi-religious movement denigrating medical advancement as a problem. One of the big places this is happening is with mental health and autism research. Certain online communities are calling it 'neurodivergent' and claiming special status while those who actually suffer from bipolar, schizophrenia, or autism would love nothing more than to be 'fixed.'
I've always loved the idea of Jordie's eyewear. It's an aspiration that a man could be 'fixed' and become a crucial member of a crew when he'd be unable to do so without that technology. I applaud Musk for investing money into the technology, hyperbole aside... but hyperbole is also the basis for all of the technology we enjoy today. Musk is already making headway with Neuralink in ways people, just 5 years ago, denigrated.
The self fulfilling prophecies and the desire to fill a void as a species. Don't we love to operate within the triangle of collective wisdom, collective stupidity and self sabotage 😅
Are we on the brink of a breakthrough, or blindly following our own techno-optimistic illusions?
Thank you for highlighting Riz Virk. I was trying to remember the term I had heard for this "sci-fi feedback loop" phenomenon that seems so prevalent in the public facing sides of tech dev. I had lovingly referred to it as the "Torment Nexus" effect, which is definitely a more pessimistic outlook on things, and a perspective on the harms created by innovation selection that I am not certain is completely covered by the former perspective.
I also appreciate you pointing out the potential savior complex inherent in this sort of dynamic. I often feel like I have categorized and reduced a description of the factors driving an innovation and its communications, but then I see something like this and am reminded of a factor I've temporarily forgotten. For example, I had reacted with the idea that maybe expectations were too high, the unexplored risks of even products that succeed on delivering on those expectations, that tech companies nowadays need to generate hype for a prospective product, and that much of the vision of those products is for the value that this hype generates, but I missed the "savior" element. It makes conversations around these technologies much more important.
To that effect, it does make me wonder further about regulation for enterprise planning to fix things such as disabilities. Others have commented similarly, but I wonder, particularly for Elon Musk, what the outcomes would be when someone with mind-boggling money and influence decides that disabilities of any kind are flaws that ought to be corrected, or, worse still, that ordinary facets of people's identities are diseases, disabilities, or otherwise things that can and should be fixed. The line is blurred with many disabilities, where some may feel they are enough a part of the person that they are that they would not "cure" themselves if they had the chance, though obviously this comes down to the individual.
It makes it all the more important that, in this and other cases, the foundation of Musk's beliefs are based on a fundamental misunderstanding of the science he's working with. Someone with that degree of influence has the ability to spread and reinforce this scientific misinformation, and alter society in a large enough way that, when the time comes, it may not come down to the informed consent of the individual when it comes to the choice of whether or not to "cure" their disability, especially if the change is irreversible.
Speaking of which, this makes me feel that fundamental changes would have to occur in the way that technology operates, especially if we have the same people making decisions about medical technology as consumer technology. The idea of "orphaned technologies" in the eye or brain doesn't just make me think about the brain being too complex to deal with, but also the ways in which the cycle of technological hype has created bubble after bubble. It's one thing to have a VR device that is no longer supported by the tech it interacts with, or that you could no longer repair if it malfunctions, or a software system that no longer has support patches, it's another thing if that device is in your brain. A few years ago I wrote about pacemaker tech, which has advanced a lot even in the subsequent years, but as we arrive in a technological theater where these systems require more interactivity, more support from other technology, I wonder whether we will start to see worse and worse risks of having what will more and more quickly be known as a legacy system. Regardless of whether transistors continue to get faster and faster, it seems that the public discussion around technological development gets faster and faster.
I agree with your concern of 'fixing' people. It's absurd to let the blind see and the deaf hear and even more absurd that we've accepted the broad use of prosthetics to fix those missing limbs. If we let this keep going, someday someone is going to 'cure' cancer and heart disease. Worse, what happens if they 'fix' dementia? That's hubris against the will of nature certainly.
I say this in jest mostly but we've turned a corner over the last 10 years where this is becoming an oddly quasi-religious movement denigrating medical advancement as a problem. One of the big places this is happening is with mental health and autism research. Certain online communities are calling it 'neurodivergent' and claiming special status while those who actually suffer from bipolar, schizophrenia, or autism would love nothing more than to be 'fixed.'
I've always loved the idea of Jordie's eyewear. It's an aspiration that a man could be 'fixed' and become a crucial member of a crew when he'd be unable to do so without that technology. I applaud Musk for investing money into the technology, hyperbole aside... but hyperbole is also the basis for all of the technology we enjoy today. Musk is already making headway with Neuralink in ways people, just 5 years ago, denigrated.
The self fulfilling prophecies and the desire to fill a void as a species. Don't we love to operate within the triangle of collective wisdom, collective stupidity and self sabotage 😅
Are we on the brink of a breakthrough, or blindly following our own techno-optimistic illusions?
What would such a framework look like? "within a responsible and ethical framework"