Frankenstein: “THE M.O.N.S.T.E.R.”, WAS “NOT” THE MOST EVIL CREATURE=IT’S/HIS CREATOR WAS THE MOST EVIL/DANGEROUS DEV IL!! Elon wants to put a chip in your brain, AND Why you shouldn’t trust AI search engines

February 15, 2023Hello, Insiders! This week, I’m exploring the new realities of artificial intelligence. Up today is a topic that strikes close to home: AI and journalism. From the printing press to the whole dang internet, journalism has benefited and suffered mightily thanks to technology. AI is no exception. Some will see it as a job-stealing threat. ChatGPT, for example, can summarize and repackage information but doesn’t give credit. That’s not good. More reasons to be wary: Sports Illustrated’s publisher tried using AI instead of human writers. The first article had some significant errors. CNET had to issue a string of corrections on an AI article. And some AI tools have the bad habit of stealing other people’s work.  As the editor of a large news organization, I plan to have lots of conversations with the team about AI’s benefits and drawbacks. I’d love to include you in that. So please send your thoughts: insidertoday@insider.com. Stay tuned for more on this.— Nicholas Carlson  In today’s edition: Brain chips could change you, Salesforce applies pressure on some employees, and the Ohio train derailment released potentially toxic chemicals. If this was forwarded to you, sign up here.  
THE LATESTRussian tanks, Tesla crash, & Amazon leak Russia’s tanks may face “disadvantageous conditions” in a battle with US-made Abrams tanks, a defense expert at a Russian think tank says. Read why. The wife of the man accused of intentionally driving his family off a 250-foot cliff doesn’t want him prosecuted. More hereA leaked recording shows Amazon CEO Andy Jassy delivering a brutally honest rallying cry for employees to “redefine” the company. Our scoop here.
THE BIG STORYBrain-chip dangers

Arif Qazi / Insider 
Elon Musk wants to put a computer chip in your brain. Neuralink, Musk’s neurotech startup, has been working toward this goal since its founding in 2016, and it recently announced plans to start human trials in the next several months. The idea is to translate brain signals into digital outputs — like being able to send a text message or type with just a thought. And Musk wants it to be mainstream. A “Fitbit in your skull,” as he once put it. But there are real dangers and unique ethical pitfalls to neural implants. Compelling evidence suggests the devices can cause cognitive changes beyond the scope of their intended applications — and there’s a risk of growing overly dependent on them as well. For some people, the devices have even changed their personality.
https://www.businessinsider.com/brain-chips-elon-musk-neuralink-change-personality-behavior-computer-tech-2023-2?utm_source=Sailthru&utm_medium=email&utm_campaign=Insider%20Today%2C%20February%2015%2C%202023&utm_term=INSIDER%20TODAY%20SEND%20LIST%20-%20ALL%20ENGAGED

How brain chips can change you

Studies show that Elon Musk’s new tech can bend your mind in strange and troubling ways

Research has found that brain chips can warp your sense of self. Arif Qazi / Insider

Evan Malmgren

Evan Malmgren

Feb 15, 2023, 2:02 AM

Elon Musk wants to put a computer chip in your brain. Well, maybe not in your brain, but in the brain of some human somewhere. 

Musk’s neurotech startup, Neuralink, has been working toward implanting its skull-embedded brain chip in a human since it was founded in 2016. After years of testing on animal subjects, Musk announced in December that the company planned to initiate human trials within six months (though this wasn’t the first time he’d said these trials were on the horizon).

Neuralink has spent over half a decade figuring out how to translate brain signals into digital outputs — imagine being able to move a cursor, send a text message, or type in a word processor with just a thought. While the initial focus is on medical use cases, such as helping paralyzed people communicate, Musk has aspired to take Neuralink’s chips mainstream — to, as he’s said, put a “Fitbit in your skull.” 

Musk’s company is far from the only group working on brain-computer interfaces, or systems to facilitate direct communication between human brains and external computers. Other researchers have been looking into using BCIs to restore lost senses and control prosthetic limbs, among other applications. While these technologies are still in their infancy, they’ve been around long enough for researchers to increasingly get a sense of how neural implants interact with our minds. As Anna Wexler, an assistant professor of philosophy in the Department of Medical Ethics and Health Policy at the University of Pennsylvania, put it: “Of course it causes changes. The question is what kinds of changes does it cause, and how much do those changes matter?”

Intervening in the delicate operation of a human brain is a sticky business, and the effects are not always desirable or intended. People using BCIs can feel a profound sense of dependency on the devices, or as though their sense of self has been altered. Before we reach the point where people are lining up to get a smartphone implanted in their brain, it’s important to grapple with their dangers and unique ethical pitfalls.

From science fiction to a billion-dollar industry

In the 1974 film “The Terminal Man,” a man gets an invasive brain implant to help with his seizures. While the operation initially seems to be a success, things go awry when sustained exposure to the chip sends him on a psychotic rampage. As is typically the case in sci-fi movies, a scientist warns of the disaster early in the story by comparing the implants to the lobotomies of the 1940s and 1950s. “They created an unknown number of human vegetables,” he says. “Those operations were carried out by physicians who were too eager to act.”

While humans have yet to produce flying cars, man missions to Mars, or engineer convincing replicants, BCIs may be the most significant technology to not only catch up to but in some cases surpass their early sci-fi depictions. More than 200,000 people around the world already use some kind of BCI, mostly for medical reasons. Perhaps the best-known use case is cochlear implants, which enable deaf people to, in a sense, hear. Another preeminient use case is in epileptic-seizure prevention: Existing devices can monitor brain-signal activity to predict seizures and warn the person so that they can avoid certain activities or take preventive medication. Some researchers have proposed systems that would not only detect but preempt seizures with electrical stimulation, almost exactly the mechanism depicted in “The Terminal Man.” Implants for people with Parkinson’s disease, depression, OCD, and epilepsy have been in human trials for years.

Recent improvements in artificial intelligence and neural-probing materials have made the devices less invasive and more scalable, which has naturally attracted a wave of private and military funding. Paradromics, Blackrock Neurotech, and Synchron are just a few venture-backed competitors working on devices for paralyzed people. Last November, a startup called Science unveiled a concept for a bioelectric interface to help treat blindness. And last September, Magnus Medical got approval from the Food and Drug Administration for a targeted brain-stimulation therapy for major depressive disorder. 

Neuralink, meanwhile, has been dogged by a history of overhyped promises — failing to deliver on timelines, for example, and reportedly triggering a federal investigation into claims of animal-welfare violations. The market-intelligence firm Grand View Research valued the global brain-implants market at $4.9 billion in 2021, and other firms have projected that the figure could double by 2030. 

For now BCIs are constrained to the medical domain, but a vast array of nonmedical uses have been proposed for the technology. Research published in 2018 described participants using BCIs to interface with numerous apps on an Android tablet, including typing, messaging, and searching the web just by imagining relevant movements. More speculative applications include playing video games, manipulating virtual reality, or even receiving data inputs like text messages or videos directly, bypassing the need for a monitor. These may sound like science fiction, but the reality is that we’ve reached a point where the cultural and ethical barriers to this kind of tech have begun to outpace technical ones. And despite the fictional nature of “The Terminal Man,” its disastrous turn raises real questions about unintentional effects of BCIs.

A changed mind

There have been no confirmed cases of “Terminal Man”-style violent rampages caused by BCIs, but compelling evidence suggests the devices can cause cognitive changes beyond the scope of their intended applications.

Some of these changes have been positive; after all, BCIs are intended to change certain things about their users. Wexler, the University of Pennsylvania philosophy professor, interviewed people with Parkinson’s who were undergoing deep-brain stimulation, a surgical treatment that involves implanting thin metal wires that send electrical pulses to the brain to help abate motor symptoms, and found that many had lost their sense of self before undergoing treatment. “Many felt that the disease had robbed them, in some ways, of who they were,” she told me. “It really impacts your identity, your sense of self, if you can’t do the things that you think of yourself as being able to do.” In these instances, BCIs helped the people feel like they were returning to themselves by helping treat the underlying disease.

deep brain stimulation brain surgery
A woman undergoes deep brain stimulation surgery — a type of brain-computer interface that can help people with Parkinson’s Disease.

Eran Klein and Sara Goering, researchers at the University of Washington, have similarly noticed positive changes in personality and self-perception among people using BCIs. In a 2016 paper on attitudes and ethical considerations surrounding DBS, they reported that study participants often felt that the treatment helped them recapture an “authentic” self that had been worn away by depression or obsessive-compulsive disorder. “I’ve begun to wonder what’s me, and what’s the depression, and what’s the stimulator,” one patient said. In a talk in late 2022 on similar research, the neuropsychologist Cynthia Kubu described a heightened sense of control and autonomy among patients she’d interviewed.

But not all the changes that researchers have found are beneficial. In interviews with people who’ve had BCIs, Frederic Gilbert, a philosophy professor at the University of Tasmania specializing in applied neuroethics, has noticed some odd effects. “The notions of personality, identity, agency, authenticity, autonomy, and self — these are very compact, obscure, and opaque dimensions,” Gilbert told me. “Nobody really agrees on what they mean, but we have cases where it’s clear that BCIs have induced changes in personality or expression of sexuality.”

Across numerous interview studies, Gilbert has noticed patients report feelings of not recognizing themselves, or what is typically referred to as “estrangement” in the research. “They know that they are themselves, but it’s not like it was prior to the implantation,” he said. Some expressed feelings of having new capacities unrelated to their implants, such as a woman in her late 50s who hurt herself while attempting to lift a pool table she’d thought she could move on her own. While some estrangement could be beneficial — if it results in a healthy sense of self-esteem, for example — negative instances, known as deteriorative estrangement, can be quite vexing. “It has led to extreme cases where there has been attempted suicide,” Gilbert said.

For people using BCIs to help with a significant medical limitation, it makes sense that the treatment would have a positive psychological effect. But when it comes to considering brain chips for popular use, there’s much more concern about downsides.

A smartphone in your brain

As the technology improves, we get closer to Musk’s “Fitbit in your skull” vision. But there’s reason to be cautious. After all, if it’s easy to get addicted to your phone, just think how much more addicting it could be if it were wired directly into your brain. 

Gilbert told me about one patient he had interviewed who developed a kind of decision paralysis, eventually feeling as if they couldn’t go out or decide what to eat without first consulting the device that showed what was going on in their brain. “There is nothing wrong with having a device that is completing a decision,” Gilbert said, “but at the end, the device was kind of supplanting the person in the decision, kicking them out of the loop.”

Sometimes a patient can come to rely so much on their device that they feel like they can’t function without it. Gilbert has encountered many study participants who have fallen into depression upon losing support for their devices and having them removed, often simply because a given trial expired or ran out of funding. “You grow gradually into it and get used to it,” an anonymous study participant who’d received a device to detect signs of epileptic activity said in an interview. “It became me.”

This kind of dependence is further complicated by the fact that BCIs are difficult to support financially and maintain, often requiring invasive brain surgery to remove and reimplant them. Since BCIs are largely still in the trial phase, there’s a lack of universal standards or stable financial support, and many devices are at risk of abruptly losing funding. Early adopters could have their sense of self disrupted by supply-chain issues, hardware updates, or a company’s bankruptcy.

There are also privacy concerns that come with a computer getting access to your brain waves. “If you get a device to help you move your prosthetic arm, for instance, that device will pick up other sources of noise that you may not want to be out of your brain,” Gilbert said. “There is a lot of background noise, and that background noise can be deciphered. That noise is necessarily converted, sitting somewhere on the cloud.” Someone could learn a lot by studying your brain waves, and if a hacker managed to access your data, they could read your mind, in a sense, by looking for specific expressions of brain-signal activity.

Since BCIs are still mainly constrained to the medical field, most early adopters are happy to make these kinds of trade-offs. “If someone has a disability that makes it so that they can’t communicate,” Wexler said, they’re “generally pretty happy if there’s a technology that then allows them to do so.” But, putting aside the idea that nonmedical BCIs would likely introduce a host of new problems, it’s less clear that the trade-offs would be worth it just to have a Fitbit in your head.

While we’re still a long way away from the cyborgian future of electronically interconnected minds prophesied by people like Elon Musk, the industry’s accelerating growth compounds the urgency of ethical considerations once constrained to science fiction. If a brain chip can change key parts of your personality, companies should not be rushing to put them in people’s heads. Wexler told me that while most people in the industry aren’t that open to using BCIs as a consumer product, they still think it’s likely to happen. If it does, she said, “the whole risk-benefit trade-off changes.”


Evan Malmgren is a writer who covers power and infrastructure and is currently working on a book about American off-gridders.

_________________________________________________________________

https://en.wikipedia.org/wiki/Frankenstein

Volume I, first edition
AuthorMary Shelley
CountryUnited Kingdom
LanguageEnglish
GenreGothic novel, literary fiction, horror fiction, science fiction[1]
Set inEngland, Ireland, Italy, France, Scotland, Switzerland, Russia, Germany; late 18th century
Published1 January 1818; 205 years ago
PublisherLackington, Hughes, Harding, Mavor & Jones
Pages280
Dewey Decimal823.7
LC ClassPR5397 .F7
Preceded byHistory of a Six Weeks’ Tour 
Followed byValperga (roman) 
TextFrankenstein;
or, The Modern Prometheus
at Wikisource

THE “REAL LIFE EVIL/SATANIC/EVIL-DOER(S) AND GREEDY/WEALTHY HUMANS LIKE THIS ELON MUSK HATER, ARE THE CREATORS, AND WANT TO CONTROL “THE MONSTERS” AND ALSO MAKE “YOU=THEIR MONSTERS, CONTROLLED BY “CHIPPING”!!!!! AS “IF” THE CURRENT “A.I.”/”ELECTRONIC ZOMBIES CONTROLS ARE NOT GOOD ENOUGH???//!!!!!

WELLL, WHO’S MONITORING THE MONITORS, THAT’S DOING THE MONITORING????

Enemy of the State (1998) – Regina King as Carla Dean: Carla Dean : Well, who’s gonna monitor the monitors of the monitors?

The Warrior

I am Honored to be Your Friend: we "HONOR" WOMEN & MOMS, and MILITARY Females with our NEW, EXCITING "G.i.J.i.M.O.M." Series: http://thesiborg.com/ http://familymediasite.com/ http://tdmcomics.com/

We are ®Reece ENTERPRISES/©REECENETRICS™/®FAMILY MEDIA COMPANY™/©TDM Comics International; a small but slowly/Strategically growing group of Companies, Creating Comics, and Entertainment Products & “Brands” geared Towards the World Wide Diverse People, of many Cultures and Nations to “spread the love of Positive Images for peoples of All Colors, World wide!”

Our Comics Books have Different Strategic Designs, as Our Own Special ways of Supporting Literacy, Reading, and The ARTS & Libraries of Education.

Terry Reece, aka “the Warrior” Super Hero
Founder/Chairman/CEO
Writer/Copywriter/Creator of The Closet Cove and the L.A.Z.E.R.U.S. project, and the "G.i.J.i.M.O.M." Series Brand
warrior_75210@yahoo.com