Javascript required
Skip to content Skip to sidebar Skip to footer

How to Upload Your Consciousness to a Computer

Hypothetical process of digitally emulating a brain

Mind uploading, also known as whole brain emulation (WBE), is the theoretical futuristic process of scanning a physical structure of the brain accurately enough to create an emulation of the mental state (including long-term memory and "self") and transferring or copying it to a computer in a digital grade. The reckoner would then run a simulation of the brain's information processing, such that it would respond in essentially the same way as the original brain and experience having a sentient witting mind.[1] [2] [three]

Substantial mainstream enquiry in related areas is being conducted in animal encephalon mapping and simulation, development of faster supercomputers, virtual reality, brain–computer interfaces, connectomics, and information extraction from dynamically functioning brains.[4] According to supporters, many of the tools and ideas needed to achieve listen uploading already exist or are currently nether active development; however, they will admit that others are, every bit nonetheless, very speculative, but say they are still in the realm of engineering possibility.

Mind uploading may potentially be accomplished by either of two methods: copy-and-upload or copy-and-delete by gradual replacement of neurons (which can be considered every bit a gradual destructive uploading), until the original organic brain no longer exists and a computer program emulating the brain takes command over the torso. In the case of the one-time method, heed uploading would be achieved by scanning and mapping the salient features of a biological brain, and so by storing and copying, that information state into a computer organisation or another computational device. The biological brain may non survive the copying procedure or may be deliberately destroyed during it in some variants of uploading. The faux mind could exist within a virtual reality or simulated world, supported by an anatomic 3D body simulation model. Alternatively, the simulated heed could reside in a computer within (or either continued to or remotely controlled) a (not necessarily humanoid) robot or a biological or cybernetic body.[five]

Among some futurists and within the part of transhumanist motion, mind uploading is treated as an important proposed life extension technology. Some believe heed uploading is humanity'south current best selection for preserving the identity of the species, equally opposed to cryonics. Another aim of mind uploading is to provide a permanent fill-in to our "mind-file", to enable interstellar infinite travel, and a means for human civilisation to survive a global disaster by making a functional copy of a human society in a computing device. Whole-encephalon emulation is discussed by some futurists as a "logical endpoint"[5] of the topical computational neuroscience and neuroinformatics fields, both most encephalon simulation for medical enquiry purposes. It is discussed in artificial intelligence research publications as an arroyo to strong AI (bogus general intelligence) and to at least weak superintelligence. Some other approach is seed AI, which wouldn't be based on existing brains. Computer-based intelligence such as an upload could think much faster than a biological human even if it were no more than intelligent. A large-scale social club of uploads might, co-ordinate to futurists, give rise to a technological singularity, meaning a sudden time constant subtract in the exponential evolution of engineering science.[6] Listen uploading is a central conceptual feature of numerous science fiction novels, films, and games.

Overview [edit]

The established neuroscientific consensus is that the homo mind is largely an emergent belongings of the information processing of its neuronal network.[seven]

Neuroscientists have stated that important functions performed by the mind, such as learning, memory, and consciousness, are due to purely concrete and electrochemical processes in the brain and are governed by applicable laws. For example, Christof Koch and Giulio Tononi wrote in IEEE Spectrum:

Consciousness is part of the natural earth. It depends, nosotros believe, only on mathematics and logic and on the imperfectly known laws of physics, chemical science, and biological science; it does non ascend from some magical or otherworldly quality.[viii]

The concept of mind uploading is based on this mechanistic view of the listen, and denies the vitalist view of human life and consciousness.[9]

Eminent computer scientists and neuroscientists accept predicted that advanced computers will exist capable of idea and even reach consciousness, including Koch and Tononi,[8] Douglas Hofstadter,[x] Jeff Hawkins,[10] Marvin Minsky,[11] Randal A. Koene, and Rodolfo Llinás.[12]

Many theorists have presented models of the brain and accept established a range of estimates of the corporeality of computing ability needed for partial and complete simulations.[5] [ citation needed ] Using these models, some have estimated that uploading may go possible within decades if trends such equally Moore's law go along.[xiii]

Theoretical benefits and applications [edit]

"Immortality" or fill-in [edit]

In theory, if the data and processes of the listen can exist disassociated from the biological body, they are no longer tied to the individual limits and lifespan of that body. Furthermore, information inside a encephalon could exist partly or wholly copied or transferred to one or more than other substrates (including digital storage or another brain), thereby – from a purely mechanistic perspective – reducing or eliminating "mortality risk" of such information. This full general proposal was discussed in 1971 past biogerontologist George M. Martin of the University of Washington.[14]

Space exploration [edit]

An "uploaded astronaut" could be used instead of a "live" astronaut in human spaceflight, avoiding the perils of zero gravity, the vacuum of space, and catholic radiation to the human body. It would allow for the use of smaller spacecraft, such as the proposed StarChip, and it would enable well-nigh unlimited interstellar travel distances.[15]

Relevant technologies and techniques [edit]

The focus of mind uploading, in the case of copy-and-transfer, is on information acquisition, rather than data maintenance of the encephalon. A set of approaches known as loosely coupled off-loading (LCOL) may be used in the attempt to characterize and copy the mental contents of a brain.[sixteen] The LCOL approach may take reward of self-reports, life-logs and video recordings that can be analyzed past artificial intelligence. A bottom-up approach may focus on the specific resolution and morphology of neurons, the fasten times of neurons, the times at which neurons produce action potential responses.

Computational complexity [edit]

Estimates of how much processing power is needed to emulate a human brain at various levels, along with the fastest and slowest supercomputers from TOP500 and a $k PC. Note the logarithmic scale. The (exponential) trend line for the fastest supercomputer reflects a doubling every 14 months. Kurzweil believes that mind uploading will exist possible at neural simulation, while the Sandberg & Bostrom report is less sure about where consciousness arises.[17]

Advocates of listen uploading point to Moore's police to support the notion that the necessary computing ability is expected to go available within a few decades. However, the actual computational requirements for running an uploaded man mind are very difficult to quantify, potentially rendering such an statement specious.

Regardless of the techniques used to capture or recreate the office of a human heed, the processing demands are likely to exist immense, due to the big number of neurons in the human being encephalon along with the considerable complexity of each neuron.

In 2004, Henry Markram, lead researcher of the Blue Brain Project, stated that "it is not [their] goal to build an intelligent neural network", based solely on the computational demands such a project would have.[eighteen]

It volition exist very difficult because, in the brain, every molecule is a powerful computer and we would need to simulate the structure and function of trillions upon trillions of molecules also every bit all the rules that govern how they interact. You would literally need computers that are trillions of times bigger and faster than annihilation existing today.[xix]

Five years subsequently, afterwards successful simulation of function of a rat brain, Markram was much more than assuming and optimistic. In 2009, as director of the Blue Brain Project, he claimed that "A detailed, functional artificial man brain can exist built inside the next ten years".[20] Less than ii years into information technology, the projection was recognised to be mismanaged and its claims overblown, and Markram was asked to stride downward.[21] [22]

Required computational capacity strongly depend on the chosen level of simulation model scale:[5]

Level CPU demand
(FLOPS)
Retention need
(Tb)
$i million super‐computer
(Primeval yr of making)
Analog network population model 1015 102 2008
Spiking neural network 1018 10iv 2019
Electrophysiology ten22 104 2033
Metabolome x25 106 2044
Proteome 1026 107 2048
States of poly peptide complexes x27 ten8 2052
Distribution of complexes 1030 xix 2063
Stochastic behavior of single molecules 1043 x14 2111
Estimates from Sandberg, Bostrom, 2008

Scanning and mapping scale of an individual [edit]

When modelling and simulating the brain of a specific individual, a brain map or connectivity database showing the connections between the neurons must exist extracted from an anatomic model of the brain. For whole encephalon simulation, this network map should show the connectivity of the whole nervous system, including the spinal cord, sensory receptors, and muscle cells. Subversive scanning of a small sample of tissue from a mouse brain including synaptic details is possible as of 2010.[23]

All the same, if short-term retention and working memory include prolonged or repeated firing of neurons, as well equally intra-neural dynamic processes, the electrical and chemical betoken state of the synapses and neurons may be difficult to extract. The uploaded mind may then perceive a memory loss of the events and mental processes immediately before the time of brain scanning.[5]

A full brain map has been estimated to occupy less than ii x 1016 bytes (20,000 TB) and would store the addresses of the connected neurons, the synapse blazon and the synapse "weight" for each of the brains' 10xv synapses.[5] [ failed verification ] However, the biological complexities of truthful brain role (e.g. the epigenetic states of neurons, protein components with multiple functional states, etc.) may foreclose an authentic prediction of the volume of binary data required to faithfully represent a functioning homo mind.

Series sectioning [edit]

Serial sectioning of a brain

A possible method for mind uploading is serial sectioning, in which the encephalon tissue and perhaps other parts of the nervous system are frozen and then scanned and analyzed layer by layer, which for frozen samples at nano-scale requires a cryo-ultramicrotome, thus capturing the structure of the neurons and their interconnections.[24] The exposed surface of frozen nerve tissue would be scanned and recorded, and then the surface layer of tissue removed. While this would be a very tiresome and labor-intensive process, enquiry is currently underway to automate the collection and microscopy of series sections.[25] The scans would and then be analyzed, and a model of the neural net recreated in the organisation that the heed was existence uploaded into.

There are uncertainties with this approach using electric current microscopy techniques. If it is possible to replicate neuron function from its visible structure solitary, and so the resolution afforded by a scanning electron microscope would suffice for such a technique.[25] However, equally the role of brain tissue is partially determined by molecular events (particularly at synapses, simply too at other places on the neuron's cell membrane), this may not suffice for capturing and simulating neuron functions. It may be possible to extend the techniques of serial sectioning and to capture the internal molecular makeup of neurons, through the use of sophisticated immunohistochemistry staining methods that could and then exist read via confocal laser scanning microscopy. However, as the physiological genesis of 'heed' is not currently known, this method may non be able to access all of the necessary biochemical information to recreate a man brain with sufficient fidelity.

Encephalon imaging [edit]

Process from MRI conquering to whole brain structural network[26]

It may be possible to create functional 3D maps of the brain activity, using avant-garde neuroimaging engineering, such equally functional MRI (fMRI, for mapping change in blood flow), magnetoencephalography (1000000, for mapping of electrical currents), or combinations of multiple methods, to build a detailed three-dimensional model of the brain using not-invasive and non-destructive methods. Today, fMRI is often combined with MEG for creating functional maps of human cortex during more complex cognitive tasks, as the methods complement each other. Even though current imaging technology lacks the spatial resolution needed to gather the data needed for such a scan, important contempo and hereafter developments are predicted to substantially improve both spatial and temporal resolutions of existing technologies.[27]

Encephalon simulation [edit]

There is ongoing work in the field of brain simulation, including partial and whole simulations of some animals. For example, the C. elegans roundworm, Drosophila fruit wing, and mouse have all been simulated to diverse degrees.[ citation needed ]

The Blue Brain Project by the Brain and Mind Establish of the École Polytechnique Fédérale de Lausanne, Switzerland is an attempt to create a constructed brain by reverse-engineering mammalian brain circuitry.

Bug [edit]

Practical bug [edit]

Kenneth D. Miller, a professor of neuroscience at Columbia and a co-director of the Center for Theoretical Neuroscience, raised doubts nearly the practicality of mind uploading. His major argument is that reconstructing neurons and their connections is in itself a formidable chore, just it is far from being sufficient. Operation of the encephalon depends on the dynamics of electrical and biochemical indicate exchange between neurons; therefore, capturing them in a single "frozen" state may prove insufficient. In addition, the nature of these signals may require modeling down to the molecular level and beyond. Therefore, while non rejecting the idea in principle, Miller believes that the complication of the "accented" duplication of an individual mind is insurmountable for the nearest hundreds of years.[28]

Philosophical problems [edit]

Underlying the concept of "mind uploading" (more accurately "mind transferring") is the broad philosophy that consciousness lies within the brain's information processing and is in essence an emergent feature that arises from large neural network high-level patterns of system, and that the same patterns of organization can be realized in other processing devices. Mind uploading also relies on the thought that the human mind (the "cocky" and the long-term retentiveness), just like non-homo minds, is represented by the current neural network paths and the weights of the brain synapses rather than by a dualistic and mystic soul and spirit. The mind or "soul" can be divers as the information state of the brain, and is immaterial just in the same sense as the information content of a information file or the state of a computer software currently residing in the work-space retentiveness of the figurer. Information specifying the information state of the neural network can be captured and copied as a "computer file" from the brain and re-implemented into a different physical form.[29] This is not to deny that minds are richly adapted to their substrates.[30] An analogy to the thought of mind uploading is to copy the temporary information state (the variable values) of a computer plan from the estimator memory to another computer and continue its execution. The other computer may perhaps have different hardware architecture just emulates the hardware of the offset computer.

These bug accept a long history. In 1775, Thomas Reid wrote:[31] "I would be glad to know... whether when my brain has lost its original construction, and when some hundred years after the same materials are fabricated and then curiously equally to become an intelligent being, whether, I say that being will be me; or, if, two or 3 such beings should be formed out of my encephalon; whether they will all be me, and consequently 1 and the same intelligent existence."

A considerable portion of transhumanists and singularitarians place smashing hope into the belief that they may get immortal, by creating ane or many non-biological functional copies of their brains, thereby leaving their "biological shell". However, the philosopher and transhumanist Susan Schneider claims that at best, uploading would create a copy of the original person'southward listen.[32] Schneider agrees that consciousness has a computational basis, but this does not mean we can upload and survive. Co-ordinate to her views, "uploading" would probably event in the death of the original person's encephalon, while only outside observers can maintain the illusion of the original person still being alive. For it is implausible to retrieve that one's consciousness would leave one's brain and travel to a remote location; ordinary physical objects do not behave this manner. Ordinary objects (rocks, tables, etc.) are not simultaneously hither, and elsewhere. At best, a copy of the original listen is created.[32] Neural correlates of consciousness, a sub-branch of neuroscience, states that consciousness may be thought of as a land-dependent holding of some undefined complex, adaptive, and highly interconnected biological arrangement.[33]

Others have argued against such conclusions. For case, Buddhist transhumanist James Hughes has pointed out that this consideration just goes then far: if 1 believes the self is an illusion, worries most survival are not reasons to avoid uploading,[34] and Keith Wiley has presented an statement wherein all resulting minds of an uploading procedure are granted equal primacy in their claim to the original identity, such that survival of the cocky is determined retroactively from a strictly subjective position.[35] [36] Some take also asserted that consciousness is a part of an extra-biological system that is however to be discovered; therefore information technology cannot be fully understood under the present constraints of neurobiology. Without the transference of consciousness, true listen-upload or perpetual immortality cannot be practically achieved.[37]

Some other potential consequence of heed uploading is that the decision to "upload" may so create a mindless symbol manipulator instead of a conscious heed (run into philosophical zombie).[38] [39] Are we to presume that an upload is conscious if it displays behaviors that are highly indicative of consciousness? Are we to assume that an upload is conscious if it verbally insists that it is witting?[40] Could there be an absolute upper limit in processing speed above which consciousness cannot be sustained? The mystery of consciousness precludes a definitive answer to this question.[41] Numerous scientists, including Kurzweil, strongly believe that the answer as to whether a separate entity is conscious (with 100% confidence) is fundamentally unknowable, since consciousness is inherently subjective (see solipsism). Regardless, some scientists strongly believe consciousness is the result of computational processes which are substrate-neutral. On the contrary, numerous scientists believe consciousness may be the event of some grade of quantum computation dependent on substrate (see breakthrough mind).[42] [43] [44]

In lite of uncertainty on whether to regard uploads as witting, Sandberg proposes a cautious approach:[45]

Principle of bold the well-nigh (PAM): Assume that any emulated arrangement could have the same mental backdrop as the original organization and care for it correspondingly.

Ethical and legal implications [edit]

The process of developing emulation applied science raises ethical issues related to animal welfare and bogus consciousness.[45] The neuroscience required to develop brain emulation would crave beast experimentation, first on invertebrates and so on pocket-size mammals before moving on to humans. Sometimes the animals would just need to be euthanized in order to excerpt, piece, and browse their brains, but sometimes behavioral and in vivo measures would be required, which might cause pain to living animals.[45]

In improver, the resulting animal emulations themselves might suffer, depending on one's views about consciousness.[45] Bancroft argues for the plausibility of consciousness in encephalon simulations on the footing of the "fading qualia" idea experiment of David Chalmers. He and then concludes:[46] "If, as I fence above, a sufficiently detailed computational simulation of the brain is potentially operationally equivalent to an organic brain, information technology follows that we must consider extending protections against suffering to simulations."

It might help reduce emulation suffering to develop virtual equivalents of amazement, likewise as to omit processing related to pain and/or consciousness. Even so, some experiments might require a fully functioning and suffering fauna emulation. Animals might too suffer by blow due to flaws and lack of insight into what parts of their brains are suffering.[45] Questions also ascend regarding the moral status of partial brain emulations, as well as creating neuromorphic emulations that draw inspiration from biological brains just are congenital somewhat differently.[46]

Brain emulations could exist erased by computer viruses or malware, without demand to destroy the underlying hardware. This may brand assassination easier than for physical humans. The attacker might accept the calculating power for its own use.[47]

Many questions ascend regarding the legal personhood of emulations.[48] Would they be given the rights of biological humans? If a person makes an emulated copy of themselves and then dies, does the emulation inherit their property and official positions? Could the emulation ask to "pull the plug" when its biological version was terminally ill or in a coma? Would information technology help to care for emulations as adolescents for a few years and then that the biological creator would maintain temporary command? Would criminal emulations receive the death penalty, or would they be given forced information modification as a class of "rehabilitation"? Could an upload take marriage and child-intendance rights?[48]

If simulated minds would come up truthful and if they were assigned rights of their own, it may be difficult to ensure the protection of "digital human rights". For example, social scientific discipline researchers might be tempted to secretly expose simulated minds, or whole isolated societies of simulated minds, to controlled experiments in which many copies of the same minds are exposed (serially or simultaneously) to unlike examination weather.[ commendation needed ]

Political and economical implications [edit]

Emulations could create a number of conditions that might increase risk of state of war, including inequality, changes of power dynamics, a possible technological arms race to build emulations first, showtime-strike advantages, strong loyalty and willingness to "die" amongst emulations, and triggers for racist, xenophobic, and religious prejudice.[47] If emulations run much faster than humans, there might not be enough time for human leaders to make wise decisions or negotiate. It is possible that humans would react violently against growing power of emulations, especially if they depress human wages. Emulations may not trust each other, and fifty-fifty well-intentioned defensive measures might be interpreted equally crime.[47]

Emulation timelines and AI risk [edit]

There are very few feasible technologies that humans have refrained from developing. The neuroscience and computer-hardware technologies that may make encephalon emulation possible are widely desired for other reasons, and logically their development will keep into the future. Bold that emulation technology will arrive, a question becomes whether we should accelerate or slow its advance.[47]

Arguments for speeding upward encephalon-emulation research:

  • If neuroscience is the bottleneck on brain emulation rather than computing power, emulation advances may be more erratic and unpredictable based on when new scientific discoveries happen.[47] [49] [50] Express calculating power would hateful the first emulations would run slower and and so would exist easier to adapt to, and there would be more than time for the technology to transition through lodge.[fifty]
  • Improvements in manufacturing, 3D printing, and nanotechnology may accelerate hardware production,[47] which could increase the "computing overhang"[51] from backlog hardware relative to neuroscience.
  • If one AI-development grouping had a lead in emulation engineering, information technology would have more subjective time to win an artillery race to build the first superhuman AI. Considering it would be less rushed, it would have more freedom to consider AI risks.[52] [53]

Arguments for slowing downward encephalon-emulation inquiry:

  • Greater investment in brain emulation and associated cognitive science might heighten the ability of artificial intelligence (AI) researchers to create "neuromorphic" (brain-inspired) algorithms, such as neural networks, reinforcement learning, and hierarchical perception. This could accelerate risks from uncontrolled AI.[47] [53] Participants at a 2011 AI workshop estimated an 85% probability that neuromorphic AI would arrive before encephalon emulation. This was based on the thought that encephalon emulation would require understanding some encephalon components, and it would exist easier to tinker with these than to reconstruct the entire encephalon in its original course. Past a very narrow margin, the participants on balance leaned toward the view that accelerating brain emulation would increase expected AI hazard.[52]
  • Waiting might requite society more time to call up about the consequences of brain emulation and develop institutions to meliorate cooperation.[47] [53]

Emulation inquiry would also speed up neuroscience as a whole, which might accelerate medical advances, cognitive enhancement, prevarication detectors, and adequacy for psychological manipulation.[53]

Emulations might be easier to control than de novo AI considering

  1. Human abilities, behavioral tendencies, and vulnerabilities are more than thoroughly understood, thus control measures might be more intuitive and easier to plan for.[52] [53]
  2. Emulations could more easily inherit human motivations.[53]
  3. Emulations are harder to manipulate than de novo AI, considering brains are messy and complicated; this could reduce risks of their rapid takeoff.[47] [53] Also, emulations may be bulkier and require more hardware than AI, which would also slow the speed of a transition.[53] Dissimilar AI, an emulation wouldn't be able to rapidly expand across the size of a human brain.[53] Emulations running at digital speeds would have less intelligence differential vis-à-vis AI and so might more hands control AI.[53]

As counterpoint to these considerations, Bostrom notes some downsides:

  1. Even if nosotros ameliorate empathize human being behavior, the evolution of emulation beliefs under self-improvement might be much less anticipated than the evolution of safe de novo AI under cocky-improvement.[53]
  2. Emulations may non inherit all human motivations. Possibly they would inherit our darker motivations or would carry abnormally in the unfamiliar surround of cyberspace.[53]
  3. Even if in that location'southward a slow takeoff toward emulations, at that place would still be a 2nd transition to de novo AI subsequently on. Two intelligence explosions may hateful more total risk.[53]

Because of the postulated difficulties that a whole brain emulation-generated superintelligence would pose for the control problem, calculator scientist Stuart J. Russell in his book Human being Compatible rejects creating 1, simply calling it "then obviously a bad thought".[54]

Advocates [edit]

Ray Kurzweil, director of applied science at Google, has long predicted that people will be able to "upload" their entire brains to computers and become "digitally immortal" past 2045. Kurzweil made this claim for many years, e.thou. during his oral communication in 2013 at the Global Futures 2045 International Congress in New York, which claims to subscribe to a similar set of beliefs.[55] Listen uploading has also been advocated past a number of researchers in neuroscience and artificial intelligence, such every bit the late Marvin Minsky.[ citation needed ] In 1993, Joe Strout created a minor web site called the Mind Uploading Home Page, and began advocating the thought in cryonics circles and elsewhere on the net. That site has not been actively updated in recent years, just it has spawned other sites including MindUploading.org, run past Randal A. Koene, who as well moderates a mailing list on the topic. These advocates see mind uploading as a medical procedure which could eventually save countless lives.

Many transhumanists look frontwards to the development and deployment of mind uploading technology, with transhumanists such as Nick Bostrom predicting that it will become possible within the 21st century due to technological trends such every bit Moore'southward police.[v]

Michio Kaku, in collaboration with Scientific discipline, hosted a documentary, Sci Fi Science: Physics of the Impossible, based on his book Physics of the Impossible. Episode four, titled "How to Teleport", mentions that heed uploading via techniques such as breakthrough entanglement and whole encephalon emulation using an advanced MRI machine may enable people to be transported vast distances at about calorie-free-speed.

The book Beyond Humanity: CyberEvolution and Future Minds by Gregory S. Paul & Earl D. Cox, is about the eventual (and, to the authors, almost inevitable) evolution of computers into sentient beings, only also deals with human mind transfer. Richard Doyle's Wetwares: Experiments in PostVital Living deals extensively with uploading from the perspective of distributed embodiment, arguing for example that humans are currently office of the "artificial life phenotype". Doyle'southward vision reverses the polarity on uploading, with artificial life forms such every bit uploads actively seeking out biological embodiment as part of their reproductive strategy.

Run across also [edit]

  • Mind uploading in fiction
  • Brain Initiative
  • Brain transplant
  • Brain-reading
  • Cyborg
  • Cylon (reimagining)
  • Democratic transhumanism
  • Human Encephalon Projection
  • Isolated encephalon
  • Neuralink
  • Posthumanization
  • Robotoid
  • Send of Theseus—thought experiment asking if objects having all parts replaced fundamentally remain the aforementioned object
  • Simulation hypothesis
  • Simulism
  • Technologically enabled telepathy
  • Turing test
  • The Hereafter of Work and Death
  • Chinese room

References [edit]

  1. ^ A framework for approaches to transfer of a mind's substrate, Sim Bamford
  2. ^ Goertzel, BEN; Ikle', Matthew (2012). "Introduction". International Journal of Machine Consciousness. 04: ane–3. doi:ten.1142/S1793843012020015.
  3. ^ Coalescing minds: brain uploading-related group heed scenarios
  4. ^ Kay KN, Naselaris T, Prenger RJ, Gallant JL (March 2008). "Identifying natural images from human brain activity". Nature. 452 (7185): 352–5. Bibcode:2008Natur.452..352K. doi:ten.1038/nature06713. PMC3556484. PMID 18322462.
  5. ^ a b c d e f g Sandberg, Anders; Boström, Nick (2008). Whole Brain Emulation: A Roadmap (PDF). Technical Report #2008‐3. Future of Humanity Institute, Oxford Academy. Retrieved 5 April 2009. The basic thought is to take a particular brain, scan its structure in item, and construct a software model of it that is and so true-blue to the original that, when run on advisable hardware, information technology volition behave in essentially the same way as the original encephalon.
  6. ^ Goertzel, Ben (Dec 2007). "Human-level bogus full general intelligence and the possibility of a technological singularity: a reaction to Ray Kurzweil'due south The Singularity Is Near, and McDermott's critique of Kurzweil". Artificial Intelligence. 171 (eighteen, Special Review Outcome): 1161–1173. doi:10.1016/j.artint.2007.10.011.
  7. ^ Hopfield, J. J. (1982-04-01). "Neural networks and physical systems with emergent collective computational abilities". Proceedings of the National Academy of Sciences. 79 (8): 2554–2558. Bibcode:1982PNAS...79.2554H. doi:10.1073/pnas.79.viii.2554. ISSN 0027-8424. PMC346238. PMID 6953413.
  8. ^ a b Koch, Christof; Tononi, Giulio (2008). "Tin can machines be conscious?" (PDF). IEEE Spectrum. 45 (vi): 55. doi:10.1109/MSPEC.2008.4531463. S2CID 7226896.
  9. ^ Lindop, Grevel; Whale, John (2020-03-nineteen), "[Bubbling]", The Works of Thomas De Quincey, Routledge, pp. 337–341, doi:10.4324/9780429349119-17, ISBN978-0-429-34911-9, S2CID 242630082, retrieved 2021-01-22
  10. ^ a b null. "Tech Luminaries Address Singularity". ieee.org. Archived from the original on 2009-05-01. Retrieved 2009-04-02 .
  11. ^ Marvin Minsky, Conscious Machines, in 'Mechanism of Consciousness', Proceedings, National Research Quango of Canada, 75th Anniversary Symposium on Science in Order, June 1991.
  12. ^ Llinas, R (2001). I of the vortex: from neurons to self. Cambridge: MIT Printing. pp. 261–262. ISBN978-0-262-62163-2.
  13. ^ Ray Kurzweil (February 2000). "Alive Forever–Uploading The Human Brain...Closer Than You Retrieve". Psychology Today.
  14. ^ Martin GM (1971). "Cursory proposal on immortality: an acting solution". Perspectives in Biology and Medicine. 14 (2): 339–340. doi:ten.1353/pbm.1971.0015. PMID 5546258. S2CID 71120068.
  15. ^ Prisco, Giulio (12 December 2012). "Uploaded e-crews for interstellar missions". kurzweilai.net . Retrieved 31 July 2015.
  16. ^ "Substrate-Contained Minds - Carboncopies.org Foundation". carboncopies.org. Archived from the original on 2014-01-03. Retrieved 2014-01-03 .
  17. ^ Roadmap p.xi "Given the complexities and conceptual issues of consciousness nosotros will not examine criteria 6abc, but mainly examine achieving criteria 1‐five."
  18. ^ "Bluebrain - EPFL". epfl.ch. xix May 2015.
  19. ^ Blue Brain Project FAQ Archived 2007-01-27 at the Wayback Machine, 2004
  20. ^ Jonathan Fildes (22 July 2009). "Artificial brain 'x years away'". BBC News.
  21. ^ "Your brain does not process information and it is not a computer – Robert Epstein | Aeon Essays". Aeon . Retrieved 2021-04-04 .
  22. ^ Theil, Stefan (2015-x-01). "Why the human encephalon project went wrong and how to fix it". Scientific American . Retrieved 2021-04-04 . {{cite web}}: CS1 maint: url-status (link)
  23. ^ "New imaging method developed at Stanford reveals stunning details of brain connections". Stanford Medicine.
  24. ^ Merkle, R., 1989, Large scale analysis of neural structures, CSL-89-ten November 1989, [P89-00173]
  25. ^ a b ATLUM Project Archived 2008-02-01 at the Wayback Motorcar
  26. ^ Hagmann, Patric; Cammoun, Leila; Gigandet, Xavier; Meuli, Reto; Honey, Christopher J.; Wedeen, Van J.; Sporns, Olaf; Friston, Karl J. (2008). Friston, Karl J. (ed.). "Mapping the Structural Core of Man Cerebral Cortex". PLOS Biology. 6 (seven): e159. doi:x.1371/periodical.pbio.0060159. PMC2443193. PMID 18597554.
  27. ^ Glover, Paul; Bowtell, Richard (2009). "Medical imaging: MRI rides the wave". Nature. 457 (7232): 971–two. Bibcode:2009Natur.457..971G. doi:ten.1038/457971a. PMID 19225512. S2CID 205044426.
  28. ^ Will Y'all Always Exist Able to Upload Your Encephalon?, www.nytimes.com
  29. ^ Franco Cortese (June 17, 2013). "Clearing Up Misconceptions About Mind Uploading". h+ Media.
  30. ^ Yoonsuck Choe; Jaerock Kwon; Ji Ryang Chung (2012). "Time, Consciousness, and Mind Uploading" (PDF). International Journal of Machine Consciousness. 04 (one): 257. doi:10.1142/S179384301240015X.
  31. ^ "The Duplicates Paradox (The Duplicates Trouble)". benbest.com.
  32. ^ a b Schneider, Susan (March ii, 2014). "The Philosophy of 'Her'". The New York Times . Retrieved May 7, 2014.
  33. ^ Primal neuroscience. Squire, Larry R. (3rd ed.). Amsterdam: Elsevier / Academic Printing. 2008. ISBN9780123740199. OCLC 190867431. {{cite volume}}: CS1 maint: others (link)
  34. ^ Hughes, James (2013). Transhumanism and Personal Identity. Wiley.
  35. ^ Wiley, Keith (March twenty, 2014). "Response to Susan Schneider's "Philosophy of 'Her"". H+Magazine . Retrieved 7 May 2014.
  36. ^ Wiley, Keith (Sep 2014). A Taxonomy and Metaphysics of Listen-Uploading (1st ed.). Humanity+ Printing and Alautun Press. ISBN978-0692279847 . Retrieved 16 Oct 2014.
  37. ^ Ruparel, Bhavik (2018-07-30). "On Achieving Immortality". Bhavik Ruparel . Retrieved 2018-07-31 .
  38. ^ Michael Hauskeller (2012). "My Brain, my Listen, and I: Some Philosophical Bug of Mind-Uploading". Academia.edu. 04 (1): 187–200.
  39. ^ George Dvorsky. "Y'all Might Never Upload Your Encephalon Into a Estimator". io9.
  40. ^ Brandon Oto (2011), Seeking normative guidelines for novel future forms of consciousness (PDF), University of California, Santa Cruz
  41. ^ Ben Goertzel (2012). "When Should Two Minds Be Considered Versions of I Another?" (PDF).
  42. ^ Sally Morem (Apr 21, 2013). "Goertzel Contra Dvorsky on Heed Uploading". h+ Media.
  43. ^ Martine Rothblatt (2012). "The Terasem Mind Uploading Experiment" (PDF). International Journal of Machine Consciousness. 4 (1): 141–158. doi:10.1142/S1793843012400070. Archived from the original (PDF) on 2013-08-27.
  44. ^ Patrick D. Hopkins (2012). "Why Uploading Will Non Work, or, the Ghosts Haunting Transhumanism" (PDF). International Journal of Automobile Consciousness. 4 (1): 229–243. doi:10.1142/S1793843012400136. Archived from the original (PDF) on 2012-09-06.
  45. ^ a b c d east Anders Sandberg (fourteen Apr 2014). "Ideals of encephalon emulations". Journal of Experimental & Theoretical Bogus Intelligence. 26 (iii): 439–457. doi:10.1080/0952813X.2014.895113. S2CID 14545074.
  46. ^ a b Tyler D. Bancroft (Aug 2013). "Upstanding Aspects of Computational Neuroscience". Neuroethics. vi (ii): 415–418. doi:x.1007/s12152-012-9163-seven. ISSN 1874-5504. S2CID 145511899.
  47. ^ a b c d e f g h i Peter Eckersley; Anders Sandberg (Dec 2013). "Is Encephalon Emulation Unsafe?". Journal of Artificial General Intelligence. 4 (3): 170–194. Bibcode:2013JAGI....4..170E. doi:x.2478/jagi-2013-0011. ISSN 1946-0163.
  48. ^ a b Kamil Muzyka (December 2013). "The Outline of Personhood Law Regarding Artificial Intelligences and Emulated Human Entities". Periodical of Artificial Full general Intelligence. 4 (three): 164–169. Bibcode:2013JAGI....4..164M. doi:x.2478/jagi-2013-0010. ISSN 1946-0163.
  49. ^ Shulman, Carl; Anders Sandberg (2010). Mainzer, Klaus (ed.). "Implications of a Software-Limited Singularity" (PDF). ECAP10: 8 European Briefing on Computing and Philosophy . Retrieved 17 May 2014.
  50. ^ a b Hanson, Robin (26 Nov 2009). "Bad Emulation Advance". Overcoming Bias . Retrieved 28 June 2014.
  51. ^ Muehlhauser, Luke; Anna Salamon (2012). "Intelligence Explosion: Evidence and Import" (PDF). In Amnon Eden; Johnny Søraker; James H. Moor; Eric Steinhart (eds.). Singularity Hypotheses: A Scientific and Philosophical Assessment. Springer.
  52. ^ a b c Anna Salamon; Luke Muehlhauser (2012). "Singularity Summit 2011 Workshop Report" (PDF). Motorcar Intelligence Research Institute . Retrieved 28 June 2014.
  53. ^ a b c d eastward f one thousand h i j one thousand l k Bostrom, Nick (2014). "Ch. xiv: The strategic picture". Superintelligence: Paths, Dangers, Strategies. Oxford University Printing. ISBN978-0199678112.
  54. ^ Russell, Stuart (2019). Human Compatible: Bogus Intelligence and the Trouble of Command. Viking Press. ISBN978-0-525-55861-three. OCLC 1113410915.
  55. ^ "Mind uploading & digital immortality may be reality past 2045, futurists say - KurzweilAI". kurzweilai.net.

hotchinjuspencesses.blogspot.com

Source: https://en.wikipedia.org/wiki/Mind_uploading