On human genome manipulation and Homo technicus: the legal treatment of non-natural human subjects | AI and Ethics
Skip to main content

Advertisement

On human genome manipulation and Homo technicus: the legal treatment of non-natural human subjects

  • Original Research
  • Published:
AI and Ethics Aims and scope Submit manuscript

Abstract

Although legal personality has slowly begun to be granted to non-human entities that have a direct impact on the natural functioning of human societies (given their cultural significance), the same cannot be said for computer-based intelligence systems. While this notion has not had a significantly negative impact on humanity to this point in time that only remains the case because advanced computerised intelligence systems (ACIS) have not been acknowledged as reaching human-like levels. With the integration of ACIS in medical assistive technologies such as companion robots and bionics, our legal treatment of ACIS must also adapt—least society faces legal challenges that may potentially lead to legally sanctioned discriminatory treatment. For this reason, this article exposes the complexity of normalizing definitions of “natural” human subjects, clarifies how current bioethical discourse has been unable to effectively guide ACIS integration into implanted and external artefacts, and argues for the establishment of legal delineations between various ACIS-human mergers in reference to legal protections and obligations internationally.

This is a preview of subscription content, log in via an institution to check access.

Access this article

Subscribe and save

Springer+ Basic
¥17,985 /Month
  • Get 10 units per month
  • Download Article/Chapter or eBook
  • 1 Unit = 1 Article or 1 Chapter
  • Cancel anytime
Subscribe now

Buy Now

Price includes VAT (Japan)

Instant access to the full article PDF.

Fig. 1

Similar content being viewed by others

Explore related subjects

Discover the latest articles, news and stories from top researchers in related subjects.

Notes

  1. Notwithstanding arguments that advanced computer intelligence system (ACIS) are being overtly anthropomorphised by humanity, and that too much “human” attribution to these systems may pose other issues. While it may be rational that our species attribute traits to inanimate objects to make them more relatable, we cannot forget that there is a fine line between empathetic and sympathetic “relatedness” and that sympathy is an imagined version of empathy (or rather, an imagined relation to a sentiment one shares with another individual that one has also gone through). This general concept digresses from the main substance of this article and shall be left aside for a different forum given the metaphysical nature of the subject.

  2. Defined as such to divert from the bias that some academics and lay-people hold towards the notions of genetically-modified (GM) products in general, which are arguably more produce- or animal-oriented than human-oriented in connotation.

  3. Insofar as it is unrelated to GeMa practices.

  4. Forthcoming work by author under typesetting.

  5. The law as it exists, without consideration for how it should be.

  6. To differentiate between pharmaceutically-enhanced individuals or those utilizing nootropics, as these may technically qualify as being technological enhancements under particular interpretations to the semantic meanings of the phrase.

  7. Sparked by the merciless killing of George Floyd and others at the hands of local police officials in the USA during the global health crisis.

  8. Beauchamp and Childress seem only to address ethical concerns of biotechnology insofar as it relates to traditionally recognised concerns in the field—namely, in its use for creating animal-human hybrids to grow organs or other significant hormones/compounds for patients. Their lack of focus on issues of artificial intelligence’s integration into medical technologies in their newest edition of Principles of Biomedical Ethics is highly concerning in this regard, as it displays a lack of attention to current issues being faced by bioethicists, patients, and physicians alike. Out of the three cited, only Furrow et.al. [14] addresses the subject of biobanks (pp. 557–563)—which is frightening considering new developments in the field towards biobank regulation in 2016 and beyond, such as the World Medical Association’s Declaration of Taipei (https://www.wma.net/policies-post/wma-declaration-of-taipei-on-ethical-considerations-regarding-health-databases-and-biobanks/, [2020, last accessed 5 February 2021]).

  9. Referring to the structure of the human body, as in cases of amputees or more generally to address deformities of the natural biologic components of the human form.

  10. When regarding access to the Internet of Things through electronic computer systems in a more specific sense. The development of books and reading do not fit into this category given the notion of “preadaptation” to these advances [42].

  11. Given its emphasis on the ethics within the art and practice of medicine, as opposed to life in a broader sense.

  12. Forthcoming work by author under typesetting.

  13. Defined as “the administration of total body irradiation and/or alkylating agents, at doses which will not allow autologous hematologic recovery” [2] or more simply as a severe form of decreased bone marrow activity.

  14. Specifically, whether the patient’s prosthesis or genetically manipulated trait(s) is (or are) visible to other members of the population and identify them as being “enhanced.” And for clarity, this “visibility” does not have to be restricted to what one may be able to perceive when walking down an avenue—much like tattoos may alter a person’s appearance but can be placed on virtually any area of the human body.

  15. Explicitly, the factors the author advocates to protect are those such as skin colour and sex or gender (outside of disorders not related to intersex conditions). Others may exist—such as ethnicity when tied to paternal or maternal ethnic origin, or notions of gender-specific hereditary familiar “honour” and other like gender-specific hereditary notions—but are not explicitly defined to intentionally induce a factor of ambiguity to such classifications.

  16. Which may furthermore be exacerbated by “gender correcting” procedures conducted on children in their youth regardless of the overall health of “abnormal” sexual organs or “malformed” genitalia. See the Columbia Center for Clinical Medical Ethics’ recent discussion with Katrina Karkazis (2021, last accessed 5 February 2021, https://www.youtube.com/watch?v=ZXIxA9G2ThY&feature=youtu.be) for more details on why gender has not been mentioned as an aspect of “normal” humanity thus far, and as an example of how ACIS allele selection may do more harm than good specific to sex selection in embryoblasts.

  17. Which these authors could feasibly be stated as having significant influence over given how widely their work is used within the field as reference and study material.

  18. Specifically, those with advanced self-learning architectures that either mimic or go beyond the capability of commonly understood notions of human logic.

  19. Forthcoming work by author under typesetting.

  20. These include related works like Case Studies in Biomedical Ethics: Decision-Making, Principles, and Cases, 2nd ed. by Veatch et al. [43].

  21. The use of “bias” here refers to the thoughts and beliefs that are unique to each human subject, though further discussion on this topic requires a thorough examination into phenomenological, epistemological, or metaphysical arguments on the subject. As such, they will not be expanded upon further herein.

  22. Forthcoming work by author under typesetting.

  23. Forthcoming work by author under typesetting.

  24. Explicitly targeted at explaining the presence or absence of ACIS, whether they be biologically- or mechanically-designed. To this end, “smart” devices as they are understood at the time of this writing do not fall into this category.

  25. And subsequently of a child “born” in that same environment.

  26. Specifically referring to a “guilty mind” in legal terms.

  27. The legal phrase for when one represents themselves in court without the official assistance of an attorney.

  28. Referring to a court order to prevent a party from leaving or being removed from the jurisdiction of a particular court during legal proceedings.

  29. Or rather, as a concept that is unclassifiable in the whole context in which it is used.

  30. Specifically referring to legal guardianship under which a ward is considered to be totally and permanently incapable from a legal stance.

  31. More commonly understood as the superior taking legal responsibility for actions performed by their subordinate(s).

  32. “Common” property, insofar as that property cannot be subjected to individual ownership (like airspace).

  33. Regarding the competency of ACIS when under trial as property, as opposed to a legal person, or the need to consider the system competent.

  34. Implying that ACIS cannot be retained by the users of the system after a trial, as specific portions of it may be intermingled in such a manner that their constituent parts cannot be sufficiently parsed to distinguish which “version” of an ACIS’ code or data belongs to which user. This extends to the ability of a “third” party to question the claims of a given party’s principal ownership of an ACIS that is under legal dispute given the potential complexity of its architecture.

References

  1. Ackerley, R., Kavounoudias, A.: The role of tactile afference in shaping motor behaviour and implications in prosthetic innovation. Neuropsychol. (2015). https://doi.org/10.1016/j.neuropsychologia.2015.06.024

    Article  Google Scholar 

  2. Bacigalupo, A., Ballen, K., Rizzo, D., Giralt, S., Lazarus, H., Ho, V., Apperley, J.A.: Defining the intensity of conditioning regimens: working definitions. Transplant Cell Ther. (2009). https://doi.org/10.1016/j.bbmt.2009.07.004

    Article  Google Scholar 

  3. Barfield, W.: Intellectual property rights in virtual environments: considering the rights of owners, programmers and virtual avatars. Akron Law Rev. 39, 649–700 (2006)

    Google Scholar 

  4. Barrat, J.: Our final invention: artificial intelligence and the end of the human era. St. Martin’s, New York (2013)

    Google Scholar 

  5. Beauchamp, T.L., Childress, J.F.: Principles of biomedical ethics, 8th edn. Oxf. Univ. Press, New York (2019)

    Google Scholar 

  6. Björkman, B., Hansson, S.O.: Bodily rights and property rights. J. Med. Ethics 32 (4): 209–214 (2006). http://www.jstor.com/stable/27719607

  7. Bostrom, N.: Superintelligence: paths, dangers, strategies. Oxf. Univ. Press, New York (2014)

    Google Scholar 

  8. Cameron, J. (dir.): James Cameron’s avatar. 20th Century Fox, Los Angeles (2009)

  9. De Vignemont, F.: The mark of bodily ownership. Analysis (2013). https://doi.org/10.1093/analys/ant080

    Article  Google Scholar 

  10. Dockrill, P.: AI solves 50-year-old biology ‘grand challenge’ decades before experts predicted. ScienceAlert (2020). https://www.sciencealert.com/ai-solves-50-year-old-biology-grand-challenge-decades-before-experts-predicted/amp. Accessed 5 Feb 2021

  11. Du Preez, A.: Gendered bodies and new technologies: Rethinking embodiment in a cyber-era. CambSchPubl, Newcastle upon Tyne, UK (2009)

    Google Scholar 

  12. Enriquez, P.: Editing humanity: On the precise manipulation of DNA in human embryos. N. C. Law Rev. 97 (5): 1147–1240 (2019). https://scholarship.law.unc.edu/nclr/vol97/iss5/12

  13. Frangoul, H., Altshuler, D., Cappellini, M.D., Chen, Y., Domm, J., Eustace, B.K., Foell, J., et al.: CRISPR-Cas9 gene editing for sickle cell disease and β-thalassemia. N. Engl. J. Med. (2020). https://doi.org/10.1056/NEJMoa2031054

    Article  Google Scholar 

  14. Furrow, B.R., Greaney, T.L., Johnson, S.H., Jost, T.S., Schwartz, R.L.: Bioethics: health care law and ethics. West Publ., St. Paul, MN (2013)

    Google Scholar 

  15. Garland, A. (dir.): Ex machina. Univers. Pict., London (2015)

    Google Scholar 

  16. Glenn, L.M.: What is a person? In: Bess M, Pasulka DW (eds.) Posthumanism: the future of Homo sapiens, 1st edn., 229–246. Macmillan Reference USA, Farmington Hills, MI (2018)

  17. Glenn, L.M.: Case study: ethical and legal issues in human machine mergers (or: The cyborgs cometh) Annals of Health Law 21(1), 175–180 (2012)

  18. Hays, S.A.: Transhumanism. Encylopædia Britannica, web edn. (2018). https://www.britannica.com/topic/transhumanism. Accessed 5 Feb 2021

  19. Hurley, M.: Q who. In: Bowman, R. (dir.) Star trek: the next generation, season 2 episode 16. Paramount DomestTelev, Los Angeles (1989)

    Google Scholar 

  20. IEEE Global Initiative on Ethics of Autonomous and Intelligent Systems: Ethically aligned design: a vision for prioritizing human well-being with autonomous and intelligent systems, first edn. IEEE (2019). https://standards.ieee.org/content/ieee-standards/en/industry-connections/ec/autonomous-systems.html

  21. Jaynes, T.L.: Legal personhood for artificial intelligence: citizenship as the exception to the rule. AI Soc. (2020a). https://doi.org/10.1007/s00146-019-00897-9

    Article  Google Scholar 

  22. Jaynes, T.L.: Citizenship as the exception to the rule: an addendum. AI Soc. (2020b). https://doi.org/10.1007/s00146-020-01105-9

    Article  Google Scholar 

  23. Jaynes, T.L.: The legal ambiguity of advanced assistive bionic prosthetics: Where to define the limits of ‘enhanced persons’ in medical treatment. Clinical Ethics (2021). https://doi.org/10.1177/1477750921994277

  24. Jumper, J., Evans, R., Pritzel, A., Green, T., Figurnov, M., Tunyasuvunakool, K., Ronneberger, O. et al.: High accuracy protein structure prediction using deep learning. In: 14th round critical assessment of techniques for protein structure prediction (abstract book). Protein Structure Prediction Center, Davis, CA, pp 22–24 (2020). https://predictioncenter.org/casp14/doc/CASP14_Abstracts.pdf

  25. Kass, L.R.: Defending human dignity. Commentary 124 (5): 53–61 (2007). https://www.commentarymagazine.com/articles/leon-kass/defending-human-dignity/

  26. Lateef, Z.: Types of artificial intelligence you should know. Edureka (2020). https://www.edureka.co/blog/types-of-artificial-intelligence/. Accessed 5 Feb 2021

  27. Lauret, J.: GPT-3: the first artificial general intelligence? Towards Data Science (2020). https://towardsdatascience.com/gpt-3-the-first-artificial-general-intelligence-b8d9b38557a1. Accessed 5 Feb 2021

  28. Liu, S., Zhou, J., Zhang, X., Liu, Y., Chen, J., Hu, B., Song, J., Zhang, Y.: Strategies to optimize adult stem cell therapy for tissue regeneration. Int J MolSci (2016). https://doi.org/10.3390/ijms17060982

    Article  Google Scholar 

  29. Metz, C.: London A.I. lab claims breakthrough that could accelerate drug discovery. The New York Times (2020). https://www.nytimes.com/2020/11/30/technology/deepmind-ai-protein-folding.html. Accessed 5 Feb 2021

  30. Mizushima, S. (dir.) 「水島 精二、演出家」: Expelled from paradise 「楽園追放」. Toei Animat. 「東映アニメーション株式会社」with Graphinica「グラフィニカ」, Tōkyō-to 「東京都」(2014)「平成26年」

  31. Mostow, J.: Foreword: What is AI? And what does it have to do with software engineering? IEEE Trans. Softw. Eng. (1985). https://doi.org/10.1109/TSE.1985.231876

    Article  Google Scholar 

  32. Niccol, A. (dir.): Gattaca. Columbia Pict., Culver City (1997)

    Google Scholar 

  33. Pazzaglia, M., Molinari, M.: The embodiment of assistive devices—from wheelchair to exoskeleton. Phys. Life Rev. (2016). https://doi.org/10.1016/j.plrev.2015.11.006

    Article  Google Scholar 

  34. Polson, N., Scott, J.: AIQ: how people and machines are smarter together. St. Martin’s, New York (2018)

    Google Scholar 

  35. Primc, N.: Do we have a right to an unmanipulated genome? The human genome as the common heritage of mankind. Bioethics (2020). https://doi.org/10.1111/bioe.12608

    Article  Google Scholar 

  36. Ramachandran G (2009) Against the right to bodily integrity: of cyborgs and human rights. Denver Univ. Law Rev. 87 (1): 1–57. https://ssrn.com/abstract=1434712

  37. Rorty, R.: Human rights, rationality, and sentimentality. In: Hayden, P. (ed.) The philosophy of human rights, pp. 241–257. Paragon House, St. Paul (2001)

    Google Scholar 

  38. Ruggiu, D.: Implementing a responsible, research and innovation framework for human enhancement according to human rights: the right to bodily integrity and the rise of ‘enhanced societies.’ Law Innov. Technol. (2018). https://doi.org/10.1080/17579961.2018.1452177

    Article  Google Scholar 

  39. Shook, J.R., Giordano, J.: Neuroethics beyond normal: performance enablement and self-transformative technologies. Camb. Quart. Healthc. Ethics (2016). https://doi.org/10.1017/S0963180115000377

    Article  Google Scholar 

  40. Stein, R.: 1st patients to get CRISPR gene-editing treatment continue to thrive. Shots—Health News from NPR (2020). https://www.npr.org/sections/health-shots/2020/12/15/944184405/1st-patients-to-get-crispr-gene-editing-treatment-continue-to-thrive. Accessed 5 Feb 2021

  41. Tegmark, M.: Life 3.0: Being human in the age of artificial intelligence. Alfred A. Knopf, New York (2017)

    Google Scholar 

  42. Varney, N.R.: How reading works: considerations from prehistory to present. Appl. Neuropsychol. (1), 3–12 (2002)

    Article  Google Scholar 

  43. Veatch, R.M., Haddad, A.M., English, D.C.: Case studies in biomedical ethics: decision-making, principles, and cases, 2nd edn. Oxf. Univ. Press, New York (2015)

    Google Scholar 

  44. Zhuang, K.Z., Sommer, N., Mendez, V., Aryan, S., Formento, E., D’Anna, E., Artoni, F., et al.: Shared human-robot proportional control of a dexterous myoelectric prosthesis. Nat. Mach. Intell. (2019). https://doi.org/10.1038/s42256-019-0093-5

    Article  Google Scholar 

Download references

Funding

Not applicable.

Author information

Authors and Affiliations

Authors

Corresponding author

Correspondence to Tyler L. Jaynes.

Ethics declarations

Conflict of interest

The author of this work is serving as the Chair for the IEEE Nanotechnology Council Standards Committee, and actively serves as the Secretary for the IEEE P2863™—Recommended Practice for Organizational Governance of Artificial Intelligence—Standard Development Group (organized under the IEEE Computer Society Standards Activity Board)—which acts as an extension of the IEEE P7000 series of standards under the organisation’s Ethically Aligned Design initiative. All statements made herein are of entirely those of the author, and do not reflect the opinions of the IEEE, IEEE Standards Association, or related Councils or Societies under their jurisdiction; nor those of the P2863™ Standard Development Group as an entity or its affiliated members (notwithstanding of the author), or the IEEE’s Global Initiative on Ethics of Autonomous and Intelligent Systems and the Standard Development Groups that have arisen as a result of that work. Furthermore, the author’s statements are not reflective of the Alden March Bioethics Institute at Albany Medical College as an institution, its staff, or its curriculum, given their standing as a student enrolled in the Master of Science in Bioethics programme and lack of direct employment by the institution or programme (including through federal work-study tuition subsidies).

Additional information

Publisher's Note

Springer Nature remains neutral with regard to jurisdictional claims in published maps and institutional affiliations.

Rights and permissions

Reprints and permissions

About this article

Check for updates. Verify currency and authenticity via CrossMark

Cite this article

Jaynes, T.L. On human genome manipulation and Homo technicus: the legal treatment of non-natural human subjects. AI Ethics 1, 331–345 (2021). https://doi.org/10.1007/s43681-021-00044-5

Download citation

  • Received:

  • Accepted:

  • Published:

  • Issue Date:

  • DOI: https://doi.org/10.1007/s43681-021-00044-5

Keywords