-2

I was reading about corporations and trusts and came to wonder how one could use existing legal tools to allow a robot to be as "autonomous" as possible.

I imagine a foundation or non-profit owned by an irrevocable trust owning an AI and the associated data. If an AI was vested with management of most key parts of an entity's operations and a docile board of director, could the AI itself buy property, get contracts to "work", sell things, etc? If so, under what conditions? Would it be comparable to the autonomy of a person under guardianship (theoretically and legally, I know it's not the same thing, but in practice)?

I have found some information on other stacks, but nothing that constitutes a complete answer to my queries.

Thank you for your help!

EDIT: I know that an AI is not a legal entity. By saying that the AI "buys, sells, etc" I didn't mean AI as a legal entity, I meant AI as a principle of operation of a legal entity, like a non-profit. My question refers to autonomy in practice This is a very speculative thought exercise. I guess that the real question is: What imaginative detours could be taken to get around the legal impossibility for an AI to be a legal entity.

I thought the trust might be a good place to start, since in Canada there are data trusts or trusts protecting forests in perpetuity. I thought, why not an AI?

Alexis
  • 17
  • 1

2 Answers2

2

Experienced director here.

I don't see a problem with "AI as corporate CEO" concept.

There's a human Board of Directors, and each director has a duty of care to act in the corporation's best interests. Those "best interests" are:

  • comply with the law.
  • make money to the satisfaction of its owners *
  • conform to its stated mission. **

But as long as all of this stuff is on-track, the Board's role is largely as a "Safety Control Rod Ax Man". You let the business run using the well-chosen managers, and observe. ***

As long as nobody's screwing up the above to the dissatisfcation of stakeholders, people are going to leave you alone.

So if the AI is programmed with appropriate constraints, or if it has a "morality core" and a sense of prediction and consequence, it is unlikely to allow a situation where the Board would be obliged to intervene.

Obviously if the AI goes all GLaDOS, then the Board will be obliged to intervene, or breach of their duty of care, and they can be held liable for that.

* Owners. Corporate owners are allowed, in as many shells as you like... however there must be human owners at the bottom of this pile of turtles. However as long as all the human owners are A-OK with the business conduct, you're fine. An appropriate structure here might be the "Not-for-profit" or "not-for-dividend" organization, where the organization is defined by a mission instead of a profit motive. In theory the owners are "The People" via the Attorney General; but in 200 Board meetings I've never seen either one show up LOL.

** The "mission" applies to non-profits or other structures which gain a tax benefit for focusing on a recognized charitable purpose or perform a role of government. In practice you answer to the IRS; they require an informational tax return where you document what you are doing with charity money. If you have dispensed with human owners by operating in a charitable space, then your AI needs to take extra care to conform with the requirements of that space.

*** The Board also picks human "officers", such as President, Secretary and Treasurer as state law requres, who are officially responsible for management. But these are often ceremonial roles if you have a well established staff, the paper officer must only jump in if the staff has failed to do the job. Once I was President and my sole job was to convene the next general membership meeting. 2 people attended and one was me. State law defining these roles requires that the assignee oversee that the job is done. It does not require that the assignee do the actual job personally.

Harper - Reinstate Monica
  • 20,495
  • 2
  • 30
  • 88
1

There is no such thing as a "docile" director

A director who acts on the instructions of anyone else - be that another person, the author of their horoscope, or the programmer(s) of an AI, is failing in their duty as a director. A director must use their own judgement in the discharge of their duties.

Dale M
  • 237,717
  • 18
  • 273
  • 546