Skip to content

Latest commit

 

History

History
72 lines (57 loc) · 9.78 KB

sample-legal-framework.md

File metadata and controls

72 lines (57 loc) · 9.78 KB

A Sample Policy Skeleton:

Disclaimer This is just a thought experiment about what a framework for laws governing technology companies/platforms might look like if it adhered to the Universal Declaration of Human Rights as its starting point. While I think some stuff in here will be pretty non-controversial, I am well aware that some of the items below are judgment calls that reflect my own personal perspective on how to interpret/implement an item in the Declaration. The point is to start a conversation, not propose a final resolution.

Discrimination in the Provision of Services (Article 13)

  • Tech companies and platforms should be subject to the same rules and regulations regarding discrimination in housing, employment, and the provision of other services as any company providing or facilitating services in the same industry.
  • Companies which operate as "platforms" facilitating third party actors to participate in an industry without directly providing the service must create certification and inspection processes which ensure that their participants are adhering to applicable rules and regulations with regard to discrimination in the provision of their services.

Protection of Minors and the Rights of Parents (Articles 25 and 26)

  • Tech companies and platforms should have the legal responsibility to ensure that content on their properties neither contains nor links to child pornography or services supporting/facilitating child sex trafficking.
  • Tech companies must create mechanisms for identifying and verifying the identities of minors, including verifying parental/guardian consent to use their product or service. Such mechanisms must meet a "due diligence" threshold (ie, a simple checkbox isn't enough, there must be some sort of more robust verification).
  • Advertising targeting minors must be strictly scrutinized to ensure that it meets other applicable legal standards for decency and is not promoting illegal substances/activities or soliciting minors for any sort of exploitation.

Employment Rights and Benefits (Article 23)

  • Individuals providing a "core service" of a business (meaning a service which accounts for more than 20% of the customer/end-user interactions or 10% the gross revenues of a business entity) may only be considered "independent contractors" if:
    • They have the ability to set and modify their own rate of compensation (ie, setting the price of their property listing on AirBNB)
    • And, end-users/customers are able to freely choose which individual provider to engage services from (ie, the provider isn't "dispatched" to the user, they choose which one they want to hire)
  • All other individuals providing a "core service" of a business must be treated as employees for the purposes of compensation and benefits.
  • Employees providing a "core service" must be allowed to unionize and engage in collective bargaining.
  • Tech companies, including platforms who facilitate but do not own the means of service provision, must take care that anyone hired to perform a "core service" would meet the standards of safety, licensing, and training required for any employee providing the equivalent service in the equivalent analogue industry.
  • Tech companies and platforms must report and cooperate in any subsequent investigation of, any allegation of illegal behavior on the part of an individual providing a "core service" within 48 hours of receiving a credible allegation

Property Rights of Workers and Users

Physical Ownership of Property (Article 17)

  • Tech companies which do not provide the physical equipment or property necessary for the provision of a service (ie, ride sharing or property listing platforms) may not require providers to install any equipment or software not strictly necessary for the provision of the service or the safety of their end-users/customers unless such equipment/software is given to providers free of charge, does require any physical modification to the provider's own property, and can be turned off/removed when the provider is not working. (ie, the dispatch app to receive jobs/routing is fine, requiring drivers to purchase and install a satellite tracking device on their car is not, nor would requiring a permanently mounted camera that cannot be turned off)

Copyright and Likeness Protections (Article 27)

  • Companies/Platforms should have clear, verifiable take-down procedures for content reported to be in violation of an individual’s intellectual property rights. This includes violations of a person's right to their own likeness and image: companies/platforms must respect the right of individuals to have their image and likeness permanently removed regardless of whether the requestor posted the content or it was posted by a third party.

Human Trafficking and Slavery (Article 4)

  • Tech companies should have a legal responsibility to ensure that content on their properties neither contains nor links to any service providing or facilitating human trafficking or other forms of slavery
  • Tech companies/platforms which facilitate any service-for-hire must create mechanisms to verify that those offering services through their platform are doing so voluntarily and are not being compelled to work against their will.

Government Contracts and Export Rules

Certifications when Selling to Federal, State, and Local Government Entities or Contractors (Article 5, Article 21)

  • All federal, state, and local government entities must certify when purchasing any technology product/service that will be used to administer services to citizens that:
    • such service does not produce undue burdens that will limit the ability of individuals to gain equal access to services
    • such service will not be used to infringe on individual's rights to privacy and due process
    • and that such services will not be used in any act of torture or cruel and inhumane treatment of any person

Export Controls to Limit Use for Torture (Article 5)

  • Tech companies seeking to sell a product or service to any foreign government entity or a company reasonably expected to provide services to a foreign government entity must receive certification from the Federal Government that such government is not suspected of engaging in torture or inhumane treatment or that the service they provide is not reasonably expected to facilitate any acts of torture or inhumane treatment.

The Rights of Users

Data Privacy (Article 12)

  • Tech companies/platforms should never monetize and should take especial care to protect information pertaining to:
    • health records or data pertaining to health matters,
    • financial, tax, or credit history records,
    • information pertaining to an individuals legal/criminal history,
    • Information about the details of a person’s family (such as the names and ages of their children), home (such as their address or phone number), or sexual activities,
    • Information posted or sent with a reasonable expectation of privacy, such as direct correspondence (email, texts, or direct/internal messages, for example),
    • Information that, in a user's home country or present location, might be the basis for discrimination or persecution (for example, LGBTQ status for an individual located in a country with anti-LGBTQ laws).
  • Best practices should be followed to protect all such private data, including the use of encryption and access limitations for staff
  • Companies/Platforms may be required to produce some information in relation to legal processes. Beyond such obligations, highly personal information should never be shared with outside parties without explicit user consent for each instance of sharing.
  • Users should have the right to review, modify, and remove collected data about themselves in a company's records.

Libel/Slander (Article 12)

  • Tech Companies/Platforms should be required to comply with any take-down order regarding content adjudicated to be slanderous/libelous.

Rights of Users Regarding Their Associations (Article 20)

  • Users should be allowed to limit the ability of others to "follow" or "friend" them, including by limiting the search criteria/methods by which they can be found, making their profiles "private" or "restricted" for public viewing, or explicitly blocking certain individuals from following them.

Algorithmic Transparency (Article 20)

  • "Recommendation Engines", whether for "friend"/"follower" connections or for additional content, should be transparent and modifiable by the end user. In other words, users should be able to see why content/associations are being recommended to them and choose to "opt-out" of future recommendations based on such an association. This should be prominently and clearly displayed so that users are encouraged to take advantage of this feature.

Election Integrity Responsibilities (Article 21)

  • Companies/Platforms must ensure that any content (paid or organic) published by them or their users pertaining to an election (whether directly about a candidate or about a political issue) is published in "good faith." This means:
    • Paid content must only be published by individuals verified as legal residents of the jurisdictions in which the election is taking place (ie, no outside actors paying for content)
    • All content must clearly state who produced it, who posted it, and who paid for its amplification (if applicable). Additionally, companies/platforms should make available tools which allow others to view what paid content is being promoted and how it is being targeted by other actors in the political space.
    • Demonstrably false information intended to suppress voter participation (for example, promoting the wrong day for an election or false information about voter ID requirements) should be removed and violating accounts banned from additional posting.
    • Companies/Platforms should observe a "black-out" period on political advertising prior to elections.