Age-Appropriate Design Governance

--

Four Steps to Prepare for AI Governance

KidsTechEthics: Making online spaces and digital experiences safer for children.
  1. Understand the Impact of AI Bias:

Board members and their legal advisors should recognize the extent of AI bias in various sectors. Biases in healthcare AI have resulted in unequal care for different races, classes of patients, or age of user. Product design targeted at older adults often fails because it relies on outdated stereotypes. AAA (Artificially Intelligent, Algorithmic, & Autonomous) Systems used in mortgage approvals have replicated historical housing discrimination. Systems that treat all users the same may deliver content, recommendations, or nudges that are focused on adults. It is crucial to acknowledge these biases and their harmful effects, which could have been prevented with AI governance.

2. Learn How Bias Infiltrates AI:

Contrary to popular belief, AI is not neutral or flawless. Algorithms, like opinions, can be biased. Bias can creep into every stage of the AAA System lifecycle, from problem identification to data collection and testing. Each touch-point in the process is influenced by human experiences and historical biases in the data. However, every stage also presents an opportunity to identify and eliminate harmful biases. Directors should understand this process and promote risk management at each stage. If children could be present in your online product, service, or feature, how are you intending to engage with them in an age-appropriate and child-friendly way?

3. Develop a Game Plan:

Directors must have a plan in place to monitor and institute AI governance. Shareholders may hold them accountable if they fail to address AI-related harms when they are increasingly visible. If not the shareholders, then most definitely the state attorney generals will. Most executives currently lack the necessary understanding and support for responsible AI practices. Boards should ensure that corporate executives quickly become well-versed in responsible AAA System governance. This will help companies navigate the evolving legal landscape and mitigate risks associated with AI.

4. Understand How Your Business Upholds the Best Interests of the Child:

As businesses integrate AAA Systems into the flow of their online products and services, understanding the processing of user data is critical. What must take place prior to that is an assessment of any risks the system could have with respect to the user, especially those who are children. Using a child’s data, without the proper review and informed consent from the parent, or the child, could place the business at great risk; i.e. fines or limited business market access. Those solely relying upon the protections of COPPA, or thinking they are covered by the legacy FTC Safe Harbor Program, will find it fails to cover important aspects of Age-Appropriate Design Codes. Directors should gain knowledge of who in the organization is empowered and has agency to uphold the Best Interests of the Child and the Board may be required to step in where the commercial interests of the board conflict with those of child users.

--

--

Jeff Kluge, FHCA- AI Governance & Audits
Jeff Kluge, FHCA- AI Governance & Audits

Written by Jeff Kluge, FHCA- AI Governance & Audits

AI governance advocate, bridging business and ethical design of responsible tech, with a resolute focus on a brighter future for children.

No responses yet