President Biden's Executive Order (EO) released today outlines this administration's approach to harnessing the potential of artificial intelligence (AI) while mitigating risks. After many personal conversations with policymakers throughout this year, I find it interesting to see what’s included.
With a focus on innovation, digital asset security, and privacy and fairness in AI applications, the order outlines a multifaceted approach that creates both business opportunities and regulatory challenges in the evolving AI landscape.
The Executive Order covers a wide range of topics, but it specifically addresses several noteworthy points:
➡️ The Department of Commerce will craft watermarking guidelines (this is a technique used to embed a hidden mark within a digital asset like images, videos, or audio files to verify authenticity or claim ownership.) I’ve repeatedly warned that this technology doesn’t yet work so this one will be challenging.
➡️ National Institute of Standards and Technology (NIST) will formulate standards for red teaming practices (good guys purposely trying to uncover security vulnerabilities). Establishing standards helps consistently identify and address security vulnerabilities proactively, enhancing resilience against real-world threats.
➡️ Providers of “Foundation models” are now mandated to share the results of red teaming exercises with governmental bodies. (Note: The targeted models pose a “serious risk” to national security, national economic security, or national public health and safety.)
➡️ Privacy is addressed with calls for Congress to enact data privacy legislation and regulatory bodies to dig into privacy preserving techniques. If we are to maintain any personal freedoms, this is an absolute must!
➡️ AI applications that could potentially cause discriminatory effects in critical areas like hiring, housing, and judicial sentencing are addressed.
➡️ Shielding workers from job impacts tied to AI advancements is mentioned. This one will be extremely difficult and will likely require financially weighting the use of technology solutions more heavily to balance the scales (e.g., taxing software use heavily while lowering the costs for employers to use human labor).
➡️ Fostering innovation, the order backs AI research and particularly emphasizes the efforts by smaller entities. I’m always encouraged to see smaller orgs being supported, but the devil is in the details on this one.
➡️ Tightening requirements for AI technologies deployed within federal governmental operations is also addresssd. This will have an outsized impact as all companies seeking federal contracts will design to meet these requirements, creating de facto standards that impact all industries.
➡️.Accelerated hiring of AI professionals is identified as a priority. Also, training federal employees about AI and rapid contracting of AI talent.
This executive order opens various business opportunities and risks aligned with my above itemized list.
Watermarking Technology Development: Companies could innovate in watermarking technologies to meet the new guidelines.
Security Consultancy: With NIST standardizing red teaming practices, security firms may find opportunities in offering these services
AI Research and Development: Encouragement for smaller entities to engage in AI research could spur innovation and funding opportunities.
AI Professional Services: Accelerated hiring and training underline a growing market for AI expertise and education.
Privacy Legislation: Pending privacy laws could impose new compliance burdens on businesses.
Foundation Model Regulations: Mandates for sharing red teaming results could affect proprietary or sensitive business information.
AI Employment Transition: Balancing AI with job impacts may introduce operational and financial challenges.
Federal Contract Compliance: Tightened AI requirements for federal operations could necessitate costly adjustments for companies seeking federal contracts.
Bottom line, this executive order sets a framework that will both stimulate AI sector growth and, undoubtedly, impose new regulatory hurdles.