By Johnathan Josephs, MSL, AEM Regulatory Affairs Manager —
Humans understand “risk” as an organization’s legal, financial and criminal exposure if it does not follow industry laws and regulations.
One of these risk areas is conflicts of interest. In general, it’s human nature to be poor judges of our own conflicts and not motivated to disclosure them. However, it’s not our fault! Nuanced motivators such as private gain, hidden ownership, procurement fraud, bid-rigging, service on a board of directors, accepting gifts, or even family and romantic relationships are all pathways to typical conflicts of interest.
This begs a question, though: Can artificial intelligence (AI) understand and disclosure these nuanced motivators, like what we expect of humans?
On July 26, 2023, a press release from the Securities and Exchange Commission (SEC) announced a proposed rule featuring new requirements to address risk to investors from conflicts of interest associated with the use of predictive data analytics by broker-dealers and investment advisors. The goal here is to prohibit predictive data analytics (artificial intelligence) and similar technologies from placing firm interests ahead of investors’ interests. The rule would, in effect, cause the identified conflicts of interest to be “eliminated or their effect neutralized” before harm is done to the investor.
By now, we have probably all heard of risks involved with adopting artificial intelligence. The most popular of which being the Skynet network-based conscious group mind and artificial general superintelligence system from the popular Terminator movie franchise. However, short of mitigating compliance concerns like Arnold Schwarzenegger, the heavy equipment off-road industry is left wondering if broader AI regulations will start to affect the way we do business. Manufacturers are seasoned and trained enough to discern a conflict of interest and mitigate appropriately. However, being “the new kid on the block,” AI does not have the same level of training and ethics as its human counterparts.
Adopting AI is priority because staying competitive is paramount. First, the wheel revolutionized agriculture. Then the screw held together increasingly complex construction projects. Next, the assembly lines of today use robotic machines which have made life as we know it possible. Logic would dictate we give these machines enough intelligence to operate in our stead. However, innovation often forgets ethics, and in response, industry leaders should consider all inherent risks with AI:
- What is the overall response rate with monitoring AI risks within the business?
- What are the most common types of conflicts reported?
- What are the consequences for failure to complete a certification/review of AI work products?
The Colonial Pipeline is the largest pipeline system for refined oil products in the U.S. The pipeline, consisting of three tubes, is 5,500 miles (8,850 km) long and can carry 3 million barrels of fuel per day between Texas and New York. On May 7, 2021, Colonial Pipeline proactively took its systems offline in response to a ransomware attack. On May 13, 2021, Colonial Pipeline announced the company restarted their entire pipeline system and product delivery commenced to all markets, after paying a $4.4m (£3.1m) ransom to the cyber-criminal gang responsible for taking the U.S. fuel pipeline offline.
This tipping point for U.S. cybersecurity regulations kicked off the current trends and regulatory regimes we see today. Manufacturers of heavy duty off-road equipment could learn from the Colonial Pipeline case study in terms of cybersecurity, third-party risk management and ransomware attacks. Shoring up enterprise compliance and securing data networks may be a way to protect your business, protect the general public and comply with the new SEC regulations.
For more perspectives from AEM staff, subscribe to the AEM Industry Advisor.