Tuesday, February 6, 2024

SOCIAL MEDIA – Congress and Others Point to Failings in Protecting the Public

U.S. Senate Judiciary Committee’s hearing on Online Child Sexual Exploitation highlighted the need for action to protect children from predators and the committee and public’s frustration at the lack of response from social media companies. Despite assurances from 5 CEOs of major social media companies that they were doing all that they could to protect children from predatory behavior, committee members pressed them for more action and to support a variety of legislative bills that have been proposed for passage in Congress.

As reported in the Seattle Times, Senate Majority Whip, and chairman of the Judiciary Committee Dick Durbin, Democrat, Illinois, pointed out in his opening remarks that,

“They’re responsible for many of the dangers our children face online. Their design choices, their failures to adequately invest in trust and safety, their constant pursuit of engagement and profit over basic safety have all put our kids and grandkids at risk.”

The comments of the chairman and committee members point to their frustration that the social media companies such as Meta (Facebook, Instagram), TikTok, Twitter (also known as X), Discord, and SNAP have not met the challenge to keep children safe or to give a good faith effort to work with Congress to draft legislation that would take measures to help protect children. Senator Lindsey Graham, Republican, South Carolina, and the Republican ranking member on the committee, even said that the social media companies have “blood on their hands.”

The hearing pointed to the need for social media companies to devote more resources to keeping their member’s safety and for legislation that makes sure social media is adhering to a set of rules that truly ensure the safety of social media users.

Multiple committee members pressed the CEO’s on whether they would give their support to several proposed pieces of legislation that have been introduced to Congress with few takers. Legislation mentioned during the hearing included,

·         Stop CSAM (Child Sexual Abuse Material) Act. The Stop CSAM Act includes mandatory child abuse reporting, expansion of protections for child victims and witnesses in federal court, facilitates restitution for victims of child exploitation, human trafficking, sexual assault, and crimes of violence, empowers victims by making it easier to ask tech companies to remove child sexual abuse material from their platforms.

·         Earn It Act. Creates targeted exceptions to Section 230* of the Communications Decency Act of 1996 to remove blanket immunity from civil and criminal liability under child sexual abuse material laws and establishes a National Commission on Online Child Sexual Exploitation Prevention.

·         Shield Act. This New York law, passed in 2019, imposes stronger obligations on businesses that handle personal and private information to mitigate threats that contribute to identity theft, such as data breaches and data leaks.

·         Cooper Davis Act. Requires social media companies and other communication service providers to report to the DEA when they have actual knowledge that illicit drugs are being distributed on their platforms or when someone who is not a practitioner or online pharmacy is distributing prescription pain medications and stimulants.

·         Protect Act. Would raise the mandatory minimum for possession of child pornography to the same level as for its receipt (that is five years imprisonment).

·         Kids Online Safety Act. Requires covered platforms (including social media sites) to design and operate their products or services that are used by minors to prevent and mitigate certain harms such as sexual exploitation and online bullying. Platforms must provide minors with certain safeguards such as settings that restrict access to minors’ personal data and provide parents or guardians with tools to supervise the minors’ use of a platform, such as control of privacy and account settings.

 

*Section 230 of the Communications Decency Act of 1996 states:

“No provider or user of an interactive computer service shall be treated as the publisher or speaker of any information provided by another information content provider.”

 

AARP recently highlighted the issue of account takeovers in their podcast The Perfect Scam. While their focus was beyond children’s exploitation on social media, it pointed to a similar source of the problem- lack of action on the part of social media. The two-part episodes, “’National Geographic’ Photographer Paul Nicklen Warns About Social Media Imposters,” describe the challenges that victims of account takeovers have on social media such as Facebook, with scammers using their accounts to promote all sorts of scams include romance scams, importer scams, and celebrity scams.

Victims include celebrities and “common people” who experience problems in regaining control of their accounts. They also can have problems, whether they can regain control or not, with their reputations and with letting their friends know that they are not perpetuating the scams being used in their names. For example, Paul Nicklen, the National Geographic photographer, constantly takes down fake sites and accounts that pretend to be him and constantly reminds is followers that he is not asking for money or sending them individual emails that they are his favorite follower.

 Kevin Long is the owner of Social Imposter (https://www.socialimpostor.com/) which helps high end people remove fake accounts. He relates several of his efforts in trying to convince social media companies to improve their cybersecurity measures in favor of their users, with no success.

Social media companies, such as Facebook, provide slow or no response to complaints, do not have clear or effective procedures that help account holders to regain their accounts, and do not provide a means to talk to an actual person in a cybersecurity or fraud department.

Comment: Fighting and preventing crime takes all of us. Not only in physical crimes, but also in cybercrime. Over recent years many elements of our society have been forming to fight one form or another of cybercrime including government, non-profit and profit-making organizations, and we the public. There is much work to be done to improve what we do now. But we need all elements of our society to work on this problem; government, the public, non-profits, and business. Hopefully together.

Consensus may be growing that social media businesses need to step up to do more to protect their users from cybercrime. That consensus certainly resides in the Senate Judiciary Committee and at AARP. Failing action from social media, Congress needs to coalesce together to form an effective strategy, and if necessary, require social media companies to join the team.

 

 

 

 

 

 

 

The Seattle Times:

https://www.seattletimes.com/business/meta-tiktok-and-other-social-media-ceos-to-testify-before-senate-committee-on-child-exploitation/

 

CSPAN:

https://www.c-span.org/video/?532641-101/short-take-recap-social-media-executives-testimony-child-sexual-exploitation

https://www.c-span.org/video/?532641-1/social-media-company-ceos-testify-online-child-sexual-exploitation-part-1

 

AARP:

https://www.aarp.org/podcasts/the-perfect-scam/info-2023/paul-nicklen-part-1.html

https://www.aarp.org/podcasts/the-perfect-scam/info-2023/paul-nicklen-part-2.html

 

Ask Leo:

https://askleo.com/what-to-do-when-your-account-is-hacked/

 

Security.org:

https://www.security.org/digital-safety/account-takeover-annual-report/

https://www.security.org/digital-safety/account-takeover-prevention/

 

 

No comments:

Post a Comment