Skip to main content

tv   Social Media Company CE Os Testify on Online Child Sexual Exploitation -...  CSPAN  March 20, 2024 8:12pm-10:46pm EDT

8:12 pm
8:13 pm
this is just over three and and a half hours.
8:14 pm
[gavel bangs] this meeting of the senate judiciary committee will come to order. i want to preface my remarks by saying that i have been in congress a few years. senator graham has as well. if you do not believe this is an idea whose time has come, take a look at the turnout here.
8:15 pm
today, the committee will continue its work on an issue on the mind of most american families -- how to keep our kids safe from sexual exploitation unharmed in the internet age. online child sexual exploitation includes the use of online platforms to target and groom children and the production and endless distribution of child sexual abuse material, which can haunt victims for their entire lives and in some cases take their lives. everyone here will agree this conduct is abhorrent. i would like to turn to a brief video to hear directly from the victims, survivors of the impact -- survivors about the impact these crimes have had on them. >> i was sexually exploded on facebook. >> i was sexually exploded on instagram. >> i was sexually exploded on x.
8:16 pm
>> look at how beautiful miriam is. >> my son riley died from suicide after being sexually exploded on facebook. >> and child who has been sexually exploited is never the same ever again. >> i reported this issue numerous times, and it took over an decade before anyone helped me. >> you might be able to tell that i'm using a green screen. why is that? in an internet world, my past abusers can contact me. fans of my abuse material as a child can find me and contact me. >> as a 17-year-old child, i read an impact statement after being extorted for four consecutive years. >> my son was in his room and suicidal. he's only 13 years old. i found out him and offend friend have been exploited online.
8:17 pm
we contacted twitter or now x. the response was thank you for reaching out. we reviewed the contact -- the content and did not find a violation of our policy, so no action will be taken at this time. >> how many kids like matthew -- >> like olivia -- >> like riley -- how many more kids will suffer and die because of social media? >> they failed to protect my child from sexual exploitation. >> we need congress to do something for our children and protect them. >> it is not too late to do something about it. >> online child sexual exploitation is a crisis in america. in 2013, the national center for missing and exploited children received approximately 1380 cyber tips per day. by 2023, just 10 years later, the number of cyber tips has
8:18 pm
risen to 100,000 reports a day. that's 100,000 daily reports of child sexual abuse material. in recent years, we have also seen an explosion in the so-called financial sextortion in which a predator uses a fake social media account to trick a miner into sending explicit photos and videos and threatens to release them unless the victim sends money. in 2021, 139 reports were made of sex torsion. in 2023, the number skyrocketed to more than 22,000. more than a dozen children have died by suicide after becoming victims of this crime. this disturbing growth in child sexual exploitation is driven by one thing -- changes in technology. in 1996, the world's
8:19 pm
best-selling cell phone was the motorola startac. while groundbreaking at the time, it was not much different than a traditional phone. it allowed users to make and receive calls and even receive text messages, but that was about it. fast-forward to today. smartphones are in the pockets of seemingly every man, woman, and teenager on the planet. today's smartphones allow users to make and receive calls and texts, but they can also take photos and videos, support live-streaming, and offer countless apps. with the touch of your finger, that smartphone can entertain and inform you, can become a back alley where the lives of your children are damaged and destroyed. these apps have changed the ways we live, work, and play, but as investigations have detailed, social media and messaging apps have also given predators powerful new tools to sexually exploit children.
8:20 pm
carefully crafted algorithms can be a more powerful force in the lives of our children than even the most best intentioned parent. discord has been used to groom, abduct, and abuse children as instagram helped promote a network of pedophile. snapchat's disappearing messages have been co-opted by criminals to financially extort young brigham's. tiktok has become a "platform of choice" for predators to access, engage, and groom children for abuse, and the prevalence of cs am on x comes as the company has goaded its trust and safety workforce. today, we speak with ceos of those companies, not only to companies that have contributed to the crisis, but they are responsible for many of the crimes we see online. their design choices, their
8:21 pm
constant pursuit of engagement and profit over basic safety have put our kids and grandkids at risk. coincidentally, several of these companies implement it common sense child safety improvements within the last week, days before their ceo's would have to justify their lack of action to this committee, but the tech industry alone is not to blame for the situation we are in. those of us in congress need to look in the mirror. in 1996, the same year the motorola startac was flying off shelves, and years before social media went mainstream, we passed section 230 of the communications decency act. this law immunized then fledgling internet platforms from liability for user-generated content. interesting only one other industry in america has immunity from civil liability. we believe that for another day.
8:22 pm
for the past 30 years, section 230 has remained largely unchanged, allowing big tech to grow into the most profitable industry in the history of capitalism without fear of liability for unsafe practices. that has to change. over the past year, this committee has unanimously reported five bills that finally hold tech companies accountable for child sexual exultation on their platforms. take a look at the membership of the senate judiciary committee and imagine if you will, if there's anything we can agree on unanimously. these five bills achieved agreement. one of these bills, the stop csam act, takes a stand against online child sexual exploitation. it's bipartisan and absolutely
8:23 pm
necessary. let this hearing be a call to action. we need to get kids online safety legislation to the president's desk. >> the republicans will answer the call. all of us. everyone of us is ready to work with you and our democratic colleagues on this committee to prove to the american people while washington is certainly broken, there is a ray of hope, and it is here. after years of working on this issue with you and others, i've come to conclude the following -- social media companies as they are currently designed and operate art dangerous products. they are destroying lives, threatening democracy itself. these companies must be reined in, or the worst is yet to come. kevin duffy is a republican
8:24 pm
representative from south carolina in the rock hill area. two other victims who came and showed us photos of your loved ones, don't quit. it's working. -- to all the victims who came and showed us photos of your loved ones, don't quit. the damage your family has been dealt, hopefully we can take your pain and turn it into something positive so nobody else has to hold up a sign. gavin's son got online to instagram and was tricked by a group in nigeria that put up a young lady posing to be his girlfriend and as things go at that stage in life, he gave her some photos, compromising sexual photos, and it turned out that she was part of a extortion group in nigeria.
8:25 pm
they threatened the young man that if you don't give us money, we will expose these photos. he gave them money, but it was not enough. they kept checking, and he killed himself. these are bass towards -- these are bastards by any known definition. mr. zuckerberg, you and the companies before us, i know you don't mean it to be so, but you have blood on your hands. [cheers and applause] you have a product that's killing people. when we had cigarettes killing people, we did something about it. you want to talk about guns, we had the atf. nothing here. there's not a damn thing anybody can do about it. senators blumenthal and blackburn have been like the dynamic duo here. have found emails from your
8:26 pm
company where they warn you about this stuff, and you desired -- decided not to hire 45 people that could do a better job of policing this. the bottom line is you cannot be sued. you should be, and these emails would be great for punitive damages, but the courtroom is closed to every american abuse by all the -- abused by all of the companies in front of us. it is now time to repeal section 230. this committee is made up of ideologically most different people you could find. we have come together through your leadership, mr. chairman, to pass five bills to deal with the problem of exploitation of children. i will talk about them in depth in a little bit. the bottom line is all these bills have met the same fate. they go nowhere. they leave the committee and they are done.
8:27 pm
now there's another approach -- what do you do with dangerous product? you allow lawsuits, you have statutory protections, or you have a commission of sorts to regulate the industry in question. they take your license away if you have a license. they find you. none of that exists here. -- they fine you. none of that exist here. we live in a america in 2024 where there is no regulatory body dealing with the most profitable biggest companies in the history of the world. they cannot be sued, and there's not one law in the book that is meaningful protecting the american consumers, and other than that we are in a good spot. so here is what i think is going to happen. i think after this hearing today, we are going to put a lot of pressure on our colleagues. leadership in the republican and democratic senate to let these bills hit the floor and vote. and i'm going to go down starting in a couple of weeks,
8:28 pm
make unanimous consent requests to do cspan. do the csam. do all the bills. become famous. i am going to give you a chance to become famous. now elizabeth warren and lindsey graham have almost nothing in common. i promised i would say that publicly. [applause] the only thing worse than me doing a bill with elizabeth warren is her doing a bill with me. we have sort of parked that because elizabeth and i see an abuse here that needs to be dealt with. senator durbin and i have different political philosophies but i appreciate what you have done on this committee. you have been great so to all my democratic colleagues thank you, very, very much. to all of my republican colleagues, thank you very, very much. save the applause for when we get a result. this is all talk right now. but there will come a day when we keep pressing that we get the right answer for the american people. what does that answer? accountability. now, these products have an upside. they have enriched our lives in many ways.
8:29 pm
mr. zuckerberg, you have created a product i use. the idea that i think when you first came up with it to be able to talk your friends and family and have a place where you can talk your friends and family about going on things in your life. there is an upside here. but the darkside has to be dealt with. it is now time to deal with the darkside because people have taken your idea, and they have turned it into a nightmare for the american people. they have turned it into a nightmare for the world at large. tiktok. we had a great discussion about
8:30 pm
how maybe larry ellison could protect american data from chinese communist influence. but tiktok, the representative in israel, quit the company, because tiktok is being used in a way to basically destroy the jewish state. this is not just about individuals. i worry that in 2024, our democracy will be attacked again through these platforms by foreign actors. we are exposed. and a.i. is just starting. so to my colleagues, we are here for a reason. this committee has a -- a history of being tough, but also doing things that need to be done. this committee has risen to the occasion. there is more that we can do but to the measures of this committee, let's insist that our colleagues rise to the occasion, also. with the hundred and 18thcongress, we have the votes
8:31 pm
that can fix this problem. all you can do is cast your vote at the end of the day, but you can urge the system to require others to cast their vote. mr. chairman, i will continue to work with you and everybody on this committee to have a day of reckoning on the floor of the united states senate. thank you, senator graham. today, we welcome five witnesses who i will introduce. jason cintron, the ceo of this court, inc. mark zuckerberg, the founder and ceo of meta. evan siegel, the cofounder of snap inc. i will note for the record that mr. zuckerberg and mr. chu are appearing voluntarily. i am disappointed that our other witnesses have not done that. the others are here pursuant to subpoenas and mr. cintron on the excepted service
8:32 pm
after u.s. marshals were sent to discord's headquarters at taxpayer expense. i hope this is not a sign of your commitment or a lack of commitment to addressing the serious issue before us. after the witnesses, each witness will have five minutes to make a statement. senators will ask questions any opening ground for each of seven minutes. i expect to take a short break at sometime during questioning to allow the witnesses to stretch their legs. if anyone is in need of a break at any time please let my staff know. before i turn to the witness side, i would also like to take a moment to acknowledge that this area has gathered a lot of attention as we expected. we have a large audience. the largest i have seen in this room today. i want to make clear as with other judiciary committee hearings, we ask people to behave appropriately. i know there is high emotion in this room for justifiable reasons, but i ask you to please follow the traditions of the committee. that means no standing, shouting, chanting, or applauding witnesses. disruptions will not be tolerated. anyone who does disrupt the hearing will be asked to leave. the witnesses are here today to address a serious topic. we want to hear what they have to say. i think you for your cooperation.
8:33 pm
can all the witnesses please stand to be sworn in. you confirm the testimony you are about to give before the committee will be the truth, the whole truth, and nothing but the truth? so help you god? let the record show that all witnesses have answered in the affirmative. mr. citron, please proceed with your opening statement. good -- opening statement. >> good morning. my name is jason cintron and i am the cofounder and the ceo of discord. we are an american company with about 800 employees living and working in 33 states. today, this court has gone to more than 150 million monthly active users.
8:34 pm
discord is a communications platform where friends hang out and talk online about shared interests, from fantasy sports to writing music to video games. i have been playing video games since i was five years old. and as a kid, that is how i had fun and found friendship. me of my fondest memories are playing video games with friends. we built this court so that anyone can build friendships playing video games from minecraft to wordle and everything in between. games have always brought us together and discord made it happen today. discord is one of the many services that have revolutionized how we can locate with each other and the different moments of our lives. imessage zoom, gmail, and on and on. they create communities, accelerate commerce, healthcare, and education. just like with all technology and tools, there are people who exploit and abuse our platforms for immoral and illegal purposes. all of us here on the panel today, and throughout the tech industry have a solemn and
8:35 pm
urgent responsibility to ensure that everyone who uses our platform is protected from these criminals, both online and off. discord has a special responsibility to do that because a lot of our users are young people. more than 60% of our active users are between the ages of 13 and 24. that is why safety is built into everything we do and is essential to our mission and our business. and, most of all, this is deeply personal. i and a dad with two kids. i want this court to be a product that they use and loved. and i want them to be safe on discord. i want them to be proud of me for helping to bring this product to the world. that is why i am pleased to be here today to discuss the important topics of the online safety of miners.
8:36 pm
i have written testimony and it provides a comprehensive overview of our safety programs. here are a few examples of how we protect and empower young people. first, we put our money into safety. the tech sector has a reputation of larger companies buying smaller ones to increase user numbers and boost financial results. but the largest acquisition we have ever made it discord was a company called sentra fee. it did not help us expand our market share because it uses a.i. to help us identify, band, and report criminals and bad behavior. it has actually lowered our user account to get rid of bad actors. second, you have heard of end to end encryption that blocks anyone, including the platform itself from seeing users' communication's. it is a feature on dozens of platforms, but not on discord. that is a choice we have made.
8:37 pm
we do not believe we can fulfill our safety obligations at the messages of teens are fully encrypted, because encryption would block our ability to investigate a serious situation and when appropriate report to law enforcement. third, we have a zero tolerance policy on child sexual abuse material or csam. we scan images, upload it to discord, to detect and block the sharing of this abhorrent material. we have also built a innovative tool, tenet safety assist, that blocks images and helps block and report unwelcome conversations. we have also developing new semantic hatching technology for detecting novel forms of csam called c.l.i.p. finally, we recognize that improving on my safety requires all of us to work together, so we partner with nonprofits, law enforcement, and our tech colleagues to stay ahead of the curve in protecting young people online.
8:38 pm
we want to be the platform that empowers our users to have better online experiences. to build true connections, genuine friendships, and to have fun. senators, i sincerely hope today is the beginning of a ongoing dialogue that results in real improvement in online safety. i look forward to your questions and helping the committee learn more about discord. thank you, mr. citron. mr. zuckerberg? >> thank you, members of the committee. every day teens and young people do amazing things in our services. these are apps to create new things, express themselves, explore the world around them, and feel more connected to the raw they care about. overall, teens tell us this is a positive part of their lives, but some face challenges online
8:39 pm
and they work hard to produce potential harms. being a parent is one of the hardest jobs in the world. technology gives us new ways to medicate with our kids and connect to their lives but it can also make parenting more complicated, and it is important to me that our services are positive for everyone who uses them. we are on the side of parents working hard everywhere to raise their kids. over the last eight years, we have built more than 30 different tools and features so parents can set time limits for their teens using their apps and see who they are following or report someone for bullying. for teens, we have added nudges to remind them when they have been using instagram for a a while or they should go to sleep. and we have added words and ways for people without people finding out. we put special restrictions on instagram 14 accounts by default. accounts for under 16 are set to private and have the most restrictive content settings and cannot be messaged by adults they do not follow or are not connected to. with so much of our lives spent on mobile devices and social media, it is important to look into the effects on tenet mental -- teen mental health and
8:40 pm
well-being. i take this very seriously. mental health is a complex issue and the existing body of scientific work has not shown a causal link between using social media and young people having worse mental health outcomes. a recent national academies of science report evaluated over 300 studies, and found that research, quote, did not support the conclusion that social media causes changes in adolescent mental health and the population level. end quote. they also suggested that they social media can provide significant positive benefits when young people use it to express themselves and connect with others. still, we will use this research to inform our road map. keeping young people safe online has been a challenge since the internet began. as criminals evolve their tactics, we have to evolve our defenses, too. we work closely with law enforcement to find bad actors and help bring them to justice, but the difficult reality is that no matter how effective we
8:41 pm
invest or how effective our tools are, there is always more to learn and more improvements to make, but we remain ready to work with members of this committee and parents to make the internet safer for everyone. i am proud of the work that our teams due to improve online child safety on our services and across the entire internet. we have around 40,000 people overall working on safety and security and we have invested more than $20 billion since 2016, including $5 million in the last year alone. we have many teams dedicated to child safety and we lead the industry and a lot of the areas we are discussing today. we built technology to tackle the worst online risks and share to help the whole industry get better, like project lantern which helps companies share data about people who break child safety rules and we are founding members who help prevent nude images from being spied on line. we also go beyond legal requirements and use sophisticated technology to proactively discover abusive material, and as a result, we find and report more inappropriate content than
8:42 pm
anyone else in the industry. as the national center for missing and exploited children put it this week, meta goes above and beyond to make sure there are no portions were this type of activity occurs. i hope we can have a substantial discussion today with legislation that delivers what parents say they want, a clear system for age verification, and control over what apps their kids are using. three out of four parents want app store age verification. and four out of 51 parental approval of whatever -- whenever teens download apps. we support this. parents should have the final say on what apps are appropriate for their children and should not have to upload their i.d. every time. that is what app stores are for. we also support setting industry standards on age- appropriate content and limiting signals for advertising to teens to age and location and not behavior. at the end of the day, we want
8:43 pm
everyone who uses our services to have safe and positive experiences. before i wrap up, i want to recognize the families who are here today who have lost a loved one or lived through some terrible things that no family should have to endure. these issues are important for every parent on every platform and i am committed to continuing to work on these areas, and i hope we can make progress today. >> thank you. mr. spiegel? >> chairman durbin, ranking member graham, and members of the committee, thank you for moving forward an important legislation to protect children online. i am evan siegel, the cofounder and ceo of snap. we created snap chat, and online
8:44 pm
service used by more than 800 million people online to communicate with their friends and families. i know many of you have been working to protect children online since before snapshot was created and we are grateful for your long-term dedication to this cause and your willingness to work together to help keep our communities safe. i want to acknowledge the survivors of online harms in the families were here today who have suffered the loss of a loved one. words cannot begin to express the profound sorrow i feel with a service we have made to bring people happiness and joy that has been used to create harm. i want to be clear we understand our responsibility to keep the community safe. i also want to recognize the many families who have worked to raise awareness on these issues, push for change, and collaborated with lawmakers like the cooper davis act which can help save lives. i started building snapchat with my cofounder bobby murphy when i was 20 years old. we designed snapchat to solve some of the problems we have experienced online when we were teenagers. we did not have an alternative to social media.
8:45 pm
that meant pictures shirt online were public, permanent, and subject to popularity metrics and it was not very good. we built snapchat differently because we wanted a way to communication that was fast, fun, and private. a picture is worth 1000 words of people communicate with images and videos. we do not have public likes and comments when you share your story with friends. snapchat is private by default , meaning that people have to opt in and choose who can contact them. one we built snapchat, we had services deleted by default. like prior generations of enjoyed the privacy of phone calls which are not recorded, they have enjoyed moments that may not be pictures are perfect but can instead convey emotion without permanence. even though they are deleted by default, we let everyone know
8:46 pm
that images and videos can be saved by that recipient. when we take action, we also retain the evidence for a extended period which allows us to support law enforcement and hold criminals accountable. to help prevent the spread on snapchat, we as a combination of automated processes and human review. we apply our content rules consistently and fairly across all accounts. we want samples of our enforcement action to quality enforcement to verify that we are getting it right. we also proactively scanned for known child sexual abuse material, drug-related content, and other kinds of harmful content and we deactivate it and preserve the evidence to law enforcement and present evidence to the relevant authorities for further action. last year, we made 690 reports leading to more than 1000 arrests. we also removed 2.2 million pieces of drug-related content and block 705,000 associated accounts. even with our strict privacy settings, content moderation efforts, and law enforcement collaboration, bad things can still happen when people use
8:47 pm
online services. that is why we believe that people under the age of 13 are not ready to communicate on snapchat. we strongly encourage parents to use the device level parental controls on iphone and android. weaves them in our own household and my wife approves every app that our 13-year-old downloads. for parents who want more visibility and control, we have the family center we can review who your teen is talking to, review privacy settings, and set limits. we have worked for years with the committee on the online safety act and the cooper davis act which we are happy to support. i want to produce broader legislation. know legislation is perfect but some rules of the road are better than none. much of the work that we do protect people from our service would not be possible without the support of our industry, government, nonprofit organizations and ngos, and in particular law enforcement and first responders who have committed their lives to keeping people safe.
8:48 pm
i am extraordinarily grateful for the efforts to prevent criminals from using online services to perpetrate their time. i feel a deep obligation to give back and to make a positive difference, and i am grateful to be here today as part of this vitally important democratic process. members of the committee, i give you my commitment that we will be part of the solution for online safety. we will be honest about our shortcomings and we will work continuously to improve. thank you, and i look forward to answering your questions. >> thank you, mr. spiegel. mr. chu? >> members of the committee, i appreciate the opportunity to appear before your today. my name is shou chew and i am the ceo of tiktok, a online community of more than 1 billion people worldwide, including well over 170 million americans who use our app every month to create, to share, and to
8:49 pm
discover. now, although the average age on tiktok in the u.s. is over 30, we recognize that special safeguards are required to protect minors, especially when it comes to combating all forms of csam. as a father of three young children myself, i know that the issues we are discussing today are horrific, and the nightmare of every parent. i am proud of our efforts to address the threats to young people online from commitment to protecting them to our industry-leading policies, and use of innovative technology, and our significant ongoing investment in trust and safety to achieve this goal. tiktok is vigilant about
8:50 pm
enforcing its 13 and up policy and offers an expense for teens that is much more restrictive than you and i would have as adults. we make careful product design choices to help make our app inhospitable to those seeking to harm teens. let me give you a few examples of long-standing policies unique to tiktok. we did not do them last week. first, that rank-messaging is not available to any users under the age of 16. second, accounts for people under 16 are automatically set to private along with their content. furthermore, the content cannot be downloaded and will not be recommended to people they do not know. third, every teen under 18 has a screen time limit automatically set to 60 minutes. and, fourth, only people 18 and above are allowed to use our live stream feature. i am proud to say that tiktok
8:51 pm
was among the first to empower parents to supervise their teens on our app with our family-pairing tools. this includes setting screen time limits, filtering out content and others. we made these choices after consulting doctors and safety experts who understand the unique stages of teenage development to ensure we have the appropriate safeguards to prevent harm and minimize risk. now, safety is one of the core priorities that defines tiktok under my leadership. we currently have more than 40,000 trust and safety professionals working to work our community globally. and we expect to invest more than $2 billion in trust and safety efforts in this year alone. a significant part of that in our u.s. operations. our robust community guidelines strictly prohibit content or behavior that puts teenagers at risk of exploitation or harm, and we vigorously enforce them.
8:52 pm
it helps to quickly identify potential csam and other materials that break our rules. it automatically removes the content for safety professionals for further review. we also moderate the right messages for csam and relate to the material and use third- party tools like photo dna and take it down to combat csam to prevent that content from being we continually meet with parents, teachers, and teenagers. in fact, i set down with a group just a few days ago. weaves their insight to strengthen protections on our platform and we also work with leading groups like the technology coalition program the steps we are taking to protect teens are a critical part of a larger trust and safety work as we continue our voluntary and unprecedented efforts to build a safe and secure data environment
8:53 pm
for u.s. users. ensuring that our platform remains free from outside manipulation and lamenting safeguards on our content recommendation and moderation tools. keeping teens safe online requires a collaborative effort as well as collective action. we need commitment to protecting people online and we welcome the opportunity to work with you on legislation to achieve this goal. our commitment is ongoing and unwavering because there is no finish line when it comes to protecting teens. thank you for your time and consideration today. i am happy to answer your questions. >> thank you. >> thank you. >> can you check if your
8:54 pm
microphone is on? >> maybe if i better just might chair. my apologies. let me start over. chair durbin, chairman graham, and esteemed ranking members of the committee thank you for the , opportunity to discuss x's work to protect the safety of minors online. today's hearing is titled a crisis. it calls for immediate action. as a mother, this is personal. and i share the sense of urgency. x is an entirely new company, and indispensable platform for the world and for democracy.
8:55 pm
you have my personal commitment that x will be active and a part of this solution. while i joined x only in june of 2023, i bring a history of working together with governments, advocates, and ngos to harness the power of media to protect people. before i joined, i was struck by the leadership steps that this new company was taking to protect children. x is not the platform of choice for children and teens. we do not have a line of business dedicated to children. children under the age of 13 are not allowed to open an account. less than 1% of the u.s. users on x are between the ages of 13 and 17. and those users are automatically set to a private default setting and cannot accept a message from anyone they do not approve.
8:56 pm
in the last 14 months, x has made material changes to protect minors. our policy is clear. x has zero tolerance towards any material that features or promotes child sexual exploitation. my written testimony details x's extensive policies on content or actions that are prohibited, and include growing blackmail and -- grooming, black male, -- blackmail, and alleged victims of cse. we also have more tools and technology to prevent those bad actors from distributing, searching for, and engaging with atf. if cse content is posted on x, we remove it. and now we also remove any account that engages with cse content, whether it is real or computer-generated.
8:57 pm
last year, x suspended 12.4 million accounts for violating our cse policies. this is up from 2.3 million accounts that were removed by twitter in 2022. in 2023, 850,000 reports were sent to neckmec, including our first ever auto-generated report. this is 8 times more than what was reported by twitter in 2022. we have changed our priorities. we have restructured our trust and safety teams to remain strong and agile.
8:58 pm
8:59 pm
we are building a trust and safety center of excellence in austin, texas, to bring more agents in house to accelerate our impact. we are applying to the technology coalition's project lantern to make further industry-wide impact. we have also opened up our algorithms for increased transparency. we want america to lead in this solution. x commends the senate for passing the support act and we support the shield act. it is time for our federal standard to criminalize the sharing of nonconsensual intimate material. we need to raise the standards across the entire internet ecosystem. especially for those tech companies that are not here today, and not stepping up. x supports the stop can can act. the kids online safety act should continue to progress, and we will continue to engage with it and ensure the protections of the freedom of speech. there are two additional areas that require everyone's attention. first, as the daughter of a police officer, law enforcement must have a critical resources to bring these bad offenders to justice. second, with artificial intelligence, offenders' tactics will continue to sophisticate and evolve. industry collaboration is imperative here. x believes that the freedom of speech and platform safety can
9:00 pm
and must coexist. we agreed that now is the time to act with urgency. thank you. i look forward to answering your questions. >> thank you very much, ms. yaccarino. now we will go into rounds of questions. seven minutes each for the members, as well. i would like to make note of your testimony, ms. yaccarino. you were the first social media company to publicly endorse the see sam act. >> it is our honor, chairman. >> thank you for doing that. i will still be asking some probing questions, but let me get down to the bottom line here. i am going to focus on my or knowingly post child's sex materials. secondly, intentionally or knowingly promote or aid or abet
9:01 pm
violation of child exploitation laws -- child sexual exploitation laws. is there anyone here who believes you should not be held civilly liable for that kind of conduct? mr. citron? >> good morning, chair. you know, we very much believe that this content is disgusting.
9:02 pm
and that there are many things about the stop csam act bill that are very encouraging, and we very much encourage adding more resources to the tip line and modernizing that with getting resources to neckmec, and i am very open to having conversation with you and your team. >> i would sure like to do that because if you intentionally or knowingly post or store csam, i think you ought to be civilly liable. i cannot imagine anyone who would disagree with it. it is disgusting content. mr. spiegel, i want to tell you, i listened closely to your testimony here, and it has never been a secret that snapchat is used to send sexually explicit images. in 2013, early in your company's history, you admitted this in an interview. do you remember that interview? >> senator, i do not recall the specific interview. >> you said when you were first trying to get people onto the app, you would say you can send disappearing photos. and they would say, oh, for sexting? >> senator, when we first rated called ticaboo, and the feedback we received from people was that they were actually using the
9:03 pm
name of the application to snapchat and we found that they were using it to talk visually. >> as early as 2017, law enforcement identified snapchat is the pedophiles' go-to sexual exploitation tool. an exploitation of a 12-year-old girl shows the court the danger. over 2 1/2 years of predators sexually groomed her sending her sexually explicit images and videos over snapchat. the man admitted he only used snapchat and not any other platforms because he, quote, "knew the chats will go away." did you and everyone else at snapchat really failed to see the platform is a perfect tool for sexual predators? >> senator, that behavior is disgusting and reprehensible. we provide reporting tools for people who have been harassed or have been shared inappropriate content can report it. we typically respond to those reports within 15 minutes so we
9:04 pm
can provide help. >> when the victim sued snapchat, it was dismissed under the communication decency act. do you have any doubt that snap faced civil liability for facilitating sexual exploitation the company would have , implemented even better safeguards? >> senator, we already worked extensively to proactively detect this type of behavior. we make it very difficult for predators to find teens on snapchat. there no public friends list. no public photos. when we recommend friends for teens, we make sure they have several mutual friend in, before making that recommendation. we believe that is important to preventing predators from misusing our platform. >> mr. citron, according to discord's website, takes a proactive and automated a approach to safety with more than 200 members. smaller servers rely on servers
9:05 pm
-- how do you defend an approach to safety that lies in groups of fewer than 200 sexual predators to report themselves for things like grooming, trading with csam, or sextorsion? >> chair, our goal is to get all of that content off of our platform and ideally prevent it from showing up in the first place or for people engaging in these kind of horrific activities. we deploy a wide array of techniques that work across every surface on discord.
9:06 pm
as i mentioned, recently launched something called teen safety assist. it is for teen users. it kind of acts like a buddy that lets them know if they are in a situation are talking with someone that may be inappropriate so they can report that to us and block that user. >> mr. cintron, if that were working we would not be here today. >> this is an ongoing challenge for all of us. that is why we are here today. 15% of our company is focused on trust and safety in which this is one of our top issues. this is more people we have working on marketing and promoting the company. we take these issues very seriously. it is an ongoing challenge and i look forward to collaborating with nonprofits to improve our approach. >> i certainly hope so. your organization and businesses one of the more popular ones among children. can you explain to us what you're doing particularly and if you have seen any evidence of csam in your business? >> yes, senator. you have a strong commitment to trust and safety as i said in my opening statement, i intend to invest more than $2 billion in trust and safety this year alone. we have 40,000 safety professionals you know working on this topic. we have a specialized child safety team to help us identify
9:07 pm
specialized horrific issues, like material like the ones you have mentioned. if we identify it on our platform, we do a detection, we will remove it and report them to other authorities. >> why is it that tiktok is allowing children to be exploited into performing commercialized sex acts? >> senator, i respectfully disagree with that characterization. our live streaming product is not for anyone below the age of 18. we have taken action to identify anyone who violates it and we remove them from using that service. >> at this point, i am going to turn to my ranking member, senator graham. >> thank you. mr. citron, you said we need to start a discussion. to be honest with you, we have been having this discussion for a very long time. we need to get a result, not a discussion.
9:08 pm
do you agree with that? >> ranking member, i agree that this is an issue that we have also been very focused on since we started our company in 2015. >> are you familiar with the urn -- the urn it act? >> a little bit. yes. >> you support that? >> we -- >> like yes or no? >> we are not prepared to support it today. >> but you support the csam act? >> the stop csam act? >> or the shield act? >> we believe that the cyber tip line -- do you support it, yes or no? the project safe childhood act. do you support it? >> we believe that -- >> i will take that as a no. the report act. do you support it? >> ranking member graham, we look very much forward to having conversations with you and your team. >> you support removing section liability protections for social
9:09 pm
230 media companies? >> i believe that section 230 is in need of an update. it is a very old law. >> do you support repealing it so people can sue if they are harmed? >> 230 as written, while -- >> thank you very much. so here you are. if you wait on these guys to solve the problem, we are going to die waiting. mr. zuckerberg. i am trying to be respectful here. the representative from south carolina's son got caught up in a sexual extortion ring in nigeria using instagram. and he was shaken down and paid money, it wasn't enough and
9:10 pm
he killed himself. using instagram. >> it is terrible. no one should have to go through something like that. >> do you think he should be allowed to sue you? >> i think that they can sue us. >> i think you should and you can't. so the bottom line here, folks, is that this committee is done with talking. we passed five bills unanimously and they are different ways. look at who did this. graham-blumenthal. durbin-holly. blackburn-also off. klobuchar. i mean, we have found common ground here that just is astonishing and we have had hearing after hearing, mr. chairman, and the bottom line is i have come to conclude, gentlemen, that you are not going to support any of this.
9:11 pm
linda -- how do you say your last name? >> yaccarino. >> do you support the earn it act? >> we strongly support the -- the collaboration to raise industry levels -- >> no. in english, do support the earn it act. yes or no, we do not need doublespeak. >> we look forward to -- >> ok. you have to earn liability section. here you get it no matter what you do. it is now time to make sure the people holding up the signs can sue on behalf of the loved ones. nothing will change until the courtroom door is opened to
9:12 pm
victims of social media. $2 billion, mr. chiu, what percentage is that of what you made last year? >> senator, it is a significant and increasing investment. >> you pay taxes. to present is -- 2% is what percent of your revenue? >> senator, we are not ready to share our financials in public. >> it is a lot if you make $100 billion. if you tell us you're going to spend $2 billion, great. how much do you make? it is all about eyeballs. our goal is to get eyeballs on you. and this is not just about children. the damage is being done. do you realize, mr. chew, that your tiktok representative in israel resigned yesterday? >> yes. i am aware. >> he said "i resigned from tiktok for living in a time in which our existence as jews in israel and israel is under
9:13 pm
attack and danger." multiple screenshots taken from tiktok's internal employee platform known as l.a.r.k. and other iranian backed terror groups including the houthis in yemen. >> i want to make it very clear, hate speech -- >> why did he resign? why did he quit? >> senator, we -- >> why he quit? he had a good job. he gave up a good job because he thinks you're platform is being used to help people who want to destroy the jewish state. and i'm not saying that you want that. mr. zuckerberg, i am not think you want as an individual any of the harms, but i am saying the product you have created with all the upsides have a dark
9:14 pm
side. mr. citron, i am tired of talking and having discussions. we all know the answer here. here. stand by your product. defend your practices. open at the courthouse door. until you do that, nothing will change. until these people can be sued for the damage they are doing, it is all talk. i'm a republican who believes in free enterprise, but i also believe that every american who has been wronged has somebody to go to to complain. there is no commission to go to that can punish you. there is not one law in the book because you oppose everything you do and you cannot be sued. that has to stop. how do you expect people in the audience to believe that we are going to help their families if we don't have some system or combination of systems to hold these people accountable? because for all the upside, the dark side is too great to live with. we do not need to live this way as americans.
9:15 pm
>> thank you, senator graham. senator klobuchar is next. she has been quite a leader on the subject for a long time. she acted with senator cornyn on the revenge porn website. >> thank you very much, chairman, and thank you ranking member. for those words. i could not agree more. for too long we've been seeing the social media companies turn a blind eye when kids joined the platforms in record numbers. they have used algorithms that push adult content, they provided a venue, maybe not knowingly at first, but for dealers to sell deadly drugs like fentanyl. the head of our drug enforcement
9:16 pm
administration said they basically have been captured by the cartels in mexico. and in china. i strongly support, first of all, the stop csam act and agree with senator graham that nothing is going to change unless we open up the court room doors. i think the time for immunity is done because i think money talks stronger than we talk. two of the five bills are my bills with senator cornyn. one has passed through the senate and is awaiting action in the house. the other is the shield act, and i do appreciate the support of x. it is about revenge porn. the fbi director testified before the committee. there have been over 20 suicides of kids attributed to online revenge porn in the last year.
9:17 pm
but for those parents and those families, this is for them about their own child, but it's also about making sure this doesn't happen to other children. i know because i've talked to these parents. visits -- parents like bridget, from hastings, minnesota, who lost her teenage son after he took a fentanyl-laced pill that he purchased on the internet. amy neville, who is also here. her son alexander was only 14 when he died after taking a pill he did not know was actually fentanyl. we are starting a law enforcement campaign, one pale kills, in minnesota, going to schools with the sheriffs and the law enforcement. the way to stop it is at the border and points of entry, but we know that 30% of the people getting fentanyl are getting it
9:18 pm
off the platforms. meanwhile, social media platforms generated 11 billion in revenue in 2022 from advertising directed at children and teenagers, including nearly $2 billion in ad profits derived from users age 12 and under. when a boeing airplane lost the door in midflight several weeks ago, nobody questioned the decision to ground the fleet of over 700 airplanes. so why are we not taking this same type of decisive action on the danger of these platforms when we know these kids are dying? we have bills that have passed through this incredibly diverse committee when it comes to her -- two our political views that passed through this committee and they should go to the floor. we should do something about liability, and we should turn to
9:19 pm
some of the other issues that a number of us have worked on when it comes to the charges for app stores and when it comes to the monopoly behavior. in self referencing. i will stick with this today. one third of fentanyl cases investigated over five months have direct ties to social media. that is from the dea. between 2012 and 2022, cyber tip line reports of online child sexual exploitation increased from 415,000 to more than 32 million. as i noted, at least 20 victims committed suicide in six store station -- sextortation cases. i will start with that with you
9:20 pm
, mr. cintron. the shield act include a provision that would help protection and accountability for those who are threatened by these predators. young kids get a picture and send it in and think they have a new girlfriend or boyfriend. it ruins their life and they kill themselves. can you tell me why you're not supporting the shield act? >> we think it's important that teenagers have a safe experience on our platform. i think the portion to strengthen law enforcement and the ability to investigate and hold bad actors accountable -- senator klobuchar: so you are saying you may support it? >> we would like to have conversations with you. we do welcome legislation and regulation. this is an important issue for our country and we've been prioritizing safety -- 01:12:03 -- safety -- senator klobuchar: thank you. i'm more interested if you supported. there has been so much talk and popcorn throwing. i'm tired of this. it has been 28 years since the internet and we have not passed any of these bills.
9:21 pm
because everyone doubletalk, doubletalk. it is time to actually pass them. and the reason they have not passed is because of the power of your company. let me be clear about this. what you say matters. your words matter. mr. chew, i am a cosponsor of the stop csam act along with the -- along with senator hawley, the lead republican, which among other things, empowers victims. making it easier for them to have tech companies remove the material and related imagery from the platforms. why would you not support this bill? >> we largely support it. i think the spirit is aligned with what we want to do. that some groups have and we look forward to asking them. i think if this legislation is passed we will comply. senator klobuchar: mr. spiegel we talked ahead of time. , i appreciate your company's support for the cooper davis
9:22 pm
act. it is a bill that will allow law enforcement to do more when it comes to fentanyl. i think you know what a problem it is. one boy was suffering dental pain so he purchased a percocet and bought a counterfeit drug laced with fentanyl. as his mother said, "all the hopes and dreams we have for him were erased in the blink of an eye and no mother should have to bury their kid." talk about why you support the cooper davis act. >> we strongly support it and we believe it will help dea go after the cartels and get dealers off the streets to save more lives. senator klobuchar: are there others that support that bill?
9:23 pm
ok. no. mr. zuckerberg, in 2021 the wall street journal reported on internal matter research documents asking why do we care about tween's and answering his own question by citing meta- internal emails. they are a valuable but untapped audience. at a commerce hearing, i asked meta's head of safety why children ages 10 to 12 are so valuable and she said we do not knowingly attempt to recruit people who are not old enough to use our app. when the 42 state attorneys general brought their case they said the statement wasn't accurate. in 2021, ms. davis received in the mail from the instagram research director saying they are investing in experiencing targeting young ages, 10 to 12. in february 2021 instant
9:24 pm
message, one employee wrote they are working to recruit gen alpha before they reached teenage years. a 2018 email distributed inside meta said you were briefed that children under 13 will be critical for increasing the rate of acquisition when users turn 13. explain that with what i heard at the testimony at the commerce hearing that they were not being targeted. and i ask again why your company does not support the stop csam act or the shield act? >> we had discussions internally whether we should build a kids version of instagram. we've not moved forward with that and we have no plans to do so.
9:25 pm
i cannot speak directly to the exact emails you cited but it sounds to me like they were deliberations around the project that we thought was important and we did not end up moving forward with it. my position on the bills, i agree with the goal of all of them, there are specific things i would probably do differently and we also have our own legislative proposal we think would be most effective in helping the internet and various companies give parents control over the experience. i'm happy to go into detail. ultimately, i believe -- senator klobuchar: i think these parents will tell you that the stuff has not worked. to give parents control. they do not know what to do. it is hard and that's why we are coming up with other solutions that we think are more helpful
9:26 pm
to law enforcement but this idea of getting something going on liability. i believe with the resources that you have that you could do more than you are doing. or these parents would not be sitting behind you in this senate hearing room. >> i don't think parents should have to upload and id or proof they are parent in every single app that there children use. the easier place to do this is in the app store themselves. my understanding is apple and google or at least apple already requires parental consent when a child does a payment within an app. it should be trivial to pass a law to have parents have control anytime a child downloads an app. and offers consent. the research we have done shows that the vast majority of parents want that and that's the
9:27 pm
type of legislation that would make this easier. senator klobuchar: i remember one mother telling me with all these things she can do that she cannot figure out. it's like a faucet overflowing and she is out with a mop meanwhile her kids are being exposed to material. we have to make it simpler for parents to they can protect their kids, and i don't think this is the way to do it. i think the answer is what senator graham has talked about is opening up the halls of the courtroom so it puts it on you guys to protect these parents and protect these kids. and pass some of these laws that make it easier for law enforcement. >> thank you, senator klobuchar. we will try to stick to the seven minute rule. it did not work very well, but i'll try to give additional time on the other side as well. senator cornyn. >> but there's no question that your platforms are very popular.
9:28 pm
we know that here in the united states we have an open society and a free exchange of information, but there are or authoritarian governments and criminals who will use your platforms for the sale of drugs, for sex, for extortion, and the like. mr. chew, i think your company is unique among the ones represented here today because of its ownership by bytedance, a chinese company, and i know there have been some steps you've taken to wall off the data collected in the united states. the fact of the matter is is that under chinese national intelligence laws, all information accumulated by companies in the people's republic of china are required to be shared with the chinese intelligence services.
9:29 pm
bytedance, the initial release of tiktok was 2016. the efforts you made with oracle was in 2021, allegedly walled off in 2023. what happened to the data that tiktok collected before that? >> tiktok is owned by bytedance. we have three americans on the board. you're right in pointing out that over the last three years we've spent billions of dollars with project texas. it is a plan that is unprecedented in our industry to wall off data from the rest of our staff. >> i'm asking about all of the
9:30 pm
data that you collected prior to that. >> we have finished the first phase of [indiscernible] and we are beginning phase two where we will require a third-party to verify and then go into employees working laptops to delete that as well. >> was the data collected by tiktok prior to project texas shared with the chinese government pursuant to the national intelligence was? >> we have not been asked for any data by the chinese government and have never provided it. >> your company is unique among those represented today because you are undergoing review by the committee on foreign investment in the united
9:31 pm
states, correct? >> senator, there are ongoing discussions and a lot of the project texas work is discussions with many agencies. >> cynthia's is designed to review foreign investments for national security risks. correct? >> yes, i believe so. >> and your company is being reviewed by this interagency committee at the treasury department for potential national security risks. >> this review is on an acquisition of musically, which is an accusation -- acquisition that was done many years ago. >> does it provide information to the treasury department about how your platform operates as a national security risk? >> it has been many years and a lot of discussions around how
9:32 pm
our systems work. we have a lot of robust discussions about a lot of detail. >> 63% of teenagers, i understand, use tiktok. does that sound right? >> senator, i cannot verify that. we know we are popular among many age groups. the average age in the u.s. is over 30. we are aware we are popular. >> you reside in singapore with your family, correct? >> yes, i reside in singapore and i work here in the u.s. as well. >> do your children have access to tiktok in singapore? >> if they lived in the u.s., i would give them access to the under 13 age experience. my children are under age 13. >> my question is in singapore do they have access to tiktok? or is it restricted by domestic law? >> we do not have an under 13 experience in singapore.
9:33 pm
we created the under 13 experience in response to that. >> a wall street journal article contradicts what your company is stating publicly. according to the journal employees under project texas , say that user data, including user emails, date of birth, ip addresses, continue to be shared with bytedance staff owned by a chinese company. do you dispute that? >> yes, senator. there are many things about that article that are not accurate. what it gets right is that it's a voluntary project and we sent -- spent billions of dollars. there are thousands of employees involved and it is very difficult. >> why is it important that the data collected from u.s. users be stored in the united states? >> this project was built in
9:34 pm
response to some of the concerns that were raised by this committee and others. >> that was because of concerns that the data that was stored in china can be accessed by the chinese communist party, according to the national intelligence laws, correct? >> senator, we are not the only company that does business -- that has chinese employees. we are not the only company in this room that hires chinese nationals. in order to address these concerns, we have moved to the data into our infrastructure and build a 2000 person team to oversee the database. we are walled off from the rest of the organization and then we opened it up to third parties like oracle to give them third-party validation. this is unprecedented access. i think we are unique in taking more steps to protect user data . >> you disputed the wall street journal story published yesterday. are you going to conduct any
9:35 pm
sort of investigation to see if there is any truth to the allegations made in the article, or are you just going to dismiss it outright? >> we are not going to dismiss them. we have ongoing security inspections not only by our own personnel but by third parties to ensure that the system is rigorous and robust. no system anyone can build is perfect. but what we need to do is make sure we are improving it and testing it against people who try to bypass it. if anybody breaks our policies within the organization, we take disciplinary action against them. >> thank you, senator cornyn. senator kunz? >> thank you, chair. i would like to start by thanking the families that are here today. all the parents who are here because of a child they have lost. all the families that are here because you want us to see you and to know your concerns. you have contacted each of us in our offices, expressing your
9:36 pm
grief, loss, passion, and concern. the audience that is watching cannot see this. they can see the witnesses, but this room is packed as far as the eye can see. and when this hearing began, many of you held up pictures of your beloved and lost children. i benefit from and participate in social media, as do many of the committee and nation and world. there are now a majority of people on earth participating in and benefiting from one of the platforms you've launched or you lead or you represent. we have to recognize that there are some real positives to social media. it has transformed modern life, but it has also had a huge impact on families and children. on nations. there's a whole series of bills championed by members of this committee that try to deal with
9:37 pm
the trafficking in illicit drugs and trafficking in illicit child sexual material. things that are facilitated on your platform that may lead to self harm or suicide. so we've heard from several of the leaders from the committee the chair and ranking and the , experience senators. the frame we look at this is consumer protection. when there is some new technology, we put in place regulations to make sure it is not overly harmful, as my friend senator klobuchar pointed out. one door flew off of one airplane and no one was hurt, and yet the entire boeing fleet of that type of airplane was grounded and a federal fit for purpose agency did an immediate safety review. i am going to not to the other point pieces of legislation that i think are urgent that we take up and pass, but the core question of transparency. if you are a company manufacturing a product that is allegedly addictive and harmful,
9:38 pm
one of the first things we look to is safety information. we try to give our constituents, our consumers, warnings, labels that help them understand what are the consequences of this product and how to use it safely or not. as you've heard some of my colleagues, if you sell an addictive, defective, harmful product in this country in violation of regulations, you get sued. and what is distinct about platforms as an industry is most of the families who are here are here because there were not sufficient warnings and they cannot effectively sue you. so let me dig in for a moment if i can. because each of your companies voluntarily discloses information about the content and the safety investments you make and the actions you take. there was a question by senator graham earlier about tiktok.
9:39 pm
mr. chiu, you said you invest $2 billion in safety. my background memo says your revenue is $85 billion. you are investing $5 billion in safety. so what matters, what matters is the relative numbers and the absolute numbers. if there's anyone in this world who understands data, it's you guys. i want to walk through whether or not these voluntary measures of disclosure of content and harm are sufficient. i would argue that we are here because they are not. without better information, how can policymakers know whether the protection you testified about, the new initiatives, the starting programs, the monitoring and the takedowns are actually working. how can we understand meaningfully how big these
9:40 pm
problems are without measuring and reporting data? mr. zuckerberg, you referenced the national academy of science study that said at the population level there's no proof about harm to mental health. it may not be at the population level, what i'm looking at a room with hundreds of parents of -- parents who have lost children. and our challenge is to take the data and make decisions about protecting families and children from harm. let me ask about what your companies do or do not report, and i will particularly focus on your content policies around self-harm and suicide. i will ask a series of yes/no questions. and what i'm getting at is, do you disclose enough. mr. zuckerberg, from your policy prohibiting content about suicide or self-harm, do you recorded estimate of the total amount of content, not a percentage, of the overall. not a prevalence number, but the total amount of content on your platform that violates this
9:41 pm
policy? and do you report the total number of views that self-harm or suicide promoting content that violates this policy that gets on your platform? >> yes, senator. we pioneered a quarterly reporting on community standards and enforcement across all these categories harmful content. you -- we focus on prevalence, which you mentioned, because what we are focused on is what percent of the content that we take down -- >> mr. zuckerberg, you are very talented and i have very little time left. i am trying to get an answer to a question. not as a percentage of the total. it's a huge number. so the percentage is small. but do you report the actual amount of content and views self-harm content receives? >> i believe we focus on prevalence. >> correct. you don't.
9:42 pm
ms. yaccarino, do you report it ? >> senator, we have less than 1% of our users that are between the ages of 13 and 17. >> do you report the absolute number? >> we have taken down almost 1 million posts down in regard to mental health. >> mr. chew, do you disclose the number of appearances of these types of content and how many are viewed before they are taking down? >> we disclose the number based on each category and how many were taken down before it was recorded. >> mr. spiegel? >> yes, senator we do disclose. ,>> i have three more questions i would go through if i have unlimited time. i will submit them to the record. platforms need to hand over more content about how the algorithms work, what the content does, and
9:43 pm
what the consequences are. not at the aggregate. not at the population level. the actual numbers of cases so we can understand the content. in closing, mr. chairman, i have a bipartisan bill the platform , accountability and transparency act cosponsored by several senders. -- senators. it is in front of the commerce committee. it would set reasonable standards for disclosure and transparency to make sure we are doing our job based on data. yes, there is a lot of emotion in this field. understandably. but if we can legislate responsibly about the management of the content on your platform, we need to have better data. is there any one of you willing to say now that you support this bill? mr. chairman, let the record
9:44 pm
mr. chairman, let the record reflect a yawning silence from the leaders of the social media platforms. thank you. >> we are on the first of two roll calls. please understand that there's no disrespect. they are doing their job. >> thank you, mr. chairman. tragically, survivors of sexual abuse are often repeatedly victimized and re-victimized over and over again by having nonconsensual images of themselves on social media platforms. there was a study that pointed out there was one instance of csam that reappeared more than 490,000 times after it had been reported. after it had been reported. we need tools to deal with this. we need laws to mandate standards so this doesn't happen
9:45 pm
, so we have a systematic way of getting rid of this stuff. because there is literally no plausible justification and no way of defending this. one tool that i think would be particularly effective is a bill that i will be introducing later today, and i invite all my committee members to join me. it is called the protect act. it would require websites to verify age and verify they received consent of any and all individuals appearing on their site in pornographic images and it would require platforms that have meaningful processes for an individual seeking to have images of him himself or herself removed in a timely manner. based on your understanding of existing laws, what might it take for a person to have those images removed, say, from x? >> senator lee, thank you.
9:46 pm
it sounds like what you are going to introduce into law, in terms of eco-systemwide and user consent sounds exactly like part of the philosophy of why we are supporting the shield act. and no one should have to endure nonconsensual images being shared online. >> without that, without laws in place, and it's fantastic anytime a company, as you have described with yours, wants to take those steps. it's very helpful. it can take a lot longer than it should, and sometimes it does, to the point where someone had images shared 490,000 times after it was reported to the authorities. that is deeply concerning. yes, the protect act would work in tandem with the shield act.
9:47 pm
mr. zuckerberg let's turn to you next. as you know, i feel strongly about privacy and believe one of the best protections for an individual's privacy online and involves end to end encryption. we know that a great deal of grooming and sharing of csam happens to occur on end-to-end encrypted systems. does meta allow juvenile accounts on their platform to use encrypted services? people under age 18. >> we allow people under age 18 to use whatsapp. >> do you have a bottom level age in which they are not allowed to use it? >> i don't think we allow people under the age of 13. >> what about you, mr. cintron? do you have -- do you allow kids
9:48 pm
to have accounts to access encrypted messaging? >> discord is not allowed for children under the age of 13 and we do not use end to end encryption for text messages. we feel it's important to respond to law enforcement requests and we are also working on proactively building technology, we are working with a nonprofit to build a grooming classifier and identify these conversations so we can intervene and give those teenagers tools to get out of the situation and maybe even report those conversations to law enforcement. >> it can be harmful, especially if you are on the site were children are being groomed and exploited. if you allow children on an end-to-end encryption enabled app, that can prove problematic. let's go back to you, mr. zuckerberg. instagram announced it will
9:49 pm
restrict all teenagers from access to eating disorder material, suicidal ideation, self-harm content, and that is fantastic. what is odd, what i'm trying to understand, why is it that instagram is only restricting -- it's restricting access to sexually explicit content, but only for teenagers ages 13-15. why not restricted for 16 and 17 as well? >> my understanding is we don't allow sexually explicit content for people of any age. >> how is that going? [laughter] [applause] >> are prevalence metrics
9:50 pm
suggest, 99% of accounts we removed were identifies in using ai system. i think our efforts are industry leading. the other thing you asked about was self-harm content, which is what we recently restricted. we made the shift, i think the state of the science is shifting a bit. previously we believed that what -- when people were thinking about self-harm, it was important for them to express that and get support, and now more of the thinking in the field is it is better to not show that content at all. which is why we recently moved to restrict it for showing up. >> is there a way for parents to make a request on what their kids can't see or not see on your site? >> there are a lot of parental controls. i don't think we currently have a control around topics, but we do allow parents to control the
9:51 pm
times that children are on the site and a lot of it based on monitoring and understanding what the teenagers experience is. >> mr. cintron, discord has been all but if he on it. 17% of minors use discord have had online sexual interactions on your platform. 17%. 10% have those interactions with someone that they believed to be an adult. do you restrict minors from accessing discord servers that host pornographic material on them? >> senator, we do restrict that and discord does not recommend we allow adults to share content but we don't allow teenagers to access that.
9:52 pm
>> i see my time is expired. thank you. >> welcome, everyone. we are in this hearing because of a collective -- your platforms really suck at policing themselves. we hear about it in congress with drug dealing facilitated across platforms. we see it and we hear about it here in congress with harassment and bullying that takes place across platforms. we see it and hear about it here in congress with respect to child pornography, sexual exploitation and blackmail and we are sick of it. it seems to me there's a problem
9:53 pm
with accountability because these conditions continue to persist.
9:54 pm
in my view section 230, which provides immunity from lawsuits is a significant part of that problem if you look were bullies have been brought to heel, whether it is dominion finally getting justice against fox news after a long campaign to try to discredit the election manufacturer or whether it is the mother's and father's of the sandy hook victims finally getting justice against infowars and its campaign of trying to get people to believe that the massacre of their children was a fake put on by them or even now, more recently, with a writer getting a significant settlement. it courtroom proves to be the place where these things get sorted out. i will just describe one case, if i may. it is called doe v twitter. the plaintiff in that case was blackmailed into 1017 for a sexually explicit photo and video of himself then aged 13 to 14. a compilation video of multiple
9:55 pm
csam videos surfaced on twitter in 2019. a concerned citizen reported that video on december 25th, 2019. christmas day. twitter took no action. the plaintiff, then a minor in high school in 2019 became aware of this video from his classmates in january of 2020. you are a high school kid. and suddenly there is that. that is a day that is hard to recover from. ultimately, he became suicidal. he and his parents contacted law
9:56 pm
enforcement and twitter to have these videos removed. on january 21st and 22nd 2020 and twitter ultimately took down video on january 30 once federal law enforcement got involved. that is a pretty foul set of facts. when the family sued twitter for all of those months of refusing to take down the explicit video of this child, twitter invoked section 230, and the district court ruled that the claim was barred. there's nothing about that set of facts that tells me that section 230 performed any public
9:57 pm
service in that regard. i would like to see very substantial adjustments to section 230 so that the honest courtroom, which brought relief and justice to e. jean carroll after months of defamation, which brought silence, justice and peace to the parents of the sandy hook children after months of defamation and bullying by infowars and alex jones, and which brought significant justice and an end to the campaign of defamation by fox news to a little company that
9:58 pm
was busy just making election machines. my time is running out. i will turn to -- i guess senator cruz is next, but i would like to have each of your companies put in writing what exemptions from the protections of section 230 you would be willing to accept, bearing in mind, the fact situation in doe v twitter and the damage that was done to that young person and that family by the nonresponsiveness of this enormous platform over months and months and months. again, think of what it is like to be a high school kid and have that stuff in the public domain and have the company that is holding it out there in the
9:59 pm
public domain react with disinterest. will you put that down in writing for me? 1, 2, 3, 4, five yeses. done. senator cruz. sen. cruz: thank you. social media is a powerful tool, but we are here because every parent i know, and every parent in america is terrified about the garbage that is directed at our kids. i have two teenagers at home. and the phones they have are portals to predators, viciousness, bullying, self-harm , and each of your companies could do more to prevent it. mr. -- mr. zuckerberg, in june of
10:00 pm
2023 the wall street journal reported that instagram's recommendation systems were actively connecting pedophiles to accounts that were advertising the sale of child sexual abuse material. in many of those cases accounts appeared to be run by underage children themselves, using code words and emojis to advertise illicit material. in other cases, the accounts included the victim was being sex trafficked. i know instagram has a team that works to prevent the abuse and exploitation of children online. but what was particularly concerning about the wall street journal expose was the degree to which instagram's own algorithm was promoting the discoverability of victims for pedophiles seeking child abuse material. in other words, this material was not just living in the dark corners of instagram. instagram was helping pedophiles
10:01 pm
find it by promoting graphic hash tags, including preteen sex to potential buyers. instagram had the following warning screen to individuals who were searching for child abuse material. these results may contain images child sexual abuse, and then you gave users two choices. get resources or see results anyway. mr. zuckerberg, what the were you thinking? >> all right, senator. the basic science behind that is when people are searching for something that is problematic, it's often helpful to rather than block it to help direct
10:02 pm
them to something that could be helpful for getting them to get help. sen. cruz: i understand get resources. in what saying universe is there a link for see results anyway. >> we might be wrong. we try to trigger this morning. -- warning. when we think there's -- sen. cruz: you might be wrong. how many times with this display? >> i don't know. sen. cruz: why don't you know? >> i don't know the answer. sen. cruz: you know what? it's interesting you say you don't know off the top of your head, because i asked in june of 2023 in your oversight letter and your company refused to answer. will you commit right now to within five days to answer this question for this committee? >> we will follow up on that. sen. cruz: not i will follow up.
10:03 pm
i know how lawyers write statements they are not going to answer. will you tell me how many this times this was display? >> i will personally look into it. sen. cruz: let me ask you this. how many times did it instagram user who got this morning that you are seeing images of child sexual abuse, how many times did they click on see results anyway. i want to see that. >> i'm not sure if we stored that, but i will look into that. sen. cruz: and what follow-up did instagram do when you had a potential pedophile clicking on i would like to see child pornography. what did you do next when that happened? >> an important piece of context is any context we think -- sen. cruz: that is called a question. what did you do next? when somebody clicked, you may be getting child sexual abuse images and they clicked, see results anyway. what was your next step? you said you might be wrong. did anyone examine was it in
10:04 pm
fact child sexual abuse material? did anyone try to protect that child? what did you do next? >> senator, we take down anything that we think is sexual abuse material on the service -- sen. cruz: did anybody verify if it was sexual abuse material? >> i don't know if every search result is worth following up on. sen. cruz: did you report the people who clicked see results anyway? >> do you want me to answer the question? we've reported more people and done more reports like this to the national center of missing and exploited children. we go out of our way to do this and we've made more than 26 million reports, which is more than the rest of the industry combined. sen. cruz: mr. zuckerberg, your company and of residual media --
10:05 pm
social media company needs to do much more to protect children. mr. chew, i want to turn to you. are you familiar with the national intelligence law that all citizens shall support and cooperate with national intelligence effort in accordance with the law and shall protect national intelligence with secrets they aware of. >> yes, i'm familiar with us. sen. cruz: tiktok is owned with bytedance. is bytedance subject to the law? >> tiktok is not available in mainland china and as we talked about in your office, project texas put this out of reach. sen. cruz: bytedance is subject to the law. subject to the law that says shall protect national intelligence work secrets they are aware of, it compels people subject to the law to lie to protect those secrets. is that correct? >> i cannot comment on that.
10:06 pm
what i said -- sen. cruz: because you have to protect secrets? >> tiktok is not available in mainland china. sen. cruz: but it is controlled by bytedance, which is subject to this law. you said earlier, and i wrote this down, we have not been asked for any data by the chinese government, and we have never provided it. i'm going to tell you and i told this when we met last week, i don't believe you. and i will tell you the american people do not either. if you look at what is on tiktok in china, you are promoting to kids science and math videos, educational videos, and limit the amount of time they can be on tiktok. in the united states you are promoting to kids self-harm videos and anti-israel propaganda.
10:07 pm
why is there such a dramatic a difference? >> senator, that is not >> accurate. sen. cruz: you have a company that is essentially the same but it promotes beneficial materials instead of harmful materials. >> that is not true. we have a lot of science and math content here on tiktok. sen. cruz: let me point to this, mr. chew. there was a report recently that compared hashtags on instagram to hashtags on tiktok. and what trended. the differences were striking. for something like hashtag taylor swift or hashtag trump, they found 2 for everyone on tiktok. that is not a dramatic difference. the difference jumps to 8-1 for the hashtag uyghur and it jumps
10:08 pm
and it jumps 57-1 to tiananmen square and it jumps to 174-1 to hong kong protest. why is it that but instagram people can put up a hashtag hong call -- kong protest times 174 compared to tiktok. what censorship is tiktok doing fundamentally, few things happen. not all videos carry hashtags. that's the first thing. the second thing is you cannot selectively choose a few words -- sen. cruz: why the difference between taylor swift and tiananmen square? >> there was a massive protest during the time. i'm trying to say is --
10:09 pm
sen. cruz: why would there be a minimal difference on taylor swift and massive difference? >> senator , can you rap up? >> our algorithm does not suppress any content. sen. cruz: answer the question. >> i think your analogy is flawed. sen. cruz: there is an obvious difference. >> senator blumenthal. >> thank you, mr. chairman. mr. zuckerberg. thank you, that was good enough. one of your top leaders in september of 2021. she was global head of safety. and you know that she came before a subcommittee on consumer protection.
10:10 pm
correct? >> yes. sen. blumenthal: and she was testifying on behalf of safety on behalf of facebook. and she told us, facebook is committed to building better products for young people and to doing everything to do everything to protect their privacy, safety, and well- being on our platform. and she said to kids' safety where where we are investing heavily. we know that statement was untrue. we know it from an internal email that we've received. it is an email written by nick clegg. he was meta's president of global affairs and he wrote the memo to you. which you received, correct?
10:11 pm
it was written to you. >> i cannot see the email. i will assume that you got it correct. sen. blumenthal: he summarized facebook's policy. he said, we are not on track to succeed for our core well-being topics. problematic use. bullying and harassment. connection and and ssi, meaning suicidal self injury. he said also in another memo we need to do more and we are being held back by a lack of investments. this memo has the date of august 28th, just weeks before that testimony from antigone davis. >> i'm not sure what dates the
10:12 pm
testimony was. sen. blumenthal: those are the date on the email. he was asking you, pleading with you for resources to back up the narrative to fulfill the commitment. in effect, antigone davis was making promises that nick clegg was trying to fulfill and you rejected that request for 45 to 84 engineers to do well-being for safety. we know you rejected it from another memo, his assistant, tim colburn, who said, nick did email mark to emphasize his support for the package, but it
10:13 pm
lost out to other pressures and priorities. we have done the calculation that those potentially 84 engineers would cost meta $54 million and a quarter when it earned $9.2 billion, and get it failed to make that commitment in real terms, and you rejected that request because of other pressures and priorities and that is an example from your own internal documents of failing to act and it is the reason why we can no longer trust meta and, frankly, any of the social medias. two in fact grade their own
10:14 pm
homework and the public and particularly the parents in the room know we can no longer rely on social media to provide a kind of safeguard that children and parents deserve. and that is the reason why passing the kids online safety act is so critically important. mr. zuckerberg, do you believe you have a constitutional right to lie to congress? >> no. but -- let me clarify. sen. blumenthal: let me clarify for you. in a lawsuit brought by hundreds of parents and some in this very room alleging that you made false and misleading statements concerning the safety of your platform for children, you argued in not just one pleading, but twice in december and then in january that you have a
10:15 pm
constitutional right to lie to congress and do you disavow that filing in court? >> i don't know what filing you are talking about, but i would like the opportunity to respond to the previous things you shared as well. sen. blumenthal: i do have a few more questions. let me ask others who are here, because i think it is important to put you on record, who will support the kids online safety act? yes or no? >> there are parts that -- sen. blumenthal: it is a yes or no question and i will run out of time so i am assuming the answer is no if you can answer -- can't answer yes. >> we very much think the privacy act would be great. >> we strongly support the safety act and we have already implemented many of its core provisions.
10:16 pm
sen. blumenthal: thank you, and i welcome that support along with microsoft's support. >> with some changes, we can support it. sen. blumenthal: in the present form, do you support it? >> we do have some concerns. sen. blumenthal: i will take that as a no. >> we support kosa and we will make sure it accelerates and will continue to offer community for those seeking that voice. >> we support the age- appropriate content standards but would have -- sen. blumenthal: yes or no? do you support the kids online safety act? >> these are nuanced questions. sen. blumenthal: i am asking whether you will support it or not. >> the basic spirit and idea is right and there are some ideas that i would debate. sen. blumenthal: unfortunately i don't think we can count on social media as a group or big tech to support this measure and
10:17 pm
in the past we know it has been opposed by armies and lawyers and lobbyists who were prepared for this fight, i was very very glad that we have parents here because tomorrow we will have an advocacy day. and the folks who really count, and the people in this room who support this measure will be going to their representatives and their senators and their voices and faces will make a difference. senator schumer has committed that he will work with me to bring this bill to a vote and then we will have real protection for children and parents online. thank you. >> thank you, senator blumenthal. we do have a vote and you have voted in senator hawley has not voted yet.
10:18 pm
you are next and i don't know how long it will be open but i will turn it over to you. >> thank you. let me start with you, mr. zuckerberg. did i hear you say there is no link between mental health and social media use? >> what i said is i think it is important to look at the science and i know that people widely talk about this as if that is something that has been proven, and i think the bulk of the scientific evidence does not support that. >> really. let me remind you of some of the science of your own company. instagram studied the effect of your platform on teenagers and let me read you some quotes on this and company researchers found instagram was harmful for a sizable percentage of teenagers, most notably, teenage girls and here is a quote from your own study. we make body image issues worse for 1 in 3 girls and here's
10:19 pm
another one, they blame for increases in the rate of anxiety and depression and this reaction was unprompted and consistent across all groups and that is your study. >> we do try to understand the feedback and how people feel about the services. we can improve. sen. hawley: your own study says that you make life worse for one in three teenage girls and you increase anxiety and depression and that is what it says and you are testifying to us in public that there is no link and you have been doing this for years. for years you have been testifying under oath there is absolutely no link that your product is wonderful and full speed ahead while internally you know full well your product is a disaster for teenagers and you keep right on doing what you are doing. >> that is not true. sen. hawley: let me show you some other facts i know you are familiar with. those are facts. it's not a question. those are not facts. here are some more facts and
10:20 pm
here is some information from the whistleblower who came before the senate and testified under oath who worked for you, a senior executive. here is what he showed he found when he studied your products. so, for example, this is girls between the ages of 13 and 15 with 37% of them reported that they have been exposed to nudity on the platform in the last seven days and 24% said they experienced unwanted sexual advances and they have been propositioned in the last seven days. 17% said they encountered self-harm content pushed at them in the last seven days. i know you are familiar with the statistics because he sent you an email where he lined it out and we have a copy of it here and my question is, who did you fire for this? who got fired for that. >> we studied all of this because it's important and we want to improve our services. you studied it and there was no linkage just said. >> i said you mischaracterized.
10:21 pm
sen. hawley: 37% of teenage girls between 13 and 15 were exposed between unwanted nudity in a week on instagram and you know about it and who did you fire. >> this is why we are building -- sen. hawley: who did you fire? >> i don't think that -- sen. hawley: who did you fire? >> i won't answer that. you didn't fire anybody? it's not appropriate to -- sen. hawley: do you know who is sitting behind you? you have families from across the nation whose children are either severely harmed or gone and you don't think it is appropriate to talk about steps that you took or the fact that you didn't fire a single person? have you compensated any of the victims? >> sorry? sen. hawley: these girls? have you compensated them? >> i don't believe so. sen. hawley: why not? don't you think they deserve some compensation for what your
10:22 pm
platform has done? help with counseling services or issues that your services caused. >> our job is to make sure we build tools to keep people safe. sen. hawley: will you compensate them? >> our job is to make sure we build industry leading tools and to build tools that empower parents. sen. hawley: you didn't take any action or fire anybody or compensated a single victim. and let me ask you this and there are families of victims here today and have you apologized to them? and would you like to do so now? they are here and you are on national television and would you like now to apologize to the victims who have been harmed? show them the pictures? would you like to apologize for what you have done to these good people? >> [indiscernible]
10:23 pm
sen. hawley: why, mr. zuckerberg, why should your company not be sued for this? why is it that you hide behind a liability shield and you shouldn't be held accountable personally and shouldn't you be held accountable? will you take personal responsibility? >> senator, i think i have already answered this. will you take responsibility? >> i think my job and the job of this company is to build the best tools we can to keep our communities safe. we are doing an industry where your product is killing people and you personally commit to compensating the victims. you are a billionaire. will you set up a compensation fund fund with your money?
10:24 pm
-- compensation fund with your money? this isn't a complicated question. will you set up a victims compensation fund with your money, the money made on these families sitting behind you senator, yes or no? >> my job -- sen. hawley: it does sound like a no. your job is to be responsible for your company is done and you have made billions of dollars in the people sitting behind you. you have nothing to help them and you have done nothing to compensate them and you have done nothing to put it right and you could do so today and you should. and before my time expires, mr. chew. let me ask you. your platform, why should your platform not be banned in the united states of america? you are owned by a chinese communist company, a company based in china and the editor- in-chief of your parent company is a communist party secretary and your company has been surveilling americans for years
10:25 pm
according to linked audio from more than 80 internal tiktok meetings, china- based employees of your company have repeatedly access nonpublic data of united states citizens and they have tracked journalists improperly gaining access to their ip addresses and an attempt to identify whether they are writing negative stories about you. your platform is basically an espionage arm for the chinese communist party and why shouldn't you be banned in united states of america? >> senator, i disagree with your characterization. what you said we have explained in a lot of details. tiktok is used by 170 million americans. sen. hawley: but every one of those americans are in danger from the fact that you track their keystrokes and app usage and location data. we know that all of that information can be accessed by chinese employees who are subject to the chinese communist party. why shouldn't you be banned in this country? >> senator, that is not
10:26 pm
accurate. a lot of what you describe is correct that we don't. sen. hawley: it is 100% accurate. do you deny repeatedly that american data has been accessed by bytedance employees in china? >> we built a project that cost us billions of dollars to stop that and we built the project to do that. sen. hawley: according to a report from wall street journal from yesterday, it hasn't been stopped and even now workers without going through official channels have access to the private information of american citizens including their birthdate, ip address and more and that is now. >> senator, as you know, the media doesn't always get it right. sen. hawley: the chinese communist party does? >> we spent billions of dollars to build this and it is a vigorous and robust and unprecedented and i am proud of the work that they are doing, the work they are doing to protect the data. sen. hawley: it is not protected at all and it is subject to communist chinese party
10:27 pm
inspection and review and your app unlike anybody else sitting here and heaven knows we have problems with everybody here, but you're at unlike any of those is subject to the control and inspection of a foreign hostile government that has actively tried to track the information of the whereabouts of every american and it should be banned in the united states of america for the security of this country. thank you, mr. chairman. >> senator. >> thank you, mr. chairman. as we have heard, children face all sorts of dangers in social media from mental health harm to sexual exploitation and even trafficking. sex trafficking is a serious problem in my home state of hawaii especially for native hawaiian victims, that social media platform are being used to facilitate this trafficking as well as the creation and distribution of csam is
10:28 pm
concerning. but it is happening and for example several years ago a merger was stationed and sentenced to 15 years in prison for producing this is part of the online exploitation of a minor female and he began communicating with this 12-year-old girl through instagram and then he used snapchat to send her sexually explicit photographs and solicit such photographs from her and he later used these to black mail her. just last month, the fbi arrested a neo-nazi cult leader and hawaii who lured victims to his server and he used it to share images of extremely disturbing child sexual abuse material interspersed with neo-nazi imagery and members of his child exploitation and hate group are also present on instagram, snapchat, x and tiktok all of which they used to recruit potential members and
10:29 pm
victims. in many cases including the ones i mentioned, your company and it played a role in law enforcement investigating them but at the time, so much damage had been done and this is about how to keep children safe online and we have listened to all of your testimony to seemingly impress the safeguards for young users and do you try to limit the time they spend or require parental consent and you have all of these tools, yet, trafficking and exploitation online and of your platform continues to be rampant. nearly all companies make money through advertising, specifically by selling the attention of your users and your product is your users. and as a made up product designer wrote in an email, young ones are the best ones and you want to bring people to your
10:30 pm
service young and early. in other words, hook them early. research published last month estimates that snap makes an astounding 41% by adjusting to users under 18 and with tiktok it is 35%. seven of the 10 largest servers attracting many paid users are four games used primarily by teens, by children. and all of this is to say that social media companies, yours and others, make money by attracting kids to your platforms but ensuring safety doesn't make money but it costs money. and if you will continue to attract kids to your platform, you have an obligation to make sure they are safe on those platforms because the current situation is untenable and that
10:31 pm
is why we have this hearing. but to ensure safety that costs money and your companies can't continue to profit off of young users only to look the other way when those users are children harmed online. we have had a lot of comments about section 230 protections and i do think we are definitely heading in this direction and some of the other things we have passed out of this committee talked about limiting the liability protections for you and last november, the subcommittee heard testimony in response to one of the questions about how to ensure that social media companies focus more on child safety and he said, and i am paraphrasing a bit and he
10:32 pm
said, what will change their behavior is at the moment that mark zuckerberg declares earnings. and these earnings have to be declared to the sec. he said last quarter we made $34 billion in the next thing you have to say is how many teenagers experienced unwanted sexual advances on this platform . mr. zuckerberg, will you commit to reporting measurable child safety data on your quarterly earnings report and calls? >> senator, it is a good question and we actually already have a quarterly report we issue and do a called answer questions for how we enforce community standards that includes not just the child safety issues. >> is that a yes? >> we have a separate call we do this on. >> you have to report your earnings to the sec.
10:33 pm
will you report this kind of data and numbers by the way because percentages don't really tell that full story but will you report the number of teenagers and sometimes you don't know that they are teens or not because they just claim to be adults. will you report the number of underage children on your platform to experience unwanted kinds of messaging that harm them? when you commit to siding -- citing those numbers when you make the quarterly report? >> i am not sure it would make as much sense to do that in that filing but we do it publicly so everybody can see it and we have to follow up and talk about that and the specific thing and some of the ones you mentioned around underaged people and or services we don't allow people under the
10:34 pm
age of 13 on our service, so if you find anyone we remove them from my surface. i not saying that people don't am lie. i won't be able to say that we won't be able to count how many there are because fundamentally, if we identify that somebody is under age we remove them. >> i think that is important that we get actual numbers because these are real human beings and that is why all of these parents are here because each time a young person is exposed to this kind of unwanted material and they get hooked, it is a danger to that individual. so i am hoping that you are saying you do report this kind of information if not to the sec that it is made public and i think i am hearing that you do. >> senator, i think we report more publicly on our enforcement than any other company in the industry and we are very supportive of that. >> i am running out of time, zuckerberg, but i will follow up with what exactly it is you do report.
10:35 pm
and again for you when they automatically place accounts of young people and you testified on this on the most restrictive privacy and content sensitivity sections and yet teenagers are able to opt out of these safeguards and is that right? it isn't mandatory that they remain on the settings? they can opt out? >> yes, senator. we default teens into a private account -- teenagers. but some want to be creators and have content they share broadly and i don't think that is something that should be banned blanket lien -- blanketly. >> why not. i think it should be mandatory that they remain on the more restrictive settings. a lot of teenagers create amazing things and with the right supervision and parenting and controls i think that that -- i don't think that is the type of thing you want to not allow anyone to be able to do. >> my time is up.
10:36 pm
but i do have to say there is an argument that you have made everything we are proposing and i do share the concern that i have about the blanket limitation of liabilities that we provide all of you and i think that has to change and that is on us, congress, to make that change. thank you. >> senator cotton. >> let's cut to the chase. is tiktok under the influence of the chinese communist party? >> no. we are a private business. >> you can say that they are subject to the 2017 national security law which requires chinese companies to turn over information to the chinese government and you concede that? >> senator -- >> there is no question and you can see the that earlier. >> any company has to follow the local laws. >> is in at the case that
10:37 pm
bytedance also has internal communist party committee? >> all businesses have to follow local law. >> your parent company is subject to the national security law that requires it to the answer to the party and it has its own internal chinese communist party committee and you answer to that parent company, but you expects us to believe you aren't under their influence? >> i understand this concern. which is why we built. >> used to work for the menu or the cfo for them? >> correct. >> in april 2021 while you were the cfo the chinese communist party internet investment fund purchased a 1% stake in the main chinese subsidiary and the return for that 1% gold ensure the party took one of three seats at that subsidiary company and that is correct? >> it is for the chinese business. >> that deal was finalized on april 30 of 2021 and isn't it true that you were appointing
10:38 pm
-- appointed the ceo the very next day on may 1 of 2021? >> it is a coincidence. that the chinese communist party took its golden chair at the board seat and the very next day you were appointed to ceo as tiktok and that is a coincidence. >> it really is. >> it is. ok. and then before this you at another company? >> i used to work around the world. >> where did you live? >> i lived in china? >> i lived in beijing and i >> -- and i worked there for about five years. >> you lived there for five years and is that the case that they were sanctioned by the us government in 2021 for being a communist chinese military company? >> i am here to talk about tiktok -- i can't remember. >> the biden administration
10:39 pm
never reversed those sanctions just like they reversed that and it was sanctioned as a chinese communist military company. so you said today as you often say that you live in singapore and of what nation are you a citizen? >> singapore. >> are you a citizen of any other nation? >> no. >> have you applied for citizenship anywhere else? >> i did not. >> do you have a passport from singapore or any other nations? >> no. >> your wife is an american citizen and your children? correct. have you applied for american citizenship. >> not yet. >> have you ever been a member of the chinese communist party? i am singaporean. no. >> have you ever been affiliated with the chinese communist party? >> no.
10:40 pm
>> you said earlier that what happened at 10 square in june 1989 was a massive protest and did anything else happen in 10 square? >> i think it is well documented that was a massacre. >> it wasn't indiscriminate slaughter of hundreds of citizens and you agree with the trump administration and biden administration that the chinese government is committing genocide against the uyghur people? >> i think it's important that anybody who cares about this topic any topic -- it is a simple question that unites both parties and our country and governments around the world is the chinese government considering or committing genocide against the the people -- these people? yes or no. you are a worldly well up -- >> well-educated man and is the chinese government committing genocide against the government -- people? yes or no? >> you are here to give testimony that is truthful and honest and complete and let me ask you this. joe biden said that the president of china was a
10:41 pm
dictator. do you agree? >> senator, i won't comment on world leaders. >> why won't you answer the simple questions? it's not appropriate. >> are you scared you will lose your job? i disagree. are you scared you will be arrested and disappear the next time you go to china? >> you will find the content critical of china freely and -- on tiktok. >> ok. let's look what tiktok is doing to america's youth and does the name mason aiden's ring a bell? >> you may have to give me more specifics. >> he was a 16-year-old arkansan and after a breakup he went on your platform in search of inspirational quotes and positive affirmations and instead he was served up numerous videos glamorizing suicide until he killed himself by gun.
10:42 pm
what about the name chase? a 16-year-old who saw more than 1000 videos on your platform about violence and suicide until he took his own life by stepping in front of a train and are you aware his parents are suing tiktok for pushing their son to take his own life? >> yes, i am aware of that. >> ok. and finally, as the federal trade commission sued tiktok during the biden administration? >> i can't talk about -- >> are you currently being sued by the federal trade commission. >> i can't talk about that. >> are you being sued by the federal trade commission? >> the answer is no and the company for ms. yaccarino is
10:43 pm
being sued. mr. zuckerberg's company is what the chinese company is not in are you familiar with the name christina? she was a paid advisor with your communist influence. company hired by the biden ftc to advise on how to sue mr. zuckerberg's company. >> bytedance is a global company, not a chinese company. >> public reports indicate that your lobbyist had been there more than 40 times and how many times did your company visit last year? >> i don't know that. >> are you aware that the biden campaign and the democratic national committee is on your platform and they have tiktok accounts? >> we encourage people to come on. >> they will let their staff use their personal phones. >> we encourage everyone to
10:44 pm
join. >> all of these companies are being sued and you aren't and they have a former paid advisor, your parent talking about how they can sue mr. zuckerberg's company and joe biden's reelection campaign is on your platform. let me ask you. have you or anybody else it tiktok communicated with or coordinated with the biden administration and the biden campaign the democratic national committee to influence the flow of information on your platform? >> we work with anyone, any creators and it is all the same process. >> we have a company that is a tool for the chinese communist party poisoning the minds of america's children and in some cases driving them to suicide and at best the biden administration is taking a pass on at worst may be in collaboration with. thank you. >> we are going to take a break now. members can take advantage as they wish. the break will last 10 minutes. please do your best to return.
10:45 pm
[indiscernible chatter] [indiscernible chatter] [captioning performed by the national captioning institute, which is responsible for its caption content and accuracy. visit ncicap.org] [captions copyright national cable satellite corp. 2024]

2 Views

info Stream Only

Uploaded by TV Archive on