> It would have mandated that platforms like Instagram and TikTok address online dangers affecting children through design changes and allowing young users to opt out of algorithmic recommendations.
> A central – and controversial – component of the bill was its “duty of care” clause, which declared that companies have “a duty to act in the best interests of minors using their platforms” and would be open to interpretation by regulators.
[…]
> Sensitive but important topics such as gun violence and racial justice could be viewed as potentially harmful and subsequently be filtered out by the companies themselves. These censorship concerns were particularly pronounced for the LGBTQ+ community, which, opponents of Kosa said, could be disproportionately affected by conservative regulators, reducing access to vital resources.
Sounds like it was an internet censorship bill. But lawmakers accidentally included some anti-algorithmic profiling stuff which would annoy their bosses, corporate lobbyists. I guess we’ll see a new version that only tramples the rights of normal people eventually.
I will give you it is not a direct line. But, I wont upload a government ID to a social platform. If everyone had to, it would have a chilling effect across the entire internet. It would be one of the single biggest censorship moves that a government had ever taken.
Now where is this centralized database and who has control? How much does it cost for businesses or people to access it and how is their access limited? Are the identification requests being stored by any party?
There are many ways in which this ideal is not in line with a free society.
I understand your reasoning but on the other hand it seems like any attempt to reel in big tech ends up in "all roads lead to censorship/free speech infringement" territory.
So there's really no way to reign in big tech on any of this?
Sure there are. You can regulate that there must be a variety of algorithms available to users where they can choose. Having some control to fully opt out or augment in a way you want would be nice. You could regulate then removal of section 230 protections for platforms that use certain types of algorithms or fail to release reports on specific transparency metrics. We can regulate that accounts marked as children upon creation need to be separated out from main servers to their own playground. There are many options available besides users showing papers whenever they want to participate in speech on the internet. The issue is nuance and that never plays well for legislation.
Marked as children in the same way they do today. I am assuming parental caring here. Don’t knowingly allow children into your bar. If a parent marks an account as a child during its creation, they are segregated. YouTube has YouTube kids. It’s not a requirement, but they try to make a safer environment for those who aren’t mature enough or able to handle the risk of full YouTube. It’s not perfect right? But I like that direction.
I find that headline to be unintentionally hilarious because anyone paying attention to online child safety legislation already knows the vast majority of it is either terribly written and destined to trigger unintentional consequences or a naked power grab by law enforcement using 'child safety' as a pre-text to get more and/or easier access to user data.
I find most child safety legislation borderline hysterical.
Fear of "algorithms" and such always turn into absurdist type laws.
Long ago, my state we had a law (quickly suspended and later struck down) that made it a felony to sell a minor a "violet" video game. I don't think making some poor store clerk a felon is solving anything...
I think mrgold was looking for a solution that didnt lock their kid out of the arcade, but just allowed them to minimize the exposure to gambling while allowing a social experience. Having control of, or ability to remove some algorithms would be appreciated.
what if I want my kid to be degenerate gambler at an early age (apparently this is all cool the USA, I mean gambling...) but don't want her to poison her brain with arcade games?
the answer here of course is very simple, you teach your kid what she is or isn't allowed to do "online" just like you do IRL. and you verify by either sharing the account so you can see what is being installed and/or checking the device...
You do you i guess. But we have decided as a society that childen do not have the full list of rights as adults. And in most cases, there is still leniency as well. You might think its fun getting shitfaced with your 6 year old, but society has said no. Now your 16 yr old has a glass of wine under your supervision? Well no one is going to be locking you up.
Is this method of parenting that you are advocating for working out okay for you or do you not have children?
Myself, I am childless and I do not like KOSA as I am unwilling to give up the anonymous internet to keep the kids safe. At the same time I do think there is room for some regulation here.
I am a parent and strongly believe that it is my greatest pleasure in life and greatest responsibility to be a parent. I am advocating very simple parenting method - I do not want fucking government to dictate what books my kid car read, what they can do on their devices (if they have them) etc... this is my job, not governments
I’m with you on that. I think all of this should be mandated in a way that allows parents the choice and the tech companies the choice to operate for children or not. Now if you are advocating that the govt overreached when it banned cigarette companies from airing commercials during children’s cartoons… thats a little to far on the libertarian train for me. I don’t think regulation around the type of algorithms they can use around children is not worth investigating. Advertising choices and algorithmic implementation incentives do not seemed align with childhood development and I worry the feed them to the wolves approach won’t work out well for either side.
I don’t think regulation around the type of algorithms they can use around children is not worth investigating.
The problem - especially in USA is that whoever is in power when the "regulation" is added - whatever that regulation is - will be 100% political thing. And say 50% of people will not be happy about it. The power shifts, we either scrape or put other "regulations" in place to make the other 50% of people happy - ad infinitum... I am against regulations mostly because they are never about "protecting kids" or whatever the "spirit" should be - they are always political in nature (e.g. your kids should NOT read these books nonsense...)
Parents have no chances against multi billion corporations turning kids into zombies. I see hordes of these on the way to school in the morning and trying find way home afterwards.
Yes and we should legalize lead paint again while we are at it. It’s a parents responsibility to prevent their child from eating paint chips everywhere they go.
A better analogy is that algorithms are designed to be feedback loops that trigger extremely positive or negative emotions in people. Meaning, they are addiction machines.
Allowing minors access to addiction machines is like allowing them access to tobacco.
Generally, we say “fuck it” when it comes to addiction in adults. Personal choice or whatever - never mind the very definition of addiction undermines choice.
But with children, we at least try to shield them from things that they will be hooked on indefinitely. Try, I say, because we don’t do a great job. Just look at food.
I’m not in favor of child censorship type legislation because it’s almost always a poor excuse at authoritarianism and censorship. But, reframing algorithmic feeds as slot machines or cigs makes the position much more understandable.
Because there is no knob you can turn and get rid of lead paint. You can use parental controls and block kids access to sites and apps you find objectionable
> You can use parental controls and block kids access to sites and apps you find objectionable
You can use parental controls and block kirds access to sites you found objectionable in the past, just as you can keep your kids out of buildings you jave previously inspected for lead paint.
OTOH, I suspect people raising this objection (note that I am not, even if I think I have an understanding of how they might apply the analogy) would prefer to have some guarantee that people aren't doing the equivalent of repainting with lead paint after the parents have inspected, and would often even prefer not have to do the equivalent of inspecting every virtual building their children might enter for the digital equivalent of lead paint.
Well, since the internet is international, how do you propose that you make legislation that will keep your kids “safe” from international sites?
This isn’t hypothetical, right now there is a law in Florida where porn sites must verify the age of users. I’m sure you would be shocked to know that kids can still get to porn sites without age verification. Sites not based in the US just ignored the law.
However, you can still use parental controls to block those sites.
Its essentially part of the environment at this point, you can't completely control access. Kids without social media access are second-class citizens - or worse - in their peer groups. The reason the platforms do not offer an option to turn off algorithmic feeds is because they make a lot of money off these kids.
So do you also buy your kids iPhones so they won’t feel like second class citizens with green bubbles?
Would you buy your kids $150 Jordan’s or whatever the equivalent is today so they would be part of the in crowd? Say it’s okay to do drugs because if they don’t they won’t be part of the crowd?
I'd guess maybe 40% of kids are using drugs now and then, less than 5% regularly (at least in my district). But over 90% are on Instagram, even more on Tiktok. Its not really comparable.
And what’s stopping you from using parental controls to keep your kids off of Instagram? It’s comparable because you don’t let your kids do other things I’m sure just because other kids are doing it.
That you don’t want to be a responsible parent and you are worried about your kids being ostracized? You never did answer the rest. Do you also buy your kids the latest fashion, the newest iPhone and let them use drugs so they fit in?
Because I can’t easily block every building that has lead paint.
Let me tell you a little story. Florida requires age verification for porn sites. Guess what? The internet is global. It might surprise you that since some of the biggest porn sites aren’t in the US, they just completely ignored the law.
However, parental controls still work.
Do you not have any control of what your children do?
The parental control is not letting your kid go to the website or download the app on devices you buy them and you control. All consumer OS’s for both mobile and desktop computers already do that.
Are you suggesting that adults should have to upload their ID to websites?
And you still didn’t answer the point about what you do about sites not hosted in the US - like the porn sites that are ignoring age verification laws in Florida.
To be frank, its both societies' and a parents job. But I want to see solutions that don't make me, a free adult show my identification to websites. I will not support for hamfisted legislation that deprive adults of privacy and freedom.
Oh I see the problem. It’s a minefield of issues! I’m saying that allowing kids on large social media to begin with is an issue. Co-minging them with the rest of the userbase adds to the issue. If the interface was a different color and the parent had an easy way to know which Instagram (or Instagram Jr) the user was on, the parent could easily understand where they were. It would be easier to regulate algorithms and such on that walled garden. If kids were found to be on the adult version, the accounts should be banned.
I’m not saying there is a perfect solution. I’m saying the 1st amendment exists and you need to work within that framework. And I refuse to agree that the only choice is giving up being anonymous on the internet. It’s not even clear that KOSA could survive 1A challenges.
Regulate the issues causing the problems. Allow users to bring their own algorithms. Don’t allow algorithmic content/advetising on accounts marked as under 18. There are so many other things to try first. The issue is Big Tech owns the legislators and so we are stuck with nothing except them whims of the few powerful people at the helm of these ships.
misleading headline, kosa had nothing to do about child safety and everything to do with government censorship, anybody framing it as "child safety" is a rube falling for the oldest conservative christian play in the book.
Because they care more about being contrarians than actually working together, and it shows. This isn't the only bill to have this issue, and it sure wont be the last.
People are downvoting comments about the impact of this bill on LGBTQ+ people, but “protecting minor children from the transgender [sic] in this culture” was literally the stated motivation by the senator who co-introduced KOSA: https://www.nbcnews.com/nbc-out/out-politics-and-policy/sena...
If courts can look at the statements of bills' sponsors when interpreting laws, it's not unreasonable for us to do the same when thinking critically about a bill's likely implementation.
Because it's a dystopian nightmare waiting to happen. The real question is, why the hell did it do so well in the Senate? Also, why is The Guardian apparently shilling for that and not mentioning privacy first and foremost? We know the UK has already passed something just as oppressive.
KOSA was not a child safety bill, it was a political (and particularly LGBTQ) harassment and suppression bill lightly masquerading (lightly, because right-wing supporters could not stop gloating about ita utility for that purpose) as a child safety bill that too many on the Democratic side in the Senate willingly went along with because they lack the courage to argue against the thinnest layer of pretextual justification (which is a broader problem than just that bill.)
The opposition to the bill is simple: KOSA was written from the start as a broad-spectrum censorship bill obivously designed to (among other things) suppress LGBT existence with the excuse of "won't someone think of the children".
For example, the censorship of any discussion of "illegal narcotics" - don't forget, all those state-licensed marijuana businesses out there are still illegal under federal law...
It was a bill that granted overboard powers to governments who were and still are actively salivating over the idea of weaponizing it to suppress speech they don't like. There's no more, "oh it's fine they won't really use it for that" discussion to be had because the current administration is just saying it out loud and publicly now. Social media sites sympathetic to the administration have already implemented the bits the opposition was worried about. The bill would make it so you can't just go somewhere else.
> Key provisions include requiring platforms to implement default safety settings for users under 17, provide parental oversight tools, and limit data collection and targeted ads for teens aged 14-15, who would also need parental consent to use platforms like TikTok, Instagram, Facebook, and X.
We may have dodged a bullet here. The big platforms would have (mostly) no problem implementing compliance, but smaller sites? Forget it. This would therefore be one more step toward squeezing small sites offline. No.
Not to mention there is no way to guarantee results from this.
> provide parental oversight tools
Good, but see my statement above. Implement this? Yes. MANDATE this? NO.
> limit data collection and targeted ads for teens aged 14-15
Explain to me how you prove that someone is 14-15?
This is a bunch of nonsense fluff and I'm glad it didn't pass.
Your AI hallucinated. It probably confused the KOSA in question (Kids Online Safety Act) with a local children's soccer league called "KOSA 14/15" (https://www.redrosearena.com/team/102420). You're arguing against something nobody proposed.
> A central – and controversial – component of the bill was its “duty of care” clause, which declared that companies have “a duty to act in the best interests of minors using their platforms” and would be open to interpretation by regulators.
[…]
> Sensitive but important topics such as gun violence and racial justice could be viewed as potentially harmful and subsequently be filtered out by the companies themselves. These censorship concerns were particularly pronounced for the LGBTQ+ community, which, opponents of Kosa said, could be disproportionately affected by conservative regulators, reducing access to vital resources.
Sounds like it was an internet censorship bill. But lawmakers accidentally included some anti-algorithmic profiling stuff which would annoy their bosses, corporate lobbyists. I guess we’ll see a new version that only tramples the rights of normal people eventually.