| ▲ | LoganDark 12 hours ago |
| Monitoring children's DMs is the responsibility of the parents, not megacorps. If a parent wants to install a keylogger or screen recorder on their child's PC, that's their decision. But Google should not be able to. Neither should... literally anyone else except maybe an employer on a work-provided device. |
|
| ▲ | ranyume 12 hours ago | parent | next [-] |
| > Monitoring children's DMs is the responsibility of the parents, not megacorps Absolutely. But what responsibilities do megacorps have? Right now, everyone seems to avoid this question, and make do with megacorps not being responsible. This means: "we'll allow megacorps to be as they are and not take any responsibilities for the effects they cause to society". Instead of them taking responsibilities, we're collecting everyone's data and calling it a day by banning children from social networks... and this is because there are many interests involved (not related to child development and safety). |
| |
| ▲ | acuozzo 11 hours ago | parent | next [-] | | > But what responsibilities do megacorps have? Right now, everyone seems to avoid this question Clear, simple, direct: Whatever was required of The Bell Telephone Company and nothing more. | | |
| ▲ | da_chicken 11 hours ago | parent | next [-] | | So there should be a human operator manually gatekeeping every individual request to connect with another endpoint? It's a good thing those human operators couldn't listen in to whichever conversation they wanted. | | |
| ▲ | acuozzo 10 hours ago | parent [-] | | Human operators were not required of The Bell Telephone Company by law. Bell switched to mechanical switching stations as soon as doing so was economically advantageous. (Reconsider my post. I'm arguing for no regulation.) | | |
| ▲ | lmz 7 hours ago | parent [-] | | Sure. And "lawful access" intercept capabilities are also required of telcos. |
|
| |
| ▲ | ranyume 11 hours ago | parent | prev | next [-] | | I'd say that at minimum social networks need to be required to show how their algorithm works and allow users control over their data. They must be able to know why a content was served to them. Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interests, that this is the bare minimum for a free society. Ideally, users should be able to modify the algorithm, so they can get just what they want, while simultaneously maximizing free speech. If something isn't illegal, it shouldn't be hidden or removed. | | |
| ▲ | drnick1 2 hours ago | parent | next [-] | | > Nowadays social networks are so pervasive in society, affecting it and molding it to unknown interests I think this is the real issue. We should free ourselves from "social networks" such as Tiktok, Facebook, Instagram and others. Even with direct messages truly E2EE, they create countless other privacy problems. They enable surveillance of people at scale and should be completely shunned for that reason alone. | |
| ▲ | acuozzo 11 hours ago | parent | prev [-] | | > social networks need to be required to show how their algorithm works Hypothetically speaking: What if it's a neural network in which each user has his/her own unique weights which are undergoing frequent retraining? Would it not be an undue burden to necessitate the release of the weights every time they change? Also, what value would the weights have? We haven't yet hit the point of having neural networks with interpretability. Wouldn't enforcing algorithmic interpretability additionally be an undue burden? > They must be able to know why a content was served to them. What if the authors of the code are unable to tell you why? | | |
| ▲ | BlueTemplar 9 hours ago | parent [-] | | The use of black boxes like neural networks is already effectively illegal in some governments for this very reason. |
|
| |
| ▲ | techpression 10 hours ago | parent | prev | next [-] | | I don’t remember reading about ads in phone calls, nor the complete mapping of customers behaviors to use in contexts not being the phone call. The apples to oranges in this comparison is probably top five on HN ever. | |
| ▲ | iso1631 6 hours ago | parent | prev [-] | | Whatever was required of the new york times and nothing more. If the NYT publishes and advert or editorial, it's held accountable for the contents. |
| |
| ▲ | j16sdiz 11 hours ago | parent | prev | next [-] | | > But what responsibilities do megacorps have? fake and scam AD. they literally profit from those ADs. When the AD distributes malware or make scam, they don't take any responsibility | |
| ▲ | LoganDark 11 hours ago | parent | prev [-] | | > But what responsibilities do megacorps have? They should have a responsibility of transparency, accountability and empathy towards users. They should work for the user and in the interests of the user. But multiple constraints make this impossible in practice. |
|
|
| ▲ | prmoustache 3 hours ago | parent | prev | next [-] |
| I also think children do/should have a right to privacy and their parents do not have to know everything. Kids should be able to write a journal or talk to friends with total trust that this information will not reach their parents. |
|
| ▲ | KaiserPro 8 hours ago | parent | prev | next [-] |
| > Monitoring children's DMs is the responsibility of the parents, not megacorps. Yup, but the tools provided make that easy or hard. But putting that emotive bit to one side, Megacorps have a vested interest in not being responsible to children. They need children's eye balls to drive advertising revenue. If that means sending them corrosive shit, then so be it. Its a bigger issue than encryption, its editorial choice. |
|
| ▲ | gzread 5 hours ago | parent | prev | next [-] |
| The simplest way that can work is for the child account to be linked to a parent account, and the parent account can see the child account's DMs. |
|
| ▲ | baq 10 hours ago | parent | prev | next [-] |
| Mega corps should be compelled to and rewarded for allowing parents to monitor their children’s dms. |
|
| ▲ | duped 12 hours ago | parent | prev | next [-] |
| Parents shouldn't give their child access to a device that allows DMs. That said, these platforms are making it impossible for parents to monitor anything. They're literally designed to profit off addiction in children. |
| |
| ▲ | greygoo222 12 hours ago | parent [-] | | Why? Plenty of children benefit from talking to other people. Some children need careful monitoring, and some children shouldn't be allowed to use DMs, but it's not universal and should be up to the parents. | | |
| ▲ | iso1631 4 hours ago | parent [-] | | Control over who they can talk to (if needed), certainly monitoring of both who they talk to and in many situations what the contents are At some point between the age of 0 and 18 the child has to be fully ready for an independent world. A cliff edge is a terrible idea, allowing 3 year olds unmonitored uncontrolled conversations with strangers is a terrible idea, but not allowing 15 year olds to talk to their friends is a terrible idea. |
|
|
|
| ▲ | DANmode 9 hours ago | parent | prev | next [-] |
| > maybe an employer on a work-provided device. The children yearn for the mines(?). |
|
| ▲ | iso1631 6 hours ago | parent | prev [-] |
| I'm all for helping parents to do this. Any site requiring age verification should indicate this as a http header or whatever, and the browser I allow my child to use should respect that and the parental controls should be easy for me to engage with Many parental controls are massive pains to get working. Apple does fairly well (although I don't get a parental pin number to unlock the phone, which is normally fine as my child will tell me, but in some circumstances it wouldn't be), but does require the parent to be on the apple ecosystem too. EA and Microsoft however are terrible, especially as it's likely the child will be playing fortnite/minecraft and the parent won't have ever touched it. I think with minecraft we had to make something like 5 or 6 accounts across three different sites to allow online minecraft play from a nintendo switch. |