| ▲ | armchairhacker 5 hours ago | |
> When we have a press release from a university about how researchers can detect thoughts via fMRI, we have no issue with the claim. Different people. I for one have always claimed that fMRI is too coarse-grained for detailed thought detection. If AI detection "sometimes fails", it doesn't "work". It works well enough to convict someone with other evidence, but when there's no other evidence nor an attempt to get any, it has no good use. What I propose is simple: grade only closed-book exams, and hold students' phones during the exams. Students don't need 1:1 monitoring, it's the same as 10-20 years ago. | ||