You have to physically climb into an multimillion dollar, 10 ton, liquid-helium-cooled machine and spend hours training it on your brain. We're a long way from the end of private thought, nor is there a roadmap to it.
If someone is working on a handheld device that can perform MRI-level scans from across the room, I would be worried about the privacy implications of that technology, not AI.
That is missing the point. It is the trajectory. Machines that can analyze all orthogonal data to understand your behavior is not limited to a single dimension of a MRI machine.
Facial expression analysis, all of your online conversations, the existence of your entire life is being monitored and recorded. There will be enough data to perceive beyond the veil into the thoughts of the mind.
It is definitely missing the point - capturing of brain waves does not require liquid cooling etc. In fact it's something that could theoretically be squeezed into the next Meta Quest to just "capture". Then it would just be stored until they are ready with an airflow pipeline to pass it into the nn.
During the pandemic, I built a device that records "brain waves". Not with the fidelity that would be needed to be a useful medical instrument -- let alone read minds -- but the fact that I, a fairly average geek, can do so using my own equipment and for less than $100 seems meaningful.
The device itself is very small, battery-powered (for safety) and requires that you attach electrodes to your scalp.
I suspect that the underlying point here is that when EEGs first came around in the 1920s, they were also extremely expensive and not available for common use. But since then, the technologies involved have become so much more accessible and affordable that an electronics hobbyist can build one for themself, maybe even using parts they already have kicking around.
MRIs may very well follow a similar trajectory over time.
Personally, I don't think this will happen in the very near future -- but history shows us time and time again that it's good to start thinking about these things well before they become practical.
That doesn't make sense from a physics perspective.
MRI machines operate by creating intense magnetic fields. In order to create those magnetic fields, you need superconductors, otherwise the magnets would burn up. Thus the liquid helium.
To make this accessible to the hobbyist, you need a revolution in physics, not informatics or engineering. Not saying that it's impossible, but if someone does develop room temperature superconductors, we're going to be talking about a lot more exciting things than handheld MRI machines.
> That doesn't make sense from a physics perspective.
I'm reminded of what a physicist once told me: if a physicist says something is impossible, give up all hope. If a physicist says something is uneconomical/impractical/infeasible, then there is still hope because economics change over time.
> To make this accessible to the hobbyist, you need a revolution in physics, not informatics or engineering.
The big stumbling block in terms of doing this on an (advanced) hobbyist level is the need for liquid helium and rare metals. That's an economic problem, not a physics problem. The helium is a really big deal -- but it's also the one that is most likely to have the economics change in, because the helium shortage is a serious issue and there are lots of people looking into ways to manufacture it efficiently. If they succeed, helium may end up becoming cheap enough to be within the realm of possibility for advanced hobbyists.
Also, the only reason that helium is needed at all is because MRI machines require superconductivity to work. It's not impossible that an advance in that field could happen such that you don't need to make things as cold as liquid helium in order to achieve it.
> we're going to be talking about a lot more exciting things than handheld MRI machines.
Hand-held? Why do they have to be hand-held? I'm just talking about ordinary people being able to build one at all, not how portable it would be.
What are you worried about? That someone will kidnap you, force you into an MRI machine, force you to train it for hours on your neural firing patterns, and get the password to your bank account this way?
I'm trying to figure out which part of this threat model AI makes a meaningful difference in. If they already have you captive, the xkcd-certified $5 wrench is cheaper.
"End of private thought" doesn't seem to be on this tech tree, unless you posit being able to scan people secretly or against their will.
I'm not worried about any of that at all. None of what I've said has some unstated "therefore, this is bad" clause to it. I'm just pondering the progression of technology here.
If someone comes up with a technology that allows people's minds to be read without their cooperation, then I'd start to worry -- but I see nothing in this that indicates that's where things are going.
Also, the idea of building my own MRI appeals to me, so my mind went on a little tangent about how to make that happen.
Progress isn't linear. Because we can cure one disease doesn't mean we're on a trajectory to eliminate all diseases. Because today's Camry is faster than last year's Camry doesn't mean we'll be traveling at relativistic speeds anytime soon.
That we can correlate thought with incredibly precise and detailed electrochemical phenomenon in your brain should come as absolutely no surprise to anyone with a materialist view of the universe. Your thoughts are, after all, electrochemical phenomenon. The problem - still - is measuring them.
The idea that AI is going to somehow read your mind from macro phenomenon like facial expressions and the width of your iris is total bullshit made up by people who want to sell TV shows. We already have this kind of "technology" in the form of polygraphs - and they have the same effectiveness as horoscopes.
You possibly have a relevant point if it wasn't for the fact that we are already at a crisis in regards to loss of privacy and its impacts for society.
In this respect, any further loss is of significant concern.
You have to lug around a giant bulky laptop and you can't even call people on it! We're a long way from mobile computing.
Anyways, even given current limitations, I'd likely side with the very researchers working on such. They explicitly highlight privacy reasons as a serious concern. Going out of their way to highlight misuse through bypassing requirements (cooperation of subject) and intentionally misinterpreting for nefarious reasons.
One thing that history teaches is that there are always people who will misuse technology for personal gain and to influence or control other people. Always. Human nature hasn't changed.
You do now. But if you are arrested, you can be compelled to do that, and in a few years EEG skullcaps will probably be sufficient.
Honestly, it baffles me that so many people on a site devoted to technology evaluate long-term trends based on current capabilities. Capacity is going to continue to double every ~2 years. If we can't make transistors 2x smaller, we'll find a different architecture to make transistor arrays 2x larger, or [something].
Technical progress compounds. It's often lumpy rather than linear, but it's going to keep imposing an accelerating effect wherever people see profit in applying it.
Liquid helium wouldn't be a barrier to being used on key individuals (say leaders of opposition parties in some large authoritarian-ish country). The training would be if it requires cooperation.
> You have to physically climb into an multimillion dollar, 10 ton, liquid-helium-cooled machine and spend hours training it on your brain
the article suggests it could be accomplished with a neurosurgical implant as well - and which, honestly, if I could transcribe my thoughts to review later, I'd love to try that out. In which case, the question of security moves to the matter of accessing the implant.
If someone is working on a handheld device that can perform MRI-level scans from across the room, I would be worried about the privacy implications of that technology, not AI.