[ 3 / biz / cgl / ck / diy / fa / ic / jp / lit / sci / vr / vt ] [ index / top / reports ] [ become a patron ] [ status ]
2023-11: Warosu is now out of extended maintenance.

/sci/ - Science & Math


View post   

File: 239 KB, 546x421, 1319905981077.png [View same] [iqdb] [saucenao] [google]
5388559 No.5388559 [Reply] [Original]

I dont have the means to develop or study this, I hope someone else does thats why im posting it here.

They have already used fMRI to look at pictures and videos in the subjects head.

All someone needs to do is designed a computer brain interphase that reads the users intention as the user moves the "mouse" on the screen with his thoughts. This would be far faster and a normal mouse, also another use could be think of letter and that could be like typing.

thus there would be no need for keyboard or mouse, and we could control computers with our brain alone, just with a little device next to our head reading our intentions.

im surprised noone is working on this, it seems everyone in the fMRI field seems obsessed with helping patients that have lost and arm or had a stroke, where in fact commercializing this kind of thing as the next step replacing the mouse and keyboard on ALL computers could be so much more lucrative,

>> No.5388567

do you genuinely think you're the first to think of this? Really?
One of many examples:
http://www.emotiv.com/

>> No.5388568

You're definitely the first one retarded to think you need an fMRI to do it, though.

>> No.5388574

>>5388567
no thats using EEG

EEG is bullshit, is really full of noise, and hard to learn. Im saying to use fMRI which is a far superior approach

>> No.5388577

>>5388568
Not really. People have been trying to do this for a few years already.

>http://cswww.essex.ac.uk/staff/poli/bci/papers/01300789.pdf
>http://www.ncbi.nlm.nih.gov/pubmed/19026717
>http://cds.ismrm.org/ismrm-2004/Files/000733.pdf

>> No.5388579

>>5388574
EEG is not bullshit. The temporal resolution is much, much better than that of fMRI. That makes it well suited for the kinds of real-time analyses that you use for brain-computer interfacing.

>> No.5388584

>>5388559
They already have this.

Quadriplegics have had this for over ten years. And I'm talking actually moving a cursor on the screen with your thoughts.

>> No.5388593

troll/10

mount stupid responds

>> No.5388594
File: 56 KB, 660x525, 1325119311260.jpg [View same] [iqdb] [saucenao] [google]
5388594

>>5388584
then why hasn't it been commercialized?

i much rather use my thoughts then a mouse.

also do you have any links?

>> No.5388596

>>5388593
this is not a troll

>> No.5388605

>>5388594
Do you know what an MRI scanner costs?

>> No.5388610

>>5388605
do you know how much a device like wireless mouse used to cost in the 1984?

>> No.5388618

>>5388610
MRI is not going to become much cheaper. Aside from the super conducting magnet, there are stable running costs that come from replenishing the liquid helium. Stop and think for a second before you hit submit, Jesus Christ.

>> No.5388646
File: 62 KB, 744x558, old-computer-thumb.jpg [View same] [iqdb] [saucenao] [google]
5388646

>>5388618
i find your lack of faith disturbing

pic related

>> No.5388680
File: 741 KB, 1003x2266, MRI.png [View same] [iqdb] [saucenao] [google]
5388680

>>5388646
I can't believe I actually have to explain this, but alright...

It's not a matter of scaling down the circuitry. The fundamental principle of MRI relies on a strong static magnetic field to cause spin alignment (see image; I made a newer version some time ago but I can't find it right now). Field strength decreases with an increase in the diameter of the magnet bore. That's a simple fact that we can't get around, and it means that no matter how fancy the circuitry gets, you'll always be stuck with a room-sized magnet if you want to be able to measure anything.

Like I said earlier, EEG is perfectly fine for what you're describing. It's relatively cheap, and a whole lot more practical. Commercial EEG interfaces are already available.

>> No.5388684

>>5388646
You know that pic is faked right?

>> No.5388691
File: 2.13 MB, 1680x1705, 1346768665006.png [View same] [iqdb] [saucenao] [google]
5388691

>>5388680
>I made a newer version some time ago but I can't find it right now
For the sake of completeness, I found the newer version.

>> No.5388703

>>5388680
ok im going to stop arguing with you because its pointless, you probably know more about fMRI then me,

but going back to what i was asking, if EEG works then hoe come i cant buy an EEG mouse/keyboard interface...? Or can I? And if i can then how come it isnt popular yet?

Im asking you because you seem to know about this stuff.

>> No.5388719

>>5388703
> if EEG works then hoe come i cant buy an EEG mouse/keyboard interface...? Or can I?
http://www.emotiv.com/
I'm not sure if you can actually use it as a mouse, but the interface setup is there for you to buy.

>And if i can then how come it isnt popular yet?
I don't know how popular these are, but they come with some limitations. EEG systems for research purposes can range from between 3000 Euro's (8 silver/silver-chloride electrodes, including amplifier) to 200000 Euro's (256 platinum/iridium electrodes, MRI compatible amplifier). If you want good signal quality, you're going to have to pay for it. These things are built by a company to be cheap, and not necessarily with product quality in mind.

>> No.5388726
File: 55 KB, 620x384, tan-le.jpg [View same] [iqdb] [saucenao] [google]
5388726

>>5388719
yeah i was checking out the videos with the hot asian and that site.

I could not see anything about using the headset to controll a keyboard and mouse, they only have like 13 different readings. And its designed with gaming in mind. Most people will think this is shtick like motions controlls for the WII.

I imagine there must be some major difficulties that prevent this technology from replacing a mouse and keyboard, because otherwise they would market it that way, which seems to me to be a lot more appealing then a small share of the gaming market. (which also would require game developers to get on board)

>> No.5388735

>>5388719
If I understand correctly the interface from emotiv needs first to be "trained" to actually associate a brainwave with a command. So, if you want to use it in a game, you have to first train the app/interface with different patterns of brainwaves so that later it can interpret these and translate them into commands.

Kind of underwhelming.

>> No.5388738

>>5388735
Also, the number of commands is limited. So, from their video, you get a choice of basic commands like Pull, Push, Nudge, etc and let the computer associate a brainwave with a chosen command.

This is pretty far from what a mouse can do instantly. Not to mention how cheap a mouse is.

>> No.5388760

>>5388735
Exactly. The reason these things haven't replaced mouse and keyboard is that they are of limited practical use. We already have a perfectly good system to convert intention into action, and it's called the motorcortex. Why reinvent the wheel?

>> No.5388766

>>5388760
ok thats true, but if you where to use mind control tasks, you could perform tasks at the speed of thought.

Imagine typing this conversation in your mind. You can almost think in 1 sentence a second. This would necessitate a lot more inputs (one for every charter). but if it was done correctly it would be much faster.

what you are is the same as: "WHy do we need windows? We can do the same operation in DOS and its a perfectly good operating system, why reinvent the wheel?"

>> No.5388780

>>5388766
What you're suggesting is a one-to-one decoding of neural activity into the corresponding thought. That's a whole different ballgame than simply moving a cursor up and down a screen. The way these things work is to have a program classify the global neuronal state as corresponding to a certain action state. The key word here is 'global'. When you think, however, you don't actually produce each individual letter belonging to each word in your mind. Every word is an incredibly intricate conjugate of abstract perceptual features, the neural signatures of which are at present too complex for us to decode in real time. It goes beyond the temporal resolution of MRI, and the spatial resolution of EEG.

>> No.5388781

This would be a horrible idea. The brain is used for thinking, the hand is supposed to move the mouse, thus leaving the brain free to think strategies. Talking about some serious shit over here.

>implying the hand works on its own
No dipshit, it's called muscle memory, you're basically already controlling it with your brain, but it has become so familiar it requires little to none brain function to move the fucking mouse.

>> No.5388788

>>5388781
what if you could think of a strategy and the computer would implement that strategy to the best of its ability, without the need to use a mouse or keyboard or even conceptualizing a mouse and keyboard.

>> No.5388792

>>5388780
if they have 13 inputs or whatever the number is.

why cant they in the future decode 1300 inputs? surely the resolution can only improve from here.

>> No.5388802

>>5388792
Sure. But that's different from reading the words that are spoken by the minds voice, so to speak. The 'input' these devices take, are voluntarily induced states that are marked by one particular EEG characteristic that is in principle not directly related to the action the input corresponds to. One would for instance vividly image a scene on a beach, which increases occipital alpha power. The machine knows that occipital alpha power corresponds to 'move the cursor up'.

Thinking a word is different though. Not every time you 'think' a word, you think it the same as last time.

>> No.5388843

>>5388802
>>5388802
I agree. But if in 20 minutes we can learn to create 13 readable states. In a few weeks we can learn to create 1300 readable states.

To the point where, when we think of a letter or word, we actually think of that state. (like a rewiering so to say) No?

With a little training (just about the same time as trainig to use a keyboard) it can really pay off in the long run.

>> No.5388851

Actually come to think of it, this is almost exactly like typing on a keyboard.

When you think of typing the letter "a" you think of your finger hitting the leftmost center key of the key board. I don't see how this can't be retrained and read just as well by that headset.

>> No.5389403

x>noone
x>"mouse"
x>im
x>ends with a comma
smells like copypasta

>> No.5389586

>>5388851
>>5388843
I'll reply to your post tomorrow, when I'm sober, and if I remember to do so. I'm to drunk to explain the problems in what you're proposing.

For now, have a great day.

>> No.5389898

>>5389586
looking forward to reply!

>> No.5389950
File: 134 KB, 500x333, 1353738903234.jpg [View same] [iqdb] [saucenao] [google]
5389950

>>5388703
>you probably know more about fMRI then me,
AHAHAHHAAHHAHAHA, this faggot is this ignorant.
CNS is a neuroscience grad student or post-grad. No shit he knows more about fMRI than you.

>> No.5389954

>>5388760
CNS... wouldn't that be the premotor cortex that does it, not the motor cortex?

>inb4 you asswreck me with your superior knowledge

>> No.5390031

God dammit, I had to re-write this:
Cunting Neurotic Salesman is right, however I think he's explaining generalities in terms of specifics.

>>5388781
Brings up an interesting yet overlooked subject, of the obfuscation between means and meaning.

>> No.5390035

>>5390031
Cunt'd:

Now then, we have basically had what you describe for awhile now - devices which can measure the activity of nerve centres. An instrument must match a certain input to a desired output. For instance cybernetic prostheses can attach to the back and replace say, a lost hand. The information that would have gone to said hand is instead carried and read to some degree of efficacy by the instrument. This also works with mirror neurons and "think of lifting your hand" - certain pathways always fire for a given individual when they think of lifting their hand or do lift their hand. By this means we can measure those pathways and assign them an output. I'm not sure but I think they did something similar with colour and a kind of stephen-hawking-word-wheel thing.

What you seem to be speaking of is an instrument which translates the origin of any input - the brain - into output. That is essentially a passive instrument- one which necessitates no action, one that would literally be reading your brain, and doing what you want. At that point it is no longer an instrument of your will but a part of your will itself. Since we have no "universal translator" of thought or any hyper-intelligent AI to do this, the instrument must then be active, requiring a narrowly defined, easily readable field. Such as "Move your hand upwards."

>> No.5390038

>>5390035
Cunt'd'd

To demonstrate the problem, let's look at a scenario:
Amanda Tapping is measuring your brain with a magical alien device that captures every facet of your neural interactions with complete fidelity. You lie down and think of the letter "B", bold and white against a black background, while trying to get the song "Yellow Submarine" out of your head for the third day, and self-consciously wondering how to check your spaghetti levels and trying not to get a boner.
As you may imagine, this reading will be different from any other, including another instance of "imagining the letter 'B'". This is the problem of translation, getting an output of of some input. An objection to this would be "Like moving your hand up, there must be some activity which exists when thinking of the letter 'B' uniquely, by how else could we think of the letter 'B'? Although I agree, the truth is language is a clusterfuck already when it's on a 2d page going left-right/top-down. As far as I know what you might call the "function" - the programming, of that problem is still very far from solved.

>> No.5390069

>>5390038

FUCK EYE FUCKING DELTEDFUACKBACKSPACEFUCKNIGGERTITSBITCHLEUAURAEORUAUOEIRAOIERAO.

Anyhow, I had quite a lot more but THE UNIVERSE DOESN'T WANT TO HEAR IT SO IN CONCLUSION:

Basically, what you're desiring would require either an external intelligent AI/universal translator to translate what we want done into action - OR a specific route which is just as asinine and inefficient- worse than a keyboard in my opinion because a keyboard externalizes the instrumentation via muscle memory - you know longer need to think about each individual letter, or even the operation of the keyboard - the nerves from your brain to your hand pick this task through practice - which alleviates the burden of micromanaging.

A way around this would be a "brain implant" of sorts, in which the input of your brain is already made readable by the implant, akin to opening Word in your mind, the translation from keyboard to page is already programmed.

This of course doesn't solve the problem of translation, merely moving it. And neuroscience is as far from hacking the brain as we are from AI's as far as I know.

Basically, unless you're a quadruple-finger-amputee, a keyboard is your most efficient means of interfacing until we either close neuroscience due to completion or make ai's so intelligent they can read our minds instantaneously.

>> No.5390388

>>5389950
noone gives a shit
idiots like you is why they should implement forced anaon

>> No.5390954

>>5389898
Sorry about that. I should remember to disable my internet connection before I go out drinking.

As I mentioned earlier, and as pointed out extensively in those four long posts above me, there are practical considerations that limit the applicability of on-line neuronal decoding, i.e. the translation problem. That leaves us with the option, as you pointed out, of training to induce state-action associations. I guess in principle this is possible, but it would be more difficult than one might expect. There is the issue of interference. 'Thought' is not as inhibited as motor output. For instance, I now ask you not to think of a pink elephant, but what pops up in your mind as you read this nonetheless is the thought of a lovely colorful pink elephant. Perhaps it has a blue bow tied around its trunk, and is wearing ballet shoes. Oh look, now it's dancing. You get the idea. Reading while 'typing' would become a problem. Stopping to type once the sentence is done would as well. There are neural systems in place that inhibit all and any thought from becoming behavior. This would effectively be eliminated by circumventing the motor system.

>>5388851
>When you think of typing the letter "a" you think of your finger hitting the leftmost center key of the key board.
You don't actually do that. Paradoxical as it may seem, typing is predominantly subserved by processes that you are not aware of. Fine motor control is mediated by the cerebellum and spinal chord. Your higher motor system only issues the command of what to type, but how the typing is implemented is beyond your conscious control.

>>5389954
Sure. I didn't mean just the primary motor cortex.

>>5390031
>Cunting Neurotic Salesman
Cute.

>> No.5390959

>>5390954
>chord
cord*

>> No.5390987

>>5390388
What I've said for the last 5 years

>> No.5391023

Could people hack your brain and fuck it up? This has been a fear of mine regarding interfacing technology with our brains for awhile.

>> No.5391138
File: 96 KB, 845x698, davidson_diag_lg1.jpg [View same] [iqdb] [saucenao] [google]
5391138

>>5390954
thanks for the answer
>Your higher motor system only issues the command of what to type, but how the typing is implemented is beyond your conscious control.
i thought this was interesting. but even if typing is controlled subconsciously, surely they can design a EEG technique that reads the signals fired in the cerebellum specifically. Yes you many not be able to train the cerebellum but perhaps you can type for a while and have the machine read your signals while typing, after a few hours the signals that arise from the typing can be associated with the letters that are being typed in real time through some sort of a machine learning algorithm.
This would have the benefit of not being interfered by with random thoughts of the frontal cortex (the pink elephant example) because the cerebellum should be more or less at a constant background when you are sitting comfortably in your chair. so in the future you could just imagine you are typing and the interface will just sort of take off.

what do you think?

>> No.5391174

>>5391138
Well, aside from the technical problem that you can't actually measure most of the cerebellum with EEG (it lies too deep, and EEG can only measure electric fields generated by large populations of neurons that are oriented perpendicular to the skull), the cerebellum orchestrates motor-control based on somatosensory and proprioceptive feedback. That means that the majority of cerebellar activity associated with typing is only present when you are already in the process of actually typing.

>> No.5393688

>>5391174
>>5391174
Well... Do you have any interesting ideas as to how this could be implemented in the future? The goal being to remove any need for mouse/keyboard or any other mechanical form of interaction, and have everything done directly by brain.

>> No.5393706

>>5390954
>http://en.wikipedia.org/wiki/Jos%C3%A9_Manuel_Rodriguez_Delgado#Research
the technology to decode basic emotions and motor skills from brain activity seems to have been around decades ago. does this mean its only a matter of time until algorithms can be developed to decode finer things like wording and coherent thought?