train your diet to eat concrete?
Anyway, I did the software stuff. If we need to sync MATLAB time stamps up to the time stamps from the EMOTIV headset, I did that. I was the one responsible for making sure that the output we got from the machine learning, which will output a 1 if it thinks the hand is moving off of EEG data from a participant, and a zero if it didnāt think the hand was moving, translated into actual movement.
And the code itself was pretty cool even though it was simple I wish I could have gushed about it in the presentation but no time. MATLAB sends data to the arudino, okay thatās simple but the way the arudino handled it was really cool.
Every second we got 128 frames of data. We may get false positives, so the way I handled that was really cool.
So there were three data variables
Data[0] Data[1] Data[2]. Arduino would read the first three frames of data from MATLAB and put them into those slots. It would call a function to add up all the data variables and output a 0 to 3. A 3 means itās pretty confident itās moving, which means we move the servo for the hand 60 degrees to close for every 3, which would mean if we had 3 3ās it would close fully. Then, it gets rid of data[0]. Moves everythint down. And replaces the next frame with data[2] and does the same thing. Thatās just so cool that instead of implementing a counter I implemented a way to get a new set of data (0-3) which can be used to control the Arduino to doing what we want it to.
A 3, would also set a flag. The next time we get a zero, a zero would reset the flag, and make it so that all future 3ās subtract 60 degrees from the current angles, opening the hand. This will mimic a hand closing, pausing, then opening. A 1 2 is our buffer zone to prevent false postives as nothing happens during then. Also if we get a 3 again it sets the flag so that a zero would add up the degrees instead. The flag makes sure it only executes once
It worked pretty well
Oh yeah our projecr was taking an EEG headset and putting it on peopleās heads, feeding the data it collects to a machine learning alotgtim which is trained on detecting hand movement
We take that data to matlab who sends it to the arudino
We couldnāt do live data. I was so upset at that. My part would have been so much cooler if it was live data. We couldnāt do live data caus itās 200 a month for the subscription
Ich Liebe Kapitalismus
If we had more time and better equipment we could have done a better hand, trained the ai on each invidual finger etc
can it flip someone off
Guys what machine would you group your orphan shredder machine near?
- Orphan stealer machine
- Cheese shredder machine
- Special alone corner for it
0 voters
I mean ya gotta put it next to the orphan stealer machine to maximize efficiency.
If you donāt put it there, then whatās even the point?
I came to this orphan shredder to shred orphans, not to walk back and forth across the room for every goddamn batch of orphans.
IF I WANTED TO WALK BACK AND FORTH ACROSS A ROOM WITH MY HANDS FULL, IāD BE DOING THE LAUNDRY, DAMMIT.
(Wait, do the stolen orphans come with clothes beyond underwear? Because, if so, I can add reselling various articles of clothing (on like eBay or some shit) to the list of things that the orphan stealing/shredding room can be used for.)
Me on my way to make the orphan shredder profitable
Wait, no. Iād have to put the clothes through the laundry. Never mind.
Orphan shredding is hardā¦
My brain is fried.
Does this look like a wolf to you