Project Black Mirror Hacks Siri to Use Brain Activity to Initiate Calls

Siri is arguably the most efficient hands-free solution of handling the most basic tasks on an smartphone. In the future you'll be able to simply think of a task you want your iPhone to perform, and traditional voice recognition could become obsolete."Some hobbyist hackers behind the Project Black Mirror have rigged up an iPhone 4S […]

Siri is arguably the most efficient hands-free solution of handling the most basic tasks on an smartphone. In the future you'll be able to simply think of a task you want your iPhone to perform, and traditional voice recognition could become obsolete.

"Some hobbyist hackers behind the Project Black Mirror have rigged up an iPhone 4S to collect brain wave patterns from some simple ECG pads, translate them into synthesized speech, which is in turn pumped through the 3.5 mm headphone jack, and recognized by Siri as a usable command. Besides pressing the home key to initiate Siri, all you have to do is think your command, and your iPhone 4S will hop to it."

The guys have recorded brain wave activity with ECG pads, matched the incoming patterns to pre-saved digital patterns saved on a MacBook, then fed the matched commands to a speech synthesizer chip that translates the command to Siri.

The video shows the developers initiating a call, but they say they've linked approximately twenty-five brain wave patterns to various Siri-controlled functions, and hope to bypass having to physically press the home button with a fully automated solution in the future.

The team breaks down the intricacies:

  1. ECG pads provide raw skin conductivity / electrical activity as analogue data (0-5v).
  2. This is plugged into the Arduino board via 4 analogue inputs (no activity = 0v, high activity = 5v).
  3. The Arduino has a program burnt to it's EPROM chip that filters the signals.
  4. Josh trained the program by thinking of the main Siri commands ("Call", "Set", "Diary" etc.) one at a time and the program where we captured the signature brain patterns they produce.
  5. The program can detect the signature patterns that indicate a certain word is being thought of. The program will then wait for a natural 'release' in brain waves and assume the chain of commands is now complete and action is required.
  6. The series of commands are fed to a SpeakJet speech synthesiser chip
  7. The audio output of which simply plugs into the iPhone's microphone jack.

Here's a demo of the Black Mirror in action: