Panther Glove Guns

As I rule I don’t review lethal weapons on scifiinterfaces.com. The Panther Glove Guns appear to be remote-bludgeoning beams, so this kind of sneaks by. Also, I’ll confess in advance that there’s not a lot that affords critique.

We first see the glove guns in the 3D printer output with the kimoyo beads for Agent Ross and the Dora Milaje outfit for Nakia. They are thick weapons that fit over Shuri’s hands and wrists. I imagine they would be very useful to block blades and even disarm an opponent in melee combat, but we don’t see them in use this way.

The next time we see them, Shuri is activating them. (Though we don’t see how) The panther heads thrust forward, their mouths open wide, and the “neck” glows a hot blue. When the door before her opens, she immediately raises them at the guards (who are loyal to usurper Killmonger) and fires.

A light-blue beam shoots out of the mouths of the weapons, knocking the guards off the platform. Interestingly, one guard is lifted up and thrown to his 4-o-clock. The other is lifted up and thrown to his 7-o-clock. It’s not clear how Shuri instructs the weapons to have different and particular knock-down effects. But we’ve seen all over Black Panther that brain-computer interfaces (BCI) are a thing, so it’s diegetically possible she’s simply imagining where she wants them to be thrown, and then pulling a trigger or clenching her fist around a rod or just thinking “BAM!” to activate. The force-bolt strikes them right where they need to so that, like a billiard ball, they get knocked in the desired direction. As with all(?) brain-computer interfaces, there is not an interaction to critique.

After she dispatches the two guards, still wearing the gloves, she throws a control bead onto the Talon. The scene is fast and blurry, but it’s unclear how she holds and releases the bead from the glove. Was it in the panther’s jaw the whole time? Could be another BCI, of course. She just thought about where she wanted it, flung her arm, and let the AI decide when to release it for perfect targeting. The Talon is large and she doesn’t seem to need a great deal of accuracy with the bead, but for more precise operations, the AI targeting would make more sense than, say, letting the panther heads disintegrate on command so she would have freedom of her hands. 

Later, after Killmonger dispatches the Dora Milaje, Shuri and Nakia confront him by themselves. Nakia gets in a few good hits, but is thrown from the walkway. Shuri throws some more bolts his way though he doesn’t appear to even notice. I note that the panther gloves would be very difficult to aim since there’s no continuous beam providing feedback, and she doesn’t have a gun sight to help her. So, again—and I’m sorry because it feels like cheating—I have to fall back to an AI assist here. Otherwise it doesn’t make sense. 

Then Shuri switches from one blast at a time to a continuous beam. It seems to be working, as Killmonger kneels from the onslaught.

This is working! How can I eff it up?

But then for some reason she—with a projectile weapon that is actively subduing the enemy and keeping her safe at a distance—decides to close ranks, allowing Killmonger to knock the glove guns with a spear tip, thereby free himself, and destroy the gloves with a clutch of his Panther claws. I mean, I get she was furious, but I expected better tactics from the chief nerd of Wakanda. Thereafter, they spark when she tries to fire them. So ends this print of the Panther Guns.

As with all combat gear, it looks cool for it to glow, but we don’t want coolness to help an enemy target the weapon. So if it was possible to suppress the glow, that would be advisable. It might be glowing just for the intimidation factor, but for a projectile weapon that seems strange.

The panther head shapes remind an opponent that she is royalty (note no other Wakandan combatants have ranged weapons) and fighting in Bast’s name, which I suppose if you’re in the business of theocratic warfare is fine, I guess.

It’s worked so well in the past. More on this aspect later.

So, if you buy the brain-computer interface interpretation, AI targeting assist, and theocratic design, these are fine, with the cinegenic exception of the attention-drawing glow.


Black History Matters

Each post in the Black Panther review is followed by actions that you can take to support black lives.

When The Watchmen series opened with the Tulsa Race Massacre, many people were shocked to learn that this event was not fiction, reminding us just how much of black history is erased and whitewashed for the comfort of white supremacy (and fuck that). Today marks the beginning of Black History Month, and it’s a good opportunity to look back and (re)learn of the heroic figures and stories of both terror and triumph that fill black struggles to have their citizenship and lives fully recognized.

Library of Congress, American National Red Cross Photograph Collection

There are lots of events across the month. The African American History Month site is a collaboration of several government organizations (and it feels so much safer to share such a thing now that the explicitly racist administration is out of office and facing a second impeachment):

  • The Library of Congress
  • National Archives and Records Administration
  • National Endowment for the Humanities
  • National Gallery of Art
  • National Park Service
  • Smithsonian Institution and United States Holocaust Memorial Museum

The site, https://www.africanamericanhistorymonth.gov/, has a number of resources, including images, video, and calendar of events for you.

Today we can take a moment to remember and honor the Greensboro Four.

On this day, February 1, 1960: Through careful planning and enlisting the help of a local white businessman named Ralph Johns, four Black college students—Ezell A. Blair, Jr., Franklin E. McCain, Joseph A. McNeil, David L. Richmond—sat down at a segregated lunch counter at Woolworth’s in Greensboro, North Carolina, and politely asked for service. Their request was refused. When asked to leave, they remained in their seats.

Police arrived on the scene, but were unable to take action due to the lack of provocation. By that time, Ralph Johns had already alerted the local media, who had arrived in full force to cover the events on television. The Greensboro Four stayed put until the store closed, then returned the next day with more students from local colleges.

Their passive resistance and peaceful sit-down demand helped ignite a youth-led movement to challenge racial inequality throughout the South.

A last bit of amazing news to share today is that Black Lives Matter has been nominated for the Nobel Peace Prize! The movement was co-founded by Alicia Garza, Patrisse Cullors and Opal Tometi in response to the acquittal of Trayvon Martin’s murderer, got a major boost with the outrage following and has grown to a global movement working to improve the lives of the entire black diaspora. May it win!

Okoye’s grip shoes

Like so much of the tech in Black Panther, this wearable battle gear is quite subtle, but critical to the scene, and much more than it seems at first. When Okoye and Nakia are chasing Klaue through the streets of Busan, South Korea, she realizes she would be better positioned on top of their car than within it.

She holds one of her spears out of the window, stabs it into the roof, and uses it to pull herself out on top of the swerving, speeding car. Once there, she places her feet into position, and the moment the sole of her foot touches the roof, it glows cyan for a moment.

She then holds onto the stuck spear to stabilize herself, rears back with her other spear, and throws it forward through the rear-window and windshield of some minions’ car, where it sticks in the road before them. Their car strikes the spear and get crushed. It’s a kickass moment in a film of kickass moments. But by all means let’s talk about the footwear.

Now, it’s not explicit, the effect the shoe has in the world of the story. But we can guess, given the context, that we are meant to believe the shoes grip the car roof, giving her a firm enough anchor to stay on top of the car and not tumble off when it swerves.

She can’t just be stuck

I have never thrown a javelin or a hyper-technological vibranium spear. But Mike Barber, PhD scholar in Biomechanics at Victoria University and Australian Institute of Sport, wrote this article about the mechanics of javelin throwing, and it seems that achieving throwing force is not just by sheer strength of the rotator cuff. Rather, the thrower builds force across their entire body and whips the momentum around their shoulder joint.

 Ilgar Jafarov, CC BY-SA 4.0, via Wikimedia Commons

Okoye is a world-class warrior, but doesn’t have superpowers, so…while I understand she does not want the car to yank itself from underneath her with a swerve, it seems that being anchored in place, like some Wakandan air tube dancer, will not help her with her mighty spear throwing. She needs to move.

It can’t just be manual

Imagine being on a mechanical bull jerking side to side—being stuck might help you stay upright. But imagine it jerking forward suddenly, and you’d wind up on your butt. If it jerked backwards, you’d be thrown forward, and it might be much worse. All are possibilities in the car chase scenario.

If those jerking motions happened to Okoye faster than she could react and release her shoes, it could be disastrous. So it can’t be a thing she needs to manually control. Which means it needs to some blend of manual, agentive, and assistant. Autonomic, maybe, to borrow the term from physiology?

So…

To really be of help, it has to…

  • monitor the car’s motion
  • monitor her center of balance
  • monitor her intentions
  • predict the future motions of the cars
  • handle all the cybernetics math (in the Norbert Wiener sense, not the sci-fi sense)
  • know when it should just hold her feet in place, and when it should signal for her to take action
  • know what action she should ideally take, so it knows what to nudge her to do

These are no mean feats, especially in real-time. So, I don’t see any explanation except…

An A.I. did it.

AGI is in the Wakandan arsenal (c.f. Griot helping Ross), so this is credible given the diegesis, but I did not expect to find it in shoes.

An interesting design question is how it might deliver warning signals about predicted motions. Is it tangible, like vibration? Or a mild electrical buzz? Or a writing-to-the-brain urge to move? The movie gives us no clues, but if you’re up for a design challenge, give it a speculative design pass.

Wearable heuristics

As part of my 2014 series about wearable technologies in sci-fi, I identified a set of heuristics we can use to evaluate such things. A quick check against those show that they fare well. The shoes are quite sartorial, and look like shoes so are social as well. As a brain interface, it is supremely easy to access and use. Two of the heuristics raise questions though.

  1. Wearables must be designed so they are difficult to accidentally activate. It would have been very inconvenient for Okoye to find herself stuck to the surface of Wakanda while trying to chase Killmonger later in the film, for example. It would be safer to ensure deliberateness with some mode-confirming physical gesture, but there’s no evidence of it in the movie.
  2. Wearables should have apposite I/O. The soles glow. Okoye doesn’t need that information. I’d say in a combat situation it’s genuinely bad design to require her to look down to confirm any modes of the shoes. They’re worn. She will immediately feel whether her shoes are fixed in place. While I can’t name exactly how an enemy might use the knowledge about whether she is stuck in place or not, but on general principle, the less information we give to the enemy, the safer you’ll be. So if this was real-world, we would seek to eliminate the glow. That said, we know that undetectable interactions are not cinegenic in the slightest, so for the film this is a nice “throwaway” addition to the cache of amazing Wakandan technology.

Black Georgia Matters and Today is the Day

Each post in the Black Panther review is followed by actions that you can take to support black lives.

Today is the last day in the Georgia runoff elections. It’s hard to overstate how important this is. If Ossoff and Warnock win, the future of the country has a much better likelihood of taking Black Lives Matter (and lots of other issues) more seriously. Actual progress might be made. Without it, the obstructionist and increasingly-frankly-racist Republican party (and Moscow Mitch) will hold much of the Biden-Harris administration back. If you know of any Georgians, please check with them today to see if they voted in the runoff election. If not—and they’re going to vote Democrat—see what encouragement and help you can give them.

Some ideas…

  • Pay for a ride there and back remotely.
  • Buy a meal to be delivered for their family.
  • Make sure they are protected and well-masked.
  • Encourage them to check their absentee ballot, if they cast one, here. https://georgia.ballottrax.net/voter/
  • If their absentee ballot has not been registered, they can go to the polls and tell the workers there that they want to cancel their absentee ballot and vote in person. Help them know their poll at My Voter Page: https://www.mvp.sos.ga.gov/MVP/mvp.do

This vote matters, matters, matters.

UX of Speculative Brain-Computer Inputs

So much of the technology in Black Panther appears to work by mental command (so far: Panther Suit 2.0, the Royal Talon, and the vibranium sand tables) that…

  • before we get into the Kimoyo beads, or the Cape Shields, or the remote driving systems…
  • before I have to dismiss these interactions as “a wizard did it” style non-designs
  • before I review other brain-computer interfaces in other shows…

…I wanted check on the state of the art of brain-computer interfaces (or BCIs) and see how our understanding had advanced since I wrote the Brain interface chapter in the book, back in the halcyon days of 2012.

Note that I am deliberately avoiding the tech side of this question. I’m not going to talk about EEG, PET, MRI, and fMRI. (Though they’re linked in case you want to learn more.) Modern brain-computer interface (or BCI) technologies are evolving too rapidly to bother with an overview of them. They’ll change in the real world by the time I press “publish,” much less by the time you read this. And sci-fi tech is most often a black box anyway. But the human part of the human-computer interaction model changes much more slowly. We can look to the brain as a relatively-unalterable component of the BCI question, leading us to two believability questions of sci-fi BCI.

  1. How can people express intent using their brains?
  2. How do we prevent accidental activation using BCI?

Let’s discuss each.

1. How can people express intent using their brains?

In the see-think-do loop of human-computer interaction…

  • See (perceive) has been a subject of visual, industrial, and auditory design.
  • Think has been a matter of human cognition as informed by system interaction and content design.
  • Do has long been a matter of some muscular movement that the system can detect, to start its matching input-process-output loop. Tap a button. Move a mouse. Touch a screen. Focus on something with your eyes. Hold your breath. These are all ways of “doing” with muscles.
The “bowtie” diagram I developed for my book on agentive tech.

But the first promise of BCI is to let that doing part happen with your brain. The brain isn’t a muscle, so what actions are BCI users able to take in their heads to signal to a BCI system what they want it to do? The answer to this question is partly physiological, about the way the brain changes as it goes about its thinking business.

Ah, the 1800s. Such good art. Such bad science.

Our brains are a dense network of bioelectric signals, chemicals, and blood flow. But it’s not chaos. It’s organized. It’s locally functionalized, meaning that certain parts of the brain are predictably activated when we think about certain things. But it’s not like the Christmas lights in Stranger Things, with one part lighting up discretely at a time. It’s more like an animated proportional symbol map, with lots of places lighting up at the same time to different degrees.

Illustrative composite of a gif and an online map demo.

The sizes and shapes of what’s lighting up may change slightly between people, but a basic map of healthy, undamaged brains will be similar to each other. Lots of work has gone on to map these functional areas, with researchers showing subjects lots of stimuli and noting what areas of the brain light up. Test enough of these subjects and you can build a pretty good functional map of concepts. Thereafter, you can take a “picture” of the brain, and you can cross-reference your maps to reverse-engineer what is being thought.

From Jack Gallant’s semantic maps viewer.

Right now those pictures are pretty crude and slow, but so were the first actual photographs in the world. In 20–50 years, we may be able to wear baseball caps that provide a much more high-resolution, real time inputs of concepts being thought. In the far future (or, say, the alternate history of the MCU) it is conceivable to read these things from a distance. (Though there are significant ethical questions involved in such a technology, this post is focused on questions of viability and interaction.)

From Jack Gallant’s semantic map viewer

Similarly the brain maps we have are only for a small percentage of an average adult vocabulary. Jack Gallant’s semantic map viewer (pictured and linked above) shows the maps for about 140 concepts, and estimates of average active vocabulary is around 20,000 words, so we’re looking at a tenth of a tenth of what we can imagine (not even counting the infinite composability of language). But in the future we will not only have more concepts mapped, more confidently, but we will also have idiographs for each individual, like the personal dictionary in your smart phone.

All this is to say that our extant real world technology confirms that thoughts are a believable input for a system. This includes linguistic inputs like “Turn on the light” and “activate the vibranium sand table” and “Sincerely, Chris” and even imagining the desired change, like a light changing from dark to light. It might even include subconscious thoughts that yet to be formed into words.

2. How do we prevent accidental activation?

But we know from personal experience, we don’t want all our thoughts to be acted on. Take, for example, those thoughts you’re feeling hangry, or snarky, or dealing with a jerk-in-authority. Or those texts and emails that you’ve composed in the heat of the moment but wisely deleted before they get you in trouble.

If a speculative BCI is being read by a general artificial intelligence, it can manage that just like a smart human partner would.

He is composing a blog post, reasons the AGI, so I will just disregard his thought that he needs to pee.

And if there’s any doubt, an AGI can ask. “Did you intend me to include the bit about pee in the post?” Me: “Certainly not. Also BRB.” (Readers following the Black Panther reviews will note that AGI is available to Wakandans in the form of Griot.)

If AGI is unavailable to the diegesis (and it would significantly change any diegesis of which it is a part) then we need some way to indicate when a thought is intended as input and when it isn’t. Having that be some mode of thought feels complicated and error-prone, like when programmers have to write regex expressions that escape escape characters. Better I think is to use some secondary channel, like a bodily interaction. Touch forefinger and pinky together, for instance, and the computer understands you intend your thoughts as input.

So, for any BCI that appears in sci-fi, we would want to look for the presence or absence of AGI as a reasonableness interpreter, and, barring that, for some alternate-channel mechanism for indicating deliberateness. We would also hope to see some feedback and correction loops to understand the nuances of the edge-case interactions, but these are rare in sci-fi.

Even more future-full

This all points to the question of what seeing/perceiving via a BCI might be. A simple example might be a disembodied voice that only the user can hear.

A woman walks alone at night. Lost in thoughts, she hears her AI whisper to her thoughts, “Ada, be aware that a man has just left a shadowy doorstep and is following, half a block behind you. Shall I initialize your shock shoes?”

What other than language can be written to the brain in the far future? Images? Movies? Ideas? A suspicion? A compulsion? A hunch? How will people know what are their own thoughts and what has been placed there from the outside? I look forward to the stories and shows that illustrate new ideas, and warn us of the dark pitfalls.

The Royal Talon piloting interface

Since my last post, news broke that Chadwick Boseman has passed away after a four year battle with cancer. He kept his struggles private, so the news was sudden and hard-hitting. The fandom is still reeling. Black people, especially, have lost a powerful, inspirational figure. The world has also lost a courageous and talented young actor. Rise in Power, Mr. Boseman. Thank you for your integrity, bearing, and strength.

Photo CC BY-SA 2.0,
by Gage Skidmore.

Black Panther’s airship is a triangular vertical-takeoff-and-landing vehicle called the Royal Talon. We see its piloting interface twice in the film.

The first time is near the beginning of the movie. Okoye and T’Challa are flying at night over the Sambisa forest in Nigeria. Okoye sits in the pilot’s seat in a meditative posture, facing a large forward-facing bridge window with a heads up display. A horseshoe-shaped shelf around her is filled with unactivated vibranium sand. Around her left wrist, her kimoyo beads glow amber, projecting a volumetric display around her forearm.

She announces to T’Challa, “My prince, we are coming up on them now.” As she disengages from the interface, retracting her hands from the pose, the kimoyo projection shifts and shrinks. (See more detail in the video clip, below.)

The second time we see it is when they pick up Nakia and save the kidnapped girls. On their way back to Wakanda we see Okoye again in the pilot’s seat. No new interactions are seen in this scene though we linger on the shot from behind, with its glowing seatback looking like some high-tech spine.

Now, these brief glimpses don’t give a review a lot to go on. But for a sake of completeness, let’s talk about that volumetric projection around her wrist. I note is that it is a lovely echo of Dr. Strange’s interface for controlling the time stone Eye of Agamatto.

Wrist projections are going to be all the rage at the next Snap, I predict.

But we never really see Okoye look at this VP it or use it. Cross referencing the Wakandan alphabet, those five symbols at the top translate to 1 2 K R I, which doesn’t tell us much. (It doesn’t match the letters seen on the HUD.) It might be a visual do-not-disturb signal to onlookers, but if there’s other meaning that the letters and petals are meant to convey to Okoye, I can’t figure it out. At worst, I think having your wrist movements of one hand emphasized in your peripheral vision with a glowing display is a dangerous distraction from piloting. Her eyes should be on the “road” ahead of her.

The image has been flipped horizontally to illustrate how Okoye would see the display.

Similarly, we never get a good look at the HUD, or see Okoye interact with it, so I’ve got little to offer other than a mild critique that it looks full of pointless ornamental lines, many of which would obscure things in her peripheral vision, which is where humans need the most help detecting things other than motion. But modern sci-fi interfaces generally (and the MCU in particular) are in a baroque period, and this is partly how audiences recognize sci-fi-ness.

I also think that requiring a pilot to maintain full lotus to pilot is a little much, but certainly, if there’s anyone who can handle it, it’s the leader of the Dora Milaje.

One remarkable thing to note is that this is the first brain-input piloting interface in the survey. Okoye thinks what she wants the ship to do, and it does it. I expect, given what we know about kimoyo beads in Wakanda (more on these in a later post), what’s happening is she is sending thoughts to the bracelet, and the beads are conveying the instructions to the ship. As a way to show Okoye’s self-discipline and Wakanda’s incredible technological advancement, this is awesome.

Unfortunately, I don’t have good models for evaluating this interaction. And I have a lot of questions. As with gestural interfaces, how does she avoid a distracted thought from affecting the ship? Why does she not need a tunnel-in-the-sky assist? Is she imagining what the ship should do, or a route, or something more abstract, like her goals? How does the ship grant her its field awareness for a feedback loop? When does the vibranium dashboard get activated? How does it assist her? How does she hand things off to the autopilot? How does she take it back? Since we don’t have good models, and it all happens invisibly, we’ll have to let these questions lie. But that’s part of us, from our less-advanced viewpoint, having to marvel at this highly-advanced culture from the outside.


Black Health Matters

Each post in the Black Panther review is followed by actions that you can take to support black lives.

Thinking back to the terrible loss of Boseman: Fuck cancer. (And not to imply that his death was affected by this, but also:) Fuck the racism that leads to worse medical outcomes for black people.

One thing you can do is to be aware of the diseases that disproportionately affect black people (diabetes, asthma, lung scarring, strokes, high blood pressure, and cancer) and be aware that no small part of these poorer outcomes is racism, systemic and individual. Listen to Dorothy Roberts’ TED talk, calling for an end to race-based medicine.

If you’re the reading sort, check out the books Black Man in a White Coat by Damon Tweedy, or the infuriating history covered in Medical Apartheid by Harriet Washington.

If you are black, in Boseman’s memory, get screened for cancer as often as your doctor recommends it. If you think you cannot afford it and you are in the USA, this CDC website can help you determine your eligibility for free or low-cost screening: https://www.cdc.gov/cancer/nbccedp/screenings.htm. If you live elsewhere, you almost certainly have a better healthcare system than we do, but a quick search should tell you your options.

Cancer treatment is equally successful for all races. Yet black men have a 40% higher cancer death rate than white men and black women have a 20% higher cancer death rate than white women. Your best bet is to detect it early and get therapy started as soon as possible. We can’t always win that fight, but better to try than to find out when it’s too late to intervene. Your health matters. Your life matters.

3 of 3: Brain Hacking

The hospital doesn’t have the equipment to decrypt and download the actual data. But Jane knows that the LoTeks can, so they drive to the ruined bridge that is the LoTek home base. As mentioned earlier under Door Bombs and Safety Catches the bridge guards nearly kill them due to a poorly designed defensive system. Once again Johnny is not impressed by the people who are supposed to help him.

When Johnny has calmed down, he is introduced to Jones, the LoTek codebreaker who decrypts corporate video broadcasts. Jones is a cyborg dolphin. Continue reading

Brain Scanning

The second half of the film is all about retrieving the data from Johnny’s implant without the full set of access codes. Johnny needs to get the data downloaded soon or he will die from the “synaptic seepage” caused by squeezing 320G of data into a system with 160G capacity. The bad guys would prefer to remove his head and cryogenically freeze it, allowing them to take their time over retrieval.

1 of 3: Spider’s Scanners

The implant cable interface won’t allow access to the data without the codes. To bypass this protection requires three increasingly complicated brain scanners, two of them medical systems and the final a LoTek hacking device. Although the implant stores data, not human memories, all of these brain scanners work in the same way as the Non-invasive, “Reading from the brain” interfaces described in Chapter 7 of Make It So.

The first system is owned by Spider, a Newark body modification
specialist. Johnny sits in a chair, with an open metal framework
surrounding his head. There’s a bright strobing light, switching on
and off several times a second.

jm-20-spider-scan-a

Nearby a monitor shows a large rotating image of his head and skull, and three smaller images on the left labelled as Scans 1 to 3. Continue reading

The Memory Doubler

In Beijing, Johnny steps into a hotel lift and pulls a small package out his pocket. He unwraps it to reveal the “Pemex MemDoubler”.

jm-4-memdoubler-a

Johnny extends the cable from the device and plugs it into the implant in his head. The socket glows red once the connection is made.

jm-4-memdoubler-b-adjusted

Continue reading

Itchy’s SFW Masturbation Chair

With the salacious introduction, “Itchy, I know what you’d like,” Saun Dann reveals himself as a peddler of not just booby trapped curling irons, but also softcore erotica! The Life Day gift he gives to the old Wookie is a sexy music video for his immersive media chair.

SWHS-Chair-03

The chair sits in the family living room, and has a sort of helmet fixed in place such that Itchy can sit down and rest his head within it. On the outside of the helmet are lights that continuously blink out of sync with each other and seem unrelated to the actual function of the chair. Maybe a fairy-lights power indicator?

SWHS-Chair-02

Continue reading

Dat glaive: Projectile gestures

TRIGGER WARNING: IF YOU ARE PRONE TO SEIZURES, this is not the post for you. In fact, you can just read the text and be quit of it. The more neurologically daring of you can press “MORE,” but you have been forewarned.

If the first use of Loki’s glaive is as a melée weapon, the second use is of a projectile weapon. Loki primes it, it glows fiercely blue-white, and then he fires it with usually-deadly accuracy to the sorrow of his foes.

This blog is not interested in the details of the projectile, but what is interesting is the interface by which he primes and fires it. How does he do it? Let’s look. He fires the thing 8 times over the course of the movie. What do we see there? Continue reading

Brain interfaces as wearables

There are lots of brain devices, and the book has a whole chapter dedicated to them. Most of these brain devices are passive, merely needing to be near the brain to have whatever effect they are meant to have (the chapter discusses in turn: reading from the brain, writing to the brain, telexperience, telepresence, manifesting thought, virtual sex, piloting a spaceship, and playing an addictive game. It’s a good chapter that never got that much love. Check it out.)

This is a composite SketchUp rendering of the shapes of all wearable brain control devices in the survey.

This is a composite rendering of the shapes of most of the wearable brain control devices in the survey. Who can name the “tophat”?

Since the vast majority of these devices are activated by, well, you know, invisible brain waves, the most that can be pulled from them are sartorial– and social-ness of their industrial design. But there are two with genuine state-change interactions of note for interaction designers.

Star Trek: The Next Generation

The eponymous Game of S05E06 is delivered through a wearable headset. It is a thin band that arcs over the head from ear to ear, with two extensions out in front of the face that project visuals into the wearer’s eyes.

STTNG The Game-02

The only physical interaction with the device is activation, which is accomplished by depressing a momentary button located at the top of one of the temples. It’s a nice placement since the temple affords placing a thumb beneath it to provide a brace against which a forefinger can push the button. And even if you didn’t want to brace with the thumb, the friction of the arc across the head provides enough resistance on its own to keep the thing in place against the pressure. Simple, but notable. Contrast this with the buttons on the wearable control panels that are sometimes quite awkward to press into skin.

Minority Report (2002)

The second is the Halo coercion device from Minority Report. This is barely worth mentioning, since the interaction is by the PreCrime cop, and it is only to extend it from a compact shape to one suitable for placing on a PreCriminal’s head. Push the button and pop! it opens. While it’s actually being worn there is no interacting with it…or much of anything, really.

MinRep-313

MinRep-314

Head: Y U No house interactions?

There is a solid physiological reason why the head isn’t a common place for interactions, and that’s that raising the hands above the heart requires a small bit of cardiac effort, and wouldn’t be suitable for frequent interactions simply because over time it would add up to work. Google Glass faced similar challenges, and my guess is that’s why it uses a blended interface of voice, head gestures, and a few manual gestures. Relying on purely manual interactions would violate the wearable principle of apposite I/O.

At least as far as sci-fi is telling us, the head is not often a fitting place for manual interactions.