Motion Capture Technologies

The future of mocap outside of cinema

Mikaila Weaver
20 min readApr 14, 2016

ABSTRACT

Motion-capture or mocap technologies have been around nearly as long as cel animation itself. From ‘Snow White and the Seven Dwarves’ to ‘Dawn of the Planet of the Apes’, the film industry has propelled these technologies forward at a rapid pace. But what are the other applications of mocap technologies outside of Hollywood production studios? Where does their future lie? When Microsoft introduced the Xbox Kinect onto the video-gaming scene in 2010, it unknowingly democratized motion capture by bringing this sophisticated technology into living rooms all over the world. 24 million units later there are student projects, new consumer products, art installations and more, all designed around the so-called hacking of the Kinect (Epstein). The Kinect is not the future of motion capture intrinsically, but rather third-party developers are using it as the means to get us there.

BACKGROUND

The idea of mapping animation onto actors is nothing new. It’s been around since the first full-length feature film to use cel animation, Disney’s 1937 film ‘Snow White and the Seven Dwarfs’. Max Fleischer is credited as the first animator to employ animation mapping in his short films. In 1915, he devised the Rotoscope, a technology that would allow animators to easily trace over frames of live-action film in order to provide their hand-drawn characters with more lifelike movement on screen (Lyttelton). Rotoscoping was first used in ‘Snow White’, and it has been used ever since. This technique was even used in the original ‘Star Wars’ trilogy to create the effect of the glowing lightsabers. Rotoscoping has been around for a long time, but alternative methods of live-action animation have rapidly developed alongside it.

Fast-forward 50 years to the coming of age of modern motion-capture technology: biomechanics labs were beginning to use computers to analyze human motion and these studies eventually found a place for discussion in computer graphics circles. For example, it is often noted in histories of motion capture that in the early 1980s, Tom Calvert, a professor of kinesiology and computer science at Simon Fraser University, used a mechanical motion-capture system whereby he “attached potentiometers to human and used the output to drive computer animated figures for choreographic studies and clinical assessment of movement abnormalities” (Sturman). Calvert and team used the potentiometers to study knee joints specifically. The output from the analog system was converted digitally and sent to a computer animation system in what closely approaches the technology of modern motion-capture systems.

These biomechanical studies paved the way for cinematic productions to feed motion-capture data directly into computers. When combined with the advances in computer graphics over the next fifteen years, motion-capture computer technology would finally become a feasible option for full-length feature films and large-scale productions. An associate editor for the film blog ScreenCrush, Nick Romano summarizes the progression of mocap from medical research to the movies:

“While the video game industry would be one of the first to utilize this tech for entertainment purposes, special effects artist John Dykstra would soon take notice and decide to apply it to the Val Kilmer-led ‘Batman Forever’ (1995) to create a digital double for some of the actor’s stunts. From there, director James Cameron used motion capture in small ways in ‘Titanic’ (1997), Ridley Scott used it for ‘Gladiator’ (2000), and George Lucas used it to create Jar Jar Binks in the ‘Star Wars’ prequels (1999–2005).”

While Jar Jar Binks was widely disliked by Star Wars fans, he was still a pioneering character for both mocap and CGI. Ahmed Best was able to act alongside his fellow cast while the actual mocap animation of Jar Jar Binks was animated based on his performance from a sound-stage after the fact (Lyttelton). ‘The Lord of the Rings: The Fellowship of the King’ (2001) was the first to realize a well-liked character in the form of Andy Serkis’s Gollum.

Peter Jackson’s ‘King Kong’ (2005) was the next film to add another layer of innovation to mocap technologies through the inclusion of facial motion capture. Prior to the release of ‘King Kong’, facial animation was done in post-production without the use of markers or mocap. Animators had to rely on live-action footage as a reference for the creation of digital faces (Romano). Facial capture would play a heavy part in films moving forward.

‘Avatar’ (2009) was not just another pioneer in motion-capture cinema, rather a paradigm of success. James Cameron and his team used head-mounted facial capture cameras to literally bring the alien Na’vi race to life and keep them firmly rooted in humanity (Rivas). ‘Pirates of the Caribbean: Dead Man’s Chest’ (2006) is also notable for using facial capture and for collecting performance-capture data of Bill Nighy and the ensemble cast of cursed sailors on set as opposed to in a studio later on (Romano). This was an uncommon practice at the time, but it would soon become standard practice in ‘The Hobbit’ trilogy and the ‘Planet of the Apes’ series. ‘

Motion-capture technologies used to be relegated to the realm of the studio, where actors in tight lycra suits covered with ping pong balls and surrounded by special cameras would act in isolation. The summer blockbuster ‘Dawn of the Planet of the Apes’ (2014), took performance capture a step further with “more than 85 percent of the movie shot outside the studio, in forests near Vancouver and in various outdoor locations near New Orleans” (Perry, ‘Motion Capture Technology…’). This film represents a culmination of all the available recent developments in mocap technologies, and it received widespread praise for both its storyline and its special effects. Fox is even campaigning to get Andy Serkis recognized by the Academy for his performance as Caesar, which would be the first mocap performance to ever be considered for an Oscar (Bricken).

The production crew for ‘Dawn of the Planet of the Apes’ used over 50 mocap cameras to capture vast outdoor scenes of simian armies. There had to be enough cameras to capture the scenes from multiple angles to prevent occlusion. Shooting outdoors was another obstacle; “a good gust of wind, or someone bumping into a camera pole, could disturb the position of a camera enough in order to require recalibration” (Perry, ‘Motion Capture Technology…’). The cameras had to be weather-proofed and repositioned each day. The production team also had to develop special body markers that could respond to changing light situations. The cast of apes also had 48 facial markers painted on each day that were filmed with the same head-mounted cameras as in ‘Avatar’. All things considered, it is clear that ‘Dawn of the Planet of the Apes’ will be the benchmark against which all future mocap productions will be judged.

DEFINITION

The most common set up for motion-capture systems is typically an optical one, i.e. systems that use tracking cameras (with or without markers) to record movement from different angles while location data is triangulated between them (Dent). In addition to optical systems, mechanical, magnetic, sonic, biofeedback, electric field, and inertial are also ways of classifying mocap (Furniss). Motion-capture systems can also be described as either active or passive, where active systems could use anything from magnetic equipment to LED light markers that use built-in processors to communicate precise coordinates while passive systems would use reflective markers. Active systems are surpassing passive systems in terms of accuracy and overall adoption by Hollywood, although passive optical capture systems remain standard for facial capture.

Optical motion capture is the most frequently used system for movies and video games, because it allows for a more traditional performance capture system, i.e. actors on a studio set captured via camera. The downside of this system is that it requires a great deal of timely post-production and computer processing power. For industries that do not require post-production to add layers of animation, optical motion capture has developed into an ideal tool for research purposes, such as for biomechanics and industrial or military research. Optical motion capture has even evolved to the point of markerless systems, which use algorithms from match-moving software to track distinctive features such as a subject’s clothing or nose (Dent). Markerless systems are not yet accurate enough for most use cases, but they are growing in terms of both popularity and availability for consumer markets.

From rotoscoping to markerless, cinematic advances in mocap technologies help drive progress in other industries and not the other way around. Even video game mocap is at the mercy of mocap for the movies. That being said, motion-capture cinema has rapidly advanced only in the last two decades in large part due to what is known as the “uncanny valley” theory. Sigmund Freud and Ernst Jentsch before him postulated on what it meant for something to be “uncanny”. Freud theorized that “something has to be added to what is novel and unfamiliar to make it uncanny.” It must diverge only just enough from what is familiar and well-known in order to be considered disconcerting if not disturbing.

In the 1970s Roboticist Masahiro Mori coined the phrase “the uncanny valley” to specifically describe “a person’s response to a humanlike robot as it abruptly shifts from empathy to revulsion when it approached, but failed to attain, a lifelike appearance.” The term has been repurposed for use in computer graphics to describe the problem faced when trying to depict real human characters in animation. “When these creations stopped looking cartoonish and started approaching photo-realism, the characters somehow began to seem creepy rather than endearing” (Perry, ‘Digital Actors Go…’). Mori also attempted to explain this experience through a now infamous chart.

To achieve a more visceral understanding of what the uncanny valley feels like in CGI, one should reference either ‘The Lawnmower Man’ (1992), a sci-fi film that dabbles in horror as a design aesthetic more than a plot tool, or ‘Parametric Expression’ (2013), a modern short by artist Mike Pelletier that deliberately flaunts its eeriness (Brownlee). Either of these experimental films will convey the uncanny valley concept quite quickly. Early computer-generated animations, through rudimentary mocap techniques, were remarkable only for their leaps and bounds in technical progress and not for any sort of overall cultural embrace of the craft. It wasn’t until Gollum made his debut in 2001 in Peter Jackson’s The Lord of the Rings trilogy, with visual effects by WETA Digital, that mocap passed the so-called Turing test for computer-animated characters, and the stage was set for a slew of fantastical CGI films and casts.

AUDIENCE AND USE CASES

Beyond the familiar spheres of film production and video game development, motion-capture technologies have diffused into other industries as varied as aerospace, advertising, health care, manufacturing, security, and even sports. According to BusinessWeek Staff Writer Aili McConnon,

“Lockheed Martin, the world’s largest defense contractor, has pushed motion sensing to even more exotic extremes in an effort to reduce design and manufacturing costs. To help engineers, technicians, pilots and Lockheed customers understand how the plane will perform, [Lockheed Martin] equips teams of visitors with VR headsets and suits donned with motion-capture sensors. They enter a darkened 15 ft. by 20 ft. area where 24 cameras track their every move. What the visitors ‘see’ through their head displays are the fighter prototype and lifelike avatars of one another. They can walk through the prototype, crouch down to inspect or change a part, and practice physical routines they will replicate in real-world planes many months later.”

Lockheed Martin is able to save money on large-scale physical prototypes through the use of these sorts of motion capture and virtual reality simulations.

Ford Motor Co. is also making innovative use of mocap technologies to analyze the reach and posture of people on the assembly lines to reduce risk of injury. Manufacturing ergonomics expert Allison Stephens says these mocap simulations have reduced the expected number of disability cases at some plants by as much as 80%, saving Ford tens of millions of dollars a year in disability payments (McConnon). Motion capture, when implemented by automotive companies can save more than dollars or injuries for that matter; it can save lives. “Toyota, Nissan, and others have sponsored research at Stanford investigating the expressions drivers typically have five seconds before they fall asleep. These can be detected with simple cameras installed in a steering wheel or dashboard that trigger an alarm” (McConnon).

Besides these commercial use cases, mocap also has many health-related applications. Motion capture can provide valuable data for research or analysis by doctors in a way that static images from MRI or CAT scan cannot. Bill Gilmour is the Founder of BioMotion, a company seeking to spread the use of mocap technology in the medical community. Gilmour explains that when a patient complains of pain in a certain area of the body, it is more difficult to diagnose the problem without seeing the body in motion (Geiger). Analyzing a person’s gait while walking can link lower-back pain to problems in a person’s legs, whereas analysis of static images may miss this critical relationship. Motion tracking has been used at the Hospital for Special Surgery in New York to help those with disorders landing from cerebral palsy to arthritis (McConnon).

In addition to the diagnosis and alleviation of pain and disease, motion-capture technologies are also used to train doctors and rehabilitate patients. Like a vast game of ‘Operation’, mocap can facilitate surgical training simulations for several doctors at a time over distance by using the technology to track their hands and tools. Motion tracking and capture systems can also record movements with greater precision than the human eye. The National Pitching Association makes uses of this aspect of motion capture to help baseball players improve their game and avoid injuries by closely analyzing the biomechanical efficiency of their movements (McConnon).

Motion-capture technologies have also been used to create art for art’s sake. Zach Lieberman, an artist and a professor at Parsons School of Design, created a program for the Xbox Kinect using FaceTracker software for use in the music video for “Chase No Face” by Olga Bell. In the video, light dances over Bell’s face in a pattern that may seem random at first, “but in fact [the lights] respond to Bell’s performance, adding a unique visual element to her song” (Schnell). Chris Milk’s “The Treachery of Sanctuary” is another art piece made possible through mocap. “The Treachery of Santuary” is essentially a large-scale interactive triptych that shapes flocks of birds in response to the movements of its audience, eventually even transforming them into a bird as a metaphor for transcendence (Holmes). Motion capture in fine arts is a perfect vehicle for channeling human emotion and understanding into design.

STRATEGIC VALUE DRIVERS

Motion-capture technology is only just reaching maturity in terms of its use in large-scale, high-budget productions, and yet it’s been nearly a century since Max Fleischer devised the Rotoscope. What factors are at play that create a barrier to more widespread adoption? Tekla Perry, Senior Editor at IEEE Spectrum, explained,

“Producing a fully realistic digital double is still fantastically expensive and time-consuming. It’s cheaper to hire even George Clooney than it is to use computers to generate his state-of-the-art digital double. However, the expense of creating a digital double is dependent on the costs of compute power and memory, and these costs will inevitably fall.”

This observation, made earlier this year, provides a strong argument for why the primary use case for motion-capture systems is still for feature film or video game animation. The technologies are largely cost-prohibitive at present. It is no coincidence that some of the most expensive films ever made, i.e. ‘Avatar’, ‘The Pirates of the Caribbean’ trilogy, the ‘Harry Potter’ series, etc., typically rely heavily on CGI and motion-capture software. Why would production studios go to such great lengths financially? The answer is simple: these movies also represent some of the highest grossing films, with ‘Avatar’ far ahead of the pack (Lyttelton).

Visual effects studios, the primary forces behind motion capture and computer-generated animation, do not always share equally in the success of high-grossing movies though. Rhythm & Hues has won numerous awards for its work, and famously filed for bankruptcy 11 days before winning the Oscar for Visual Effects for “Life of Pi” (Pulver). The 2013 Academy Awards were accompanied by a protest on their behalf and on behalf of all VFX artists. There are also opposing forces to mocap, because some artists see the imposing technologies as removing the artistry. “Pixar have never been great fans of the technique, preferring instead to let their animators use instincts to inform their art instead of raw data. The credits for 2007’s Ratatouille proudly featured the claim ‘100% Pure Animation — No Motion Capture!’” (Gray).

Ethics and politics aside, cost and computer processing power are still greater prohibitive factors in regards to widespread use of mocap technologies. When Microsoft released the Kinect for Xbox 360 late in 2010, all of this changed. Upon its original release, the Xbox Kinect “set a Guinness World Record for the fastest-selling consumer device ever” (Walker). The first generation of Kinect sold 24 million units, and the second generation has even more impressive technical specifications that promise to top its predecessor.

The Kinect primarily uses two cameras: an “RGB camera” that detects red, green and blue light to help differentiate humans from backgrounds and a camera that “transmits invisible near-infrared light and measures its ‘time of flight after it reflects off the objects (Carmody). The latter camera is referred to as the “depth camera” and it functions similar to sonar. Wired writer Tim Carmody explains, “If you know how long the light takes to return, you know how far away an object is. Cast a big field, with lots of pings going back and forth at the speed of light, and you can know how far away a lot of objects are…. With this tech, Kinect can distinguish objects’ depth within 1 centimeter and their height and width within 3 mm.”

While the Kinect is sold and marketed primarily for the Xbox 360 and now Xbox One video game consoles, its most impressive value driver is in its raw hardware and software components and their potential for “hacking”. Microsoft eventually embraced the unintended iterations of the Kinect when Steve Ballmer, Microsoft’s chief executive, announced that the company “would release a version specifically meant for use outside the Xbox context and lay down formal rules permitting commercial uses for the device. A result has been a fresh wave of Kinect-centric experiments aimed squarely at the marketplace: helping Bloomingdale’s shoppers find the right size of clothing; enabling a ‘smart’ shopping cart to scan Whole Foods customers’ purchases in real time; making you better at parallel parking” (Walker). The presence of an affordable mocap technology in consumer living rooms combined with Microsoft’s blessing for hacking has gestured in the most important age of mocap yet, one that is untethered from politics and financial limitations on availability. The Kinect has signaled in an era of user-initiated innovation in potential motion capture use cases.

PRESENT AND FUTURE BRAND ACTIVATION

Many brands are already using motion-capture technologies in the form of interactive storefronts and billboards, advertising experiences, and technological publicity stunts: Target, Adidas, Diesel, Carnival, and Starbucks are a few that quickly come to mind. A recently viral example would be “The Dancing Traffic Light”, a pedestrian crossing signal in Lisbon, Portugal that was configured using motion-capture so that dance moves of passersby would be mimicked in real time by the human figure in the traffic light (Stone). Created by Smart, makers of the Smart Car, the wildly successful stunt was part of broader campaign to promote safety. While not all of these advertisement stunts go viral, the experiences do prove more memorable than traditional, static billboards. John Payne, president of Monster Media, says of the interactive displays, “People don’t ignore the ads — they want to play with them.” (McConnon).

While offline interactive experiences are a natural fit for motion-capture technologies, what Microsoft calls “natural user interface ads” are a more innovative experience. Natural user interface ads refer specifically to Kinect-friendly advertising on the Xbox 360 (or most recently the Xbox One) dashboard. NUads, as they are known, launched late in 2012 with interactive polling ads from Subway and Toyota, and they have been a resounding success so far. Ross Honey, general manager of Xbox Live Entertainment and Advertising described NUads as a “real breakthrough in TV advertising” (Yin-Poole).

Xbox’s NUads are not important intrinsically, rather they are important as proof of concept for future use cases. These ads validate the idea that customizing an ad for a specific user interface (with more understanding of and intentional design for that interface) has the potential to increase interaction substantially. For example mobile advertisements have long been of relevant interest to advertisers, but nobody has yet been able to solve the problem of how best to execute, track and analyze them, with tracking as the most significant obstacle to usage. Natural user interface advertising could use the mobile camera to track and detect motion in order to identify particular demographics of mobile users for more targeted ads. If mobile cameras were equipped with a near-infrared depth sensor in addition to a camera, similar to the Kinect technology, the possible implications for accurate user detection, tracking and measurement would be far more robust.

Less related to brands and more related to businesses, motion capture made possible through more affordable technologies like the Kinect will allow designers of all types to analyze analog and digital products for feedback that is more accurate and quantitative. For example, designers that operate in the physical space will be able to evaluate prototypes in real time to assess the ergonomics of their designs. According to Iek van Cruyningen, head of securities at Liberates Capital Group, “Any company that creates a product used by people needs to understand how the human body moves. Motion-tracking systems and virtual simulations accelerate product development and boost productivity” (McConnon).

The same principle applies to digital products as our physical interactions with screens evolve to become more reliant on motion-capture. It can be assumed that gesture-based interfaces will soon eclipse the mouse and the television remote as the preferred method of interaction with technology. Swipes and taps on mobile and tablet screens are both pervasive and easily understood, so it is no surprise when people mistakenly try to touch a standard desktop or laptop screen as well. These innocent mistakes are more telling than they may seem at first; they actually serve as proof that more intuitive interaction is imminent for all devices. How will user experience designers design and measure the usability for gesture-based systems? Motion-capture will be both the answer and solution for designing and measuring the usability of future interfaces in the digital realm.

One final and important future use case for brands is simply to support innovation in this field. Brands are no longer the forefront of innovative ideas, rather they are often late to the so-called party. Brands and businesses do have the resources and reach that individuals with ideas often lack though. Independent think-tanks and entrepreneurs are hacking Kinects to create robotic guide dogs that are able to assist the blind and shopping carts that are able to follow people that are bound to a wheelchair (Dillow). People are inventing new uses for the sophisticated yet simple technology every day. Brands do not always have to be at the forefront of this, but they can certainly position themselves as a business that helps to make ideas like these happen. Jen-Hsun Huang, CEO of Nvidia, states “In technology, if something is possible now, it will take less than 10 years to make it practical” (Perry, ‘Digital Actors Go…’). Brands are at a pivotal moment in time where they can make a serious impact by enabling entrepreneurs to enact real change in the world.

CONCLUSION

Advances in motion-capture technologies have been propelled forward primarily by Hollywood cinema. These advanced technologies have largely trickled down to other industries only when advances in computer processing power have made them less cost-prohibitive. There are advantages to using motion capture and potential use cases for nearly every business that creates a product used by people. The most significant opportunities for brands and businesses to employ motion-capture technologies in the future are for natural user interface ad tracking, gestural user experience design, and positioning themselves as supporters of innovation, for the field of mocap in particular. After nearly a century of development by remote movie studios, motion capture is finally at a moment where it is both powerful and accessible for consumers and third-party developers; brands simply need to enable innovators from all walks of life.

REFERENCES AND WORKS CITED

Bricken, Rob. “Fox Wants Andy Serkis to Win an Oscar for Planet of the Apes So Bad.” Gawker Media: IO9. 7 Nov. 2014. Web. 9 Nov. 2014. http://io9.com/fox-wants-andy-serkis-to-win-an-oscar-for-planet-of-the-1655951366

Brownlee, John. “This Video Proves the Uncanny Valley is in Hell.” Fast Company. 22 Aug. 2013. Web. 9 Nov. 2014. http://www.fastcodesign.com/1673270/this-video-proves-the-uncanny-valley-is-in-hell

Carmody, Tim. “How Motion Detection Works in Xbox Kinect.” Wired. 3 Nov. 2010. Web. 10 Nov. 2014. http://www.wired.com/2010/11/tonights-release-xbox-kinect-how-does-it-work/

Dawtrey, Adam. “Andy Serkis Plans UK Motion-Capture Studio.” The Guardian. 23 Apr. 2010. Web. 9 Nov. 2014. http://www.theguardian.com/film/2010/apr/23/andy-serkis-motion-capture-studio

Dent, Steve. “What you need to know about 3D motion capture.” Engadget. 14 Jul 2014. Web. 10 Nov. 2014. http://www.engadget.com/2014/07/14/motion-capture-explainer/

Dillow, Clay. “Kinect Provides the Seeing Eye in a Robotic Guide Dog for the Blind.” Popular Science. 9 Nov. 2011. Web. 14 Nov. 2014. http://www.popsci.com/technology/article/2011-11/kinect-provides-seeing-eye-japanese-robotic-guide-dog-blind?dom=PSC&loc=recent&lnk=6&con=video-kinect-provides-the-seeing-eye-in-a-robotic-guide-dog-for-the-blind

Dyer, Mitch. “Xbox One’s Kinect is Legitimately Awesome.” IGN. 21 May 2013. Web. 11 Nov. 2014. http://www.ign.com/articles/2013/05/22/xbox-ones-kinect-is-legitimately-awesome

Epstein, Zach. “Microsoft says Xbox 360 sales have surpassed 76 million units, Kinect sales top 24 million.” BGR. 12 Feb. 2013. Web. 14 Nov. 2014. http://bgr.com/2013/02/12/microsoft-xbox-360-sales-2013-325481/

Freud, Sigmund. “The Uncanny.” 1919. Web. 9 Nov. 2014. http://web.mit.edu/allanmc/www/freud1.pdf

Furniss, Maureen. “Motion Capture.” MIT Communication Forum. Web. 14 Nov. 2014. http://web.mit.edu/comm-forum/papers/furniss.html

Geiger, Jacob. “BioMotion brings motion capture from Hollywood to health care.” The Richmond Times-Dispatch. 26 Jun. 2012. Web. 9 Nov. 2014. http://www.timesdispatch.com/workitrichmond/news/henrico-county/article_e942133e-ca77-5de8-a62f-61775d799e1c.html

Gray, Ali. “A Brief History of Motion-Capture in the Movies.” Imagine Games Network. 11 Jul. 2014. Web. 9 Nov. 2014. http://www.ign.com/articles/2014/07/11/a-brief-history-of-motion-capture-in-the-movies

Holmes, Kevin. “Chris Milk’s ‘The Treachery of Sanctuary’ Unveiled At London’s Digital Revolution”. VICE Media: The Creator’s Project. 2 Jul. 2014. Web. 15 Nov. 2014. http://thecreatorsproject.vice.com/en_uk/blog/chris-milks-the-treachery-of-sanctuary-unveiled-at-londons-digital-revolution

Jentsch, Ernst. “On the Psychology of the Uncanny.” 1906. Trans. Roy Sellars. 1995. Web. 9 Nov. 2013. http://art3idea.psu.edu/metalepsis/texts/Jentsch_uncanny.pdf

Lee, Kevin. “This Kinect Project Uses Your Tongue as the Controller.” TechHive. 4 Apr. 2012. Web. 11 Nov. 2014. http://www.techhive.com/article/253199/this_kinect_project_uses_your_tongue_as_the_controller.html

Lyttelton, Oliver. “A Brief History of Motion-Capture, From Gollum to Caesar.” Yahoo! Movies. 10 Jul. 2014. Web. 9 Nov. 2014. https://www.yahoo.com/movies/a-brief-history-of-motion-capture-in-the-movies-91372735717.html

McConnon, Aili. “Beyond Virtual Reality: Motion-capture technology is busting out of Hollywood and into companies such as Intel, Toyota, and Lockheed Martin.” BusinessWeek. 2 Apr. 2007. Web. 9 Nov. 2014. http://biomotionlabs.com/wp-content/uploads/2011/09/BusWeek04022007.pdf

Mori, Masahiro. “The Uncanny Valley.” Trans. by Karl F. MacDorman and Norri Kageki. IEEE Spectrum. 12 Jun. 2012. Web. 9 Nov. 2014. http://spectrum.ieee.org/automaton/robotics/humanoids/the-uncanny-valley

Nosowitz, Dan. “The Motion-Sensing Banking App We’ve All Been Waiting For.” 18 Oct. 2011. Web. 14 Nov. 2014. http://www.popsci.com/technology/article/2011-10/video-groundbreaking-motion-sensing-banking-app-weve-been-waiting?dom=PSC&loc=recent&lnk=7&con=video-the-motionsensing-banking-app-weve-all-been-waiting-for

Perry, Tekla. “Digital Actors Go Beyond the Uncanny Valley.” IEEE Spectrum. 27 May 2014. Web. 9 Nov. 2014. http://spectrum.ieee.org/computing/software/digital-actors-go-beyond-the-uncanny-valley

Perry, Tekla. “Motion Capture Technology Goes Into the Wild for Dawn of the Planet of the Apes.” IEEE Spectrum. 10 Jul. 2014. Web. 10 Nov. 2014. http://spectrum.ieee.org/tech-talk/computing/software/motion-capture-technology-goes-into-the-wild-for-dawn-of-the-planet-of-the-apes

Pulver, Andrew. “Oscars protest by visual effects workers over Life of Pi.” 24 Feb. 2013. Web. 15 Nov. 2014. http://www.theguardian.com/film/2013/feb/25/oscars-protest-life-of-pi

Rivas, Jorge. “Avatar: Motion Capture Mirrors Emotions.” Discovery News. 24 Dec. 2009. Web. 15 Nov. 2014. https://www.youtube.com/watch?v=1wK1Ixr-UmM

Romano, Nick. “How’d They Do That? A Brief Visual History of Motion-Capture Performance on Film.” Screencrush. 14 Jul. 2014. Web. 9 Nov. 2014. http://screencrush.com/motion-capture-movies/

Savage, Annaliza. “How New Motion-Capture Tech Improved The Hobbit.” Wired. 7 Dec. 2012. Web. 15 Nov. 2014. http://www.wired.com/2012/12/andy-serkis-interview/

Schnell, Joshua. “Hacked Kinect Allows Singer to Really Shine.” TechHive. 27 Apr. 2012. Web. 11 Nov. 2014. http://www.techhive.com/article/254581/hacked_kinect_this_allows_singer_to_really_shine.html

Stinson, Liz. “Where Celebrities Fall in the Uncanny Valley.” Wired. 29 Nov. 2011. Web. 9 Nov. 2014. http://www.wired.com/2011/11/pl_uncanny_valley/

Stone, Avery. “Smart’s Dancing Traffic Light Is The Grooviest Way For Pedestrians To Stay Safe.” The Huffington Post. 18 Sep. 2014. Web. 15 Nov. 2014. http://www.huffingtonpost.com/2014/09/18/smart-creates-dancing-traffic-light_n_5843884.html

Sturman, David J. “A Brief History of Motion Capture for Computer Character Animation.” SIGGRAPH 94: Course 9. 13 Mar. 1999. Web. 14 Nov. 2014. https://www.siggraph.org/education/materials/HyperGraph/animation/character_animation/motion_capture/history1.htm

Walker, Rob. “Freaks, Geeks, and Microsoft: How Kinect Spawned a Commercial Ecosystem.” The New York Times. 31 May 2012. Web. 11 Nov. 2014. http://www.nytimes.com/2012/06/03/magazine/how-kinect-spawned-a-commercial-ecosystem.html?pagewanted=all

Yin-Poole, Wesley. “Microsoft Delighted with Xbox NUads, Vows Increased Investment.” EUROGAMER. 1 Sep. 2013. Web. 11 Nov. 2014. http://www.eurogamer.net/articles/2013-01-09-microsoft-delighted-with-xbox-nuads-vows-increased-investment

--

--

Mikaila Weaver
Mikaila Weaver

Written by Mikaila Weaver

Designer and dog mom by day, sleeps at night

Responses (3)