Out of the office, she’s both a hockey player, coach, and a mother of two, but in the office, Lucy Keighley, President of Motion Analysis, is responsible for making sure the engine of our business is running smoothly - from sales, to marketing, and everything in between.
With a Masters degree in Sport Biomechanics, and early career experience working in television, in the BBC production team, a job at Motion Analysis was the perfect match.
“When the company started in 1984, it was looking at medical applications like gait analysis, using them to track how children walk, for example. Now it’s grown into the sports arena, animation and the television world as well. So working here means I get to use both my previous experiences - in television and biomechanics. We moved into the broadcast market in 2007, and funnily enough, one of the first customers I worked with was BBC Sport. They were relocating the entire sports department, which was a huge move, and in the midst of that we installed a large system for one of the longest running live sports show.”
Lucy started her career at Motion Analysis back in 2003, first coming onboard as an engineer. As the years went by she was able to experience various roles in the business, from demonstrations, to customer support, to sales, but it was the team, and customers, at Motion Analysis, who played a big role in retaining Lucy for almost two decades.
“We are a relatively small company but it’s the people that make it what it is today. Most of our team have been working here for many years and that’s because despite being small and spread across the world, our values align and that keeps us connected. And I would say our greatest value is the relationship we have with our customers. Whether it’s developers or sales staff, they make an effort to get to know and to prioritise the needs of our customers above everything.
In fact, if I were to define success, it would probably be based on our customers. The fact that many of our long term customers continue to choose Motion Analysis, and they trust us, is a big indicator of how we’re tracking as a company, and is a big reason for why I love working here.”
So, after 18 years - what is her advice to any new colleagues at Motion Analysis?
“Do what you enjoy doing, and ask as many questions as you can - there is something new to learn every day and there are so many facets of the business, so get involved in as much as you can.”
The mocap tech that goes into an animated performance
When we think of motion capture performances, what comes to mind is actors rolling around in front of a green screen, in a lycra suit, covered in small tennis balls.
Which is not wrong. But there’s more to capturing motion than just a lycra suit.
What many people refer to as a “motion capture suit”, is really just a lycra outfit to hold the markers tight to the actor's skin while they can move around without being inhibited. But the markers attached to these suits are the real star of the show.
These retro-reflective 3D tracking dots are small spheres positioned strategically on the performer to record their real-life movements. Picture the markers as computerised puppet strings - pulling the skeleton of the character through frames that then creates animated motion.
Although traditional markers need to be attached to a suit or directly to skin, our revolutionary BaSix system only uses six active marker rigs which are attached to a person’s body using gloves, footstraps, a visor, and a belt, over everyday clothes. No suit needed!
The retro-reflective markers are then tracked by motion capture cameras. The more cameras used to do the tracking, the more complete, detailed and accurate the outcome will be.
If you’re picturing a classic SLR camera that someone would use on a photoshoot, think again. Motion Analysis’s cameras, such as the Kestrel, are used to produce marker coordinate data rather than an image. They detect only infrared or near-infrared light and are able to pass information at a much higher frame rate than a typical television camera could.
An animation studio, game maker or filmmaker will use professional 3D animation software – Autodesk’s Maya is one of the more popular ones - which provides all the modeling, rendering, simulation, texturing, and animation tools that need to be added once motion is captured.
Before tracking movement for animation, animators need to have a basic skeleton mapped out for the character they are creating. This skeleton will help them to determine how many markers they need to use, and what levels of movement they need to track. For example, an acrobatic dancer who is going to be doing backflips will require more markers than a rigid limbed robot who stomps around.
The cameras and markers capture the motion and the data driving the character’s skeleton-rig is sent back to the animation program where it’s transformed with fur, clothing, or skin.
Our Cortex system is capable of solving the skeletons of any structure with any number of segments, including bipeds, quadrupeds, props, facial animation and more.
Because most humanoid characters have similar skeletons and move in similar ways, it’s possible to develop marker sets which can be used on a number of skeletons.
Our Basix Go software has a built-in, constrained and tracked human skeleton at its core, which works for almost all humanoid characters. The six active markers strapped to the performer's waist, feet, hands and head, are enough to track a human’s motion very accurately and precisely. Then within our software, (or in the receiving package), this rig can be mapped to the creator's humanoid skeleton.
Having this built in solver-skeleton that’s ready to be tracked, means our BaSix system setup time is minimal compared to other traditional mocap systems.You simply need to walk into the studio once cameras are set up, strap on your six markers, stand in a “T” pose, press “reset skeleton” in the software, and voila - you’re tracking movement and data is being streamed live into your animation package in real-time, ready to be recorded.
A day in the life of a Motion Analysis Software Engineer
There are many great software and computer engineers who have left a mark in history: from Larry Page, Sergey Brin and Eric Schmidt, the brains behind Google; to James Gosling, the creator of the Java programming language.
It’s no surprise then that as technology has progressed, software engineering has become a very sought-after career path. For Greg Hultberg, who has been working as a Software Engineer for Motion Analysis for the past three years, the broad scope of the field is a big drawcard - especially within the motion capture industry.
“There are so many nooks and crannies for us software engineers to explore. From biomechanics to manufacturing to animation, there’s always someone working on a fascinating project and there’s always an opportunity to develop new and advanced features. I specifically work on implementing software features, fixing software bugs, and creating new software products. But at the end of the day, the most important aspect of my job is making sure that the work I do is of the highest quality and meets the needs of our users.”
But how do you respond when a pandemic hits the planet and you’re forced to try to operate at full capacity from home?
“I am very grateful that I am able to work remotely and am enjoying it so far. However, I do believe in work-life balance and so have established a few rules for myself. I make sure to take a break every couple of hours to play with my dogs, I don’t work on the weekends and I set strict start and end times for the day. I am also grateful that Motion Analysis offers a flexible work schedule with an adequate amount of time off.”
While most of us struggled to get a good WIFI connection, or set up something that resembled a home office, Greg was using his computer science skills to set up a mini mocap system in his home and remote into the Motion Analysis large system as needed. This speaks to his dedication to delivering high quality work and making sure he can continue to grow his knowledge in the field.
“I feel like an important and needed part of the company and I can see how my contributions are appreciated by our clients, but I also know I have a lot more to learn. By setting myself up with a mini mocap system at home, I’m able to keep developing my engineering skills and hope to be as well-versed in motion capture and software engineering as my colleagues one day.”
But in order to do this?
“Be patient”, advises Greg. “I remind myself that I am working with highly complex software and hardware and that although the problems that arise are going to be tough, the real reward lies in solving them.”
Who knows, perhaps that posture of patience and commitment to keep growing his skill set will be what lands Greg in the history pages alongside the likes of Larry Page and James Gosling one day.
Why motion capture actors should get nominated for Academy Awards
When an actor is nominated for an Academy Award, it’s because their performance was noteworthy, a standout, something that would leave a legacy in the world of film. They get recognition for the emotion and realism they bring to their character; the evidence of effort they have put into understanding and becoming their character; and the impact their performance has on both the movie and the audience watching it.
This seems to be a question that is increasing in volume as Hollywood churns out more and more animated and live action movies which require actors to don mocap suits in order to bring their characters to life. Sure, these films may get recognized in a “Best Visual Effects” category, but the actors should be given a chance to win in the “Best Actor” and “Best Actress” categories too.
Performance capture is not the future of acting, but just a new form of acting
According to an article written for Deadline by James Franco, who starred opposite Serkis’s Caesar in Planet of the Apes, if Oscars are handed out for an actor's “performance”, then his co-star’s brilliant portrayal of the endearing-yet-terrifying primate should have been considered too.
“In acting school I was taught to work off my co-stars, not to act but react and that was how I would achieve unexpected results, not by planning a performance, but by allowing it to arise from the dynamic between actors, and on The Rise of the Planet of the Apes that’s exactly what I was able to do opposite Andy as Caesar. And Andy got to do the same because every gesture, every facial expression, every sound he made was captured, his performance was captured. “
He believes many fear performance capture may make traditional acting obsolete but, in reality, motion capture just gives actors an opportunity to play characters they may have only been able to give voice-overs to in the past.
Motion capture actors are still actors, they’re just wearing “digital makeup”
Cast your mind back to Charlize Theron in Monster (2003) or Heath Ledger in The Dark Night (2008).
Both actors were decked out in heavy makeup to transform them into the characters they were playing, making them almost unrecognizable from their usual selves. Both actors won an Oscar for their incredible and convincing performances. It was these performances, along with their makeup, that made the audience believe they were their character.
This is no different to what is required by motion capture actors. Instead of blush and eyeliner, their “makeup” is rendered digitally and their performance is captured using 3D cameras and mocap suits. But they still have to make the audience believe in their character, and the digital makeup is not the only thing doing that - their performance plays a pivotal role.
Which is why, despite him being a CGI ape, we the audience, fall in love with Caesar. We’re terrified of him and we root for him. Because we believe in him. And the only reason we believe in Caesar is because of the performance delivered by his mocap actor counterpart - Andy Serkis.
Merriam-Webster’s definition of acting agrees, defining it as “the art or practice of representing a character on a stage or before cameras.”
Some people have argued for the Oscars to develop a separate awards category for motion capture performances. But based on the above definition, the already-existing awards for “Best Actor”, “Best Actress”, “ Best Supporting Actor” and “Best Supporting Actress” should apply to motion capture actors too, who are using voice, expression, mannerisms, and movement to bring a character to life - just like traditional actors do.
To explore the cutting-edge Motion Analysis Tech that is used to bring the performances of motion capture actors to the big screen, visit our website here.
Our 10 favourite motion capture performances on the big screen
We often find ourselves geeking out over the motion capture techniques and technology being used in big blockbuster hits (especially when it’s our own systems), but we also know that in order for that tech to contribute to an impressive film, it has to go hand in hand with committed and skilled performers. So, for a change, we’re not going to speak about the mocap systems that go into making a movie, but rather the performances needed in order to make a film’s visual effects successful. Here are our top ten motion capture performances of all time:
Willem Dafoe as Tars Tarkas in John Carter(2012)
The movie itself may not have been a box office success, but we have to give props to the motion capture performance from Willem Dafoe. In order to portray the 7-foot tall creature known as Tars Tarkas, Dafoe had to sometimes film in his tracking suit, whilst balancing on stilts! Now that’s what we call mocap commitment.
Andy Serkis as King Kong in King Kong (2005)
It would be mocap heresy to not include Andy Serkiss multiple times on a list of the best motion capture movie performances. In King Kong, Serkis walked away with several film critic awards for his ability to bring to life the fierce, primal nature of Kong, while still conveying gentler, more human characteristics of the creature, using 132 retro-reflective markers. As he says in an interview with the Guardian, “King Kong was the epiphany. It was like: you can now do anything.”
Ray Winstone as Beowulf in Beowulf (2007)
Robert Zemeckis’ movie, The Polar Express, is known to be one of the first movies ever made entirely with performance capture technology, according to the motion capture history books. But it was the mocap transformation of a 50-something Ray Winstone, into the towering, young hero called Beowulf, that really impressed audiences. Thanks to mocap tech, Winstone was able to focus on bringing an iconic leading man to life through performance, without being restricted by his physical appearance.
Mark Rylance as the BFG in The BFG (2016)
It must be quite a daunting task to take on such a famous childhood character, but if anyone needed proof that motion capture technology is able to realistically convey humanity, they need only look to Mark Rylance’s performance of everyone’s favourite giant. Rylance has to shift his performance throughout the film to encapsulate the terrifying appearance of a 24-foot being, whilst also evoking the gentle compassion we love so much about the BFG.
Idris Elba as Shere Khan in The Jungle Book (2016)
Taking on a modernized, CGI version of such a beloved tale was a brave feat - as the saying goes, “If it ain’t broke, don’t fix it”. But Director Jon Favreu’s 2016 version of the Jungle Book is a CGI masterpiece that possibly elevated the story to new heights. All the mocap performances were applause-worthy, but the sinister performance conveyed by actor Idris Elba, as the infamous Shere Khan, deserves its own moment in the spotlight.
Bill Nighy as Davy Jones in Pirates of the Caribbean: At World's End (2007)
While many motion capture performances are shot entirely in-studio, Bill Nighy recorded his captivating performance of the fearsome Captain Davy Jones while on set, working alongside his castmates. This was to help him get more into character, and we’ve got to say - it clearly worked - because the tentacle-covered villain looks strikingly real.
Andy Serkis as Caesar in War for the Planet of the Apes (2017)
Once again we look at a legendary motion capture performance from Serkis - this time as the intelligent primate, Caesar, in the final installation of the Planet of the Apes trilogy.
Serkis has done this character more than justice in the previous two films, but in ‘War’, his deeply evocative and emotional portrayal of Caesar was thought by many to be deserving of an Oscar.
Zoe Saldana as Na’vi Neytiri in Avatar (2009)
Director James Cameron famously waited years to bring his vision for Avatar to life, because he believed the tech was still not where it needed to be to do his film justice. And thank goodness he did wait, because due to a specially-designed mocap stage, and her brilliant acting chops, Zoe Saldana was able to bring us a powerful and touching performance of her character, Na’vi Neytiri.
Benedict Cumberbatch in The Hobbit: The Desolation of Smaug (2013)
Cumberbatch’s performance is one for the motion capture history books. From crawling around on his belly, covered in mocap markers, to growling into the cameras and distorting his face, his commitment to his character’s specific mannerisms is what brought to life the fearsome dragon, Smaug.
Andy Serkiss as Gollum in the Lord of the Rings Trilogy (2001-2003)
Naturally, our top spot for motion capture performances in the movies goes to Andy Serkis for his portrayal as Gollum (and it’s not just because he’s wearing one of our mocap suits). Prior to Serkis’ performance in the second LOTR movie (The Two Towers), motion capture had always been performed in the studio, but for this movie, Serkis was able to shoot his performance on location, interacting and reacting to the other actors in the scene. This made it one of the most revolutionary mocap performances of all time, winning the film an Oscar for Best Visual Effects.
BaSix, Classic, or Hybrid? Invest in the best motion capture system for you.
As the use of motion capture systems becomes the norm in various industries, from gaming to biomechanics, to film, the software and types of systems available are changing.
Perhaps you’re running a small animation studio and need a system that’s uncomplicated to set up and cost effective. The best system for you may be quite different from the best system for someone working in biomechanics who needs an unlimited range of cameras.
The benefits of motion capture without a complicated setup...or a suit
OurBaSix Motion Capture System is a great lightweight and affordable option for animation studios, game developers, and previz purposes, and the best part is that it’s suitless!
For under $20K you can get your hands on a system that will help you differentiate your studio and create high-quality 3D characters quickly, with the use of only six BaSix active markers per subject. The BaSix Go software is easy-to-use and integrates with all major animation packages, including Maya, Motion Builder, Unreal and Unity.
If you’re looking to simplify and speed up the animation and previz process with software in under one minute, without a mocap suit, this is the system for you.
A full-spec system for larger-scale projects and precise calibration
Whether you want to:
design digital prototypes;
capture and identify complex movements of subjects and objects;
separate data processing tasks such as tracking, identifying, skeleton solving, and retargeting; -or track broadcast studio cameras in real-time to create live VR or AR sets,
our Classic Motion Capture System, which uses Cortex software, our most advanced mocap software yet, is the right choice for you. This premium system is internationally recognized and trusted by industry leaders - from Apple and Google to Ford and BBC - and will equip you with a complete set of the tools needed for motion tracking and editing.
Incorporate the benefits of BaSix into the classic system you already have
Perhaps you already own the classic system, but the idea of suitless markers, or the effortlessness of BaSix Go software, has piqued your interest.
The great news is that you can also design your own Hybrid Motion Capture System, which can run either off Cortex Software with user defined models or BaSix Go Software with BaSix Go models, and works with both the BaSix Active Markers as well as the Reflective Passive Markers. This would involve our Kestrel Cameras or Raptor-12 cameras being used as part of the BaSix system.
Check out this helpful diagram below which sums up all the features:
So, whether you’re looking for a quick, entry-level mocap system or you need something better equipped for high spec motion capture and analysis, we’ve got something to meet your needs.
James Karageorgiou’s secret to success as the Motion Analysis Sales Manager
Typically when you think of anyone in sales, you may conjure up one of two examples. Either you’re receiving a phone call from someone monotonously rattling off the benefits of a new product, to the point where they sound like they wouldn’t even buy the product themselves, or you’re listening to an over-the-top excited person, who is trying to make funeral cover sound like it’s the birthday gift you’ve always wanted.
Well, James Karageorgiou, our new Sales Manager from Louisville, Kentucky, has a different approach in mind. For him, sales should involve consistency, good communication, follow-up, and bringing a more authentic element to the relationship between him and his clients.
“People want to be treated like people.” says James. “If you can connect with your clients on a human level, and maintain professionalism, success stories start to pile up, not as a result of “sales tactics” but rather as a result of being genuine, transparent, listening well, serving others, and treating others well.”
James’s number one priority as a Sales Manager is understanding what his clients’ immediate needs and goals are so that he can offer them the best customer service. A process he believes starts with the simple act of listening.
“The most important aspect of my job is listening to our clients. Once I can paint a detailed picture of what my clients are trying to accomplish, we can then transition towards the consultation phase; where I apply the tools, resources, and solutions we offer, to meet clients' needs and goals. Motion Capture applications are not a “one size fits all” solution. My job during the consultative process is to ensure the client is getting exactly what they need in order to accomplish their goals.”
James, an outdoorsy sportsman with a love for everything from cooking and photography to hunting and playing video games, joined Motion Analysis in the midst of a global pandemic and a newly-transitioned remote working environment. And, despite having arrived at the company during such a tumultuous time, James has loved his interactions with the Motion Analysis team.
“It’s such a positive and productive environment, and it’s always great to work in an industry that involves all of your favorite things: tech, animation, sports, science, research, and imaging!
One lesson I’ve learned from the pandemic is that, behind every unforeseen circumstance, lies the opportunity for growth. You experience growth as a result of the courage required to push yourself incrementally to new levels, and I’m grateful to work at a company that makes room for that growth to happen.”
This is the second in our “Meet the Team" series, introducing you to all the incredible people who make up the Motion Analysis family.
Motion capture for animation: the fascinating history behind the movies we know today
One of the most memorable cinematic examples of motion capture - or mocap for short - is Andy Serkis’s portrayal of Sméagol ‘Gollum’ in the Lord of the Rings. But the revolutionary technique didn’t begin on the great adventure to Mordor and it certainly didn’t end there. In fact, mocap has been around since the 19th century, predating cinema. According to Goodbye Kansas’ podcast, Yellow Brick Road, motion capture may have even been a catalyst to the creation of cinema.
It all started with a man on a horse
A man by the name of Eadweard Muybridge brought us what could be defined as one of the earliest examples of motion capture. Muybridge, a photographer, bridged the gap between still photography and recorded movement when he came up with a way to take photographs of a moving subject - a horse on a race track - in rapid succession. These images not only enabled viewers to analyze the horse’s movements and position, but actually created a video sequence of the horse’s motion, giving the world the first moving image.
Muybridge’s work may have laid the foundations not only for the motion capture we know today, but also for the birth of cinema, inspiring and influencing generations of others in the industry.
Motion capture for animation was brought to life with a clown
In 1915, animator Max Fleischer (known for shows like Betty Boop and Popeye), invented rotoscoping (a technique that could produce realistic movement of an animated character by using live-action film footage to paint over each frame). He used footage of his brother, dressed in a clown costume, dancing on the roof, and then traced that footage, frame by frame, onto the animation of Koko the Clown.
This would be the start of a new era of motion capture and animation, especially once it caught the eye of Walt Disney.
Enter Snow White - the first full-length cel-animated feature film - which used rotoscoping to bring the characters to life. Many of the recorded movements that were traced onto the animated characters in Snow White were reused on other Disney classics, which is why, if you look really carefully, you may see the same dance routine in a number of Disney animations.
The first motion capture suit is developed
A lot was going down in the world in the 1950s. While America and the Soviet Union were embroiled in the Cold War, both on a mission to get to the moon first, animator Lee Harrison III was in the process of developing the world’s first mocap suit, which could record and animate an actor’s movements in real-time. Potentiometers attached to the bodysuit picked up any movements and translated them into rough animation on a monitor. Within two decades, animators had improved the bodysuits, lining them with active markers and using large cameras to track the movements, which produced digital animations that were far more detailed and accurate.
Sinbad: Beyond The Veil Of Mists was amongst the earliest animated films made exclusively with motion capture, using mocap suits that were able to translate movement into 3D animations.
Revolutionary characters in the history of motion capture
The development and evolution of motion capture for animation brought with it groundbreaking characters, such as the fully computer-generated Jar Jar Binks, played by Ahmed Best, in Star Wars: The Phantom Menace, and, of course, the Lord of the Rings’ Gollum.
Whilst Jar Jar Binks was one of the first main actor-driven, CG characters in a feature film, a few years later, Serkis’s Gollum made great strides into new mocap territory too, specifically with his appearance in the second Lord of The Rings Movie, “The Two Towers”.
A technique that surpassed rotoscoping was used to bring this character to life, and that technique is the motion capture - or performance capture - we know today. For this evolved version of mocap to work, Andy Serkis had to be kitted out in a mocap suit, while special cameras recorded not only his crawling, hopping, and twitching movements, but also his wide-eyed and scowling facial expressions.
The motion capture of Gollum in the second movie was so revolutionary because it was the first time they were able to shoot the mocap on location. Serkis was able to interact and respond to other actors in the scene instead of being filmed separately in a studio post the live-action filming.
Waiting a decade for motion capture to catch up
The early 2000’s brought with it more developments in motion capture for movies, inspiring director Robert Zemeckis to use the newly evolved tech to create The Polar Express, one of the first movies ever made entirely with performance capture technology. Although this tech was used previously in Lord of The Rings and Star Wars, for example, it had never been used to create an entire film.
While this evolution of motion capture was taking place, and certainly breaking new grounds, filmmaker James Cameron, was patiently waiting because he believed the tech was still not where it needed to be in order to bring his vision of Avatar to life. You remember the visually mind-blowing world and characters of Avatar right? Well, Cameron felt that the mocap technology of the early 2000s was not going to do that world justice. And so he waited for the tech to catch up to his vision. By the time that it did, he had pioneered a “virtual camera” which enabled him to watch the CGI versions of the actors, as their performance was being captured, streamed live on a monitor, within the digital environment of Pandora. This took motion capture tech to a whole new level, beyond just the suits and the set, and earned Cameron three Academy Awards.
The future of motion capture for animation
Following the success of Avatar, mocap has continued to evolve. Now there are many different kinds of motion capture for filmmakers to use, from marker-based systems that track physical markers on the actors, to markerless systems that use software that tracks an actors movement through identifying specific features on an actor (could be anything from their mouth to a piece of clothing). Studios such as Centroid Motion Capture and Goodbye Kansas have an impressive portfolio of productions that have successfully used motion capture - from The Walking Dead (Goodbye Kansas), to Pacific Rim: Uprising (Centroid Motion Capture).
But the future of motion capture is markerless, and with the presence of AI and quantum computing, that vision is becoming increasingly possible. This will mean fewer cameras required, greater flexibility in terms of the space that is used, and a much faster process.
And, with the motion capture industry expected to be a $266 million industry by 2025, according to the Global Forecast on Research and Markets, the development of markerless mocap is very much on the near horizon. But perhaps the future of mocap is arriving in other ways.
Lightweight mocap for fast 3D character animation
Motion Analysis has developed a lightweight suitless mocap system called the BaSix system, an innovative next-step in the realm of optical motion capture. The system enables users to select their animated character, equip the BaSix active markers, and then stream live animation data directly to their animation package, all under one minute.
The BaSix motion capture system can track one performer and ten props, two performers and four props, or 16 props.
What does it cost?
The BaSix system costs under $20,000.
If you already have our Kestrel or Raptor cameras and reflective markers, you can purchase the software, BaSix Go, on its own on an annual license plan for $4,000.
How much space does it require?
BaSix is intended for small studios or office spaces. We recommend an area that is around five by five meters in size.
Does Motion Analysis supply lighting and framing rigs?
We do not supply the lighting and framing rigs, however, we will be able to assist in figuring out what you need and the best people to talk to. The system is supplied with ball joint tripod heads and clamps, which can be readily wall mounted.
Can I use a custom skeleton rig?
The BaSix Go software contains a range of skeleton rigs, however, custom models are available on request.
Which software tools and engines is it compatible with?
BaSix Go integrates with Maya, MotionBuilder, VizRT, Unity, Unreal Engine, and more. We constantly update our plugins to work with the latest software releases.
Is it compatible with Microsoft Windows and Mac OSX?
The software is compatible with Windows 10.
What are the system requirements?
We recommend a dedicated professional workstation system, for example, the HP Z420.
Does it include support?
Our software is designed with ease in mind – anyone will be able to use it within minutes and a simple Quick Start guide is included. We also offer additional support via telephone and email.
From intern to VP: Two decades of experience at Motion Analysis
When he’s not camping in the Eldorado National Forest, brewing his own beer, or building off-road vehicles, Phil Hagerman serves diligently as the Vice President of Customer Service for Motion Analysis in California - a role he has worked hard to achieve over the past 22 years.
Yes, you heard right - Phil has been working for Motion Analysis for over two decades, and he’s been able to experience invaluable career growth during his time here.
“I started at Motion Analysis Corporation in 1998 as an Electronic Technician with no experience, only schooling”, says Phil. “I was first offered a 4-month-long internship and before long was hired full-time. Within a few years, after demonstrating my aptitude and understanding of the systems hardware, I got promoted to the role of Customer Service/Applications Engineer, and several years after that I was offered the chance to work in Sales.”
Despite not having any formal sales training, Phil thrived in his sales position but knew that his true calling was in customer support services.
“When the opportunity arose, I threw my name in the ring and was then offered my current role as VP of Customer Service. I truly enjoy this role, the people I work with, and the product and services we provide. The technology keeps me interested, and the relationships I have built with my customers over my many years of working at Motion Analysis, are what keep me motivated in my role.”
Even after 22 years in the business, Phil believes that his current role still keeps him on his toes and learning something new every day, while also making him feel that he is a valuable part of the company.
“Every day I do something a little different”, he says. “It’s never the same monotonous task day in and day out. And it’s great to work in an environment where your contributions and input are valued - where you don’t feel like you are just another number in a factory.”
When the pandemic hit the globe, many industries and companies struggled to adjust and transition to a remote working environment, but the customer service team at Motion Analysis was able to streamline the process to make it easier on its employees. For Phil’s team, providing services remotely was not an unfamiliar concept.
“We’ve provided resources for remote troubleshooting and training for many years. So, although it was difficult not being able to meet up with customers in person during certain instances where we usually would have, we are generally quite used to working ‘on the road’.”
His advice for teams who have struggled to adapt to the changes this pandemic has brought?
“Set goals but ensure they are flexible. And listen to others around you for ideas or insights.”
This is the first in our “Meet the Team series, introducing you to all the incredible people who make up the Motion Analysis family.