General Motors 2016-2020+

In January of 2016, General Motors (GM) teamed up with first Lyft, then Cruise Automation, and later Honda to build the world’s first mass produced autonomous shuttle.

 

My role

To cross functionally lead the corporate experience strategy for autonomous vehicles (AVs) / shuttles from scratch. To lead the experience for the experience vision and product execution of GM’s first AV shuttle, the Cruise Origin. It did this through sifting through the noise of market research, trend analysis, leveraging Design Thinking to find the most relevant signal of problems to solve and to ask the best questions, concepting to prototype building and user testing. This allowed us to conduct many iterations and sprints, informing each square inch of the vehicle inside and out + the remote experience. I also coordinated tech enabler planning for several feature sets collaborating with 10 technology domain owners.

There are four basic human needs as it relates to successful autonomous shuttle experience [be in control, be informed, be secure, be safe].
— Co-research with Lextant

Four main Customer Categories

I guided the team to focus on four main use cases that had highly differentiated sub-needs, with overlapping core needs. These insights came from customer interviews and immersion research in Michigan and San Francisco. *Be safe is the foundation need everything was built upon.

Core need breakdown by customer category:

  • Business traveler – Be in control… Ease of cargo placement, in cabin productivity

  • Young family – Be secure … Remote family communication

  • Socialite – Be secure … Cleanliness, natural socializing

  • Daily Commuter – Be informed … Efficient pickup

From problems to solve > to Jobs to be done

In framing the problem, I led a cross functional team of 35+ to narrow relevant challenges to solve by asking deeper “job to be done” questions (core team of 6). I stayed in close contact with executive champions who upsold our concepts and progress to senior corporate leaders.

I also helped them define the problem from the point of view (POV) of target users leading into solutions.

  • These insights are confidential to GM. I have written about this process on LinkedIn and how to translate insights into Jobs to be done framework.

I led dozens of iterations of the core and periphery experiences driving basic and desired functionality for the fusion of the digital and physical experience feature sets.

mapping > Concepting > Building > Testing

I co-led 5 vehicle development cycles with dozens of mini concepting iterations focused on the following:

  • Seating arrangement

  • Finding and identifying the vehicle (ride hailing)

  • Entering and existing the vehicle

  • Initiating the ride

  • View trip status and ways of building trust with the riders

  • Feeling safe in and around the vehicle

  • Seating and other comfort

  • Reconfigurability of interior

Continuous Building to Test & Learn

Notes: A wood and foam seating buck was created for each of the seating configurations shown - (Images not available to show core functionality testing).

Having a tight relationship with the shops was essential to produce low to mid fidelity vehicles & experiences.

The faster we made the prototypes, the faster we learned what worked and didn’t work. This drove richer rounds of ideation. We hit the shops after each testing phase.

Tech mapping & Product Management

I worked with GM’s Advanced Engineering leaders to build a technology mapping framework that would link technology enablers back to customer / human needs. The framework looked similar to the illustration here. I assigned a color code to the time readiness of each enabler

  • Green for production ready ‘off the shelf’ technology enabler.

  • Blue for enablers that still need validation

  • Yellow & Orange for enablers that are still in R&D phases

This format was adopted corporate wide and may still be in use 7 years later, allowing product managers to senior executives to quickly make decisions and communicate clearly to cross functional teams what technology enablers are available, need further refinement, and still need investigation and might not make it into a program.

Configuration based on COVID 19 requirements

Fund Raising through Storytelling

Towards the later end of the program, GM needed to raise $Billions to fund the further development of Cruise Automation and the Cruise Origin. Senior leaders looked to me in Advanced Design to tell an effective story behind each of the targeted users to build empathy and market opportunity within the context of the user.
I crafted similar imagery into a story arch, as well as multiple active participant and passive virtual driving experiences for investors to sit in a crafted vehicle with a VR & MR headset on to be totally immersed in the full end to end experience (pickup, initiate ride, travel, and drop off).

  • With my crafted VR & MR + physically immersive experiences, I helped GM raise $2.75 Billion for our autonomous vehicle programs.

Vadilation Research

Product and experience validation is a vital step to understanding if preproduction features still hold value in the eyes of the customer and meet the desired functional output.

There are many steps and supporting features within the end-to-end shared autonomous experience. The team tested each step to find out which elements added value, or confusion/pain for the user.

Each phase of testing allowed us to refine our feature sets and settings, and sometimes pivot direction according to budget and timing.

My team worked closely with Honda Engineering starting before the preproduction phase. We worked together with feature refinement.

  • We found out that the touch displays in the middle upper of the headliner of the vehicle still had value if people could easily touch them. The interaction took some effort to raise the arm above the head, yet allowed for an individual rider experience without a personal device.

  • The simpler experience of this screen (shown upper middle of above image) is driven to the individual rider’s device. Plus, if there is a middle rider on the bench seat, they would not have a upper display with which to interact.

  • ** Many iteration images are not available.

I started looking at the experience prior to picking up the phone, at the moment the customer has the need to move from A to B

Building trust with a Robotaxi

Exterior Human-Robot Interaction (HRI) and communication was deeply studied and essential to understanding which vehicle was pertained to which customer. We explored mirroring movement and AR augmentation for vehicle to ride communicate and trust building (keeping in mind the affects of the uncanny valley on human psyche). Many concepts and prototypes were explored using spatial design and multimodal interaction methods, unfortunately many were too costly to produce at scale. Even though the technology is not there yet for glass, or body panels interaction, I can see in the years to come viable methods to bring these to market.

The digital white digital ‘27’ is a display used for symbols, numbers, and letters.

The cargo concept has a lot of merit, allowing for seamless curb-side pickup of parcels, perishable goods, or other items. The items would be access in separate secured compartments. This was not slotted for product, yet it could make it back into the program.

Taking the Driver out of the Equation

Opening, closing, starting, and/or stopping the ride were complex parts of the automated process. It is seems easy to do on paper, but when the human is taken out of the equation, it is mind boggling how many subtle interactions and decisions are required for the passenger to make in the solo and mix social setting.

The location for the ‘start ride’ button and ‘open / close’ door buttons were part of this complex problem solved by simple solutions. They had to be visual and within reach. On the door initially made the most sense, then with the movement with the door open it became difficult to see / access.

Should these be part of a digital only experience on one device? Who decides when to start / stop the ride if there are multiple separate riders going to different locations? After many prototyped experiences and interactions we landed on something that seemed logical and needed slight adjustments over time.

Individual vs communal experience

How do you personalized your experience and build trust with each rider on board? Yes, push notifications, updates, and welcome message with a name on a device helps. To do this onboard, in the interior requires orchestration and triangulation of device location, or reserving a predesigned seat.

The overhead displays played the role of a welcome sequence, with a personalized message to those entering if the rider gave permission to have their information shared / displayed in the vehicle, along with ride updates, much little ride pool is treated on an app today.


Final End to End Experience

Building multi-sensory refinement.

After almost endless iterations we built an end to end experience that is clean, simple, appropriate.

Every inch of this vehicle was thoughtfully crafted to optimize the end-to-end customer experience with the balance of efficiently packaging the technology with customer facing components...

Starting with the rider’s thought of going from A to B, I led, or contributed to experience:

  • Exterior communication - 3 research sprints to help rider identify the vehicle upon pickup from 150 ft away (digital signage should be as high as possible to see over pedestrian heads).

  • Rider foot space & headroom height - 5+ seating bucks (2 omni directional human factors), 3 functional interior / exterior prototypes, 2 inner / outer models.

  • In-vehicle communication & interaction - Countless feature & experience prototypes, (1 seating buck with continuous location modification on inner door panels displays & sizes, over head displays (3 sizes, 4 general locations).

  • Ride initiation - Prototypes on door display, hard button on door / side panel, voice, on personal device. Hard button on, or near door made the most sense for human factors, and mental models for riders.

  • End to end mobile experience - Rider pick and drop off, AR options triangulating with vehicle location and device ‘like Pokémon Go’.

    • Built experience based on user feedback and team usability testing in field (current ride hailing solutions), or with prototype products.

I guided the cargo bed, cupholders, and grab handle development to be inclusive for a wider range of users with different abilities.

Top