Vive Occupancy Range Extension

Lutron · Q3 2018 // UX design · mobile + Web

Lutron · Q3 2018
UX design · mobile + Web

Overview

Overview

Overview

Occupancy range extension is a feature that enables the devices to work beyond its default range to support the diverse and growing applications of Vive. In larger spaces, the 30 ft wireless range of an occupancy sensor was often not enough to communicate with all the lights in the whole area. Therefore, multiple occupancy sensors had to be installed for proper coverage and functionality of the room. However, this resulted in some lights not turning on/off that were out of range of the occupancy sensor that detected motion.

Core Team

UX designer, product manager, engineering lead

Responsibilities

Research, synthesis, design & prototype, handoff

Goal

Goal

Goal

How might we help Contractors easily & successfully program a Vive system even when devices might be out of range?

Outcome

Outcome

Outcome

Our solution guides users through complex logic by optimizing for the main use cases and products used in these types of spaces. Vive started winning more jobs and gained more market share in the retrofit, code-compliant solution space.

The approach

The approach

The approach

This was a project smaller in scope, limited changes due to hardware constraints, and not much time for user research. I sourced internal subject-matter experts, interviewing our product managers and field service representatives. Therefore, it was very important that I work closely with the engineering manager to uncover the constraints for the design and to make sure the solutions were in scope. I frequently updated the product manager to stay aligned with business needs and goals.

Top problems

Under-specifying the design for the space

Wanting to squeeze every savings they can from the project due to tight timelines and small budgets, some Electrical Contractors will design the system themselves, or skip designing it altogether and just place products where they believe is best from past experience. This leads to not ordering and installing enough occupancy sensors to cover the area.

If a light doesn't turn on, it could be a number of problems

Range isn’t the only problem that could exist. If a light doesn’t respond, it doesn’t always mean it's a wireless range issue. ECs have to test the wiring and even the fixture itself to troubleshoot why the light isn’t responding.

The wireless range can't be seen

There isn’t a way to see or test the wireless range in the app or on the devices. Therefore, it’s difficult to determine when devices are within range or out of range. The only guarantee is if the project was specified correctly.

Use cases—what do users need to do?

From interviewing internal subject matter experts, I identified the top ways a Vive system would be set up and how the EC is typically required to program it. This helps us keep the solution in scope, by ensuring it meets these use cases as the minimum viable experience.

The L-shaped hallway: A large hallway that requires multiple occupancy sensors to cover the entire area. All lights should turn on and off together when occupancy is detected
The large open office: A large open office with multiple entrances that requires multiple occupancy sensors to cover the entire area. All lights should turn on and off together when occupancy is detected
The large open office with lots of sensors: A large open office with all lights that have fixture sensors. All lights should turn on and off together when occupancy is detected. The added complication with this layout is programming the sensors to turn all the lights on together, since default behavior is a sensor turning on the individual light only.

the solution

the solution

the solution

Considerations & constraints

  • No changes to the physical device, such as a stronger antenna to increase the wireless range.

  • Extending the wireless range of an occupancy sensor was limited to using the Vive hub as the “virtual sensor”.

  • This would live within the existing Vive app and utilize the current workflows and information architecture as much as possible.

Capture user intent to help program the room

In a large room, such as a large open office, multiple area occupancy sensors may be required to provide enough coverage. I learned from the research that the user typically wants all the lights to turn on and off together, even if only one part of the room is occupied. Otherwise, it can be distracting, even annoying, for the occupants if various lights are turning on and off throughout the room.

The user is only prompted when the second sensor gets added to Vive. If they choose that they want all the lights to work together, then all the lights that were added to the first sensor are automatically added to the second sensor they're adding. This is an additional improvement to current programming, since every light needs to be individually added to each controlling device.

Users responded positively to this workflow during user testing and didn't have trouble understanding what was being asked.

Capture user intent for other types of sensors

The main reason fixture sensors would be used in place of area sensors is for the dual occupancy and daylighting capability. However, similar to area sensors, the majority use case for occupancy is to have all the lights turn on and off together. Through the initial research, I learned that the discoverability to do this in the existing Vive app was very low. Most users could only achieve this by calling Lutron customer support.

In the case of fixture sensors, the Vive app detects and adds them to the system all at once. So user intent is captured at the moment they're added.

Guide the user through space reconfigurations

It is important for both the EC and FM to quickly get their job done and leave an area operable for its end occupants, whether they're setting it up the first time or making changes to it.

When an area needs to be reconfigured, for example, changing a room's configuration from all lights being controlled together to lights being controlled by separate groups, the app guides the user through reprogramming the sensors that were "broken" by the change. This helps the user focus on getting the system working again as quickly as possible.

From user testing, ECs struggled with identifying which occupancy sensor corresponded with the one on the screen, since the physical sensor doesn't have a way to identify itself. If the user taps on "I don't know which sensor is which," it directs them to go to the physical sensor and press a button, which identifies the sensor and guides them through adding lights to it.

Designed and built in Framer by Jennifer Wong © 2024.
Designed and built in Framer by Jennifer Wong © 2024.