Shutterstock

Improving conversion flows

Shutterstock Pricing Page

Brief

Shutterstock is a global provider of stock content with customers in over 150 countries and supports more than 20 languages as of 2018. As part of the internal product design team since January 2017, I worked on the Footage, Images, and API eCommerce business units.

As the lead designer on Footage, I helped bring the entire Footage site onto a new Shutterstock Design System and was responsible for UX improvements on the two most highest trafficked pages - the search results page and product page. During this time I worked with the remote Montreal-based Footage team while being based in New York headquarters. After 6 months of my involvement, another Montreal-based full-time designer was hired to work more closely with the Footage team and my focus shifted onto Shutterstock Images - the largest and oldest business unit generating the highest revenue.

My Role: I was part of the Images product team that focused solely on increasing Image eCommerce customers' conversion rates. The team was composed of one designer (me), one product manager, one engineering manager, and 8-10 developers. During my involvement with eCommerce Images, I also had to manage my workload as the lead designer for the API team's Developer site.

1    UX Audit

The first step to increasing customer conversion rates was understanding the current experience. As the new designer for Images eCommerce, I lead the design initiative to perform a UX audit of conversion flows for the highest priority user segments - new users and existing users without paid subscription plans. It had been several years since a similar UX Audit had been done and I felt it was not only a great way to onboard myself to the project, but also a standard that should be set within the product design team for any new involvement. Since this initiative was pertinent to all designers working on eCommerce Images, I pulled in the designer working on Images customer acquisition to make it a joint effort while being supported by the senior designer focused on the Images search experience.

Pre-conversion Existing Users (left), New Users (right)

The UX audit documented the main conversion pathways for each user segment, listing out what a user is thinking/feeling, UI/UX issues, relevant metrics, and product ideas page by page. We also documented the offline/offsite experience for customers' to encompass the entire user experience. After we finished our first pass, we integrated input from Customer Care, Marketing, Product leadership, and senior designers who worked on Image eCommerce in the past to get cross-departmental buy-in.

Cross-departmental input on stickys and handwritten on the printed audit

Finally, we presented our findings and recommendations to Product leadership, highlighting the most pressing user experience problems and product opportunities for our target user segments:

  • Site doesn’t show value before asking the user to invest
  • Difficult to understand pricing
  • No personalization
  • Help and support not a unified experience that fits in context with the user’s experience
  • UX Dark Patterns: Autorenewal
  • UX Dark Patterns: Misleading CTAs

2    Design Sprint

The results of the UX audit were taken into consideration for the 2018 roadmap. In the meantime, the Pricing page was selected as the first key area of the user experience that we would improve upon for these reasons:

  • It was area of opportunity identified in the UX audit.
  • It had an extremely high bounce rate of 97%.
  • In the past, several minor design changes had been tested but then rolled back because of a dip in conversion. As a result, the page had not changed significantly for several years.
  • It was a massive effort to re-design this page by the sheer number of stakeholders involved (this page affected nearly every single business unit in Shutterstock).

Additionally, qualitative and quantitative data from the past suggested that:

  • Customers do not know all the additional value adds when purchasing a product.
  • Customers do not clearly understand the differences between different products.
  • Customers do not have an easy way to compare pricing between products.

Due to the history of this page and its background, we thought it was the perfect candidate for a Design Sprint. It also satisfied criteria for a Design Sprint candidate:

  • High stakes
  • Not enough time
  • Just plain stuck

Typically satisfying even one of these criteria is sufficient reason to do a Design Sprint, but in this case the Pricing page satisfied all three. Design leadership worked with my PM to create the sprint plan and identify all the participants. A selected moderator and assistant then ran a compressed sprint over the course of 2 days which involved all the key stakeholders of the pricing page, including finance, business leads, customer care, product marketing, product leads, and design.

Normally a Design Sprint runs over the course of 5 days, but due to time constraints of all the participants and the fact that Design Sprints were relatively new to Shutterstock, the senior designer who was acting as moderator managed to fit the most collaborative parts of a Design Sprint into just 2 days. Over the course of 8 grueling, intense hours, we all worked together to identify the goal for the pricing page re-design:


"Match the right user to the right plan every time through the right channel."


We completed and highlighted problem areas on a customer journey map, learned from experts from different areas of the business to piece together a complete picture of the problem, and gained inspiration from a "Lightning Demo."

Lightning Demo: inspiration from past design concepts, competitors, and outside the industry. (past design concepts not pictured)

Then we collectively sketched solutions and decided on the best. By the end of the two days, everyone was invested and had gained new appreciation for one another. Most importantly — we had two winning sketches to prototype and test.

3    Prototype and Test

I worked with the creators of the two winning sketches to turn their ideas into fully functioning, high-fidelity prototypes that we would be able to test with real users. Both were big re-designs that included new sub-pages as part of the new user flow. After the prototypes were created using Invision, I recruited candidates for moderated user interviews that would help us obtain qualitative data on the two prototypes, feedback on competitors, and determine which design direction to go in.

I conducted a total of 12 tests to collect feedback, making sure to switch the order of the prototypes to counteract framing bias: 2 in-person moderated sessions, 4 remote moderated sessions, and 6 unmoderated remote sessions via UserTesting. The tests showed that Prototype B was more understandable than A, but both designs had their weaknesses. Additionally the tests revealed common points of confusion for users. Some of the confusion stemmed from copy; a lot of it stemmed from our product mix. I came up a few different strategies to address the confusion. However, after speaking with my PM, we decided to leave the product mix as is because:

  • The re-design was already a big change
  • Changing the product mix would require more time and resources than we had
  • Changing the product mix was outside the scope of our pricing page re-design
  • Changing the product mix was more risky than necessary

4    Iterate

The results of the tests showed that neither prototype from the Design Sprint would be good enough for us to ship. By that time, I had acquired lot of user knowledge and expertise from participating in the Design Sprint and hearing all the users during the interviews. Based on those learnings, I iterated once more to create a design backed by a winning design strategy:

The new design takes the best of Prototypes A and B, and tells a story so the user is walked through the information at the right time and with the right context. It doesn't overwhelm them with too much information like Prototype A, or underwhelm and burden them with questions like Prototype B. Instead, the new design:

  • Highlights value props such as the breadth of Shutterstock's library and flexibility of our custom plans, which users found very appealing during testing
  • Paces the information so that after users are presented with our products, we answer their most pressing questions before they even need to reach out and ask
  • Addresses licensing concerns as an authority in the industry, which is a grey area that customers have a lot of confusion about
  • Provides the feeling of legitimacy by showcasing the visual content we're selling as an intentional design choice

I tested the design again with 12 users via unmoderated remote testing to work with time constraints. This time, we heard unequivocal positive feedback that supported the new design as the way to go compared to the previous designs.


“This has a nice flow to it and is much more easy to comprehend and understand...The presentation is very nice. The previous page I definitely would have looked at other sites before signing up, but this page makes me feel like it's a page I can actually trust. It looks much much more trustworthy.”

-Usertesting participant

I also went back to the group of stakeholders involved in the Design Sprint and received particularly positive feedback from Customer Care on the new design. Thus, I presented the new design and its rationale to my product team while keeping design leadership in the loop and helped my PM prepare communications to the rest of product leadership and chief executives. From there, I worked with my PM to incorporate feedback received from all the stakeholders and business units to come to a final production design that had copy, images, and presentation that everyone agreed on.

5    Launch and measure

Finally, I worked with developers on my product team to test and launch the designs as an A/B test through Optimizely, making sure they were to spec. On January 2018, we launched the new pricing page as a full stack test. After running the test for two weeks, we saw:

  • Fewer people enter the Checkout funnel, but completion rates are higher especially on Registration (+9%) and Subscription Purchases (+40%)
  • More subscription purchases in general, especially for smaller subscriptions
  • Better engagement on Teams products
  • Significant increase in Custom product purchases

However, we were disappointed to see that conversion and revenue still remained flat, perhaps because:

  • Customers were getting stuck on Checkout and not completing the purchase
  • Some higher revenue products were being purchased less due to lower visibility and cannibalization from other products
  • Some lower commitment (and lower revenue) subscription products were being purchased less since we're clearer on cancellation policies

Because of the potential upsides, product leadership decided to productionize the test and see if we could move the needle on revenue and conversion by addressing some of these concerns. Additionally, we all believed that the new design would have higher retention rates and lower cancellation/refunds due to the clearer messaging, which would only be revealed after a longer time period.

What I Learned

Metrics chasing

During my involvement with the Pricing page, I became obsessed with how we would measure the design once it was launched. The data pertinent to the user experience was housed in several departments and it was difficult to get all the most important metrics at a glance. Additionally, I felt that only having and focusing on business-centered metrics was causing imbalanced product design that prioritized achieving business metrics to the detriment of the user experience, as evidenced by some of my findings during the UX audit. Thus, I was inspired to bring the HEART framework of measuring UX to Shutterstock, and hoped that having a flexible framework that supported both UX and business metrics which mapped to product goals would help every one working on the product have a shared understanding and buy-in of what we were all working towards.

I was lucky that my PM was very receptive to trying out this proposed framework (he supported the Design Sprint, after all) and worked with me to come up with metrics for the Pricing page as a pilot. The metrics we came up with overlapped quite a bit with business metrics product leadership was already tracking, but seeing the metrics within the framework made it much more clear which aspects of the user experience we were trying to improve upon. Even after the Pricing page launched, I continued my efforts to update the metrics and evangelize the framework; it was eventually adopted by senior design leadership who continued bringing it up with other product teams.


Accepting design constraints

Some constraints are more difficult than others to swallow, and maturing as a designer means picking your battles. I'm very proud of all that the team accomplished and the role I played, but at the end of the day the idealistic designer inside me still thinks...

"We shouldn't have 5-6 default product cards on a page AND a catch-all custom product! We should only have 3 defaults with the catch-all custom product! This product mix needs fixing." and...

"This new design needs more white space and stakeholders are too fixated on keeping things above the fold! *SMH* It's design by committee."

However, the reality is that knowing what a better design is doesn't always mean that's what can or should be done. The kind of technical work and collaborative effort that would have been involved in rethinking the product mix would have stopped the project dead in its tracks. Same goes for not valuing a majority of stakeholders' input. In such cases, it's important to keep in mind that getting something out is better than getting nothing out at all!