Tickets Please: Facilitating a Multisystem Digital Transformation
The Opportunity
Designing a seamless digital experience across web and ticketing for the oldest planetarium in North America
The Adler Planetarium, the oldest planetarium in North America, hosts more than half a million visitors annually, which drives more than ~900,000 visits to their website every year. With so many touchpoints, the cultural institution’s leadership understood the need for a seamless customer experience, both online and in person, to ensure guest satisfaction and drive ticket sales.
The team at the Adler recognized their existing digital systems were not built to effectively meet customer needs in the mobile era, with key systems not being mobile-friendly and lacking flexibility for both customers and staff. This rigidity was having a ripple effect across customer experience, operational efficiency, and team satisfaction. They were looking for a new technical solution to help modernize their digital experience.
To define this new solution, the Adler Planetarium partnered with Threespot on a digital transformation project centered on a new ticketing solution.
The Approach
A facilitated selection process with subject matter experts in the lead
Based on discussions with leadership at the Adler Planetarium, we designed a facilitated selection process for the ticketing solution that allowed their subject matter experts to lead the vetting process while relieving them of the initial research burden. Threespot’s technical and user experience experts worked through a joint discovery phase, outlining current challenges and user needs across both the existing ticketing system and website. From there, the work diverged into parallel paths – website definition and ticketing selection – to allow both halves of the digital ecosystem to be defined together.
For the ticketing selection process, Threespot led an in-person prioritization workshop with the Adler at their offices in Chicago. Together, we refined the list of about 100 requirements collected by the Adler team in advance and separated them into high-level buckets such as user experience, functionality, data reporting, and integrations. We guided the team through each section of the requirements, reviewing and adding any new requirements that emerged through discussion. Once we solidified initial requirements, Threespot and the Adler worked together to prioritize each requirement into “Must have”, “Could Have”, “Wishlist”, and “Won’t Have”.
Going into our first round of vetting, we identified 55 “must have” requirements to vet 24 ticketing solutions and set benchmarks used for the vendor scorecards. The evaluation process began with an initial review of each vendor’s publicly available information, such as website content, product documentation, and feature lists. This documentation identified which features and capabilities were clearly supported.
To validate and clarify this information, Threespot conducted direct calls and, where possible, live product demonstrations with each vendor. These sessions allowed for a higher level of understanding and insight into how vendors met the “must-have” requirements. In validating the requirements with vendors, we used a simple three-point system, which assigned a “yes”, “partial”, or “no” to a given requirement on a given ticketing system, as well as providing any relevant notes that we may have learned in the evaluation process.
In order to keep the matrix clean and intuitive, we gave a simple pass/fail/partial rating to each vendor. However, these high level ratings were not able to fully reflect the nuances of each vendor. Some systems were passing with flying colors, while others were just barely meeting the requirement. During discussions and demos we spent time walking through deeper insights and findings for each vendor to give more context to the ratings.
Using this matrix, clear patterns emerged early within the initial list of potential ticketing solutions. Through this preliminary vetting process, Threespot shared a list of candidates with the Adler team that categorized candidates into “recommended”, “worth considering”, and “not recommend” based on their potential to meet the shared business and customer needs. The thoroughness of this vetting process and the selection matrix enabled the Adler team to narrow from 24 to 3 vendors after a week of review.
The Learnings
Trust and agreement are the cornerstones of success
As illustrated above, the vetting process is an iterative one and can be started internally in advance of a larger digital transformation effort. When starting this process, you’ll want to ensure your internal team of subject matter experts is aligned on the purpose of the transformation and on the requirements of the new system. Building consensus early on the goals and needs can help ensure you’re able to leverage your team’s expertise in assessing solutions, rather than generating new ideas or retrenching on old systems.
To begin your initial assessment, any internal list of potential systems should be consolidated during your team’s first round review. This will allow you to put forward only systems that meet your initial requirements and priorities. While we started with a larger list, there were many vendors who likely could have been removed through an internal consensus process, allowing us to concentrate on a more select group with a higher likelihood of meeting the must-have requirements out-of-the-box.
As vetting progresses, it will be important to be realistic about the depth of information available. As part of an iterative process, a selection matrix can help narrow the field of suitable solutions for deeper inquiry and testing. Through this review, the matrix can give teams a high-level overview of the suitability of a solution based on shared requirements. However, teams should expect to do a more intensive, higher-fidelity review of each solution to determine if it can truly meet all requirements.