top of page

Process Model Concept Testing
@ Amazon


This is an exploratory research project that I led as the main UX researcher at Amazon. As part of an embedded team, I collaborated with UX designers, software engineers, UX managers, and product managers in order to validate and improve the launch of a Process Model UI of training software for Amazon associates. The high-level objective of the study is to increase the visibility of various process models used by Amazon content authors and demystify pain points of the process models, thus standardizing design direction to implement new UI features. In addition, the high-level impact of this research project entailed reducing economic costs using concept testing to improve the effectiveness of the process model UI product design, collaborating with stakeholders to illuminate crucial insights, and refining the product roadmap for future iterations.

Some information is removed, modified, or blurred due to confidentiality purposes

Role: UX Researcher

Duration: 2 ½ months





I started the project by collaborating with stakeholders in order to assess the overall problems content authors faced. To reiterate, content authors have difficulty understanding the current method of using the process model UI. Process Model UIs are used to view, find, and edit training modules or process models. Since most training modules were outdated, this often led content authors to use different ways to update those modules. Due to this major issue, our group held weekly collaboration meetings with cross-functional teams to discuss major touchpoints on how the product influences content authors, what actions to take to efficiently improve user experience, and what I personally need to do to test those new concepts. In the first and second weeks, the stakeholders and I discussed the current vision of the new process model UI project detailing how content authors need to be able to perform the basic tasks of using the process model UI and create a central system for all users utilizing the product to coordinate an understanding for all content authors within Amazon. The product team needed evidence to determine key focuses to search for during the development of the product. Due to this, I chose to utilize concept testing to capture those key focuses revolving around UI issues content authors may experience while traversing the new process model UI, being frugal with research methodology costs, and to strengthen direction for the business product.

Recruitment Criteria

To test the early stage of the lo-fi prototypes, the concept testing consisted of interviews with content authors who frequently use the current process model UI with their current job as well as some people who don't. The reasoning behind this target audience is that this new installation process model UI will be a central one-stop-shop system for content authors to use for viewing, finding, and editing process models. Several instruments were used as part of the concept testing. One of which was a protocol that detailed the script behind the second iteration testing of the process model UI prototype.  I also collaborated with our team's UX designer to improve and create lo-fi and med-fi prototypes to test our concepts with content authors. (Figure 1 and Figure 2) For recruiting purposes, I discussed with stakeholders which participants would be applicable to the interviewing process. Subsequently, the viable participants were messaged and emailed if they were available for a test; after which, I created a time and date invitation to schedule an interview with them.


Each interview lasted about an hour, during which participants take part in a concept test remotely via a communication service, powered by remote screen-sharing technology. I conducted about 6 interviews in one week and conducted another 8 interviews after the second iteration of the prototype was completed the next week. 

During the first interview phase, lo-fi prototypes of the process model UI were created and presented by a series of screens that show the view, find, and edit user interfaces as mentioned. I used the concept testing protocol to guide me and the interviewee through the prototype concept testing. Multiple instruments were used as part of the concept testing interview. These included the prototypes used for the concept testing, tasks, thought procedures, and the SUS (system usability scale). The SUS is a post-test survey indicating the usability of the product, which was given to each participant post-interview. After the interviews, I analyzed the interviews by evaluating the pain points end users had when navigating through the view and find process model UI while also indicating issues that deter the user experience. These user experience issues are categorized by:

  • Navigation errors - failure to locate functions, excessive keystrokes to complete a function, failure to follow the UI

  • Presentation errors - failure to locate and properly act upon desired information in screens, selection errors due to labeling ambiguities

  • Terminology errors - ambiguity behind the different terms and their definition

    • Terminologies that different content authors use (differentiated by EU and NA content authors)​

      • Terms that mean one thing for the users, but actually mean another​

  • Control usage problems - improper toolbar or entry field usage "Find" search entry

Since I also administered the SUS survey post-interview, I also calculated the SUS qualitative score to get a quantitative sense of the usability of the design prototype. After the first interview phase, I compiled a list of potential UI issues which was subsequently written as notes in a Figma document containing insights on the current prototype as well as improvements for the second iteration of the prototype. After establishing those UI issues and improvements, I discussed the main touchpoints with stakeholders during our weekly collaboration meeting regarding what UI issues were present within the process model UI to give them some context on some inconsistencies and problem areas. I also incorporated the overall SUS scores around the first iteration of the prototypes which also gave context on how the team wanted to move forward. (Examples can be seen on the right.) Through gaining an overwhelming likeability of the first iteration of the prototypes, I analyzed the data and touched bases with stakeholders. We decided that, for the second iteration of the prototype and my final concept test, we would assemble a different group of participants along with the same participants to reiterate and test an improved process model UI with additional edit buttons, visibility, and categorization user experience. 

From there, I collaborated with my UX designer to refine medium-fidelity prototypes through our original Figma document and personally conduct second-round interviews with that prototype. Since I collaborated to make new prototypes, I also altered some task questions as we wanted to add additional content to replicate product accessibility insights while gathering new insights for further improvement. The second interview results were observed by the same protocol as the first iteration concept tests where I would search for UI violations and look for insights that harm user experience as well as insights that improve user experience. Based on these findings, I created concrete analyses that are focused on the cost and benefits of the user that would evidently improve the user experience with future iterations of the process model UI.

Umbrella Process Model Wireframes and Prototypes Fixed-04.png

Figure 1. Example of  process model UI low fidelity prototype

Umbrella Process Model Wireframes and Prototypes Fixed-06.png

Figure 2. Example of  process model UI medium fidelity prototype showing active process models


When using the process model UI, content authors typically look for only view and find modalities. While our team's concepts did align with assisting content users to better their process models, there were also UI deficits that either confused users or violated usability heuristics. This was characterized by ambiguity within buttons and the inability to obtain error prevention for users. 

The primary tasks that were visible to users is that they were able to effectively look into all crucial functions needed to make a process model while being able to see all process models the users or their team have created. (Figure 3 and 4) While I can't go into details on the exact functions content authors use, some of those factors are extremely necessary since content authors use those functions to create the exact training modules of what they need to teach Amazon associates while implementing it into the process model. Based on the first lo-fi prototype findings, users can see a sidebar navigation system that obtained positive feedback since it organized their content automatically as shown in the figures above. While being able to view the process models that they have, users can also see the exact functionalities they need to view the process models in the current stage as well as how developed they are. However, users were confused about if the edit button would edit just the name of the process model or certain components of the process model since the button was located in a general location that was viewed as ambiguous. There was also a lack of missing functionalities that users wanted in terms of helping them perform their job duties more effectively and efficiently. Overall, through the first rounds of interviews, most content authors liked this approach, and underlying UI issues were used as iterative insights for the second round of prototypes. 

The second round of interviews included an improvement to the process model UI, clarification on the CTA buttons, and additional information on certain functionalities to accentuate user accessibility. Users will not have to differentiate between the edit button of the entire process model. (Figure 5) The overall assessment of the medium fidelity prototypes creates new avenues for content authors because the main insights gathered were that the information processing for the second iteration was less constraining in comparison to the old version, leading to a reduction of cognitive load.

However, there are still instances of our design that lacked validation such as looking through the version history and filtering through a system. Users do not know if they can find models that are only edited by them, their team, or companywide. In addition, there is consistent confusion as to what buttons are used for or even if they are actually buttons in the first place. For instance, some users thought arrows were to locate a specific section within the process model; however, users also didn't know that they can use this arrow to open up a certain section of the process model UI that shows essential information about the training module. Error prevention was a main issue as well since one of the additional buttons called "Discard" was ambiguous and users thought it would either delete the whole process model or it would delete just the name of the process model. In future iterations of the process model UI, I believe that focusing on improving the clarification of the process model UI as well as updating user accessibility is essential for the extensive improvement of the prototypes.

Figure 3. Prototype covering all crucial functions for content authors

Umbrella Process Model Wireframes and Prototypes Fixed-08.png

Research Impact

Strategic Impact

  • Improve the effectiveness of the product backed by iterations of design research insights

  • Refined product playbook and roadmap by establishing insights for future iterations plus illuminating job responsibilities of content authors

  • Increased ease of learning process model UI by reiterating prototypes focusing on usability and accessibility

Stakeholder Impact

  • Collaborated with UXD and UX managers on our team focusing on iterations of process model UI concepts

  • Presented insights and findings through weekly collaboration meetings between research, product, and engineer functions

Economic Impact

  • Saved development costs by reducing the amount of research workload using concept testing

  • Increased customer satisfaction rates through stakeholder collaboration, concept testing, and calculating usability metrics

Figure 4. Protoype that indicates the content of what's inside the crucial functions

Umbrella Process Model Wireframes and Prototypes Fixed-10.png

Figure 5. Clarification on CTA buttons within edit and discard functions


Concept testing is one of the most used front-end development research methodologies that are quite difficult to execute and master, yet also reduces costs and illuminates the critical foundation of UX accessibility in an early stage. In a pragmatic manner, it is necessary to make and test concepts for users to allow stakeholders and researchers to understand what needs to be improved on in the future. Additionally, the concept tests act as a catalyst for understanding the users' intentions and understanding what they want in a customer-centric manner. To fulfill the overarching vision the team and I had, I had support from stakeholders on how I can visibly help align the customer and business goals together. Due to the concept tests, customers were more likely to appreciate the prototypes that accentuate categorization and accessibility characteristics that help their workflow, yet sometimes they were confused about what specific functions are being implemented. Based on this, multiple crucial pain points were established to stakeholders; but, this posed more as a challenge to reiterate and collaborate with one another in order to provide the best product in a customer-focused mindset. Evidently, the more I engaged with the stakeholders in the research process, the more likely those product reiterations will follow based on the insights that users had. This scaffolding from our work provided the foundation that proliferated the process mode UI's future concepts. One limitation of this testing, however, was that there were only two iterations of the prototypes as well as only two iterations of the concept testing. While this is good for early front-end concept testing, the future iterations should allow for more findings based on user interaction, reduce economics costs, and increase stakeholder collaboration for both business and customer goals.


The process model UI is a coaching feature that I worked on with the product team that was being implemented in order to create a central system that content authors use to create, find, and view process models. This would evidently increase the speed of training modules launched and have a place to hold training modules to teach Amazon associates within fulfillment centers. Content authors are users who are able to create training modules through web integration for Amazon associates, specifically to incorporate into their training programs. By introducing an updated Process Model UI to different content authors, they will have better visibility as to what process models have been activated or deprecated. Content authors also have the ability to create, edit, and view modules that they or others have created.


Among the community of content authors, a wide variety of content authors each have their own ways of creating training modules such as relying on the combination of efforts from different teams to quicken the launch of training modules by using Excel sheets and non-process model UI methods. The proposed idea from the product team was to facilitate a project that would reduce resources used by combining efforts with the engineering team, product, and UX teams and build an updated product that allows content authors to have a convenient method to view, create, and edit training modules. With the improvement of an updated Process Model UI, users would have a seamless experience when utilizing this software to launch training modules.

Research Objectives and Goals

  • Evaluate pain point end users have when navigating through the view, find, and edit model such as the left navigation bar, visualizations, search bar, and filter functions

  • Increase the visibility of content UI for content authors on various process models to centralize a one-stop shop, thus improving efficiency for faster training module deliverables.

  • Coordinate with UX designer, manager, software engineers, and product managers to provide proposed foundational insights that drive design decisions and iterations

bottom of page