The Michael J. Fox Foundation's mission is to find a cure for Parkinson's. Through a large, longitudinal study and collaboration with USC's Laboratory of Neuro Imaging (LONI), they've amassed one of the largest Parkinson's related data repositories and biospecimens in the world, aiming to identify the biomarkers associated with the disease.
The PPMI website houses this open-sourced data and allows any researcher to apply to see and use the data for their own research, or request biospecimens.
Client: Michael J. Fox Foundation, under the employment of USC's Laboratory of Neuro Imaging
I worked under LONI as a UX team of one. I had a 6-month timeline to revamp the decade old website, while also trying to gain enough domain knowledge to aid in my research.
Stakeholders were certain that issues with the current site were surface level and only a facelift was needed. After an evaluation, I found many deeply rooted issues with usability and content, and convinced them that we needed to address these issues.
Furthermore, the site had to be responsive, and moved to a more secure and modern CMS.
The first step involved auditing the current site. I took a full inventory of pages and content, being sure to document it all in a shareable spreadsheet. I ran a series of evaluations against key pages to identify the gaps and opportunities we had with the site, which was then outputted as a high level read out and set of recommendation for stakeholders to help establish our goals.
Analytics & flows
Information architecture review
The Principal Investigator (PI) is the lead researcher of a laboratory study or clinical trial, and a core user of the site. Because of the importance of their role, I wanted to interview some PIs who have used the site to get a sense of their pain points as well as a better understanding of the nature of their work. User interviews were moderated by me, done one on one and remotely over video conferencing, and recorded for note taking and documentation.
There were several key insights that came out of the interviews:
With the evaluations and user interviews done, I gathered all the key insights to hone in on a few problem statements:
Though user experience and visual design was deemed as highest priority by stakeholders, through my evaluation I knew that content needed to be revamped in order to accomplish our goals.
I took the content inventory from the audit and used it as a shared, high level way to document progress and set goals and priorities, using analytics, page hierarchy, and content quality as a guide.
I worked closely with writers to help establish reusable content patterns and modules that could be built up into content templates. These were disseminated to writers as shared Google docs with guidelines such as character counts. These templates would later inform the wireframes.
I ran some co-creation working sessions with stakeholders to reorganize the site, adjusting nomenclature, and reworking the global navigation to assist in wayfinding.
Based on the evaluation and content work, I created initial designs as wireframes to visualize and document my hypotheses as to how we might solve some of our problem statements. These early designs were then validated with stakeholders.
Based on the wireframes, I created some higher fidelity designs and built an interactive prototype leveraging Bootstrap for user testing. The beauty of using Bootstrap was that we had landed on Drupal as the CMS and could eventually import the Bootstrap designs straight in as a theme.
I tested the prototype with users that covered our various user archetypes (data scientists, PhD candidates, professors, PIs, PPMI committee members, etc), and with varying degrees of familiarity with the old site. Users were able to freely interact with the prototype while providing feedback. I also had them do some tree tasks to see if the site taxonomy and new navigation was clear to them.
I synthesized the feedback from users into a high level readout for stakeholders. I provided recommendations in terms of what actions we should take in response, which then informed another iteration of the site before jumping into final designs.
The homepage was revamped to be much less visually cluttered, copy written to be shorter and more engaging, and prominent callouts were added to link to the more important sections of the site. Globally, the main navigation was cleaned up, and a mega footer was added to aid in navigation and wayfinding.
In order to help new users understand the data more before they applied for usage, some basic statistics around the data was provided by a new feature, the data dashboard. I coded this in HighCharts, and the dynamic data was queried from a SQL database.
The specimens request process was placed into a more clearer UI pattern, with relevant steps sectioned into discrete steps, along with relevant docuementation within easy reach.
Cohorts information was beefed up to include more relevant information. Furthermore, global content modules such as resources was added to certain pages to aid in wayfinding.
Publications used to be one long list. It was redesigned to include more robust search functionality, as well as pagination to help better contain results.
Finally, I created documentation around the design system which documented branding, design tokens, and design patterns. This document would aid in future maintenance and governance.
The website was launched to a great response from the community, and in time for the 2nd phase of the study, which was greatly expanded, exposing the site and study to a much larger audience.