ADTHENA - Guided Analytics
Challenge: How to improve the work environment while maximising team collaboration?
Role: Senior Ux Designer.
Contribution: Defined the user model and user interface for core features, Developed core features strategy in partnership with Product Management and Engineering teams, Communicated design intent to teams and stakeholders in different countries, Defined and led user experience research and usability studies, Defined and designed core features for desktop, web and mobile applications from pre to post-launch, Created high-level detailed user flows
Tools Used: Sketch, Adobe Creative Suite (Ai, Ps and Ae in particular), Invision.
Overview: The Cisco Spark Board provides touchscreen capabilities in the meeting room, combined with Cisco’s sleek Video Collaboration room systems, connected up into the feature-packed Cisco Spark in the Collaboration Cloud.
ADTHENA - Guided Analytics
Challenge: How to turn a huge amount of data that is being shared in Adthena into actionable insights?
Role: Senior product designer
Team: one product manager, 3 front-end engineers (based in London), one back-end engineer, one data scientist, two product designers ( including myself)
Contribution:
Wrote the assess canvas collaboratively with the product manager.
Developed the core features strategy in partnership with the product manager, VP of Prodcut and CTO.
Gathered internal and external knowledge to understand the user needs and potential value.
Created the site map, and User journey.
Produced wireframes and interactive prototypes.
Translated user and business needs into UX solutions.
Communicated design intent to teams and stakeholders.
Conducted customer interviews to mitigate risks and validate the value.
Tools Used: Sketch, Adobe Creative Suite (Ai, Ps, and Ae in particular), and Invision.



Overview:
The Adthena platform constantly scans data in order to identify key opportunities and threats in order for advertisers to take action and improve their performance. Smart Monitor is a guided analytics approach to monitoring someone’s market.
ASSESS:
The purpose of the asses canvas is to evaluate the potential value of such initiatives. Working collaboratively with the product manager, we created this canvas where we defined the potential user, potential competitor(s), customer problem(s), explored possible solutions, and documented pieces of evidence of value (user and business), impact (user and business), risks and time to impact. This tool has helped us identifying the initiative that has the most potential impact on our current company objectives.
Above: Assess canvas
Our hypothesis:
Users spend a significant amount of time investigating market trends and competitor performance spikes in order to get an advantage. For some clients, the data can be overwhelming. If our product could monitor every single dimension and alert our users when significant movements occur, our product would become an invaluable self-service tool for a search marketer.
Discovery and Research:
Recognized customer problems: Through customer and prospect interviews, sales and CS input, surveys.
Explored potential ideas: Through ideation sessions with internal stakeholders.
Gathered evidence of buyer impact: Through surveys, customer interviews, marketing, and sales input, LOI (letter of intent).
Gathered evidence of user impact: Through prototype, customer interviews, CS input, alpha version in a testing environment.
Gathered evidence of feasibility: Through technical tests, data science tests.
key results:
After speaking with more than 20 customers, we learned that there is a strong need to have a market monitoring system to help our customers improve their performance, protect their brand and allocate their spend.
““We will use this data as a justification for any increase in costs and to take into any planning sessions”
”
User journey:
Based on all the insights, feedback, and knowledge we gathered, I created a user journey that illustrates the actions our customers take, their goals, and the communication channels they use. inside and outside the app to learn about their market movements. This helped me a lot to decide which needs to address first and include in the MVP.
Above: User journey
User tests:
I created a prototype that represented the MVP. We gathered feedback from 18 customers to test the usability and to validate our customers’ needs, and business goals to mitigate value and usability risks.
delivery:
2-week sprints: Designed and delivered progressive versions of the idea, mitigated the greatest risks with each iteration.
Defined functionality: With journey maps and user stories to ensure user needs are met.
Defined user experience: Through wireframes, user experience flows, design assets.
Validated value and usability: Through customers interviews, user testing, and alpha and beta releases.
Above: Examples of the exploration process and one of the specs that were made
Optimize:
Captured user feedback: Through customer interviews, surveys, and our customer-facing teams to identify improvements that can be made to provide additional value.
Insights from feedback: The feeback helped me improve the user experience: of the features, we included in MVP to make it easier/faster for our users to get value from them.
Evaluated the business performance: Of our current products against our KPIs to identify areas for optimization.
Optimized our business model: By adjusting pricing, packaging, and go-to-market strategy to capture more business value from our target market.
““I can use it to look at increasing spend”
”
Key results:
NPS score has improved (45)
The retention of this section was more than the most popular area in the app.
Contributed to 4 renewals, following three months after release.
2 clients have given case studies.