AI Lens - Revolutionizing User Interaction with Contextual AI
At Cisco, I created and led the development of a new AI feature called, "Lens" which provides contextual engagement with AI assistants on any UI object on screen.
AI "Lens" is a breakthrough AI feature designed to augment user interactions with any UI object visible on the screen.
It provides an AI assistant that comprehends and interacts based on the context of the screen elements, significantly enhancing user workflows and task completion rates.
In the same way humans interact by pointing and discussing, AI Lens works in the same way, giving a new relationship to how humans interact with AI assistants to discuss complex data.

Lens operates on three foundational principles.
Co-Work with Context: The AI should act like a colleague, understanding context across platforms.
Trust but Verify: Users can challenge the AI’s responses, promoting trustworthiness.
Point and Ask: Users can point to any object on the screen to interact with the AI, similar to asking a colleague questions about specific items.
Organization
Cisco
Role
Inventor, Design Lead
Point. Click. Context.
Lens Select
The Lens Select tool activates a context focused AI Assistant
Contextual Selection
Using the select tool the user drags over any visual area of the software product to focus the AI Assistant on the object

Contextual Prompt
Once an area is selected the user can ask the AI anything in reference to that object and get a contextual answer.

Contextual Response
The response is directly related to the visual prompt.
Want to know a little about how it works?
process
Discovery and Hypothesis:
Collaborated across teams to understand unique business reasons, user landscapes, and industry challenges. Defined the hypothesis to test.
Prototyping:
Created varying fidelity prototypes to test the hypothesis with users, illustrating the intended narrative.
Validation:
Conducted user interviews and collected feedback continuously to refine the hypothesis and the prototype.
User Story Mapping:
Used workshops to incorporate user feedback and validate the story map with engineering teams, guiding development priorities.
Implementation and Testing:
Integrated prototypes into the real environment, gathering immediate feedback and iterating rapidly.
Final Release:
Launched the feature based on rigorous testing and validation to ensure it met the user's needs and the business objectives.





