Twenty years of research. Patented technology.
The Polyopticon emerges from two decades of independent research at the intersection of cognitive science, human-computer interaction, and embodied cognition. The core technology—using motion to navigate timeline objects—is now protected intellectual property, with a foundation extending to Victorian-era optics and contemporary neuroscience.
Patent portfolio
Foundational intellectual property protecting the temporal-haptic interface paradigm.
U.S. Patent No. 12,197,471 B2
Systems and Methods for Displaying and Manipulating Timeline Objects Using Motion
Filed March 3, 2023 • Issued January 14, 2025 • Inventor: Joseph Raimondo
What we have chosen to patent is using motion to navigate timeline objects. The patent establishes foundational protection for temporal-haptic interfaces—the core innovation enabling physical manipulation of time-based data structures.
Protected technologies
Geodesic lens algorithm
FPGA-implemented processing that maps physical device motion to navigation through temporal data structures, enabling natural exploration of timeline objects.
4DXML encoding
Four-dimensional extensible markup language for temporal-spatial data representation—enabling persistence and interchange of timeline objects and their relationships.
Timeprint generation
Compressed state capture combining content data, accelerometer input, and system timestamps into reviewable, shareable recordings of interaction sequences.
Truncated icosahedral navigation
32 degrees of navigational freedom derived from geodesic geometry—the mathematical foundation for multi-face display topology and spatial workspace organization.
Patent claims: All 15 claims accepted+
The USPTO accepted all 15 claims covering the foundational technology, including method claims for timeline object manipulation, apparatus claims for the geodesic lens implementation, and system claims for the integrated temporal-haptic interface architecture.
Key claim coverage includes: motion-based navigation of temporal data, physical verb recognition from accelerometer patterns, timeprint generation and replay mechanisms, and the geodesic compression algorithm for efficient state encoding.
Continuation patent applications
Building on the foundational patent, continuation applications extend protection to specific implementations, industry applications, and technological integrations.
Spherical multi-face display
Extension of geodesic lens algorithm across 32 MicroLED display faces—the physical realization of the truncated icosahedral topology protected in the base patent.
Collaborative timeline sharing
Multi-user artifact sharing enabling researchers to navigate and explore timeline artifacts generated by colleagues—extending the single-user foundation to team environments.
Physical verb recognition
Machine learning integration for recognizing and classifying motion patterns into programmable agent behaviors—the “3-D macro” capability for automated workflows.
Cross-face content flow
Algorithms for maintaining temporal coherence as content moves between display faces, enabling seamless workspace navigation across the spherical surface.
Scientific research systems
Specialized implementation for laboratory data analysis, hypothesis tracking, and experimental timeline management—the “Bloomberg for Scientists” application.
Defense and intelligence
Classified environment implementation for satellite tracking, multi-source intelligence fusion, and collaborative situation awareness in operations centers.
Research lineage
The intellectual foundations span from Victorian optical devices to contemporary cognitive science.
From panorama to Polyopticon
The Polyopticon concept draws on a rich tradition of immersive visual technologies. The panorama (late 18th century), diorama (1820s), and stereoscope (1838) all explored how spatial arrangement of visual information affects comprehension and engagement. These devices recognized something modern computing has largely forgotten: the physical topology of information display shapes how we think.
The Polyopticon’s 32-face geodesic topology is a contemporary answer to the same question these inventors were asking: How do we create interfaces that work with embodied cognition rather than against it?
Embodied cognition
Cognition is not confined to the brain—it emerges from the dynamic interaction of brain, body, and environment. The Polyopticon’s haptic interface leverages this insight, making physical manipulation a pathway to understanding rather than a translation layer.
Maturana & Varela, The Tree of Knowledge (1987)
Enactive perception
Perception is not passive reception—it’s active exploration. We understand the world by acting on it. The temporal-haptic interface embodies this principle: you comprehend timeline data by physically moving through it.
Varela, Thompson & Rosch, The Embodied Mind (1991)
Tacit knowledge
We know more than we can tell. Expertise resists articulation. The timeprint mechanism addresses this directly—capturing the practiced motions and decision sequences of experts in a form that can be replayed and studied.
Michael Polanyi, The Tacit Dimension (1966)
Distributed cognition
Cognitive work is distributed across individuals, artifacts, and representations. The Polyopticon serves as a cognitive artifact that extends individual capability—timeline objects externalize reasoning in reviewable, shareable form.
Edwin Hutchins, Cognition in the Wild (1995)
Enactive heuristics: A new framework
The Polyopticon research program has developed a theoretical framework we call enactive heuristics—the study of how physical interaction patterns become reliable problem-solving methods. This framework bridges embodied cognition theory and practical interface design.
Physical verbs
Recurring motion patterns that carry semantic meaning. A “sweep” means survey; a “tap-hold” means inspect; a “shake” means reset. These emerge from use and become a vocabulary of action.
Temporal embedding
The user’s conception of time becomes encoded in the interaction. Timeprints don’t just record what happened—they capture how the user experienced temporal flow during the interaction.
Somatic intuition
Knowledge that lives in the body. Expert users develop felt sense for data states—they know something is wrong before they can articulate why. The haptic interface cultivates this capacity.
Deep dive: From tacit to transmissible+
The fundamental challenge in knowledge management is the tacit/explicit gap. Explicit knowledge—facts, procedures, documentation—is readily captured and transmitted. Tacit knowledge—judgment, intuition, embodied skill—resists articulation and typically transfers only through extended apprenticeship.
Traditional approaches attempt to translate tacit knowledge into explicit form: interviews, process documentation, best practice guides. But translation inevitably loses information. The nuances of expert practice don’t survive reduction to text.
The temporal-haptic approach takes a different path. Rather than translating tacit knowledge to explicit form, timeprints preserve it in its native medium—embodied action. A novice replaying an expert’s timeprint doesn’t read about what the expert did; they experience the temporal flow of expert practice. The knowledge transfers without translation.
This is why we say generating haptic agents—3-D macros—is a key enabling technology. Rather than needing to translate a series of activities into a script that mediates action, the user’s own physical motion is stored, embedding their conception of time into the process.
Research domains
The Polyopticon draws on and contributes to multiple research disciplines.
Tangible user interfaces
Physical objects as computing interfaces. The Polyopticon extends TUI research by making the entire device a manipulation surface—every face an interaction zone, every motion meaningful.
Fishkin, K.P., “A Taxonomy for and Analysis of Tangible Interfaces” (2004)
3D user interfaces
Interaction in three-dimensional space. Unlike VR/AR approaches that simulate 3D, the Polyopticon provides genuine physical dimensionality—real objects, real motion, real haptic feedback.
Bowman et al., 3D User Interfaces: Theory and Practice (2004)
Information visualization
Making data visible and comprehensible. The Polyopticon’s contribution is adding temporal and physical dimensions to visualization—data not just seen but navigated, explored, physically manipulated.
Tufte, The Visual Display of Quantitative Information (1983)
Computer-supported cooperative work
Technology for collaboration. Timeprint sharing and workspace synchronization contribute new mechanisms for transmitting method, not just results—enabling teams to build on each other’s approaches.
Brown & Duguid, The Social Life of Information (2000)
Lifelogging and personal informatics
Capturing and reflecting on personal experience. The timeprint concept descends from Vannevar Bush’s memex and finds contemporary expression in lifelog research—but focused on cognitive work rather than general experience.
Gemmell et al., “MyLifeBits: Fulfilling the Memex Vision” (2002)
Haptic interaction
Touch and physical feedback in computing. The Polyopticon’s pressure-sensitive surfaces, accelerometer arrays, and haptic output contribute to understanding how touch channels can carry information.
Harrison et al., “Squeeze Me, Hold Me, Tilt Me!” CHI (1998)
Bibliography
Selected foundational texts informing the Polyopticon research program.
Books
Abbott, Edwin A. Flatland. Dover Publications, 1952. [The original meditation on dimensional perception]
Alexander, Christopher, et al. A Pattern Language. Oxford University Press, 1977. [Structural approach to design]
Bowman, Doug A., et al. 3D User Interfaces: Theory and Practice. Addison-Wesley, 2004. [Comprehensive 3D interaction reference]
Brown, John Seely and Paul Duguid. The Social Life of Information. Harvard Business School Press, 2000. [Information in context]
Fogg, BJ. Persuasive Technology. Morgan-Elsevier, 2002. [Behavioral design]
Fuller, R. Buckminster. Synergetics; Critical Path. [Geodesic geometry and systems thinking]
Gelernter, David. Mirror Worlds. Oxford Press, 1991. [Software representations of reality]
Hawkins, Jeff. On Intelligence. Owl Books, 2004. [Hierarchical temporal memory]
Huberman, Bernardo A. The Laws of the Web. MIT Press, 2002. [Information ecology]
Huizinga, Johan. Homo Ludens. The Beacon Press, 1955. [Play as fundamental human activity]
Laurel, Brenda. Computers as Theater. Addison-Wesley, 1993. [Dramatic structure in interface design]
Lieberman, Henry. Your Wish is My Command: Programming by Example. Morgan Kaufmann, 2001. [Demonstration-based automation]
Maturana, Humberto and Francisco Varela. The Tree of Knowledge. Shambhala, 1987. [Autopoiesis and embodied cognition]
McCloud, Scott. Understanding Comics. Harper Perennial, 1993. [Sequential art and visual communication]
Morville, Peter. Ambient Findability. O’Reilly, 2005. [Information architecture and discovery]
Nelson, Theodor Holm. Computer Lib/Dream Machines. Microsoft Press, 1988. [Visionary hypertext]
Norman, Donald A. Things that Make us Smart. Addison-Wesley, 1993. [Cognitive artifacts]
Polanyi, Michael. The Tacit Dimension. Doubleday, 1966. [Implicit knowledge]
Shneiderman, Ben. Leonardo’s Laptop. MIT Press, 2002. [Human-centered computing]
Tufte, Edward. The Visual Display of Quantitative Information. Graphics Press, 1983. [Visualization principles]
Wiener, Norbert. The Human Use of Human Beings. Doubleday Anchor, 1954. [Cybernetics and society]
Winograd, Terry and Fernando Flores. Understanding Computers and Cognition. Addison-Wesley, 1986. [Phenomenological HCI]
Zuboff, Shoshanna. In the Age of the Smart Machine. Basic Books, 1988. [Automation and expertise]
Selected papers and demonstrations+
Agarawala & Balakrishnan. “Keepin’ it real: pushing the desktop metaphor with physics, piles and the pen.” CHI 2006.
Balakrishnan et al. “The Rock’N’mouse: Integral 3D Manipulation on a Plane.” ACM CHI ’97.
Beaudouin-Lafon. “Instrumental Interaction: An Interaction Model for Designing Post-WIMP User Interfaces.” CHI 2000.
Bush, Vannevar. “As We May Think.” Atlantic Monthly, July 1945. [The memex vision]
Buxton & Myers. “A Study in Two-Handed Input.” CHI 1986.
Gemmell et al. “MyLifeBits: Fulfilling the Memex Vision.” Microsoft Research, ACM Multimedia ’02.
Harrison et al. “Squeeze Me, Hold Me, Tilt Me! An Exploration of Manipulative User Interfaces.” CHI 1998.
Jacob et al. “Reality-based interaction: a framework for post-WIMP interfaces.” CHI 2008.
Poupyrev et al. “D20: Interaction with Multifaceted Display Devices.” CHI 2006. [Prior art on polyhedral displays]
Rekimoto. “SmartSkin: an infrastructure for freehand manipulation on interactive surfaces.” CHI 2002.
Schiphorst. “soft(n): toward a somaesthetics of touch.” CHI 2009.
Shneiderman. “Why Not Make Interfaces Better Than 3D Reality?” IEEE Computer Graphics and Applications, 2003.
Weiser. “The Computer of the 21st Century.” Scientific American, September 1991. [Ubiquitous computing vision]
Development history
Twenty years from concept to patented technology.
Early 2000s: Concept formation
Initial exploration of multi-face display concepts. Early encouragement from Dr. Norman Badler at the University of Pennsylvania, despite skepticism about 3D devices in the broader field. The core insight: physical topology of information display affects collaborative decision-making in high-stakes environments.
2010s: Theoretical refinement
Development of temporal-haptic theory. Integration of embodied cognition research with interface design principles. Articulation of the enactive heuristics framework. Continued independent research while observing technology maturation—flexible displays, FPGA advancement, mobile processing power.
2023: Patent filing
Formal patent application filed with USPTO. Claims covering the core innovation: using motion to navigate timeline objects. The geodesic lens algorithm, 4DXML encoding, and timeprint generation mechanisms documented and submitted for protection.
2025: Patent granted
U.S. Patent No. 12,197,471 B2 issued January 14, 2025. All 15 claims accepted. Foundational intellectual property protection established for the temporal-haptic interface paradigm.
2025–Present: Commercialization
Active development of continuation applications, funding strategies, and partnership opportunities. Exploration of government funding through NSF SBIR programs and defense applications. Engagement with academic institutions and potential corporate partners.
Intellectual property strategy
The most promising path involves a strategy of licensing intellectual property and partnering with experienced market players to bring the whole product to market. Licensing the device technology and platform to established players is preferred over original equipment manufacturing.
Technology licensing
Core patents available for licensing to device manufacturers, platform developers, and enterprise software vendors seeking to integrate temporal-haptic capabilities.
Strategic partnerships
Collaboration with established players having deep pockets, targeted market channels, and sophistication in integrating complex IP portfolios into new product offerings.
Research collaboration
Academic partnerships for continued development. PhD-level research opportunities in embodied cognition, spatial computing, and human-computer interaction.
Licensing inquiries welcome
For licensing discussions, partnership opportunities, or research collaboration, contact Design Anticipation LLC.Get in touch →