Eyal Ofek

Prof

Accepting PhD Students

PhD projects

HCI, Sensing, Mixed Reality

1997 …2024

Research activity per year

Personal profile

Research interests

My research focused on using sensing and scene understanding for Human-Computer Interaction and using sensing and Mixed Reality (MR), enabling better collaboration and more inclusive workspace. I have been granted over 120 patents and published over 100 academic publications at leading venues, cited by more than 17000 publications.

Prior to joining the University, I was a principal researcher at Microsoft Research (2004-2023)

I have formed and led a new MR research group at Microsoft Research. I envision MR applications as weaved with the fabric of our lives rather than PC and mobile apps, limited to running on a specific device’s screen. Such applications must be aware of users’ changing physical and social contexts and be flexible to adapt accordingly. We developed systems such as FLARE (Fast Layout for AR experiences) used by the HoloLens Team and Unity game engine, or Triton 3D audio simulation used by Microsoft Games and is the base of Microsoft Acoustics.

I formed the Bing Maps & Mobile Research Lab. We combined generated many novel results, impacting the product while generating world-class computer vision and graphics research. Among our results is the development of the influential text detector technology used by the Bing Mobile app and incorporated into OpenCV, the world’s first street-side imagery service, street-level reconstruction of geometry and texture pipeline, novel texture compression used by Bing Maps, and more.

I oversaw software and algorithms R&D of the world’s first time-of-flight video camera (3DV Systems’ ZCam).  I used our depth cameras for applications such as TV depth keying, and camera-based gaming. The depth camera was later evolved to be included in the Microsoft HoloLens and MagicLeap HMDs. Another company I formed developed the popular PhotonPaint editor for Amiga (1988). Since 2023 I worked with DataBlanket (USA), developing AI for autonomous firefighting drone.

I am a senior member of the ACM. I was the paper chair of ACM SIGSPATIAL 2011, the haptics editor of Frontiers in VR and member of the editorial board of IEEE CG&A. I serve on conference committees such as CVPR, CHI, UIST, ISMAR, and ISS.

I have released multiple tools and open-source libraries such as the RoomAlive Toolkit [RoomAliveToolkit23], used around the world for multi-projection systems, SeeingVR to enhance the use of VR for people with low vision, Microsoft Rocketbox avatars, MoveBox [MoveBox23] and HeadBox toolkits to democratize avatar animation, and RemoteLab for distributed user studies.

More information at Research page: www.eyalofek.org .

Keywords

  • Q Science (General)
  • HCI
  • Sensing
  • Mixed reality
  • Interaction
  • Haptics
  • Computer Vision
  • Scene understanding
  • Computer graphics
  • Preception
  • Smart environments

Collaborations and top research areas from the last five years

Recent external collaboration on country/territory level. Dive into details by clicking on the dots or