As of August 2016, Intuitive IP has engaged new Augmented Reality/Virtual Reality Intraoperative IP Startup, NavLab, a company focused on development of IP for the improvement of surgical workflow – from surgical planning to performance – ultimately to improve patient outcomes. There are several components, including using artificial intelligence (AI) to assist with diagnosis, image analysis, virtual surgery, and surgical planning; the use of virtual plans that can be then utilized in the operating theater to guide the surgeon using Augmented/Mixed Realty or Virtual Reality systems; as well technology to compliment robot-assisted surgery. A virtual surgical toolset that augments hand-held surgical instruments and can complement MR/VR -assisted surgery will be a mainstay in the surgical theater in the future.
How the surgeon interacts with these systems before, during and after the surgery is being explored such that the surgeon and the system can learn and improve outcomes for the current and for future patients. The ability to directly overlay computer models of patient imaging, along with the surgeons’ virtual surgical plan will allow for more accurate surgical performance, more efficient and less-invasive surgery. Computer software will be used extensively to plan surgery with both computer algorithms and with surgeon input, suggest ideal pathways for surgical intervention based on image-analysis, and to allow surgeons to virtually perform the surgery before going into the operating room. Additionally, this technology will augment the surgeon’s intraoperative knowledge of the anatomy so they can adjust their instrument placement with with traditional hand-held instruments, or adjusting robotic arm(s) used for robot-assisted surgery.
THE PROBLEM NAVLAB AIMS TO ADDRESS
CEO Justin Esterberg, MD, began NavLab in late 2015 as he recognized the technology for surgeons to use before, during and after surgery for tagging medical information for real-time retrieval and referencing consistently contemplated and antiquated. This includes diagnostic imaging tags, aspects of the patient’s electronic medical record (EMR) or other patient-specific data.
When a surgeon is performing the procedure, that surgeon does not currently have direct control of the information they will need for referencing images such as XRays, MRI images, CT scans, or key aspects of the patient’s EMR or other parameters of interest to the surgeon. A surgeon can generally plan for the surgery they are about to perform (review the images and textbooks of anatomy, etc), scroll through images on a desktop computer, make some drawings etc., however the surgeon does not have easy access to these plans while scrubbed into surgery. Additionally, there is not a systematic method to analyze the placement of surgical implants virtually, intraoperatively, or postoperatively and compare these to the preoperative plan, to teach the system and the surgeon to optimize future surgical interventions. The current methods of traditional surgery and evaluation of implants is an analog system.
THE NAVLAB SOLUTION
Intellectual property around various methods of achieving direct overlay of imaging data on patient anatomy, in-line-of-sight, virtual pathways of instruments, surgical UI was filed in 2016 by NavLab in partnership with Intuitive IP. This IP develops an extended portfolio of multiple patents for surgical UI, use of AI for surgical planning and performance, interaction of hand-held surgical instruments or robotic arms with virtual surgery, surgeon interaction with AR/MR/VR surgical systems, surgeon interaction with robotic systems.
NavLab is incubating IP in the areas of surgical workflow, surgical UI, surgical AI, methods in AR/MR/VR and robot-assisted surgery. NavLab plans to now partner with companies under the Intuitive umbrella to further develop the IP and either explore acquisition or licensing options with each IP product individually or as a portfolio.