Show simple item record

dc.rights.licenseRestricted to current Rensselaer faculty, staff and students. Access inquiries may be directed to the Rensselaer Libraries.
dc.contributorRadke, Richard J., 1974-
dc.contributorWen, John T.
dc.contributorJulius, Anak Agung
dc.contributorFajen, Brett R.
dc.contributor.authorJivani, Devavrat Ganesh
dc.date.accessioned2021-11-03T09:24:52Z
dc.date.available2021-11-03T09:24:52Z
dc.date.created2021-07-07T16:14:39Z
dc.date.issued2020-12
dc.identifier.urihttps://hdl.handle.net/20.500.13015/2674
dc.descriptionDecember 2020
dc.descriptionSchool of Engineering
dc.description.abstractIn this thesis, we investigate the application of sensing technology to scenarios including large scale immersive environments, industrial robot workspaces, and domestic environments. We design frameworks that tackle the individual requirements and challenges of each of these application areas. First, we look at expansive immersive environments, where the addition of multiple 3D sensors enables occupant awareness and gesture based interactions through the application of image filtering techniques and shape approximations. This not only allows the environment to react to the presence of its users, but also lets the users interact with its vast screens through pointing and dragging. We evaluate our system in an experimental immersive space. Second, we look at industrial robot workcells, where through a combination of well-placed 3D sensors, joint configuration information of the robot, highly accurate models of its links, and 3D shape primitives we develop a real-time safety solution. Through simulation and a physical testbed, we demonstrate that this system allows human workers to safely cohabit and collaborate with large, powerful industrial robots. Third, we employ robot-mounted sensors and image-based visual servoing on augmented reality markers to enable the flexible assembly of large structures in an industrial manufacturing scenario. We evaluate our approach in a physical testbed and establish that we can exceed metrics achieved by a fully manual process. Lastly, we integrate room and robot mounted sensors with an assistive robot through the use of markers to enable an activity of daily living for quadriplegic individuals, introducing the possibility of restoring some independence in their lives.
dc.description.abstractHumans benefit when the spaces they inhabit and the machines they interact with are smarter. Intelligent machines are transitioning from mere tools to partners in many human endeavours, and hybrid human-machine systems are becoming increasingly popular. There is a need to make existing spaces interactive, collaborative, and more accommodating of human presence. From expansive theatre-like spaces to factory floors to kitchens, each space can be augmented with the addition of sensing. Networked, low-cost, consumer-grade 3D sensors have introduced the possibility of adding accurate occupancy sensing to a host of existing and emergent applications. The number and layout of these sensors can be configured according to the requirements of the application, providing a rich representation of the covered volume and its contents. This information can be processed to augment individual spaces in relevant ways. Specifically, we explore three application areas, where the addition of 3D sensors and cameras can help establish a synergy between human beings and their surroundings.
dc.language.isoENG
dc.publisherRensselaer Polytechnic Institute, Troy, NY
dc.relation.ispartofRensselaer Theses and Dissertations Online Collection
dc.subjectElectrical engineering
dc.titleEnabling human-machine coexistence using depth sensors
dc.typeElectronic thesis
dc.typeThesis
dc.digitool.pid180519
dc.digitool.pid180520
dc.digitool.pid180521
dc.rights.holderThis electronic version is a licensed copy owned by Rensselaer Polytechnic Institute, Troy, NY. Copyright of original work retained by author.
dc.description.degreePhD
dc.relation.departmentDept. of Electrical, Computer, and Systems Engineering


Files in this item

Thumbnail

This item appears in the following Collection(s)

Show simple item record