Construction Information Technology Laboratory

Tracking Construction Site Resources with Machine Vision

Resource tracking is a vital aspect of construction project management and control systems. Tracking technologies based on radio frequency (RF) communications have dominated the market and have proven reliable and adequate in many typical construction environments. These RF technologies rely on sensors (tags) placed on each resource that are “read” by remote (satellites) or local reader devices. In small construction sites the labor overhead generated from the deployment, maintenance and decommissioning of such tags and readers is small. However in medium and large-scale sites, where tracking is needed for thousands of materials and hundreds of personnel and equipment, the labor overhead of RF technologies reaches prohibitive costs when compared to the anticipated benefits. As a result, automated tracking solutions have so far been limited to small projects and case studies that cover only small segments of larger projects.

“Remote” tracking, defined here as tracking resources from a distance without the need for tags attached on them has the potential to reduce this cost down to feasible levels. Vision tracking is the most popular and inexpensive form of remote tracking. It finds the location of the tracked object in each frame by comparing with its position on the previous frame through a process known as image alignment. However, the state-of-the-art vision tracking technologies do not have the ability to track resources automatically and in 3D. Yet.

R2-r

Doctoral student Man-Woo Park and CEE Assistant Professor Ioannis Brilakis, with support from the U.S. National Science Foundation (Grant # 0933931), are validating the ability of a novel framework to track multiple resources simultaneously from stationary construction cameras. This framework starts by recognizing most common project related resources (materials, personnel and equipment) from each video, matching them across all views, and using the result to initialize all independent tracking sequences. This way in each subsequent frame, the 2D positions of the matched entities can be used to calculate its location in space. The NSF-funded project aims at calculating the performance of each step based on established metrics and comparing it to the performance of RF technologies. If comparable or better performance is determined, the cost and time savings associated with the elimination of tags on each entity and its replacement by a drastically smaller number of cameras will make this work a breakthrough for the construction industry that will change the direction of project monitoring and control technologies.

Ph.D. candidate Man-Woo Park started his Ph.D. program in fall 2008 at Georgia Tech and joined the Construction Information Technology Laboratory (CITL) in spring 2009. Park has an interest in the sensing technology for construction engineering and is now performing the research on the vision tracking. His primary research interests lie at the intersection of machine vision and construction engineering. The long term goal of his research is to create the tools to improve the construction processes taking advantage of his knowledge and experience in structural engineering.