<?xml version="1.0"?>
<feed xmlns="http://www.w3.org/2005/Atom" xml:lang="en">
	<id>https://mw.hh.se/caisr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=BjornAstrand</id>
	<title>ISLAB/CAISR - User contributions [en]</title>
	<link rel="self" type="application/atom+xml" href="https://mw.hh.se/caisr/api.php?action=feedcontributions&amp;feedformat=atom&amp;user=BjornAstrand"/>
	<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Special:Contributions/BjornAstrand"/>
	<updated>2026-04-04T23:52:26Z</updated>
	<subtitle>User contributions</subtitle>
	<generator>MediaWiki 1.35.13</generator>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=On_control_of_robots_in_remote_workspaces_using_lasers&amp;diff=3610</id>
		<title>On control of robots in remote workspaces using lasers</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=On_control_of_robots_in_remote_workspaces_using_lasers&amp;diff=3610"/>
		<updated>2017-10-10T07:51:01Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: Created page with &amp;quot;{{StudentProjectTemplate |Summary=Supervised autonomy for controlling robots at remote locations. Lasers and ultrasonic ranging to be used. |Keywords=Mechatronics, laser point...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Supervised autonomy for controlling robots at remote locations. Lasers and ultrasonic ranging to be used.&lt;br /&gt;
|Keywords=Mechatronics, laser pointers, ultrasonic ranging, Supervised Autonomy, “man-in-the-feedback-loop”, IMU. &lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=In the video clip, http://mobile-robotics.com/ragvald.php , a rate gyro is used to stabilized the heading of a mini UGV prototype.&lt;br /&gt;
&lt;br /&gt;
|Supervisor=Björn Åstrand, &lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
Objective and research questions &lt;br /&gt;
The goal is to build a mobile robot to test how robots can be controlled relative to objects in a remote work space. Sensors aboard the robot are laser, rate gyros/accelerometers and ultrasonic ranger. The commands of the driver to the robot should be done remotely using a navigation camera and laser pointers. The long-term goals of the project is to “understand fundamental limits” for tele-robotics. Models &amp;amp; tests.&lt;br /&gt;
&lt;br /&gt;
Background &lt;br /&gt;
To illustrate the technology, consider the video sequence: http://www.mobile-robotics.com/ugv/ &lt;br /&gt;
It shows a rate gyro stabilized 4-wheel min-ATV that runs on the surface of gravel and stones. The direction of the vehicle is determined by the driver by directing the navigation camera. Gyro-switched 90° turns and U-turns are also displayed. Second video illustrates instability of type ”phase-windup”. &lt;br /&gt;
&lt;br /&gt;
The control of the robot is to be with three laser pointers on board the robot: &lt;br /&gt;
- a laser fix on the robot that gives the heading of the robot, &lt;br /&gt;
- a laser pointer where the robot indicates where it will drive with current control. This is a virtual steering wheel. &lt;br /&gt;
- a laser pointer that the driver uses to specify where the robot should move. Feedback to the driver is via a forward looking navigation camera.&lt;br /&gt;
 &lt;br /&gt;
The robot&amp;#039;s hardware is two servo motors with wheel coders. This motor package will be the &amp;quot;driving front axle&amp;quot; for the mobile robot. Motors with wheel coders and drive electronics ready. Raspberry Pi with sensor card (SensorHat) is placed on the robot with the camera. &lt;br /&gt;
&lt;br /&gt;
Project steps &lt;br /&gt;
&lt;br /&gt;
WP1: Litterature Review and Basic motion – getting started &lt;br /&gt;
The two motors form a “wheel chair” robot. First test is to driven along an 8-shaped trajectory. Laser pointer on RC-servo to be used as a virtual steering wheel. Caster wheel for balance.  &lt;br /&gt;
&lt;br /&gt;
WP2: Car type of robot &lt;br /&gt;
The pair of motors above will be the front axis for a car type of robot. &lt;br /&gt;
&lt;br /&gt;
WP3: Final tests. Reversing with a trailer behind car robot &lt;br /&gt;
Driver to use a laser pointer for control. Faster than manual control? Learning the geometry by driving forward. Then reversing. Applications for vehicles in mines, in forests, at construction sites etc.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Safety_assurance_for_human_in_automated_crane_environment&amp;diff=3577</id>
		<title>Safety assurance for human in automated crane environment</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Safety_assurance_for_human_in_automated_crane_environment&amp;diff=3577"/>
		<updated>2017-09-28T10:56:13Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: Created page with &amp;quot;{{StudentProjectTemplate |Summary=safety assurance for human in automated crane environment |Keywords=Robot perception |TimeFrame=October 2017 to June 2018, with possible exte...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=safety assurance for human in automated crane environment&lt;br /&gt;
|Keywords=Robot perception&lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|Supervisor=Björn Åstrand, Josef Bigun&lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
A central issue for automated systems is safety; the system must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of automated systems has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted.&lt;br /&gt;
&lt;br /&gt;
The aim of the project is to develop algorithms and methods for safety assurance for human in automated crane environment.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=SAS2&amp;diff=3506</id>
		<title>SAS2</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=SAS2&amp;diff=3506"/>
		<updated>2017-09-22T14:47:26Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{ResearchProjInfo&lt;br /&gt;
|Title=SAS2 – Situation Aware Safety Systems&lt;br /&gt;
|ContactInformation=Björn Åstrand&lt;br /&gt;
|Description=The goal with this project is to develop a safety system that better can handle the complexity in the environments where AGV systems operates.&lt;br /&gt;
 &lt;br /&gt;
The approach is to use 3D perception along with methods for detect, track and identify objects in the environment, such as actions of moving objects can be foreseen and concerns can be made based on objects identities (humans and other trucks/AGVs).&lt;br /&gt;
|ProjectResponsible=Björn Åstrand&lt;br /&gt;
|ProjectStart=2016/01/01&lt;br /&gt;
|ProjectEnd=2018/12/31&lt;br /&gt;
|ApplicationArea=Mechatronics&lt;br /&gt;
}}&lt;br /&gt;
{{AssignPublicationProject&lt;br /&gt;
|Publication=Hedenberg, Klas &amp;amp; Åstrand, Björn, 3D Sensors on Driverless Trucks for Detection of Overhanging Objects in the Pathway, Autonomous Industrial Vehicles : From the Laboratory to the Factory Floor., s. 41-56, 2016&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Volvo Group - Trucks Technology - Advanced Technology and Research&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Fotonic AB&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Toyota Material Handling Europe&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Kollmorgen Automation AB&lt;br /&gt;
}}&lt;br /&gt;
Research and development are done in three different areas:&lt;br /&gt;
&lt;br /&gt;
Adaptive Warning fields: How to automatically construct and represent a volume of interests for safe traverse of the driverless truck? What sensors to use and how to integrate information from different sources, e.g. other sensors and maps (multi-layer)?&lt;br /&gt;
&lt;br /&gt;
Tracking, Prediction, Identification and Reasoning: How to improve the performance of a safety system of a driverless truck by leveraging the system awareness by incorporating new functionalities as identification and tracking of objects/agents? &lt;br /&gt;
&lt;br /&gt;
Benchmarking: How to evaluate and benchmark the performance of a safety system solution on a driverless truck for object detection and protection?&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=SAS2&amp;diff=3505</id>
		<title>SAS2</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=SAS2&amp;diff=3505"/>
		<updated>2017-09-22T14:46:31Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: Created page with &amp;quot;{{ResearchProjInfo |Title=SAS2 – Situation Aware Safety Systems |ContactInformation=Björn Åstrand |Description=The goal with this project is to develop a safety system tha...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{ResearchProjInfo&lt;br /&gt;
|Title=SAS2 – Situation Aware Safety Systems&lt;br /&gt;
|ContactInformation=Björn Åstrand&lt;br /&gt;
|Description=The goal with this project is to develop a safety system that better can handle the complexity in the environments where AGV systems operates.&lt;br /&gt;
 &lt;br /&gt;
The approach is to use 3D perception along with methods for detect, track and identify objects in the environment, such as actions of moving objects can be foreseen and concerns can be made based on objects identities (humans and other trucks/AGVs).&lt;br /&gt;
&lt;br /&gt;
|ProjectResponsible=Björn Åstrand&lt;br /&gt;
|ApplicationArea=Mechatronics&lt;br /&gt;
}}&lt;br /&gt;
{{AssignPublicationProject&lt;br /&gt;
|Publication=Hedenberg, Klas &amp;amp; Åstrand, Björn, 3D Sensors on Driverless Trucks for Detection of Overhanging Objects in the Pathway, Autonomous Industrial Vehicles : From the Laboratory to the Factory Floor., s. 41-56, 2016&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Volvo Group - Trucks Technology - Advanced Technology and Research&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Fotonic AB&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Toyota Material Handling Europe&lt;br /&gt;
}}&lt;br /&gt;
{{AssignProjPartner&lt;br /&gt;
|projectpartner=Kollmorgen Automation AB&lt;br /&gt;
}}&lt;br /&gt;
Research and development are done in three different areas:&lt;br /&gt;
&lt;br /&gt;
Adaptive Warning fields: How to automatically construct and represent a volume of interests for safe traverse of the driverless truck? What sensors to use and how to integrate information from different sources, e.g. other sensors and maps (multi-layer)?&lt;br /&gt;
&lt;br /&gt;
Tracking, Prediction, Identification and Reasoning: How to improve the performance of a safety system of a driverless truck by leveraging the system awareness by incorporating new functionalities as identification and tracking of objects/agents? &lt;br /&gt;
&lt;br /&gt;
Benchmarking: How to evaluate and benchmark the performance of a safety system solution on a driverless truck for object detection and protection?&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Adaptive_warning_field_system&amp;diff=3504</id>
		<title>Adaptive warning field system</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Adaptive_warning_field_system&amp;diff=3504"/>
		<updated>2017-09-22T14:30:03Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Adaptive warning field system&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=3D perception, mapping,&lt;br /&gt;
|TimeFrame=January 2017 until June 2017, with possible extension until September 2017&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/&lt;br /&gt;
OpenCv - http://opencv.org/ &lt;br /&gt;
&lt;br /&gt;
Nemati, Hassan, Åstrand, Björn (2014). Tracking of People in Paper Mill Warehouse Using Laser Range Sensor. 2014 UKSim-AMSS 8th European Modelling Symposium, EMS 2014, Pisa, Italy, 21-23 October, 2014.&lt;br /&gt;
&lt;br /&gt;
Power, P. Wayne, and Johann A. Schoonees. &amp;quot;Understanding background mixture models for foreground segmentation.&amp;quot; Proceedings image and vision computing New Zealand. Vol. 2002. 2002.&lt;br /&gt;
|Prerequisites=Image analysis, programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand,&lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Finished&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems only work in 2D and consist of a static protection field (in size) and a speed adaptive warning field – higher speed - larger field. However, mostly the size of the warning field is hardcoded and heavily bounded to the AGV route. The setup of such system is costly and takes a long time to adjust for proper and efficient operation.&lt;br /&gt;
&lt;br /&gt;
The goal with this project [as a subset of SAS2 project] is to develop a safety system based on an adaptive warning field that autonomous learns the foreground (static and dynamic obstacles) and background model (static objects). The approach is to use 3D perception along with methods that combine a method that continuously segments background from foreground model (e.g. optical flow approach or Gaussian mixture models), with a method that uses a geometric map to filter out the foreground model. A challenge is how to update/learn the geometric map. &lt;br /&gt;
&lt;br /&gt;
Preferable the solutions are designed as ROS-packages (or, c++,  python, matlab -code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouse and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
RQ: How to automatically construct and represent a volume of interests for safe traverse of the driverless truck? What sensors to use and how to integrate information from different sources, e.g. other sensors and maps? How to detect the difference between foreground (static and dynamic obstacles) or background model (static objects).&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review and construction of a dataset.&lt;br /&gt;
WP2: Develop methods for estimation of background /foreground model and an adaptive warning field system.&lt;br /&gt;
WP3: Evaluation of the feasibility of the system and comparison with existing systems.&lt;br /&gt;
WP4: [bonus] conference publication (ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for adaptive warning field system using data acquired in a real warehouse or mine.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Agent_and_object_detection_and_classification_in_a_warehouse_setting&amp;diff=3503</id>
		<title>Agent and object detection and classification in a warehouse setting</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Agent_and_object_detection_and_classification_in_a_warehouse_setting&amp;diff=3503"/>
		<updated>2017-09-22T14:29:31Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Detection, and classification of different agents (manual driven forklift trucks, other robots, humans) and objects (such as pallets) in a warehouse environment&lt;br /&gt;
|Keywords=Robot perception, feature extraction, machine learning, classification &lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/&lt;br /&gt;
 &lt;br /&gt;
OpenCv - http://opencv.org/&lt;br /&gt;
&lt;br /&gt;
Lalonde, Jean-Francois; Vandapel, Nicolas; Huber, Daniel; Hebert, Martial; Natural terrain classification using three-dimensional ladar data for ground robot mobility, Journal of Field Robotics, Vol. 23, No. 10, pp. 839 - 861, November, 2006&lt;br /&gt;
&lt;br /&gt;
Mosberger, Rafael; Vision-based human detection from mobile machinery in industrial environments, Thesis, Örebro University, Sweden, 2016&lt;br /&gt;
&lt;br /&gt;
Saarinen, Jari P.; Andreasson, Henrik; Stoyanov, Todor; Lilienthal, Achim J.; 3D normal distributions transform occupancy maps: An efficient representation for mapping in dynamic environments, The International Journal of Robotics Research, Vol 32, Issue 14, pp. 1627 – 1644, 2013&lt;br /&gt;
&lt;br /&gt;
|Prerequisites=Programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand, Naveed Muhammad&lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems don’t consider the identity of different agents in close proximity to the robot and AGV.&lt;br /&gt;
&lt;br /&gt;
The goal of this project [as a subset of SAS2 project] is to develop a method for detection and identification of different agents and objects (such as other AGVs, manually driven forklift trucks, humans, pallets etc.) present in a warehouse environment. The idea is to investigate if a robot, using perception data it acquires through its exteroceptive sensors such as cameras and lidars, can detect and identify different categories of agents and objects present in its environment. This includes segmentation of perception data, extraction of features, and implementation of classification techniques. Preferably the solutions are designed as ROS-packages (or, c++, python, matlab-code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouses and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
Research Question: How to detect and classify different agents and objects present in a warehouse environment into different categories.&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review. WP2: Segmentation and feature extraction from perception data. &lt;br /&gt;
WP3: Classification of different agents/objects using supervised and/or unsupervised learning methods. &lt;br /&gt;
WP4: [bonus] conference publication (ICRA, IROS, ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for detection and classification of agents/objects using data from warehouse environments or mines.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Agent_and_object_detection_and_classification_in_a_warehouse_setting&amp;diff=3502</id>
		<title>Agent and object detection and classification in a warehouse setting</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Agent_and_object_detection_and_classification_in_a_warehouse_setting&amp;diff=3502"/>
		<updated>2017-09-22T14:28:59Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: Created page with &amp;quot;{{StudentProjectTemplate |Summary=Detection, and classification of different agents (manual driven forklift trucks, other robots, humans) and objects (such as pallets) in a wa...&amp;quot;&lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Detection, and classification of different agents (manual driven forklift trucks, other robots, humans) and objects (such as pallets) in a warehouse environment&lt;br /&gt;
|Keywords=Robot perception, feature extraction, machine learning, classification &lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/ &lt;br /&gt;
OpenCv - http://opencv.org/&lt;br /&gt;
Lalonde, Jean-Francois; Vandapel, Nicolas; Huber, Daniel; Hebert, Martial; Natural terrain classification using three-dimensional ladar data for ground robot mobility, Journal of Field Robotics, Vol. 23, No. 10, pp. 839 - 861, November, 2006&lt;br /&gt;
Mosberger, Rafael; Vision-based human detection from mobile machinery in industrial environments, Thesis, Örebro University, Sweden, 2016&lt;br /&gt;
Saarinen, Jari P.; Andreasson, Henrik; Stoyanov, Todor; Lilienthal, Achim J.; 3D normal distributions transform occupancy maps: An efficient representation for mapping in dynamic environments, The International Journal of Robotics Research, Vol 32, Issue 14, pp. 1627 – 1644, 2013&lt;br /&gt;
&lt;br /&gt;
|Prerequisites=Programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand, Naveed Muhammad&lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems don’t consider the identity of different agents in close proximity to the robot and AGV.&lt;br /&gt;
&lt;br /&gt;
The goal of this project [as a subset of SAS2 project] is to develop a method for detection and identification of different agents and objects (such as other AGVs, manually driven forklift trucks, humans, pallets etc.) present in a warehouse environment. The idea is to investigate if a robot, using perception data it acquires through its exteroceptive sensors such as cameras and lidars, can detect and identify different categories of agents and objects present in its environment. This includes segmentation of perception data, extraction of features, and implementation of classification techniques. Preferably the solutions are designed as ROS-packages (or, c++, python, matlab-code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouses and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
Research Question: How to detect and classify different agents and objects present in a warehouse environment into different categories.&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review. WP2: Segmentation and feature extraction from perception data. &lt;br /&gt;
WP3: Classification of different agents/objects using supervised and/or unsupervised learning methods. &lt;br /&gt;
WP4: [bonus] conference publication (ICRA, IROS, ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for detection and classification of agents/objects using data from warehouse environments or mines.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Adaptive_warning_field_system&amp;diff=3501</id>
		<title>Adaptive warning field system</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Adaptive_warning_field_system&amp;diff=3501"/>
		<updated>2017-09-22T14:25:17Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Adaptive warning field system&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=3D perception, mapping, &lt;br /&gt;
|TimeFrame=January 2017 until June 2017, with possible extension until September 2017&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/&lt;br /&gt;
OpenCv - http://opencv.org/ &lt;br /&gt;
&lt;br /&gt;
Nemati, Hassan, Åstrand, Björn (2014). Tracking of People in Paper Mill Warehouse Using Laser Range Sensor. 2014 UKSim-AMSS 8th European Modelling Symposium, EMS 2014, Pisa, Italy, 21-23 October, 2014.&lt;br /&gt;
&lt;br /&gt;
Power, P. Wayne, and Johann A. Schoonees. &amp;quot;Understanding background mixture models for foreground segmentation.&amp;quot; Proceedings image and vision computing New Zealand. Vol. 2002. 2002.&lt;br /&gt;
|Prerequisites=Image analysis, programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand, &lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Closed&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems only work in 2D and consist of a static protection field (in size) and a speed adaptive warning field – higher speed - larger field. However, mostly the size of the warning field is hardcoded and heavily bounded to the AGV route. The setup of such system is costly and takes a long time to adjust for proper and efficient operation.&lt;br /&gt;
&lt;br /&gt;
The goal with this project [as a subset of SAS2 project] is to develop a safety system based on an adaptive warning field that autonomous learns the foreground (static and dynamic obstacles) and background model (static objects). The approach is to use 3D perception along with methods that combine a method that continuously segments background from foreground model (e.g. optical flow approach or Gaussian mixture models), with a method that uses a geometric map to filter out the foreground model. A challenge is how to update/learn the geometric map. &lt;br /&gt;
&lt;br /&gt;
Preferable the solutions are designed as ROS-packages (or, c++,  python, matlab -code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouse and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
RQ: How to automatically construct and represent a volume of interests for safe traverse of the driverless truck? What sensors to use and how to integrate information from different sources, e.g. other sensors and maps? How to detect the difference between foreground (static and dynamic obstacles) or background model (static objects).&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review and construction of a dataset.&lt;br /&gt;
WP2: Develop methods for estimation of background /foreground model and an adaptive warning field system.&lt;br /&gt;
WP3: Evaluation of the feasibility of the system and comparison with existing systems.&lt;br /&gt;
WP4: [bonus] conference publication (ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for adaptive warning field system using data acquired in a real warehouse or mine.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Data_Mining_In_a_Warehouse_Inventory&amp;diff=3500</id>
		<title>Data Mining In a Warehouse Inventory</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Data_Mining_In_a_Warehouse_Inventory&amp;diff=3500"/>
		<updated>2017-09-22T14:24:11Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=A study of feature selection and distance measures for clustering big number of categories (&amp;gt;1000) and novelty detection in warehouse environment.&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=object recognition, signal processing, feature selection, unsupervised clustering, large scale many class classification, data mining.&lt;br /&gt;
|TimeFrame= October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=Zeynep Akata, Florent Perronnin, Zaid Harchaoui, Cordelia Schmid.  Good Practice in Large-Scale  Learning  for  Image  Classi cation.   IEEE  Transactions  on  Pattern  Analysis  and  Machine Intelligence, Institute of Electrical and Electronics Engineers, 2014, 36 (3), pp.507-520.&amp;lt;10.1109/TPAMI.2013.146&amp;gt;.&amp;lt;hal-00835810&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Florent Perronnin, Zeynep Akata, Zaid Harchaoui, Cordelia Schmid.  Towards Good Practice in Large-Scale Learning for Image Classification.  CVPR 2012 - IEEE Computer Vision and Pattern  Recognition,  Jun  2012,  Providence  (RI),  United  States.   IEEE,  pp.3482-3489,  2012,&amp;lt;10.1109/CVPR.2012.6248090&amp;gt;.&amp;lt;hal-00690014&amp;gt;&lt;br /&gt;
&lt;br /&gt;
Raphael  Puget,  Nicolas  Baskiotis,  Patrick  Gallinari.   Sequential  Dynamic  Classi cation  for Large Scale Multi-class Problems.  Extreme Classi cation Workshop at ICML, Jul 2015, Lille,France.  2015.&amp;lt;hal-01207428&amp;gt;&lt;br /&gt;
|Prerequisites=Programming skills, Machine Learning, Computer Vision, Data Mining.&lt;br /&gt;
|Supervisor= Björn Åstrand, &lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
;Background&lt;br /&gt;
: Object recognition in problems entailing many classes is a challenging task. One example of such problems is the inventory list of warehouse. The inventory of typical warehouses often contain up to 10K different classes of objects. In this project we intend to develop inventory list maintanance method that is able to learn the number of classes of objects and train a classifier from the data. Towards this objective, we employ the background knowledge (e.g. from the Warehouse Management System - WMS) to constrain the complexity of the problem.&lt;br /&gt;
&lt;br /&gt;
;Objectives&lt;br /&gt;
: To develop an incremental clustering algorithm, that learns new classes of object through novelty detection. The background knowledge (e.g. WMS), which is an important source of information for constraining the problem, should be exploit towards a more robust system design.&lt;br /&gt;
&lt;br /&gt;
;Research Questions&lt;br /&gt;
: What is the optimal feature space and clustering technique for object identification in large-scale many classes? How to use background knowledge as clustering cues? How to employ novelty detection for learning new classes incrementally?&lt;br /&gt;
&lt;br /&gt;
;Setup&lt;br /&gt;
: dataset from a real-world warehouses.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Dynamic_Objects_Detection_and_Tracking&amp;diff=3499</id>
		<title>Dynamic Objects Detection and Tracking</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Dynamic_Objects_Detection_and_Tracking&amp;diff=3499"/>
		<updated>2017-09-22T14:22:48Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Dynamic Objects Detection and Tracking in Warehouses, Using 3D Sensors.&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=3D Sensor, Point Cloud, Obstacle Detection, Obstacle Tracking, Obstacle Avoidance.&lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=Petrovskaya, Anna, and Sebastian Thrun. &amp;quot;Model based vehicle detection and tracking for autonomous urban driving.&amp;quot; Autonomous Robots 26.2-3 (2009): 123-139.&lt;br /&gt;
&lt;br /&gt;
Wojke, N.; Haselich, M., &amp;quot;Moving vehicle detection and tracking in unstructured environments,&amp;quot; Robotics and Automation (ICRA), 2012 IEEE International Conference on , vol., no., pp.3082,3087, 14-18 May 2012.&lt;br /&gt;
&lt;br /&gt;
Moras, J.; Cherfaoui, V.; Bonnifait, P., &amp;quot;A lidar perception scheme for intelligent vehicle navigation,&amp;quot; Control Automation Robotics &amp;amp; Vision (ICARCV), 2010 11th International Conference on , vol., no., pp.1809,1814, 7-10 Dec. 2010&lt;br /&gt;
&lt;br /&gt;
Golovinskiy, Aleksey, Vladimir G. Kim, and Thomas Funkhouser. &amp;quot;Shape-based recognition of 3D point clouds in urban environments.&amp;quot; Computer Vision, 2009 IEEE 12th International Conference on. IEEE, 2009.&lt;br /&gt;
&lt;br /&gt;
Granstrom, K.; Lundquist, C.; Gustafsson, F.; Orguner, U., &amp;quot;Random Set Methods: Estimation of Multiple Extended Objects,&amp;quot; Robotics &amp;amp; Automation Magazine, IEEE , vol.21, no.2, pp.73,82, June 2014&lt;br /&gt;
&lt;br /&gt;
Data Association and Tracking a survey  RoboEarth.&lt;br /&gt;
&lt;br /&gt;
Rusu, Radu Bogdan, and Steve Cousins. &amp;quot;3d is here: Point cloud library (pcl).&amp;quot; Robotics and Automation (ICRA), 2011 IEEE International Conference on. IEEE, 2011.&lt;br /&gt;
&lt;br /&gt;
Brostow, Gabriel J., et al. &amp;quot;Segmentation and recognition using structure from motion point clouds.&amp;quot; Computer Vision–ECCV 2008. Springer Berlin Heidelberg, 2008. 44-57.&lt;br /&gt;
&lt;br /&gt;
Drost, Bertram, et al. &amp;quot;Model globally, match locally: Efficient and robust 3D object recognition.&amp;quot; Computer Vision and Pattern Recognition (CVPR), 2010 IEEE Conference on. IEEE, 2010.&lt;br /&gt;
&lt;br /&gt;
Biasotti, S. ; Falcidieno, B. ; Giorgi, D. ; Spagnuolo, M. “Mathematical Tools for Shape Analysis and Description”, 2014, Publisher :Morgan &amp;amp; Claypool, Edition:1, ISBN:1627053646&lt;br /&gt;
&lt;br /&gt;
Börcs, Attila, et al. &amp;quot;A Model-based Approach for Fast Vehicle Detection in Continuously Streamed Urban LIDAR Point Clouds.&amp;quot; (2014).&lt;br /&gt;
|Prerequisites=Familiarity with filtering techniques (eg. EKF) for mobile robots localization, Image analysis, programming skills (preferably C++ or Python).&lt;br /&gt;
|Supervisor=Björn Åstrand, Naveed Muhammad,&lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
; Concise Description&lt;br /&gt;
: This project [as a subset of AIMS project], targets the automation of lift trucks in warehouse environments. Operating automatic guided vehicles in this particular environment is challenging due to the high expected throughput and consequently high traffic. Lift trucks are heavy vehicles operating with relatively high speed in an environment where neither the trucks, nor the humans are well protected as a regular urban traffic. This calls for a extra security measure and cautious decisions. Collision avoidance is an essential skill for mobile robots to guarantee a safe operation in a workspace shared with humans. This project focuses on detection and tracking of dynamic objects in order to avoid collision.&lt;br /&gt;
&lt;br /&gt;
; Objective&lt;br /&gt;
: To reliably detect, segment, and track dynamic objects (eg. humans and lift trucks) from a 3D point cloud, acquired by the means of a 3D sensor mounted on a mobile robot, in a highly structured environment (warehouse).&lt;br /&gt;
&lt;br /&gt;
; Research Questions&lt;br /&gt;
: What is the optimal sensor configuration to minimize the blind spots, data losses due to sensor deficiency, and consequently improving the detection accuracy?&lt;br /&gt;
: How to exploit the assumption of structured environment to improve tracking?&lt;br /&gt;
: How the background knowledge of agent types (humans, manually driven trucks and auto-guided trucks) and their behaviour models could improve the tracking?&lt;br /&gt;
&lt;br /&gt;
; Preliminary Plan&lt;br /&gt;
* startup: literature review and data acquisition&lt;br /&gt;
* point cloud manipulation, object segmentation, scene understanding.&lt;br /&gt;
* filtering and tracking.&lt;br /&gt;
* [bonus] object recognition&lt;br /&gt;
&lt;br /&gt;
;Deliverable&lt;br /&gt;
: An implementation and demonstration of the developed method for detection and tracking of the moving obstacle over the real data acquired in a real warehouse.&lt;br /&gt;
&lt;br /&gt;
;Bonus&lt;br /&gt;
: conference publication (ETFA, ECMR, TAROS)&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Model_behaviour_of_agents_in_a_warehouse_setting&amp;diff=3498</id>
		<title>Model behaviour of agents in a warehouse setting</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Model_behaviour_of_agents_in_a_warehouse_setting&amp;diff=3498"/>
		<updated>2017-09-22T14:21:06Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Modelling the behaviour of agents (manual driven forklift trucks, other robots, humans etc.) in a warehouse environment&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=Machine learning, robot perception, modelling, simulation &lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/&lt;br /&gt;
&lt;br /&gt;
OpenCv - http://opencv.org/&lt;br /&gt;
&lt;br /&gt;
Lidström, Kristoffer; Situation-Aware vehicles – supporting the next generation of cooperative traffic system, PhD thesis, Örebro university, 2012.&lt;br /&gt;
&lt;br /&gt;
Lundström, Jens; Järpe, Eric; Verikas, Antanas; Detecting and exploring deviating behaviour of smart home residents, Expert systems with applications., 55, s. 429-440, 2016&lt;br /&gt;
&lt;br /&gt;
Lidström, Kristoffer; Larsson, Tony; Act normal: using uncertainty about driver intentions as a warning criterion, 16th World Congress on Intelligent Transportation Systems (ITS WC), 21-25 September, 2009, Stockholm, Sweden&lt;br /&gt;
&lt;br /&gt;
Lidström, Kristoffer; Model-based Estimation of Driver Intentions Using Particle Filtering, Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems Beijing, China, October 12-15, 2008&lt;br /&gt;
&lt;br /&gt;
|Prerequisites=Programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand, Naveed Muhammad &lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems don’t consider the behaviour or the identity of different agents in close proximity to the robot and AGV.&lt;br /&gt;
&lt;br /&gt;
The goal of this project [as a subset of SAS2 project] is to develop a method to model behaviour of different agents (manual driven forklift trucks, AGVs, humans) in a warehouse setting and thus use that for predicting behaviour in different scenarios. The idea is to investigate if different agents can be automatically divided into categories depending on their behaviour and how that information can be used to foresee actions of different agents.&lt;br /&gt;
&lt;br /&gt;
The study also includes construction of a simulator where the validity of the developed method for behaviour modelling, is evaluated. Real data can also be used to verify the system. Preferably the solutions are designed as ROS-packages (or, c++, python, matlab-code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouses and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
Research Question: How to learn behaviour of different categories of agents (e.g. manual driven trucks, humans, AGVs) especially if they are only partially observed in time. How to represent behaviour of an agent?&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review and construction of a dataset. &lt;br /&gt;
WP2: Develop methods for modelling behaviour of agents in a warehouse setting. &lt;br /&gt;
WP3: Comparison study and development of improvements of the different systems. &lt;br /&gt;
WP4: [bonus] conference publication (ICRA, IROS, ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for modelling behaviour of agents using simulated data and data acquired in a real warehouse environments or mines.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
	<entry>
		<id>https://mw.hh.se/caisr/index.php?title=Model_behaviour_of_agents_in_a_warehouse_setting&amp;diff=3497</id>
		<title>Model behaviour of agents in a warehouse setting</title>
		<link rel="alternate" type="text/html" href="https://mw.hh.se/caisr/index.php?title=Model_behaviour_of_agents_in_a_warehouse_setting&amp;diff=3497"/>
		<updated>2017-09-22T14:19:45Z</updated>

		<summary type="html">&lt;p&gt;BjornAstrand: &lt;/p&gt;
&lt;hr /&gt;
&lt;div&gt;{{StudentProjectTemplate&lt;br /&gt;
|Summary=Modelling the behaviour of agents (manual driven forklift trucks, other robots, humans etc.) in a warehouse environment&lt;br /&gt;
|Programme=Mobile and Autonomous Systems&lt;br /&gt;
|Keywords=Machine learning, robot perception, modelling, simulation &lt;br /&gt;
|TimeFrame=October 2017 to June 2018, with possible extension to September 2018&lt;br /&gt;
|References=SAS2-project, http://islab.hh.se/mediawiki/SAS2&lt;br /&gt;
ROS - Robot Operating System, http://www.ros.org/ &lt;br /&gt;
OpenCv - http://opencv.org/&lt;br /&gt;
Lidström, Kristoffer; Situation-Aware vehicles – supporting the next generation of cooperative traffic system, PhD thesis, Örebro university, 2012.&lt;br /&gt;
Lundström, Jens; Järpe, Eric; Verikas, Antanas; Detecting and exploring deviating behaviour of smart home residents, Expert systems with applications., 55, s. 429-440, 2016&lt;br /&gt;
Lidström, Kristoffer; Larsson, Tony; Act normal: using uncertainty about driver intentions as a warning criterion, 16th World Congress on Intelligent Transportation Systems (ITS WC), 21-25 September, 2009, Stockholm, Sweden&lt;br /&gt;
Lidström, Kristoffer; Model-based Estimation of Driver Intentions Using Particle Filtering, Proceedings of the 11th International IEEE Conference on Intelligent Transportation Systems Beijing, China, October 12-15, 2008&lt;br /&gt;
&lt;br /&gt;
|Prerequisites=Programming skills (preferably C++ or Python)&lt;br /&gt;
|Supervisor=Björn Åstrand, Naveed Muhammad &lt;br /&gt;
|Level=Master&lt;br /&gt;
|Status=Open&lt;br /&gt;
}}&lt;br /&gt;
A central issue for robots and automated guided vehicles (AGVs) is safety; the robot or AGV must not harm humans or damage objects in the environment. Safety concerns have become more and more important as the use of AGVs has spread and advances in sensor technology, sensor integration, and object detection and avoidance have been more widely adapted. Today’s safety systems don’t consider the behaviour or the identity of different agents in close proximity to the robot and AGV.&lt;br /&gt;
&lt;br /&gt;
The goal of this project [as a subset of SAS2 project] is to develop a method to model behaviour of different agents (manual driven forklift trucks, AGVs, humans) in a warehouse setting and thus use that for predicting behaviour in different scenarios. The idea is to investigate if different agents can be automatically divided into categories depending on their behaviour and how that information can be used to foresee actions of different agents.&lt;br /&gt;
&lt;br /&gt;
The study also includes construction of a simulator where the validity of the developed method for behaviour modelling, is evaluated. Real data can also be used to verify the system. Preferably the solutions are designed as ROS-packages (or, c++, python, matlab-code).&lt;br /&gt;
&lt;br /&gt;
Resources: Facilities for data logging, cameras, depth sensor, data logging equipment, data set from warehouses and collaboration with industrial partners.&lt;br /&gt;
&lt;br /&gt;
Research Question: How to learn behaviour of different categories of agents (e.g. manual driven trucks, humans, AGVs) especially if they are only partially observed in time. How to represent behaviour of an agent?&lt;br /&gt;
&lt;br /&gt;
WP1: Literature review and construction of a dataset. &lt;br /&gt;
WP2: Develop methods for modelling behaviour of agents in a warehouse setting. &lt;br /&gt;
WP3: Comparison study and development of improvements of the different systems. &lt;br /&gt;
WP4: [bonus] conference publication (ICRA, IROS, ETFA, ECMR, TAROS)&lt;br /&gt;
&lt;br /&gt;
Deliverable: an implementation and demonstration of the developed system for modelling behaviour of agents using simulated data and data acquired in a real warehouse environments or mines.&lt;/div&gt;</summary>
		<author><name>BjornAstrand</name></author>
	</entry>
</feed>