Midterm Cheat Sheet
docx
keyboard_arrow_up
School
Louisiana State University *
*We aren’t endorsed by this school
Course
4200
Subject
Information Systems
Date
Apr 3, 2024
Type
docx
Pages
5
Uploaded by MasterTankKangaroo4
Lec Week 1 - Introduction to Autonomous Vehicles (Types, Key concepts & components of AVs; safety frameworks, regulations, ethics)
1.
[5pts] What is 'Connected AV' mean?
Relies on inputs from other vehicles, devices, or smart infrastructure (AV truck convoy)
2.
[4pts] An autonomous mobile robot used to make room deliveries in a hotel is called what type of robot?
Service Robots – operating generally indoors
3.
[4pts] What kind of autonomous mobile robot is this? (name and acronym)
Unmanned Surface Vehicle (USV) – water mobile robot on surface
4.
[4pts] Adaptive speed control falls at what level in the SAE driving autonomy scale?
SAE Level 1 – provide steering OR brake/acceleration support
5.
[4pts] A military drone takes off and flies independently to a target waypoint but requires a human to remotely target and release weapons. What level of autonomy is this drone?
Level 4 (High Autonomy) – pilot can monitor flight remotely
6.
[4pts] The process of a robot determining/tracking its position is known as localization
7.
[4pts] Which standard details how to demonstrate an autonomous vehicle can operate without human supervision?
Evaluation of Autonomous Products (ANSI/UL4600) – proves safety of autonomous road vehicles operating w/out human
8.
[4pts] Which standard details how to design and evaluate AVs are safe against external hacking and tricking attacks?
Cybersecurity engineering (ISO/SAE 21434) – protect safety functions from compromise, attacks, unauthorized access, damage
9.
[6pts] Briefly describe what ASIL is and what it describes
Automotive Safety Integrity Levels – way of describing the hazards w/vehicle functions & giving a level of the severity 10.
[4pts] National standards/regulations govern the use of autonomous vehicles on public roadways (TRUE | FALSE).
False – NHTSA has issued guidelines but no formal regulations for AV
11.
[7pts] I work at a large manufacturing facility with multiple buildings at different sites (fairly close together) on either side of a public road. I am planning to use AV drones to fly maintenance spare parts (under 10lbs) between buildings; the drone itself is 20 lbs. It is planned for the drones to be fully autonomous and line of sight monitoring is not possible. What needs to be done to be in compliance with federal regulations?
12.
[10pts] An AV is coming up on a bicyclist on a one lane each direction road (as shown in figure); the bike is sharing the lane. What actions are available to the AV? Discuss in context of the ethical frameworks we discussed.
The vehicle should follow behind the cyclist at a safe distance until there is no oncoming traffic from the other lane. When it is safe, the vehicle
can then cross the centerline to avoid hitting the cyclist and pass him.
Lec Week 2-3 – Mobile Kinematics (3 DOF and 6 DOF space mobile robots; odometry and dead reckoning)
13.
[6pts] How many controlled
degrees of freedom does a differential robot have? And what are they?
2 DOF – x- and y-translation
14.
[4pts] Is a differential robot non-holonomic, holonomic, or redundant?
Non-holonomic – the number of controllable DOF of the robot is smaller than the number of DOF of the robot
15.
[20pts] A differential robot starts a move at base frame pose [ X=140", Y=300",
= -25
]. The robot has 1.5" radius wheels and a 5" track width between wheels. The robot moves at
L
= 6 radians/sec,
R
= 8 radians/sec for 20 seconds. Assuming instantaneous acceleration and ideal driving conditions, what
is the robot's pose in the base frame at the end of the 20 seconds? Show all work.
x
(
t
)
=
x
0
+
∫
t
0
t
V
(
t
)
cos
[
θ
(
t
)
]
=
x
0
+
r
2
∫
t
0
t
{
ω
R
(
t
)
+
ω
L
(
t
)
cos
[
θ
(
t
)
]}
dt →x
(
20
)
=
140+ {1.5
¿
2
∫
0
20
[
8
+
6cos
(
−
25
°
)
]
dt
=
341.6} ¿
y
(
t
)
=
y
0
+
∫
t
0
t
V
(
t
)
sin
[
θ
(
t
)
]
=
y
0
+
r
2
∫
t
0
t
{
ω
R
(
t
)
+
ω
L
(
t
)
sin
[
θ
(
t
)
]}
dt → y
(
20
)
=
300+ {1.5
¿
2
∫
0
20
[
8
+
6sin
(
−
25
°
)
]
dt
=
222.0}
¿
θ
(
t
)
=
θ
0
+
∫
t
0
t
ω
(
t
)
dt
=
θ
0
+
1
L
∫
t
0
t
[
ω
R
(
t
)
−
ω
L
(
t
)
]
dt
=−
25
°
+
1
5} int from {0} to {20} {left (8-6 right ) dt=-17°} ¿
16.
[10pts] Differential robot with base frame pose [ X=10", Y= -45",
= -0.2 radians]. The robot has 1" radius wheels and a 6" track width between wheels. The robot moves at
L
= 6 radians/sec,
R
= 8 radians/sec. Location of ICC in the base frame and radius of curvature?
v
L
=
r ω
L
=
(
1
) (
6
)
=
6
¿
s
; v
R
=
r ω
R
=
(
1
) (
8
)
=
8
¿
s
;ω
=
v
R
−
v
L
L
=
8
−
6
6
=
1
3
rad
s
R
=
L
2
v
R
+
v
L
v
R
−
v
L
=
6
2
8
+
6
8
−
6
=
21 ; ICC=[ matrix {x-Rsin(θ) ## y+Rcos(θ)} ]=[ matrix {10-21 sin (-0.2) ## -45+21 cos (-0.2
17.
[10pts] Differential robot at base frame pose [ X=140", Y=300",
= -25
]. Target (objective) is [ X=200", Y=400"] (any orientation). How would you move the robot to get there? (define wheel speeds and time for each move; can be more than one move)
18.
[20pts] Ackermann drive mobile robot. A function cmdVel() ( vel, steering_angle = cmdVel() ) is available to get the current linear velocity and steering angle. There are also some global variables defined curX, curY, curYaw, lastUpdate, defining the current pose estimate and time odometry was last updated. Write a Python function that will update these 4 global variables each time called.
import time
def cmdVel
(): # Sample function cmdVel() returning linear velocity and steering angle
return 1.0, 0.2 # Global variables
curX = 0.0
curY = 0.0
curYaw = 0.0
lastUpdate = time.time()
def updateGlobalVariables
(): # Function to update global variables based on current velocity and steering angle
global curX, curY, curYaw, lastUpdate
vel, steering_angle = cmdVel() # Get current linear velocity and steering angle from cmdVel()
currentTime = time.time() # Time since the last update
deltaTime = currentTime – lastUpdate
# Update the pose estimate
curX += vel*deltaTime*(0.5*math.cos(curYaw) + math.cos(curYaw + steering_angle)))
curY += vel*deltaTime*(0.5*math.sin(curYaw) + math.sin(curYaw + steering_angle)))
curYaw += vel/L*math.tan(steering_angle*deltaTime
lastUpdate = currentTime # Update the last update time
updateGlobalVariables() # Call updateGlobalVariables() when you want to update the pose
19.
[6pts] Explain what dead reckoning is and why it not by itself useful for navigation
Dead reckoning uses wheel and heading sensors to update the position. It is the process of calculating the current position of a moving object by using a previously determined position, or fix, and by using estimations of speed and heading direction over elapsed time. It is not useful by itself for navigation because there are uncertainties associated with this process such as numerical integration errors, unequal wheel diameters, variation in the contact point of the wheel/unequal floor contact, and sensor errors.
Lec Weeks 4,6,7 – Perception (Range Sensors [infrared, ultrasound, LIDAR, RADAR], Computer Vision)
20.
[5pts] Give an example of a proprioceptive sensor on a mobile robot:
Proprioceptive sensors measure the internal state of the robot (movements, position, internal parameters). Examples include encoders, accelerometers, gyroscopes, IMUs, tilt/inclinometers, and compasses
21.
[6pts] What impact might sensor bandwidth have on control of an AV?
The sensor bandwidth is the rate at which sensors can acquire and process data. It influences the control performance of the AV (contributes to responsiveness, accuracy, stability, and adaptability). They can impact response time, accuracy and precision, control stability, and object detection
& tracking. 22.
[6pts] Explain the difference between accuracy and precision in sensor readings.
Accuracy is how close a measurement is to the target value; precision is the repeatability/consistency of measurements.
23.
[4pts] A gyroscope is used to measure linear velocities along X, Y, and Z axis ( TRUE | FALSE ) FALSE – measure orientation & angular vel
24.
[4pts] What kind of sensor unit do I need to track both linear and angular velocities? Inertial Measurement Units (IMU’s) or INS or IRU
25.
[4pts] Name the ROS2 message type you would use for a 2D laser scanner. (or for GPS position, LIDAR, odometry, etc.)
sensor_msgs/LaserScan, sensor_msgs/NavSatFix, sensor_msgs/LaserScan OR sensor_msgs/PointCloud2, sensor_msgs/Odometry
26.
[5pts] Latitude -30.4
, Longitude +94.0
is approximately where in the world? Southern & Eastern Hemisphere (lat.
N/S; long.
E/W)
27.
[6pts] Briefly explain trilateration and triangulation and the difference between them.
Trilateration is based only on the robot’s distances to beacons & triangulation is based on signal angles to beacons. 28.
[6pts] What are the shortcomings of radar devices?
Lower accuracy, higher power consumption, more prone to deception/jamming, difficulty in object classification, limited sensing range for small objects
29.
[6pts] What is a PointCloud?
A 3D coordinate system representing the external surface of an object, often generated by 3D scanning devices like LiDAR sensors or depth cameras.
30.
[6pts] How does 3D and 4D LIDAR differ?
3D captures spatial information; 4D also captures time information. 3D used for mapping object detection and modeling; 4D used for motion tracking like navigation, surveillance, and monitoring moving objects
31.
[4pts] What vision geometry issue is illustrated in this picture (1)? Color (RGB)
32.
[4pts] And this image (2)? Grayscale/intensity
33.
[6pts] How is a color image represented in memory?
Matrix representation of the image where each pixel is a number in the matrix
with a value that represents the color information at that point.
34.
[6pts] What factors determine the size of a color image in bytes?
Resolution (width/height of pixels);Pixel array size (# of pixels); Color (binary
[b/w], grayscale, color); Color depth (# of bits to represent each pixel’s color info. [2^(bits) = # of colors]; Frame rate (# frames per second)
Image size = width * height * byte (byte = bit/# of colors)
35.
[6pts] The field type std_msgs/Header is included in most ROS message types. What is its purpose?
Provides a way to include temporal & spatial context info in ROS messages. Ensures proper synchronization & interpretation of data.
36.
[7pts] Briefly describe how the Canny edge detector works, what its inputs are, and what it outputs.
An image processing algorithm used to detect edges in an image while minimizing noise. Works by first smoothing out noise to reduce the impact of high-frequency noise on the process. Then the gradient of the image is calculated which identifies the intensity gradients (rate of change) in both
the horizontal and vertical directions. The magnitude (represents the strength of the intensity change at a location) and orientation (indicates the direction of the change) of the gradient are computed for each pixel. Non-maximum suppression is applied to the thin edges (only the pixel with the maximum gradient magnitude in the direction of the gradient is preserved for each pixel). Two thresholds, a high (strong edge) and low (weak edge) are applied to classify pixels as strong, weak, or non-edges (helps highlight the most significant edges [double thresholding]). Weak edges are
considered potential edges, and the algorithm tracks & connects them with strong edges (edge tracking by hysteresis). The primary input is a grayscale image (color often converted to grayscale). The output is a binary image where pixels are classified as edges or non-edges.
37.
[6pts] What is a Gaussian filter used for in computer vision?
Used for image smoothing or blurring (preprocessing to enhance the performance of algorithms). Effective in reducing noise in an image, constructing scale-space representation of images, gradient computation, corner detection, and image blending.
38.
[7pts] In Python, an image has been read in and assigned to variable "img". Write the line of Python code to apply an OpenCV 5x5 Gaussian mask filter to the image and store the result in a variable "img2" import cv2
img2 = cv2.GaussianBlur(img, (5, 5), 0)
39.
[6pts] What does Semantic Segmentation mean?
Computer vision task that involves classifying & labeling each pixel in an image with a corresponding class or category. Partitions an image into semantically meaningful regions, where each region is assigned a label that represents the category of the object/scene.
40.
[7pts] I have determined an object contour (an array of points), stored in Python variable 'pts'. I want to place a bounding box around the object in the image. How would I do this in OpenCV? (describe, identify function(s) – no code required).
You can calculate the bounding box using the ‘cv2.boundingRect()’ function, which returns the parameters of the bounding box. Then you can use these parameters to draw the rectangle on the image using the cv2.rectangle() function.
41.
[4pts] I want to find straight lines in an image. What technique might I use for this? Hough Line Transform
42.
[6pts] How does a Haar Cascade Classifier differ from using SIFT or ORB techniques?
Haar is used for object detection & SIFT is for key point detection, feature extraction, & matching. Haar requires training on positive & negative samples for a specific object; SIFT doesn’t require training for feature extraction. Haar features are not scale-invariant; SIFT is designed to be scale & rotation invariant. Haar does not provide descriptors (focuses on object presence or absence); SIFT provides descriptors that capture local image information around key points.
43.
[6pts] What pros/cons does MobileNet+SSD have over standard DNN object detection models w.r.t. mobile robots?
Pros – computational efficiency, speed, lower memory footprint, object detection; Cons – accuracy Labs 1 & 2 – Basics of Linux, ROS 2, git and GitHub
44.
[6pts] The current Linux terminal windows prompt is "gk@rosbotpc:~/ws/src$". What information is the prompt imparting?
The username is “gk” (the user currently logged in), the hostname is “rosbotpc” (name of the computer on the local network), the current working directory is “/ws/src” (where the user is located), “$” is the shell prompt (terminal is ready to accept commands)
45.
[6pts] Write the Linux command(s) to use in a Linux terminal to view the names of all files and folders in and below the current directory.
tree -a
46.
[6pts] In a Linux terminal, you are currently in your home directory ("~"). Give the Linux command(s) to:
Your preview ends here
Eager to read complete document? Join bartleby learn and gain access to the full version
- Access to all documents
- Unlimited textbook solutions
- 24/7 expert homework help
Delete file "xyz.py" in directory "~/ws/src/pkg1/pkg1": rm ~/ws/src/pkg1/pkg1/xyz.py
Delete directory "tmp" in your home directory and any of the folders and files in it: rm -r ~/tmp
Copy all the files in the directory "~/ws/src/pkg1/" to the directory "~/repo/pkg1": cp -r ~/ws/src/pkg1/* ~repo/pkg1/
In the workspace src directory "~/ws/src", make a directory "pkg2" with a "models" directory under it:mkdir -p ~ws/src/pkg2/models
47.
[5pts] Give the Linux terminal command(s) to view all uncommitted files in a git repo: git status -u
48.
[5pts] You checkout, edit and save a file in a git repo project. What needs to be done to commit it to the project repo?
Add changes to staging area: git add modified_file.txt or git add . (adds all changes); Commit the changes: git commit -m “message”; Push changes: git push
49.
[8pts] You are creating a git repo for a new project in Linux. Give the git commands needed to initialize it.
Navigate to directory: cd /path; Initialize git repository: git init; Create README file: nano README.md; Add & commit README file: git add README.md, git commit -m “Initial commit with README”
50.
[5pts] What is the purpose of the SSH key we created and added to your Linux and GitHub site?
Provides a secure & authenticated communication channel between local machine and the GitHub server. Steps: Generate SSH key pair; Add public key to GitHub; Use SSH for Git operations
51.
[7pts] You want to download a public ROS2 examples package (
https://github.com/ros2/examples.git
) to a local ROS2 workspace directory ("~/ws/src/examples"). Provide the Linux command(s) to accomplish this.
git clone https://github.com/ros2/examples.git
~/ws/src/examples; Clones the ROS2 examples repository and places in workspace
52.
[5pts] From the command prompt in a Linux terminal, provide the command to:
View all active ROS 2 topics: ros2 topic list
View all active ROS 2 nodes: ros2 node list
View all active ROS 2 services: ros2 service list
View the structure of message type "sensor_msgs/LaserScan": ros2 interface show sensor_msgs/msg/LaserScan
Subscribe to topic "/lab02/ControllerInfo": ros2 topic echo /lab02/ControllerInfo
Launch package "labX" with launch file "world1.launch.py": ros2 launch labX world1.launch.py
53.
[7pts] In ROS2, how do topics, services, and actions differ?
Topics are suitable for broadcasting data to multiple nodes asynchronously; Services are used for synchronous one-to-one communication with a request & reply; Actions are employed for more complex, asynchronous one-to-one communication with goals, feedback, and results
54.
[5pts] In ROS2, what are Parameters?
Mechanism for storing & retrieving configuration data at runtime. Provide a way to dynamically configure & tune behavior of noes in a ROS2 system. Offer a centralized & standardized way to manage configuration settings.
55.
[5pts] What is a ROS bag?
A file format in ROS used to store & replay ROS message data. Way to record data generated by ROS nodes during the runtime a robotic system.
56.
[5pts] What is a ROS message type?
Structured data type used for communication between different nodes.
57.
[5pts] What is the purpose of a launch file in a ROS2 package?
Used to specify & launch multiple ROS nodes & associated parameters with a single command. Node configuration, Node grouping, Parameterization, Namespace management, Remapping, Conditionals & loops, Composability, Launch description language (LDL)
58.
[5pts] What is the purpose of setup.py in a Python ROS2 package?
Enabling the build, installation, & distribution of the ROS package. Key purposes are: package metadata, dependencies specification, package installation, package entry points, test configuration, and distribution.
59.
[5pts] What is a ROS 2 Package?
Encapsulates & organizes the code, configuration, & resources related to a specific piece of functionality or component within a system. Used to structure & modularize code in a way that facilitates the development, deployment, & reuse of robotic software.
60.
[5pts] What does an SDF file define?
Defines the structure & properties of elements in a robot/simulation environment. Provides a way to describe various aspects of the systems, including models, sensors, actuators, physics, & environment enabling interoperability across frameworks.
61.
You are writing the code for a Python node file, which contains a class Navigator(Node). Write the Python code to:
[6pts] In the __init__ function, initiate a new class property "targetOrientation" to 0.0 (only write the line of code to accomplish this, not the function)
[7pts] In the __init__ function, register a subscription for topic name "MyTopics/cmd_vel" of type Twist, which will callback a function defined elsewhere in your class called "velocityCallback", and a queue length of 15 (only write the line of code to accomplish this, not the __init__ function or callback function)
[10pts] Define a callback function "stringCallback" for a topic with message type string. The function should extract the string data from the message and print it to the terminal.
# Initiating a new class property “targetOrientation” in the __init__ function
self.targetOrientation = 0.0 # Registering subscription for topic “MyTopics/cmd_vel” of type Twist w/callback function “velocityCallback” & a queue length of 15
self.create_subcription(Twist, ‘
MyTopics/cmd_vel’
, self.velocityCallback, 15) # Defining the callback function “stringCallback” for a topic with message type string
def stringCallback
(self, msg): # Extracting string data from the message & printing it to the terminal
string_data = msg.data print(
f”Received string message {string_data}”
)
62.
[5pts] What build system is used in ROS 2? And what build type is used for Python-only packages?
The default build system in ROS 2 is colcon (a build tool for development & building of software packages in ROS 2). For Python-only packages in ROS 2, the build type is the “ament_python” build type (specifically designed for building ROS 2 packages that consist entirely of Python code.