Computer Vision in FIRST: Sam Carlberg Brad Miller
Computer Vision in FIRST: Sam Carlberg Brad Miller
Sam Carlberg
Brad Miller
!2
Agenda
• Overview
• What was new in 2018
• Camera hardware options
• cscore
• CameraServer (NetworkTable integration)
• GRIP
• Code generation
• WPILib - VisionThread/VisionRunner
• End-to-end Demo with Raspberry Pi 3
!3
Overview
• What is computer vision?
• Photography + Computer Processing
• Start with good photos
• Exposure & lighting
• Composition & focal length
• Then process digitized values to measure, compare, and rate
!4
Overview
• Math... not magic
• Example – What makes the target unique?
• Describe it to a friend on the phone/email
• Quantify that info by measuring
• Color, size, area/CH-Area,
• Aspect Ratio, Moment of Inertia
• Limit Tests
!5
FRC Workflow
• Goals on the field have often been marked with retro-reflective tape
• Make use of provided images on first day to design vision algorithms
• Create a vision program based on the design
• Use robot control loops based on gyros or other fast sensors – usually not on
camera frames
!6
OpenCV
• Popular computer vision library
• C, C#, C++, Java, Python, and LabVIEW interfaces
• Runs on Linux, Windows, Mac, iOS, and Android
• Libraries included with WPILib for the roboRIO
• Designed for computational efficiency and can take advantage of accelerated
hardware (GPUs)
• User community of 47,000 people and over 9M downloads
!8
Cameras
IP $$
USB Webcam $
Pixy Cam $
LimeLight $$$
!9
IP Cameras
• Typically used for security
• Axis has low latency and good API
• Support multiple clients
• Easiest to stream to web page or other viewers
• Fixed lens, basic configuration
!10
PixyCam
• Camera with built-in processor
• Limited capabilities, but has been used successfully by some teams (blob and
color code tracking)
• Lots of output types (SPI, I2C, UART, analog/digital)
• Includes a cable to connect directly to an Arduino along with some libraries
• Very fast – 50 updates/sec
!13
Basler Dart
• Sensitive sensor & Global Shutter
• 54fps color, 120 BW (USB3 full frame)
• Area scan – lower bw, even higher fps
• Hundreds of properties
• Digital lines for exposure control
• Lenses from companies like Edmunds
!15
Hardware processing
• roboRIO – in an independent thread
• Driver station computer
• Onboard coprocessor
• Raspberry PI
• Kangaroo
• NVIDIA TK1
• Cell phone
• LimeLight (Pi Compute Module 3)
• ….
!16
CameraServer
!17
cscore
• Makes cameras easy to use
• Source→sink model
• Can switch sinks between sources
• Support for USB and IP cameras
• Multiplatform, except USB camera support is Linux only
• USB camera support for Windows and Mac planned for 2019
• Set camera properties (if supported by the camera)
• Brightness
• Exposure
• Frame rate
• Frame size
• MJPEG streaming webserver (looks like an IP camera)
• Webpage access to camera properties for easy experimentation
!18
CameraServer
• Wrapper around cscore
• Automatically creates MJPEG server when camera is created
• Publishes information about created cameras to NetworkTables
• Restreams IP cameras via roboRIO USB port
!19
Code examples
• Single USB camera
CameraServer.getInstance().startAutomaticCapture();
Code examples
• Camera Settings
UsbCamera frontCamera =
CameraServer.getInstance().startAutomaticCapture(“Front",
“/.dev/v4l/by-path/platform-3f980000.usb-usb-0:1.2:1.0-video-
index0");
frontCamera.setVideoMode(PixelFormat.kMJPEG, 320, 240, 30);
frontCamera.setBrightness(50);
frontCamera.setWhiteBalanceHoldCurrent();
frontCamera.setExposureManual(15);
!21
GRIP
!23
GRIP
• Developed as a WPI RBE/CS senior capstone project
• Designed to make vision processing easier
• FRC teams and researchers
• Based on OpenCV
• Works equally well with any robot programming language
!24
GRIP
!26
GRIP
!27
GRIP Workflows
• Select a source – camera, video, or saved images
• Create the processing pipeline while viewing intermediate results
• Set outputs via network tables variables
• Robot program gets values and drives/turns/aims at target
Three workflows:
1. GRIP on DS laptop
2. GRIP generated code on roboRIO
3. GRIP generated code on co-processor
!28
GRIP Results
• What happened in 2017
• 16,000 downloads!
• Code generation
• HTTP REST API
• Many teams running on roboRIO, driver station and co-processors
• Future features
• GPU acceleration
• Improved FRC usage
• Limelight will support Grip in 2019
GRIP Demo
!30
LimeLight camera
Simple to start
● Designed for FRC camera system that works with Java, C++,
LabVIEW and Python
● Includes camera, LED lights, microprocessor with network
tables, and a web server
● Easy to mount, easy to power, built in vision-pipeline
● Tune the pipeline parameters using web-browser
● Robot reads data from the “LimeLight” NetworkTables table
LimeLight camera 2
● Has built-in “Fixed Function”
OpenCV vision pipeline
○ Threshold
○ Erode
○ Dilate
○ Filter contours
○ Targeting logic
● Up to 10 pipeline configs
● Second USB camera can be
added to the feed
Raspberry Pi 3 Demo
• 2 cameras hooked up to Pi3 (both USB)
• Pi Camera also supported but not demo’ed
• Pi configured with read only file system
• For demo purposes, building with Eclipse project
• Could use gradle, etc
• Deploy GRIP pipeline with generated code
Ultimate Simplicity
• Like Jeff Goldblum said in Jurassic Park, “life finds a way”
• No camera – just a flashlight for aiming with the vision sensor being the
drivers at the driver station