Meeting agenda: (just a thoughts half way working on the doc → this is a lot I will work on software with you)
System by System discussion
Project Comms:
What is the current status of the current Comms system
Definition:
The communication system is mainly from the ground station to the rover
Is the current comm system a fully decoupled system(meaning the only thing you need to change if we put on a new comms module is just the setting == No code change needed)
Multiple radio links and switches in between
RSSI to evaluate the radio signal strength?
What are the types of message going between the ground station and what are not being relayed back?
What if we want to debug can the comms help us to get the data we are interested in?
How can we differentiate the system:
By the amount of information being communicated
By the different link do different things(easier)
Project Drivetrain:
What are the involved module in between after the ros joy message is received
What data is being received by the CAN wrapper (fused)?
How can the autonomy module and sensor feedback interact with the current drivetrain system?
How Estop come into play with the current system
Do we want to enable brake feature?
Motor calibration feature support?
How frequently are we updating in CAN (regularly or only upon new data?)
How do we want to implement our own close loop control? velocity based? position based? just percentage of motor based?
Do we want to have a sensorless mode?
What are the mode inside the drivetrain?
GPS location controlled?
encoder position controlled?
Project Sensory:
What’s the status on Vn 300
What is the return of the module
How can we utilize the data
sensor fusion needed?
what are the other system that need to listen into the sensor?
what’s the way to update them upon request? regularly?
Ignore the localization board for now?
how can we define the duty of the four camera and what is the communication logic of all the four camera
For the environment sensing camera what are the algorithm we use
How can that algorithm highlight and guide the drivetrain system to successfully avoid object if needed
The other two camera is just for photo only? just bypass jetson?
Encoder integration plan? → any sensor fusion needed?
What about gimbal controller?
how can we initiate the control or there is just a fixed amount of operation you can do with gimbal hardcoded?
Project GUI
Same question what are the input and what are the output?
What is the available user defined input by joystick + emulated interface or commandline?
do we process data on GUI end or not?
Project ARM
What is the model we want to use?
Are we treating encoder and controller and motor as a close box module and the model just interface with that module?
what are the available control mode for the arm?
How can we do pipelining for the arm motion?
Like if user command like pick up something from a point and follow a path to point b
do we just give one point at a time or what?
This question should be worded as how can we interface with the model, we have one layer with it?
(note: there isn’t a lot for arm because if the model is good then we don’t worry but we need understand the system capability first)
AUX interface
How can jetson handle communication with PDB
How can jetson handle communication with general AUX board
can we define a generic custom message scheme?
What is the IPC between different threads and what is the message and thread priority
Is there any hard timing requirement for it?
What I want to get out of it?
A system architecture diagram for all the involved system and how they interact
A detailed document for the details of each system that is not captured by the diagram