Video 1.3 - On the Need of Mathematical Models
Video
#Physics #Engineering #Control_Theory #RoboticsTable of Contents:
B) On the Need of Mathematical Models
We have seen that "controls" deals with "dynamic systems" in generality, and robotics is one facet of this.
Now, we will make sense of what this means with a precise language like mathemantics. And to do this, we are going to need “models”
Models are an approximation, an abstraction of what the actual system is doing. And the "Control design" is going to be done relation to that model and then deployed on the **"real system"
The main question in "Control Theory" is:
- How do we pick the input signal "u"? In order to match the system output "y" with the reference "r"
B.1) Main "Performance Objectives" in Control Design
There are many objectives we can achieve when choosing a control signal (we will list them from the most critical to the less critical).
-
Stability Objective: loosely speaking, it's when the system doesn't blow up. If you design a controller that makes the system go unstable, then no other objective matters because you have failed.
-
Tracking Objective: which means we can make the system output "y" be equal to the reference "r". This means we can make robots follow paths and self driving cars follow roads.
-
Robustness Objective: this means that our controller (that was designed with a model), also works with the real system.
Remember that models are never going to be perfect, so we can't overcommit to a particular model and we can't have our controller be too highly dependent on what the particular parameters of a model are.
We need to design controllers that are "immune" to variations across parameters in the model.
- Disturbance Rejection Objective: Similar to robustness, the real would brings more complications. The controller has to be able to overcome at least a reasonable disturbances for the controller to be effective.
These disturbances can be:
- measurement noise
- external loads (e.g. "wind")
- friction
- Optimality Objective: this answers the questions - How do we do something the best possible way? And "best" can mean different things like: being the fastest, or using the least amount of fuel, etc.
B.2) Dynamical Models
So… What do these models look like?
B.2.1) Models in Discrete Time
Let's start in "Discrete time",
We can describe
We can understand this with the simplest example of a "Discrete Time System", a clock.
We can also plot this system.
It is a trivial system, but it is a Difference equation none the less and it changes over time.
Dynamics = Change Over Time
B.2.2) Models in Continuos Time
But we have a problem...
The laws of physics are all in "Continuous time".
Instead of describing the "next" state, we need derivatives with respect to time.
We can now describe the instantaneous rate of change of
Now, what would be the "Differential Equation" of our clock example?
It would be very simple, we would say that the rate of change of a clock would be 1 second each second, or with Newton's notation:
And we can also plot "one solution" of this differencial equation, we consider that the initial conditions of the clock are zero,
When we work with Dynamic Models, we are going to need almost always a continuous time model, because nature is continuous.
But our implementation with robots and computers run in "discrete time", so we need to know how to go from continuous time to discrete time.
B.3) From Continuous to Discrete
We need to sample our continuous model in some time interval
In other words,
So the main question is, what is
We can approximate
Read More the Taylor Series here
And if we use the first term only we get,
This is a way of getting a discrete time model from the continuous time model.
And this is how we are going to have to take things we do in continuous time and map it onto the actual implementations of computers that ultimately run in discrete time.
Z) 🗃️ Glossary
File | Definition |
---|