EdgeLab Digital Twin - Connected Car Use Case
Our simulation starts from a quiet / static situation. In the dashboard, we can see that the total attached UEs is zero, and that the data exchange is static. At the console, we see the initial definitions and set-up happening.
Note that the cars are not yet connected to the simulation, so we need to connect them one by one. We will connect our UE under test - the "real" car, and after him, an additional 29 UEs will be connected.
We display full network connectivity over a wireless network - mimicking the network that will be found between cars and between each car and the road infrastructure.
On the dashboard, we see the number of connected UEs slowly rising from zero to 30, and the data movements indicating that something is happening. Once the cars are connected, we are able to interact with them along the whole run.
On the console (above) we see the UEs being created and connected. We also see a traffic light that is sending information as it turns from red to green.
With a multitude of cars going through the urban environment, our test subject has the opportunity to interact with many of them, traffic lights, and other players in the virtual scenery. Data is collected along the way and processed by the test subject in order to take decisions regarding the continuation of its ride along the simulated streets.
In the visualization screen, we can see the test subject waiting for his green light in order to carry on. Likewise, we see two cars in the opposite directions that are turning right and left in front of our car. The small green & brown diagram shows the current position of each of the UEs that are connected in the platform.
At the end of the run, we observe the console wrapping up the activity. We ping our test subject and verify that it is still connected to the platform. The dashboard shows a stable 30 UEs count and lots of data activity in and out of the test subject and system:
And now for a real-life case that could be addressed by using a platform such as EdgeLab: in a recent Tesla accident, the Tesla autodrive apparently did not understand properly a parked car which had its hood open and drove into it. The driver at one point understood that something went wrong, took control and hit the breaks, but it was too late and a crash happened. Once such situations happen, the Tesla black box may be retrieved and the last drive simulated over the EdgeLab platform. If the conclusion is a vision processor fault, it will be made apparent soon enough, and the vision processor may be retrained to cover for the gap.
Note that this is a very simplistic demonstration and the capabilities of the platform can be easily extended to reach the specific needs of any use case.