EdgeLab Digital Twin - Lidar Development Use Case
In this Use Case, we start with a Real-Life or Simulated Lidar unit, that travels around a city environment in a simulated UE (= User Equipment = car), interacting with other urban elements such as cars, traffic lights, pedestrians, motorcicles, road obstacles and so on. Along the way, the lidar produces outputs which are sent to the on-chip processor. The device driver processes the raw data and sends it in a "down-to-earth" format to the application software.
So here, it is all about the processor. The involved SW modules are the device drivers and the application software. "Default" SW stacks for ADAS can be provided to the SoC developer by the platform, ported to various ISAs. Note that an ISA can perform better than another, in this case, given its ability to process this type of signals.
Various types of activities can be carried out in this use case, with the help of EdgeLab:
at the chip level, we can develop and clean our Verilog with the help of e.g. Mentor's emulator on the cloud. Our EdgeLab platform is capable to connect to the emulator and share the synthetic or real data with it.
at the device driver level, the device driver can intercept the lidar signals from the platform (which could be synthetic and created by the platform or made by a real Lidar that is connected to the system), and thus driver development and debug may be based on the lidar raw data that is produced by EdgeLab while the car is circulating on its urban environment.
at the application level, application code development could be based on the synthetic data that comes from EdgeLab and is intercepted at the application input, or even from the real device drivers, who have used on their turn the synthetic or real data that was produced for them by the EdgeLab platform.
Thus, with the help of the EdgeLab platform, you may achieve several development or validation goals on your design: you may utilize the platform data to assess the performance of the ISA you have chosen, to develop and validate your device drivers, to develop and validate your application code, or even to feed back to the system post-deployment issues that require the attention of the Lidar developers.
Let's take a look at a few screenshots and see how this is done:
Our simulation starts with an urban environment in which our Lidar-equipped car will circulate.
In this specific run, we have chosen to have 29 fully connected UEs (vehicles) in the near vicinity of the UE under test, which will interact with us in different times. Also a number of crossings with traffic lights that open and close in their own timing and need to be properly understood by the Lidar system.
Throughout the whole driving session, the lidar is active and produces raw data that is forwarded to the car:
In the diagram above, the left side shows the output from the lidar. The upper right section shows the raw data as it is received by the software device drivers, and the lower right section shows the processed data as it is delivered to the application software.
At a first approach, the device drivers may be developed with the help of the Lidar raw data that is intercepted from the simulation environment. Note that the Lidar raw data may also be recorded into a file and fed into the software environment out of such recorded data, to make it repeatable and for ease of handling.
Once the device drivers are capable to deliver information into the application, the application may be developed with the help of the platform, by feeding the device drivers' output into the application level. The same may be achieved even before the drivers are fully developed, by intercepting the simulated environment and software drivers output data directly at the application level. By doing so, software development becomes more of a parallel effort and the project schedules may be shortened.