Hack2023 1st Prize: Difference between revisions

From MECwiki
Jump to: navigation, search
No edit summary
No edit summary
 
(3 intermediate revisions by the same user not shown)
Line 1: Line 1:
<p class="left" style="font-size:26px;"> ←''[[ETSI_-_LF_-_OCP_-_MEC_Hackathon_2023|MEC Hackathon 2023]]''<p>
<br>
{{DISPLAYTITLE:<span style="position: absolute; clip: rect(1px 1px 1px 1px); clip: rect(1px, 1px, 1px, 1px);">{{FULLPAGENAME}}</span>}}
{{DISPLAYTITLE:<span style="position: absolute; clip: rect(1px 1px 1px 1px); clip: rect(1px, 1px, 1px, 1px);">{{FULLPAGENAME}}</span>}}


Line 31: Line 33:


<div class="panel-body">
<div class="panel-body">
[[File:Edge Hackathon 2022 - Automotive Special Prize.jpg|600px|center|top|class=img-responsive]]
[[File:Optare1.png|300px|center|top|class=img-responsive]]
</div>
</div>
  <div class="panel-footer">From left to right,  Fernando Lamela and Santiago Rodriguez  with Jury members Bob Gazda (InterDigital) and Jyoti Sharma (5GAA Board/Verizon)</div>  
  <div class="panel-footer">From left to right,  Santiago Rodriguez  & Fernando Lamela</div>  
<!--  <div class="panel-footer">Footer text 1</div>  -->
<!--  <div class="panel-footer">Footer text 1</div>  -->
</div><!-- End of pan -->
</div><!-- End of pan -->
Line 42: Line 44:
= Introduction =  
= Introduction =  
<p>
<p>
* A connected vehicle (car) that with the support of 5G, MEC and artificial intelligence technologies is able to capture information from the surrounding environment and feed a smart platform with this information which can predict and adopt several actions for the good of people.
A computer vision platform or ecosystem that utilizes 5G, edge computing, and AI to handle
 
multiple video stream sources from various devices such as drones, fixed cameras, and IoT
[[File:hack2022-optare-architecture1v2.png|800px|center|top|class=img-responsive]]
sensors. </p>
<br>
<p>
* The information retrieved by the car and sent to the platform is collected by three different ways:
This platform aims to enable the application of different AI detection and analytic
** '''camera:''' the car is equipped with a camera that sends a video stream to the edge over which is executed an AI inference algorithm to detect several patterns and generate information packages relevant to the smart platform.
models based on user selection. </p>
** '''sensors:''' the car is equipped with several sensors (temperature, humidity, …) that can create heat maps with this information to the smart platform.
<p>
** '''the smart environment itself'''. In regions/areas where the information sources are isolated (low power, low signal quality, bad coverage due maintenance, weather conditions, etc), the car can act like a link between the information source and the smart environment platform, uploading this information in the next available coverage area crossed by the car in its route.
The inference process takes place at the edge, where computational resources are shared to execute AI models relevant to different scenarios, including forest fire and smoke detection, wildlife control, and deforestation.
 
* Every information contribution to the smart environment could be rewarded with points that can be exchanged with tax discounts, for example, guaranteeing the participation of the people establishing missions with a different degree of value.
 
* Every connected car can register on the smart environment platform and provide several information about the environment and people.  
</p>
</p>




= Architecture =
[[File:Green-cyclops.png|800px|center|top|class=img-responsive]]
<br>
<br>
[[File:hack2022-optare-architecture2.png|800px|center|top|class=img-responsive]]
The key components and functionalities of this computer vision platform can be described as
<br><br>
follows:
[[File:hack2022-optare-architecture3.png|800px|center|top|class=img-responsive]]
<p>
<br><br>
• Video Stream Management: The platform receives and manages multiple video
streams and information from diverse sources, such as drones, fixed cameras, and IoT
sensors. These streams are transmitted via the high-speed and low-latency capabilities
of a 5G network.
</p>
<p>
• IoT Events Management: The platform receives and manages multiple events from
devices placed on the ground to add information to the use cases in the described
scenario. </p>
<p>
• Edge Computing: The platform leverages edge computing infrastructure, which brings
computational resources closer to the video sources. This allows for real-time analysis
and decision-making at the network edge, reducing the need for data transmission to a
central server or cloud. Edge computing facilitates faster response times and more
efficient resource utilization.</p>
<p>
• AI Detection and Analytic Models: The platform supports the selection and deployment
of different AI detection and analytic models based on specific requirements. For
example, models can be chosen for forest fire and smoke detection, wildlife monitoring,
or deforestation analysis. These models utilize computer vision techniques, such as
object detection, image classification, and semantic segmentation, to extract
meaningful information from the video streams.</p>
<p>
• Model Inference at the Edge: The selected AI models are executed at the edge
computing nodes, where the necessary computational resources are available. This
eliminates the need to transmit large volumes of video data to a centralized server or
cloud for inference. By performing inference at the edge, the platform achieves
real-time analysis and enables quick response to detected events or anomalies.</p>
<p>
• Resource Sharing: allowing for the simultaneous execution of multiple AI models on
different video streams. This flexibility enables efficient resource allocation and
scalability to handle varying workloads.</p>
<p>
• Scenario-Specific Applications: The platform is adaptable and customizable to address
a wide range of use cases.</p>
<p>
• Cloud Management Applications: The platform delivers cloud components for high level
supervision and management purposes, as well as use case configuration to be
deployed over the edge nodes.</p>
<p>
• Sustainable applications: All the AI applications will be measured in terms of energy
consumption making smart decisions on how to run trying to make a sustainable
context</p>
<p>
• The offloading of the AI to the edge makes more efficient the drone consumption
helping in the autonomy of the UAVs


= Use Cases =
</p>
 
== UC1: Historical Data collection from autonomous MMTC Edge ==
[[File:hack2022-optare-component-UC1.png|800px|center|top|class=img-responsive]]


* Context: two different networks,
** Provided by Operator 1, connected, with Edge services in different nodes.
** Provided by Operator 2, autonomous, with MMTC Edge services for different local verticals and with LoraWan connectivity for security monitoring.
'''UC.1:''' a connected car scenario where it can act, once registered, as a carrier of the historical data generated by all the IoT devices connected to the MMTC Edge during an interval of time, in order to be stored and processed by services and the resources provided by the Operator 1 Edge.
* Considerations:
** MMTC Edge need to be efficient and guaranty IoT Flows based services to local verticals, with limited resources and delivers local 5G connectivity to IoT Devices
** Edge can have more resources, in addition to 5G connectivity, with backhaul, and can serve applications and services where it can balance between local and cloud.
<br><br>
== UC2: Service APP Update to autonomous MMTC Edge ==
<br>
[[File:hack2022-optare-component-UC2.png|800px|center|top|class=img-responsive]]
* In the context of two different networks,
** Provided by Operator 1, connected, with Edge services in different nodes.
** Provided by Operator 2, autonomous, with MMTC Edge services for different local verticals and with LoraWan connectivity for security monitoring.
'''UC.2:''' a connected car scenario where it can act, once registered, as a carrier of new versions, if exist, of MMTC IoT Security Flows, as a part of the ‘firmware’ of the node, that can be updated once security checks were passed.
* Considerations:
** MMTC Edge provides different functionalities related with the management of IoT devices and data produced, delivering an optimized environment for some verticals in locations with reduced access to broadband communications.
** MMTC Edge provides secure graphical access for customers to modify their own IoT Flows, but no to update or modify security flows (firmware), that only can be updated by an operator or by a connected car (with this UC) once is registered and security checks are made.


<br><br>
<br><br>
Line 98: Line 115:
• '''Project repository'''
• '''Project repository'''


https://github.com/flamela/kitt-etsi-hackathon22
https://github.com/flamela/green-cyclops-etsi-hack23
[[File:hack2022-optare-car_dashboard.png|600px|center|class=img-responsive]]


Web App for Managing green cyclops Edge app and model ecosystem as well as dashboard representations.


= Hackathon Alignment =
Need to be complemented with backend API not availbale in this repo, that is only uploaded to be used as inspiration or reusability, for another teams.
<p> This project has fullfilled the hackathon criteria required by the jury </p>
[[File:Github-optare.png|600px|center|class=img-responsive]]
{| class="wikitable"
|+
|-
|'''Innovation'''
|[[File:hack2022-optare-outcome2.png|60px|center|class=img-responsive]]
|Digital divide of 5g end edge in some areas because its complexity and the


cost to bring. The mobile backhaul via vehicular net could help on that


proposing a network and computation for agriculture and remote industries
= Project Videos =
|-
|'''Use-Case / Solution Credibility'''
||[[File:hack2022-optare-outcome2.png|60px|center|class=img-responsive]]
|Create a platform with capability for sending and receive data using the


different edge covered areas and using cheap and autonomous equipment
for a quick deployment. Integrated demo is created for hackathon
|-
|'''Use of ETSI MEC Services'''
||[[File:hack2022-optare-outcome2.png|60px|center|class=img-responsive]]
|MEC013 for location information, MEC012 for radio network information,
MEC021 for mobility and depending on the availability of more than one
sandbox mep, MEC030 related with V2X and the automotive blueprint
MEC011 for application registration, mec service discovery
|-
|'''Use of LF Edge Akraino Blueprints'''
||[[File:hack2022-optare-outcome2.png|60px|center|class=img-responsive]]
|Checked the Elliot Manager and Nodes. Needs a continuum connectivity
among the components and this is not possible in this architecture. Follow
some APIs for upload new app images. Tested MEC-based Stable Topology
Prediction for Vehicular Networks but similar and more evolved with MEC030
|-
|'''Deliverable'''
||[[File:hack2022-optare-outcome2.png|60px|center|class=img-responsive]]
|Developed all the sw components running in a demo environment. Designed
the autonomous antenna (Spanish self delivery frequency and edge
platform). Tested with a wifi connection and a static 5G antenna
|-
|}
<br>
<!--
* Full details [//mecwiki.etsi.org/images/MEC_Service_Federation_for_Location-aware_IoT_with_DevOps_MEC_Infra_Orchestration_v3.pdf here]
<br>
[[File:Hack2022-Domino use case1.png|600px|center|top|class=img-responsive]]
<br>
-->
= Project Videos =
SOON
<!--
* See the demo video of the project [[https://wiki.akraino.org/display/AK/ETSI-LF+Edge+Akraino+Hackathon+2022?preview=/61835204/61835620/DOMINO-DEMO-EQIX-5G-MEC.mp4 here] ]


* See the presentation pitch at the Edge Computing World conference [[https://wiki.akraino.org/download/attachments/61835204/ETSI-LFE-HACKATHON-DOMINO.mp4 here]]


-->
{{#evu:https://www.youtube.com/watch?v=zmEYpDqo0FU
|alignment=inline
|dimensions="120"
}}

Latest revision as of 09:04, 28 March 2024

MEC Hackathon 2023


1st Prize Award

Managing natural resources using AI, Edge Computing and Advanced Communication


Team

Team Green Cyclops from Optare Solutions

  • Xose Ramon Sousa Vazquez
  • Santiago Rodriguez Garcia
  • Fernando Lamela Nieto
Optare1.png

Introduction

A computer vision platform or ecosystem that utilizes 5G, edge computing, and AI to handle multiple video stream sources from various devices such as drones, fixed cameras, and IoT sensors.

This platform aims to enable the application of different AI detection and analytic models based on user selection.

The inference process takes place at the edge, where computational resources are shared to execute AI models relevant to different scenarios, including forest fire and smoke detection, wildlife control, and deforestation.


Green-cyclops.png


The key components and functionalities of this computer vision platform can be described as follows:

• Video Stream Management: The platform receives and manages multiple video streams and information from diverse sources, such as drones, fixed cameras, and IoT sensors. These streams are transmitted via the high-speed and low-latency capabilities of a 5G network.

• IoT Events Management: The platform receives and manages multiple events from devices placed on the ground to add information to the use cases in the described scenario.

• Edge Computing: The platform leverages edge computing infrastructure, which brings computational resources closer to the video sources. This allows for real-time analysis and decision-making at the network edge, reducing the need for data transmission to a central server or cloud. Edge computing facilitates faster response times and more efficient resource utilization.

• AI Detection and Analytic Models: The platform supports the selection and deployment of different AI detection and analytic models based on specific requirements. For example, models can be chosen for forest fire and smoke detection, wildlife monitoring, or deforestation analysis. These models utilize computer vision techniques, such as object detection, image classification, and semantic segmentation, to extract meaningful information from the video streams.

• Model Inference at the Edge: The selected AI models are executed at the edge computing nodes, where the necessary computational resources are available. This eliminates the need to transmit large volumes of video data to a centralized server or cloud for inference. By performing inference at the edge, the platform achieves real-time analysis and enables quick response to detected events or anomalies.

• Resource Sharing: allowing for the simultaneous execution of multiple AI models on different video streams. This flexibility enables efficient resource allocation and scalability to handle varying workloads.

• Scenario-Specific Applications: The platform is adaptable and customizable to address a wide range of use cases.

• Cloud Management Applications: The platform delivers cloud components for high level supervision and management purposes, as well as use case configuration to be deployed over the edge nodes.

• Sustainable applications: All the AI applications will be measured in terms of energy consumption making smart decisions on how to run trying to make a sustainable context

• The offloading of the AI to the edge makes more efficient the drone consumption helping in the autonomy of the UAVs




Software resources

Project repository

https://github.com/flamela/green-cyclops-etsi-hack23

Web App for Managing green cyclops Edge app and model ecosystem as well as dashboard representations.

Need to be complemented with backend API not availbale in this repo, that is only uploaded to be used as inspiration or reusability, for another teams.

Github-optare.png


Project Videos