Layered Realities - Smart City Safety
End-to-end security solution in cities / Safety and security of citizens / City management / Mobile edge computing / Network function virtualisation / Low enforcement / Proactive and quick response to emergencies
Sign up to find out more and be the first to receive all the programme details and secure your spot to take part in this weekend of FREE showcases and demos.
Over the weekend of 17th - 18th March 2018, the University of Bristol's Smart Internet Lab welcomes you to the world's first 5G public showcase in Millennium Square, Bristol. Demonstrations, talks and artistic experimentation combine in an exciting blend of technology and expression. This event is running in conjunction with Watershed & We The Curious.
Photo Credit: Kaleider's - 'From The Light Of Fire, Our Dancing Shadows'
All our demonstrations will take place in a large marquee on Millennium Square.
Given the critical importance of security in cities, innovative advances in wireless communications system are increasingly improving the safety of city inhabitants. New services such as audio and video monitoring of public areas and automated municipality rule infraction detection, allow a quicker response to threats and anomalies and all in order to prevent reoccurrence. The University of Bristol has deployed a smart city safety case study, as a proof of concept, to identify suspicious activities in the city. The basic components of this case study are listed below and they are connected together to the Internet through a WiFi Interface.
- Bike rider helmet
- Raspberry PI
- 360 degree camera and audio
Figure 1 shows a high level architecture of the smart city safety use case. The bike rider carries his helmet, to which the Raspberry PI is attached with a 360-degree camera. The bike rider's journey is captured with video and audio and sent via WiFi to the Mobile Edge Computing (MEC) or Cloud, to be processed. Once the audio and video has been processed and any suspicious activity has been detected, a notification is generated and sent to the different security agents.
Many of today’s municipalities are becoming test beds for the smart city experimentation, where technological capabilities are addressing daily needs ranging from parking to water treatment and city security. The University of Bristol is working to provide, through the 5GinFIRE platform, a smart city safety use case which has been deployed according to the architecture shown in Figure 2.
Figure 2 below shows the main building blocks that make the smart city safety case study a reality. Note that only open-source frameworks (OpenStack, OpenDayLight, etc) are being used to deploy the case study.
For the video processing two key technologies have been used:
1- Network Function Virtualisation: a virtual network function (VNF) video transcoder was designed, specified and deployed at Bristol network function virtualisation infrastructure (NFVI) via OSM MANO orchestractor.
2- Machine Learning: a face detection program has been trained and deployed at the NFVI.
Both VNF video transcoder and the face detection program can be moved along the network-edge and to the remote cloud to improve response time. In this case, the solution brings the computation close to the users.
Note that the video transcoder functionality is necessary for providing the right video format for different security agent's mobile devices, and also to facilitate the face detection procedure.
This is ongoing work as face recognition improves, audio transcoder and audio recognition are in the process of being designed, and gradually the implementation of a smart and safe city service comes into view.