Infrastructure

The Aveiro Tech City Living Lab (ATCLL) is divided into the following components that will assist the scenario: (i) sensing, devices that collect various forms of data, and services performing data fusion; (ii) connected vehicles, with support of autonomous driving capabilities, and also regular connected public buses (iii) access and edge, which is supported by a 5G private network, edge computing, Software-Defined Networks (SDNs), and a vehicular network infrastructure (V2X); (iv) backhaul and core, which consists in a mesh network that connects by fibre, millimetre-wave, and satellite communications the access network to the core network devices and servers; and (iv) a backend and data platform, which are supported by a platform core and data processing services, built with support by current standards like the NGSI-LD. The platform is also federated within Fed4Fire+ (https://www.fed4fire.eu/testbeds/).

Sensing

The ATCLL contains mobile sensing devices and other geolocation sensors installed on vehicles (buses and garbage collection vehicles), lamp posts and bicycles. The available mobile sensing information comprises GPS location, speed and heading, temperature, humidity, and air pressure, which enables the complete mobility map of the city. The Wi-Fi access points, both in the static access points and in mobile onboard units in the vehicles, are also gathering data about city occupation through people’s smartphones. In terms of static sensors, the edge nodes of the infrastructure have traffic radars, LiDARs, video cameras, and computing units. Information coming from these devices is aggregated -using data fusion techniques- and correlated between themselves and with the ITS messages’ information to give insights into the people’s flow, providing concrete elements for new solutions on public transportation, safety-critical and autonomous driving systems, and identifying problems and optimising mobility in the city. A set of Unmanned Aerial Vehicles (UAVs) equipped with a 5G base station, video cameras and environment sensors are also part of the extended infrastructure as mobile sensing units to gather data from the city and to give support to patrolling and traffic management. Communication equipment is also installed in the vehicles to transmit mobility and environment data through the static access points, making a complete live map of these parameters in the city, and providing the required data for environment sensing, traffic monitoring, and safe driving systems. The mobile equipment is composed of a Data-Collection Unit (DCU), which integrates Wi-Fi and LoRa communication. The vehicles also include an OnBoard Unit (OBU) with Wi-Fi, ITS-G5, and 5G communication to establish the connection with the RoadSide Units (RSUs), as well as with other vehicles.

Connected Vehicles

Beyond the regular connected buses, the infrastructure includes autonomous driving development capabilities, with the availability of the PIXKIT (https://www.pixmoving.com/pixkit). It is a vehicle with dimensions similar to a light vehicle (in terms of chassis length) and has high flexibility for experimenting with sensors, computing and communications. It is also an open platform, aimed at research and teaching institutions, and with direct support from one of the most recognised Autonomous Driving (AD) platforms, Autoware, controlling the vehicle through a rich open Controller Area Network (CAN) interface. PIXKIT includes a drive-by-wire hardware platform with a set of sensors to achieve basic autonomous driving functions, like path planning, path following, vector map building, obstacle avoidance, traffic light recognition, etc. The sensor package includes LiDARs, cameras and GNSS. The vehicle integrates the road infrastructure and core network, creating a computing and communications vehicle-edge-cloud continuum, allowing for an end-to-end distribution and orchestration of microservices and support for deterministic and low-latency use cases.

Access and Edge

The access network consists of several technologies that assist different use-case scenarios/verticals in the city. As presented, the smart lamppost/wall box installed throughout the city, combines different equipment for several purposes: sensing devices, communication access points, and Multi-access Edge Computing (MEC). In terms of communication technologies, there are several devices providing wireless access with different technologies and domains. The 5G RAN is based on a commercial solution and a research solution based on a Software Defined Radio (SDR) compatible with O-RAN network architecture with the support of a RAN Intelligent Controller (RIC), using FPGA-based units or USRPs. This access infrastructure provides a complete 5G private network, composed of UEs, CPEs and gNodeBs. Additionally, for Wi-Fi and V2X communications, the Road Side Units (RSUs) combine two wireless modules: Wi-Fi and ITS-G5 (being ready to integrate cellular-vehicle-to-everything (C-V2X)). Some of the edge points integrate LoRa and/or LoRaWAN gateways. Finally, for MEC, every point has available a set of NVIDIA Jetson Orin or Jetson Xavier with a powerful GPU for graphic-intensive processing.

Backhaul and Core

The backhaul infrastructure is based on Single-Mode optical Fibre (SMF) link technology (G.652), spanning a length of 16 Km. It interconnects the 44 edge nodes covering the urban area of Aveiro, deployed on Smart Lamp Posts or wall boxes on building facades. The physical infrastructure of the access network comprises one-to-one fibre links aggregated by a switch supporting 10 GbE connections and SDN functionality through Open vSwitch (OVS) or P4. The uplink to the core is established with 40 GbE ports. Other communication types for backhaul communication include millimetre-wave and satellite communications. The mesh network composing the backhaul also includes 5 millimetre-wave antennas linking Smart Lamp Posts directly for SDN experiments (in the University of Aveiro Campus, and 3 nodes in the city centre). The Non-Terrestrial Network (NTN) domain of the infrastructure relates to 2 Starlink business antennas on the rooftop of IT, giving the network three Internet connections. SDN support enables the network control through software, utilising open-source controllers like Ryu and ONOS, along with control plane protocols OpenFlow and P4 Runtime. These SDN controllers are deployed in the core of the network and dynamically react to possible changes in demand and network conditions. Moreover, network algorithms and Machine Learning (ML) techniques are incorporated to monitor and act within the network. For instance, in the vehicular network (V2X), algorithms within the SDN controller facilitate Internet connectivity and handovers in the mobile nodes. This platform also facilitates the automated deployment of applications using containerisation technologies, microservices, and Kubernetes for end-to-end orchestration support. Additionally, various services in different nodes can be personalised through dynamic allocation and orchestration based on Quality of Service (QoS) and dynamic slicing management. Time Sensitive Network (TSN) functions are also supported on the backhaul of the network. To assist critical services with bounded-latency requirements or with required bandwidths, the network is synchronised to a nano-second precision, and it is possible to configure the network switches with different types of shapers (e.g., TAS, CBS). At the core, the infrastructure is complemented by a network and data processing centre. The datacenter consists of multiple servers managed by Proxmox VE, deploying all virtual machines with services, data platform, apps, 5G core (Open5GS), and network functions running in the cloud, such as Kubernetes, OSM, ChirpStack, among others. An important feature of the infrastructure is the ability to provide access to sensors, network devices, or services to third parties, facilitated by OpenVPN or Wireguard technologies.