fbpx
Connect with us

Feature Articles

The hidden privacy-breaching possibilities of smart cities

Inside Telecom Staff

Published

 on

Smart Cities

There already are smart cities located in different parts of the world. These cities utilize a confluence of sensors, information, and cyber algorithms to better the lives of city residents.

The technology used by a smart city presents numerous benefits from the transport sector to the law enforcement division. Sensors at bus-stops can relay information to the smart city’s system to determine how many buses to send. Smart parking can help drivers find parking spaces quicker.

Connected police stations and body cameras can also benefit the police, making the city safer. Air quality sensors in a smart city can detect areas of pollution, its causes, and present a solution. Above all, these qualities can skyrocket a city’s gross domestic product through the introduction of businesses.

Data collection and its privacy concerns

The issue with smart cities is that they need data for efficient running. This type of data can be collected by monitoring human interactions, facial recognition, data from residents’ smartphones, and so on. Thus, the line between legal data collection and privacy becomes blurred, and the encroachment on human rights begins.

Other security risks could emanate from massive data recordings. Hackers can perform cyberattacks on the system storing the information of smart city inhabitants. It is highly likely that smart cities will keep all of the information collected in one place.

Since smart city systems will be online at all times, it leaves them vulnerable to cyber threats. This translates to a single point of failure which could be exploited by cybercriminals. We’ve seen the effects of these threats in countries like Sweden where Distributed Denial of Service (DDoS) attacks on smart systems halted train transportation.

There is also a big possibility of user data weaned from smart cities to get sold on the dark web. Attacks perpetrated by a cybercriminal against eleven firms have resulted in the sale of user data on the dark web for over $23,000 in May 2020. Imagine a situation where the personal data of smart city residents goes up for sale. Targeted attacks would increase, burglaries and stalking would be on the rise too.

Solving these privacy problems

Before the establishment of global smart city initiatives, the rules on data collection need to be outlined. An independent panel selected by the people should critically analyze the guidelines and confer with the city’s inhabitants.

Preventive measures such as using VPNs to mask internet traffic should be implemented. VPN ensures the security of your online activity by creating a tunnel between your device and the communication node you’re connected to. Wearing face masks to prevent facial recognition can also be an effective method to stay private.

We’re a diverse group of industry professionals from all corners of the world. Our desire is to provide a high-quality telecoms publication that caters to an international market, offering the latest and most relevant telecoms information to businesses, entrepreneurs and enthusiasts.

Feature Articles

IoT: The hidden costs of the always connected age

Inside Telecom Staff

Published

 on

IOT Hidden Cost

By: Hannes Gredler, founder and CTO of RtBrick.

Homes are overflowing with connected devices. Everything from “smart” televisions, speakers, light bulbs, and even smart microwaves have surged in popularity in the past several years – with over 10 billion IoT devices expected to be in use worldwide this year. Many of these devices are always-on and always-connected, ready to be used at the utterance of a simple voice command. This means that even when they aren’t “on,” the devices are still consuming data. This explosion in data consumption comes at a cost, however, and so far, telecommunication companies have borne the brunt of it.

Data isn’t free. Increased network capacity takes power, and power costs money. Multiplied across the millions of customers in a company’s network, the effects become significant. Most broadband plans offered by telecoms operators allow unlimited data access for a set monthly price, with packages priced by bandwidth rather than total data usage. This is great for consumers who continue to stream more and add new connected devices to their homes, but still pay the same price for internet. For the companies providing the network, every additional bit of data consumed comes at a cost.

Most consumers are not prepared to pay more for this increased data that fuels their IoT devices and allows Netflix to be binged for hours on end. This leaves telecom companies in a predicament. How can they keep customers happy without raising prices, while not becoming overwhelmed by the large increases in data being consumed thanks to the influx of connected devices?

The complications of growing data needs

Network service providers can typically expect a 35 to 40 percent growth in their network traffic each year, a number that will only continue to climb as connected devices become more prevalent. This traffic is mainly delivered using CDNs (Content Delivery Networks), which hold copies of popular content in servers embedded throughout the network. Although this mitigates investment in the core of the network, it doesn’t help reduce costs in the access networks, which are consuming an increasing amount of data and power.

When it comes to data consumption, not all connected devices are created equal either. Some only consume data in short bursts when being used – like a laptop used for web browsing. Smart TVs, on the other hand, are some of the biggest offenders when it comes to data consumption, with an HD TV stream using 5 Mbps and a 4k stream using 10 Mbps. This is constant whenever the television is in use, making the total data sent quickly amass as consumers stream episode after episode.

Other devices that come in high on the data usage list include video chat services like Skype and Zoom, gaming, a CCTV security camera backing up to the cloud, and smart home devices that are always on and connected, even when no one is home. This last group is often overlooked when it comes to data consumption. While it’s easy to understand how streaming HD TV for hours can consume a lot of data, many people don’t realize that their smart speakers, lightbulbs, doorbells and other home IoT devices are constantly pinging data to their respective servers. This is why they’re always ready to use at a moment’s notice, but also what makes the addition of each new device to a home so costly from a network perspective.

The social distancing measures being used to mitigate the current Covid-19 pandemic are an example of what happens when this growth in data consumption occurs quickly. People are staying home and working remotely in high numbers – and using their home networks much more than usual. In Europe, networks haven’t been able to handle this influx, forcing content providers to only allow video to be streamed in standard definition. While this heightened network load is being caused by a serious situation that requires sacrifices, it also provides a look into the future at what can happen as the number of devices in use surges. Remote working was already a benefit gaining popularity before this current crisis, and its proving that networks will need to increase their efficiency and ability to scale to cope.

Overcoming the growth barrier

With data needs only set to increase as connected devices rise in popularity, telecom companies need to be able to scale their networks to accommodate the increasing data load, while cutting costs. The power behind networks is always going to come with a price, but the systems distributing the data across the network can be made more effective. Restructuring, reorganization and efficiency measures have their limits to solving the issue, however.

Telecom companies have been following the same network building playbook for years, but it unfortunately no longer works in the age of connected devices. To adapt networks so they can handle modern technologies, telecom companies should follow the path of their more modern, cloud-native counterparts. These companies have exceled by building their infrastructure using off-the-shelf hardware, running independent software, and deploying it using zero-touch-provisioning systems. Just think about how easy it is for Facebook to roll out a new feature or scale their platform that can handle influxes of use with relative ease.

There’s no reason that telecom companies can’t emulate this and look to build their networks in a similar way. Formally, this is known as network disaggregation. Simply, it means that networking software and hardware are purchased separately. In the past, networking infrastructure and software has been only sold as a package, but as networks modernize and look to model themselves after internet-natives that’s changing. Aside from being able to scale and update with ease, the main benefit of the disaggregated model is that its costs are greatly reduced. This will be vital as the number of connected devices surge.

Network service providers also need to plan better to ensure they aren’t caught in a situation where they can’t keep up with rising data consumption. Passing costs onto consumers will not be an option as competition and innovation in the space continues to progress. Knowing that data usage will only continue to surge, telecom companies need to plan for the future and adopt a network structure that will allow them to make upgrades to provide more data and increased bandwidth for devices that require it. This means looking beyond the typical network upgrade cycle that happens every few years and shifting to a model that allows scaling and software upgrades to happen in a quicker and more cost-effective manner.

Connected devices don’t have to mean the downfall of traditional telecoms companies and their networks, but their near exponential growth will require a new mindset when it comes to building access networks.

Continue Reading

Feature Articles

Growing use of cloud storage, but a hybrid approach remains

Yehia El Amine

Published

 on

Cloud

Technological trends and developments are changing the way data is stored, along with the way it needs to be protected; which is why data security has been placed at the heart of organizations’ concerns as they edge closer toward complete digitization.

As data continues to become not only more valuable, but also more essential for how organizations operate, so does its targeting by cybercriminals far and wide.

In parallel, high-profile data breaches are making the public more distrustful of organizations who store personal data – and causing governmental entities to create stringent compliance regulations.

According to numbers by U.S.-based market intelligence firm IDC, the total volume of digital data people and businesses have created worldwide is approximately 50 zettabytes in 2020 and is projected to triple by 2025.

While companies have either opted toward complete cloud storage, or on-premise storage, the risk remains the same.

“Attacks on cloud services have more than doubled, while ransomware demands are now the leading form of incidents during data breaches,” a report by U.S.-based cybersecurity firm, Trustwave stated.

The report added that the global annual cost of cybercrime, when factoring in regulatory fines, is projected to run into the trillions in the near future.

Trustwave has also recently published a survey to touch up on how enterprises are protecting and storing their data.

Small in numbers, big in responsibilities 

According to the survey, over 75 percent of respondents were part of organizations with over 500 employees and 40 percent were part of organizations with over 1,000 employees.

Yet, of those polled, most (24 percent) had a security team size between 16-20. Just 20 percent had a team of 21 or greater. On average, Singapore had the smallest security team size with 47 percent reporting having between 1-10 members.

Thus, with the average cost of data breaches rising, the stakes for organizations couldn’t be higher.

“Cybercrime is becoming so prevalent, organizations are now putting as much emphasis on detection and response as they are prevention— meaning that small cybersecurity teams are compelled to use managed security services providers and automation tools and services to supplement efforts,” the survey highlighted.

Increased use of cloud storage

When asked where their data currently resides, most respondents (55 percent) said that they use both public cloud and on-premises databases to store data, while only 17 percent solely use public cloud only.

Singapore organizations use the hybrid model much more frequently at 73 percent which is 18 percent higher than the total, and US organizations use it less frequently at 45 percent, while storing data on-premises only, the most at 35 percent.

While many organizations across industries are moving their sensitive data into some form of cloud storage such as multi-cloud environments and cloud-based services, including cloud-based applications, the focus on security intensifies.

“There’s a common misconception that cloud service providers (CSPs) are responsible for security of the platform or share liability for breaches. That’s almost always not true – meaning your team is still responsible for finding security solutions that can work seamlessly no matter where your data resides, whether it’s on-premises, in the cloud(s) or split across a combination of both,” the survey emphasized.

More than one service

When asked how many third-party cloud services organizations use, most respondents at 70 percent use between 2-4 public cloud services followed by 18 percent using 0 or just one, and 12 percent use 5 or more.

At 14 percent, the United States had the most instances of using 5 or more public cloud services followed by the United Kingdom at 13 percent, while Australia and Singapore were tied at 9 percent.

“What’s interesting here is the added level of complexity that security teams now face, with cybersecurity teams tasked with managing multiple vendors – as well as needing to account for the previously mentioned additional security risks that cloud services create,” the survey reported.

However, the prevalent approach by enterprises has been to adopt a hybrid model of using both cloud and on-premise data storage to not have a centralized database, which acts as another line of defense against hackers by removing one piece of the puzzle away from their digital fingertips.

Continue Reading

Feature Articles

O2 expands 5G rollout in 100 UK towns & cities

Yehia El Amine

Published

 on

5G

UK operator O2 announced their fifth-generation network expansion plan that will cover over 100 UK towns and cities, a year after they had initially kicked off their 5G rollout. 

While partnering with Nordic vendors Ericsson and Nokia for their 5G infrastructure, O2, which is owned by Spanish telco Telefonica, will allow existing customers a refresh tariff to move onto 5G at their convenience.

“When we launched 5G last October, we said it was the first step on a journey. One year on and we have made some incredible progress, not just in terms of our roll-out but in bringing about new capabilities that will make real changes to people’s everyday lives,” Derek McManus, Chief Operating Officer at O2, said in a statement.

With continuous investment into new technologies, O2 has successfully built up approximately 10,000 LTE-M sites that are now live across the country.

“We firmly believe 5G has a role in helping to rebuild Britain, unlocking huge possibilities for our economy and society. We’re excited to keep pushing ahead with our rollout along with our partners Ericsson and Nokia, to keep supporting our customers, businesses and society,” McManus added.

The new network will cover 57 percent of premises and 58 percent of the population while allowing billions of devices to be connected to the internet to collect and share data.

5G and UK businesses

Over the last year, O2 has increased their focus on working alongside local businesses and consortia to test and build 5G use cases to enhance the way we work and live and help rebuild Britain.

Among those initiatives, the UK operator has partnered up with Northumbrian Water Group to enable experienced technicians to remotely guide on-the-ground teams through complex tasks by relaying real-time data and instructions using augmented reality (AR) technology, creating ‘Remote Experts’.

“No one could have predicted the way this year has turned out, and that almost overnight, customers would turn to their networks more than ever before to keep them connected to loved ones, colleagues and suppliers. Connectivity has never been more important, and we want our network to continue to raise the bar,” O2 COO highlighted.

In parallel, O2 launched the first commercial laboratory for 5G and satellite communications in the UK as part of Project Darwin, a four-year trial program supported by the European Space Agency and based in the Harwell Science and Innovation Campus in Oxfordshire.

The laboratory is now open to companies looking to test proof of concept for connected and autonomous vehicles (CAVs) using both 5G and satellite communications.

The operator has also teamed up with the University of Glasgow to work with the NHS to deliver a pilot trial of a fully connected 5G powered clinic-on-wheels, in order to track COVID-19 at six care homes in Glasgow.

In addition, O2 is working with Deloitte, Wayra and Digital Catapult to build 5G accelerators in Birmingham, Wolverhampton, and Coventry.

These facilities will comprise of centrally located office and demonstration spaces with access to a private 5G network allowing businesses and public sector organizations to experiment with 5G features that aren’t yet commercially released.

The competition

Previously, tensions spiked between O2 and rivals Three UK after the former criticized Ofcom’s recent proposal to defragment the 5G ultrafast mobile broadband friendly 3.4-3.8GHz radio spectrum bands after next year’s auction.

The Office of Communications is the government-approved regulatory and competition authority for the broadcasting, telecommunications, and postal industries of the United Kingdom.

The operator complained that the decision favors Three UK (H3G) by allowing them to establish a “kingmaker” position from where they can obstruct rivals.                                     

This was partially due to OfCom’s attempt to rebalance the market to make the differences in spectrum ownership between network operators less dramatic.

The amount and type of spectrum that an operator has is crucial for all sorts of reasons, not least because it can have an impact upon network coverage, rollout costs and particularly data performance (mobile broadband speed).

However, Three UK is in a different position when it comes to 5G because they already harbor 140MHz of related spectrum, including a single 100MHz contiguous block (rivals tend to have 50MHz or 40MHz blocks).

Continue Reading

Trending