It is without a shadow of a doubt that ethical scandals in the tech industry have become the norm everywhere; every week, a new story surfaces about how the titans of social media have been breaching the ethical code of its users.
However, we need to ask ourselves what “ethical tech” actually means?
“Technologies have a clear moral dimension—that is to say, a fundamental aspect that relates to values, ethics, and norms. Technologies reflect the interests, behaviors, and desires of their creators, and shape how the people using them can realize their potential, identities, relationships, and goals,” The World Economic Forum said in a report about clean tech.
Nobody has expressed the meaning of ethical tech as accurately as DigitalAgenda, a UK-based clean tech think tank, which believes that “ethical tech is, at its heart, a conversation focused on the relationship between technology and human values, the decisions we make toward technological advances, and the impacts they can have.”
According to report by the think tank, the concept of ethical tech is related to a set of values. The notion of ethical tech refers to a set of values governing the organization’s approach to its use of technologies as a whole, and the ways in which workers at all levels deploy those technologies to drive business strategy and operations.
In addition, leaders’ biggest social and ethical concerns brought about by digital innovation apart from privacy, are related to cybersecurity risks, job replacement and the use of data.
Thus, with that power comes immense responsibility to construct a more just, free, and prosperous online space than what we currently have; and this is starting to show within the ranks of the world’s biggest companies.
How many times have we seen employees from Google, Facebook, Pinterest, Amazon and the like publicly protest and stand against the policies and behaviors of their employers on ethical grounds?
This was brought more to the forefront after Netflix’s aired its new documentary called, “The Social Dilemma” which showcased previous employees who had worked under these titans of tech.
With very minimal protection guaranteed from the industry, this leaves the regular consumer to constantly keep themselves up-to-date and weary regarding their online behavior and how the algorithms at play are shaping what we feel, think, see, hear and experience.
Let’s jump into the basic red flags that people should be aware of.
In 2020, anyone who’s remotely tech-savvy and keeps up with the news is aware that private companies such as social media platforms and mobile service operators are collecting massive heaps of data over your every online footprint.
From real-time location tracking, communication, what you post, what you like, what you ignore, and how long you linger on them before making a decision; that information is taken and sold to a handful of other entities – including but not limited to – law enforcement, the intelligence community, advertisers, political campaigns, and more.
And all of this is done without the proper consent of its userbase.
People might not think what’s at stake.
It’s not only about collecting your data to perfectly place which ad you’re going to see next when you’re mindlessly scrolling down your preferred social media platforms; it runs much deeper than that.
This information can be used in a plethora of ways against its users; law enforcement in some countries can access the data and surveillance technology to track and keep tabs on protestors, journalists, persons of question, and the like, which is a complete breach of their very basic human rights.
The trickery of deepfakes
Deepfakes is the use of media clippings, such as a photo, audio or video recording of someone and using it manipulate what the person is saying and doing by swapping out their likeness for another person.
A perfect example of this was seen back in April of this year, when State Farm aired a controversial TV commercial that appeared to show an ESPN analyst making shockingly accurate predictions about the year 2020 in 1998.
The fact that this is becoming a new trend is legitimately scary.
Another deepfake video surfaced where Belgium’s Prime Minister Sophie Wilmès links COVID-19 to climate change. In one particularly frightening example, rumors that a video of the president of a small African country was a deepfake helped instigate a failed coup.
Fake news is still in its prime
Fake news is alive and kicking.
We’ve seen it meddle with elections far and wide, start trade wars and many other real-world repercussions that society hasn’t been able to successfully flag most of the time.
Between 2015 and 2017 Russian operatives posing as Americans successfully organized in-person rallies and demonstrations using Facebook. In one instance, Muslim civil rights activists counter-protested anti-Muslim Texas secessionists in Houston who waved Confederate flags and held “White Lives Matter” banners.
Russian disinformation operatives organized both rallies, and cybersecurity experts predict more to come in the run-up to the 2020 elections.
It has now become the norm for product managers, designers, tech marketers and start-up founders to tirelessly create user experiences that would be physically and psychologically impossible to put down.
While the people behind the building blocks of these platforms see dollar signs in the distance, we need to also weigh the matter of their long-term effects on the end user.
This kind of tech is being labelled as “habit-forming products;” while they are not all bad, people need to be able to personally assess when a habit is becoming toxic.
It isn’t rocket science to assume that social media has become a common trigger for psychological conditions such as anxiety and depression – the studies speak for themselves.
In these times, there needs to be increased digital media literacy through education, seeing that people aren’t well informed enough to fully understand the level of influence companies have on our personal decisions – from what brand of shoes we decide to buy, to which president we decide to vote for.
IoT: The hidden costs of the always connected age
By: Hannes Gredler, founder and CTO of RtBrick.
Homes are overflowing with connected devices. Everything from “smart” televisions, speakers, light bulbs, and even smart microwaves have surged in popularity in the past several years – with over 10 billion IoT devices expected to be in use worldwide this year. Many of these devices are always-on and always-connected, ready to be used at the utterance of a simple voice command. This means that even when they aren’t “on,” the devices are still consuming data. This explosion in data consumption comes at a cost, however, and so far, telecommunication companies have borne the brunt of it.
Data isn’t free. Increased network capacity takes power, and power costs money. Multiplied across the millions of customers in a company’s network, the effects become significant. Most broadband plans offered by telecoms operators allow unlimited data access for a set monthly price, with packages priced by bandwidth rather than total data usage. This is great for consumers who continue to stream more and add new connected devices to their homes, but still pay the same price for internet. For the companies providing the network, every additional bit of data consumed comes at a cost.
Most consumers are not prepared to pay more for this increased data that fuels their IoT devices and allows Netflix to be binged for hours on end. This leaves telecom companies in a predicament. How can they keep customers happy without raising prices, while not becoming overwhelmed by the large increases in data being consumed thanks to the influx of connected devices?
The complications of growing data needs
Network service providers can typically expect a 35 to 40 percent growth in their network traffic each year, a number that will only continue to climb as connected devices become more prevalent. This traffic is mainly delivered using CDNs (Content Delivery Networks), which hold copies of popular content in servers embedded throughout the network. Although this mitigates investment in the core of the network, it doesn’t help reduce costs in the access networks, which are consuming an increasing amount of data and power.
When it comes to data consumption, not all connected devices are created equal either. Some only consume data in short bursts when being used – like a laptop used for web browsing. Smart TVs, on the other hand, are some of the biggest offenders when it comes to data consumption, with an HD TV stream using 5 Mbps and a 4k stream using 10 Mbps. This is constant whenever the television is in use, making the total data sent quickly amass as consumers stream episode after episode.
Other devices that come in high on the data usage list include video chat services like Skype and Zoom, gaming, a CCTV security camera backing up to the cloud, and smart home devices that are always on and connected, even when no one is home. This last group is often overlooked when it comes to data consumption. While it’s easy to understand how streaming HD TV for hours can consume a lot of data, many people don’t realize that their smart speakers, lightbulbs, doorbells and other home IoT devices are constantly pinging data to their respective servers. This is why they’re always ready to use at a moment’s notice, but also what makes the addition of each new device to a home so costly from a network perspective.
The social distancing measures being used to mitigate the current Covid-19 pandemic are an example of what happens when this growth in data consumption occurs quickly. People are staying home and working remotely in high numbers – and using their home networks much more than usual. In Europe, networks haven’t been able to handle this influx, forcing content providers to only allow video to be streamed in standard definition. While this heightened network load is being caused by a serious situation that requires sacrifices, it also provides a look into the future at what can happen as the number of devices in use surges. Remote working was already a benefit gaining popularity before this current crisis, and its proving that networks will need to increase their efficiency and ability to scale to cope.
Overcoming the growth barrier
With data needs only set to increase as connected devices rise in popularity, telecom companies need to be able to scale their networks to accommodate the increasing data load, while cutting costs. The power behind networks is always going to come with a price, but the systems distributing the data across the network can be made more effective. Restructuring, reorganization and efficiency measures have their limits to solving the issue, however.
Telecom companies have been following the same network building playbook for years, but it unfortunately no longer works in the age of connected devices. To adapt networks so they can handle modern technologies, telecom companies should follow the path of their more modern, cloud-native counterparts. These companies have exceled by building their infrastructure using off-the-shelf hardware, running independent software, and deploying it using zero-touch-provisioning systems. Just think about how easy it is for Facebook to roll out a new feature or scale their platform that can handle influxes of use with relative ease.
There’s no reason that telecom companies can’t emulate this and look to build their networks in a similar way. Formally, this is known as network disaggregation. Simply, it means that networking software and hardware are purchased separately. In the past, networking infrastructure and software has been only sold as a package, but as networks modernize and look to model themselves after internet-natives that’s changing. Aside from being able to scale and update with ease, the main benefit of the disaggregated model is that its costs are greatly reduced. This will be vital as the number of connected devices surge.
Network service providers also need to plan better to ensure they aren’t caught in a situation where they can’t keep up with rising data consumption. Passing costs onto consumers will not be an option as competition and innovation in the space continues to progress. Knowing that data usage will only continue to surge, telecom companies need to plan for the future and adopt a network structure that will allow them to make upgrades to provide more data and increased bandwidth for devices that require it. This means looking beyond the typical network upgrade cycle that happens every few years and shifting to a model that allows scaling and software upgrades to happen in a quicker and more cost-effective manner.
Connected devices don’t have to mean the downfall of traditional telecoms companies and their networks, but their near exponential growth will require a new mindset when it comes to building access networks.
Growing use of cloud storage, but a hybrid approach remains
Technological trends and developments are changing the way data is stored, along with the way it needs to be protected; which is why data security has been placed at the heart of organizations’ concerns as they edge closer toward complete digitization.
As data continues to become not only more valuable, but also more essential for how organizations operate, so does its targeting by cybercriminals far and wide.
In parallel, high-profile data breaches are making the public more distrustful of organizations who store personal data – and causing governmental entities to create stringent compliance regulations.
According to numbers by U.S.-based market intelligence firm IDC, the total volume of digital data people and businesses have created worldwide is approximately 50 zettabytes in 2020 and is projected to triple by 2025.
While companies have either opted toward complete cloud storage, or on-premise storage, the risk remains the same.
“Attacks on cloud services have more than doubled, while ransomware demands are now the leading form of incidents during data breaches,” a report by U.S.-based cybersecurity firm, Trustwave stated.
The report added that the global annual cost of cybercrime, when factoring in regulatory fines, is projected to run into the trillions in the near future.
Trustwave has also recently published a survey to touch up on how enterprises are protecting and storing their data.
Small in numbers, big in responsibilities
According to the survey, over 75 percent of respondents were part of organizations with over 500 employees and 40 percent were part of organizations with over 1,000 employees.
Yet, of those polled, most (24 percent) had a security team size between 16-20. Just 20 percent had a team of 21 or greater. On average, Singapore had the smallest security team size with 47 percent reporting having between 1-10 members.
Thus, with the average cost of data breaches rising, the stakes for organizations couldn’t be higher.
“Cybercrime is becoming so prevalent, organizations are now putting as much emphasis on detection and response as they are prevention— meaning that small cybersecurity teams are compelled to use managed security services providers and automation tools and services to supplement efforts,” the survey highlighted.
Increased use of cloud storage
When asked where their data currently resides, most respondents (55 percent) said that they use both public cloud and on-premises databases to store data, while only 17 percent solely use public cloud only.
Singapore organizations use the hybrid model much more frequently at 73 percent which is 18 percent higher than the total, and US organizations use it less frequently at 45 percent, while storing data on-premises only, the most at 35 percent.
While many organizations across industries are moving their sensitive data into some form of cloud storage such as multi-cloud environments and cloud-based services, including cloud-based applications, the focus on security intensifies.
“There’s a common misconception that cloud service providers (CSPs) are responsible for security of the platform or share liability for breaches. That’s almost always not true – meaning your team is still responsible for finding security solutions that can work seamlessly no matter where your data resides, whether it’s on-premises, in the cloud(s) or split across a combination of both,” the survey emphasized.
More than one service
When asked how many third-party cloud services organizations use, most respondents at 70 percent use between 2-4 public cloud services followed by 18 percent using 0 or just one, and 12 percent use 5 or more.
At 14 percent, the United States had the most instances of using 5 or more public cloud services followed by the United Kingdom at 13 percent, while Australia and Singapore were tied at 9 percent.
“What’s interesting here is the added level of complexity that security teams now face, with cybersecurity teams tasked with managing multiple vendors – as well as needing to account for the previously mentioned additional security risks that cloud services create,” the survey reported.
However, the prevalent approach by enterprises has been to adopt a hybrid model of using both cloud and on-premise data storage to not have a centralized database, which acts as another line of defense against hackers by removing one piece of the puzzle away from their digital fingertips.
O2 expands 5G rollout in 100 UK towns & cities
UK operator O2 announced their fifth-generation network expansion plan that will cover over 100 UK towns and cities, a year after they had initially kicked off their 5G rollout.
While partnering with Nordic vendors Ericsson and Nokia for their 5G infrastructure, O2, which is owned by Spanish telco Telefonica, will allow existing customers a refresh tariff to move onto 5G at their convenience.
“When we launched 5G last October, we said it was the first step on a journey. One year on and we have made some incredible progress, not just in terms of our roll-out but in bringing about new capabilities that will make real changes to people’s everyday lives,” Derek McManus, Chief Operating Officer at O2, said in a statement.
With continuous investment into new technologies, O2 has successfully built up approximately 10,000 LTE-M sites that are now live across the country.
“We firmly believe 5G has a role in helping to rebuild Britain, unlocking huge possibilities for our economy and society. We’re excited to keep pushing ahead with our rollout along with our partners Ericsson and Nokia, to keep supporting our customers, businesses and society,” McManus added.
The new network will cover 57 percent of premises and 58 percent of the population while allowing billions of devices to be connected to the internet to collect and share data.
5G and UK businesses
Over the last year, O2 has increased their focus on working alongside local businesses and consortia to test and build 5G use cases to enhance the way we work and live and help rebuild Britain.
Among those initiatives, the UK operator has partnered up with Northumbrian Water Group to enable experienced technicians to remotely guide on-the-ground teams through complex tasks by relaying real-time data and instructions using augmented reality (AR) technology, creating ‘Remote Experts’.
“No one could have predicted the way this year has turned out, and that almost overnight, customers would turn to their networks more than ever before to keep them connected to loved ones, colleagues and suppliers. Connectivity has never been more important, and we want our network to continue to raise the bar,” O2 COO highlighted.
In parallel, O2 launched the first commercial laboratory for 5G and satellite communications in the UK as part of Project Darwin, a four-year trial program supported by the European Space Agency and based in the Harwell Science and Innovation Campus in Oxfordshire.
The laboratory is now open to companies looking to test proof of concept for connected and autonomous vehicles (CAVs) using both 5G and satellite communications.
The operator has also teamed up with the University of Glasgow to work with the NHS to deliver a pilot trial of a fully connected 5G powered clinic-on-wheels, in order to track COVID-19 at six care homes in Glasgow.
In addition, O2 is working with Deloitte, Wayra and Digital Catapult to build 5G accelerators in Birmingham, Wolverhampton, and Coventry.
These facilities will comprise of centrally located office and demonstration spaces with access to a private 5G network allowing businesses and public sector organizations to experiment with 5G features that aren’t yet commercially released.
Previously, tensions spiked between O2 and rivals Three UK after the former criticized Ofcom’s recent proposal to defragment the 5G ultrafast mobile broadband friendly 3.4-3.8GHz radio spectrum bands after next year’s auction.
The Office of Communications is the government-approved regulatory and competition authority for the broadcasting, telecommunications, and postal industries of the United Kingdom.
The operator complained that the decision favors Three UK (H3G) by allowing them to establish a “kingmaker” position from where they can obstruct rivals.
This was partially due to OfCom’s attempt to rebalance the market to make the differences in spectrum ownership between network operators less dramatic.
The amount and type of spectrum that an operator has is crucial for all sorts of reasons, not least because it can have an impact upon network coverage, rollout costs and particularly data performance (mobile broadband speed).
However, Three UK is in a different position when it comes to 5G because they already harbor 140MHz of related spectrum, including a single 100MHz contiguous block (rivals tend to have 50MHz or 40MHz blocks).
Finland shocked by therapy center hacking, client blackmail
Facebook demands academics disable ad-targeting data tool
What’s holding remote learning back?
Tech employment – losses and recovery
5 Reasons Why… Telecoms is Important in Society
Telecom Sales Strategies that will Bring You Success in 2020
Mountasser Hachem – He Who Dares Wins
10 tech facts you should know today 03/04/2020
- MedTech4 weeks ago
5 things you need to know about Coronavirus
- News3 weeks ago
Amazon: Nearly 20,000 workers tested positive for COVID-19
- Feature Articles4 weeks ago
Hungarian Magyar Telekom hit with brief cyberattack
- Feature Articles3 weeks ago
5G and the public sector: a plethora of change awaits
- MedTech2 weeks ago
4 Coronavirus-fighting technologies being pushed today
- MedTech3 weeks ago
SkillAssure Discoveri: India’s tech retraining program now aids Philippines
- Feature Articles3 weeks ago
The new frontier of smart farming powered by 5G
- News3 weeks ago
Software Company Founder McAfee Charged with Tax Evasion