I hope this will not come as news to anyone, that data is a great asset in today’s world. Companies try to anticipate your needs and sell you products and services. Governments ask for your views for policy drafting based on popular opinion, and scientists observe trends to offer predictions on demographics, climate, or economics. All this is based on data, although only systemised, connected, and put-to-context data will give you valuable output. But how much data can different entities interested in it get, and who decides the collection, distribution, and amount?
Do I really need to worry about it?
According to GDPR, ‘the protection of natural persons concerning the processing of personal data is a fundamental right.’ To ensure this fundamental right, we have delegated that responsibility to public entities, e.g. national registry keepers, as being responsible for maintaining a comprehensive database of their personal data would be far beyond a regular citizen’s capability. Similarly, through a representative democracy, people mandate a handful of representatives they trust their governance; data collection and management are also concentrated in national registries. Simply put, civil servants act as your data keepers to provide you with public services based on your personal data. For the most part, a trust credit is granted to the state to treat my data appropriately and with the protection of my person in mind.
Does this mean I can clean my hands of responsibility, knowledge, and awareness? Far from it. Information literacy and data awareness are required not only for specialists who can write code but for you, your grandma, and every state servant these days to be critical and objective in treating personal data.
Are we regulating enough?
Next to GDPR, which states many guidelines for data usage, national regulations also support personal data protection and describe clear cases when acquiring such data is justified. To illustrate this more practically, let’s use the Population Register, the Estonian case’s main artery of data bloodstream, with its corresponding Act that declares the purpose of using the data collected in the Register. Such could be granting access to data to state and local government agencies for the performance of public duties. Other natural or legal persons might have access to data under justified interest.
However, regulations and standards alone are insufficient and do not warrant complete safety because people enforce them. And human error is a flaw written into the system, no matter how complex and technologically impressive.
What if operations fail?
It’s not a coincidence I chose the Population Register as an example here. Recently, there was a somewhat strong whirlwind in Estonia’s society when the press published an unethical use of personal data to survey reproduction and why women choose not to have children. One of Estonia’s learned lessons in building a strong digital state has been openly sharing those mentioned lessons, even if they are unpleasant. It would make me a hypocrite not to comment on such a sensitive data use case that struck painfully the Estonian public.
Estonia, being a highly digitised country, where data acquisition might be somewhat simpler and faster when you want to overlook all ethical and legal processes. So, it was incredibly disturbing for thousands of Estonian women when they got a very personalised survey requiring sexual habits, political views and preferences, and religious beliefs, all tied to the fact they do not have children. Recall me earlier pointing out the importance of context? The problem does not revolve so much around researching the demographic trends and factors potentially influencing them but the precise profiling this specific data spectrum could provide.
“It was indicated that all information was obtained from the population register, and that was a major problem for me, as I see it a clear invasion of privacy,” said Maria, one survey recipient.
Imagine getting a survey invitation so tailor-made that it made you question the ethical tone of it, only to discover a population policy-ordered think tank backed by Isamaa, a national-conservative political party in Estonia. Estonians love our pre-filled income tax declarations so that you can submit them in a matter of seconds, but we are definitely taken aback by pre-filled surveys asking why women choose not to give birth and its relation to their political views.
Should we restrict further the access to data?
Following the news about this unethical data usage, thousands of people decided to restrict data access in the Population Register for commercial use.
Even though restricting access to data is one’s full right, it might not be beneficial in the long run. Societal studies are still very much needed, and data-backed decision-making is a good strategy.
Where do we draw the line of tolerance?
We usually accept when the waiter messes up the order in a restaurant. You might be disturbed, but let it slide. We usually tend to forgive even bigger mistakes because even with strict protocols and high standards to upkeep, to err is human – a benefit of the doubt, we do not extend to technology, as systems are expected to be flawless. But how sensitive is our social nerve to technology failures? Do we need to start tolerating more technological errors in this digital era?
In a more digital world, people are less anonymous and need active engagement to moderate the data available about them. By no means am I pointing to the inevitable nature of it where we should simply throw in the towel. Unfortunately, the unethical retrieval of data from the Population Register happened in the first place, but the more we should talk about it and ensure all sides involved get better knowledge and a higher understanding of what not to repeat. Raising awareness is one vital step in deterrence.
Are we eager to learn?
In Estonia, around 150,000 people every year participate in lifelong learning. Although, as a society, we understood the importance of ICT competencies very early in the 1990s, prioritising them in the national curriculum, systematic civil servant upskilling is fairly recent. Digital State Academy, a single e-learning platform, was created for civil servants to catch up with the rapid development of the digital state and keep their skills on an adequate level to do so.
Prior existing programmes for other target groups have been running for decades, starting with Tiger Leap, introduced in 1996 to give students the necessary ICT competencies, the Look@World training courses series helping adults and the elderly gain lacking skills to engage in digital services, or constant raising awareness campaigns aimed at wider population to prevent falling victim to cyber crimes (phishing, online fraud, malware, or identity theft).
Would the Estonian model of fragmented upskilling programmes help to raise and maintain a cyber-savvy nation? Time will tell. Meanwhile, we could take a look at what other digitally minded countries are doing to offer an ample, up-to-date set of skills to their population. To help me along on this, I refer to the findings of Siddhi Pal, a Master of Public Policy student at the University of Oxford, who was very kind to share her comparisons. She is looking at upskilling programs across the globe, and her research adopts a behavioural insights and inclusion lens, aiming to bridge gaps and promote equal access to upskilling opportunities for all.
France – creating a learning culture where everyone gets equal opportunities because sometimes labelling as “low skilled” is counterproductive for motivation as it may indicate you are somehow different from the norm. Instead, people get a personal account to choose from a wide range of skill training and not less important is showing how much the government is spending on their upskilling, as compared to free programs where people do not know the cost, they tend to attribute a lower value to subsidized free of charge programs.
Also, the advantage of being a central portal is that it makes it easier to promote certain trainings or skills, making it a practical tool for the government to align to market needs or respond to a current lack faster.
Germany – focusing on SMEs’ employees, offering upskilling to support a certain industry. A similar trend in Estonia, where in the early 2000s, many private companies started to offer free-of-charge ICT courses to their employees to enhance their skills and equip them with tools to match the needs of their work tasks.
Denmark – pointed out as a supporting labour policy model where various social security measures support re-skilling and swift re-entry to the labour market to lower the unemployment rate in the country. In Estonia, the unemployment fund also prioritises up-skilling and re-skilling and a quick entry to the labour market in case of a job loss, making it a proactive strategy.
Siddhi Pal also points out that in Estonia’s case, policies that incentivize businesses, such as wage subsidies, supervision reimbursement, and tax exemptions, emphasize businesses’ critical role in fostering a lifelong learning culture. Other countries seem to follow suit. Digitisation is foremost a behavioural change that needs proper support in up- or re-skilling so people would feel knowledgeable and comfortable adjusting to changed needs.
What can we learn from here?
First, that digitisation is a mere tool of efficiency and comfort. Technology does not make decisions but does exactly what it is built to do.
Also, technology or people (or both) fail. We need constant revision, adjustment, sharing of best practices, and ensuring all parties involved keep adequate knowledge, skill and cyber hygiene.
Lastly, when an error, mishap, or inaccuracy occurs, we should discuss it openly, honestly, and take responsibility for the (possible) damage caused. And then search for ways to improve so as not to let the mistake be repeated.
Estonia’s digital state runs strongly. Some miscalculations that take place only reaffirm that it’s actually functioning and not in a vacuum status. One who never acts can never err.