Personal Digital Autonomy part 1
Feb. 4th, 2025 03:14 pmAround the same time I was thinking about this, I came across the term “digital autonomy” and this paper from Mayer and Lu from CASSIS at the University of Bonn. The authors are discussing digital autonomy for entire countries, focusing on the EU, noting that “Decades of neoliberal deregulation, trade, and technology-driven globalization created far-reaching dependencies that cannot be reversed overnight” (1). I'm not thinking at that scale, just the individual level. But maybe I can borrow some of the term to express myself. I'm thinking about what I'm going to call personal digital autonomy. Systemic, cultural, and certainly legal changes are necessary for the continued health of any internet-using population. But until that can happen, something has to start somewhere.
Why is there not more focus on encouraging personal digital autonomy by sharing the rudiments of device use and maintenance as well as encouraging “non techy” people to learn bits of code and make things? Why are we not encouraged to curate and maintain our own data? Why are we physically separated from not only the inner workings of our own devices but also the ways in which we make, store, and retrieve the data we generate? Why do so many of us learn things about our computers, phones, apps, and platforms by random instead of organized, concerted effort?
These questions are are rhetorical. I know why.
It's pretty profitable to make devices and processes seem like magic. Instead of fixing an issue, people feel they have no option but to buy another. Planned obsolescence also works like gangbusters here, keeping us buying. It's also amazingly profitable to have tons and tons and tons of data attached to each user. Everything we make online as private users is not private at all: it is scanned, sold, traded, scattered to the far reaches of companies' boardrooms. It's far easier to keep us throwing more and more of our lives and time into that well.
Basic knowledge
While personal computing (what a throwback term!) really took off towards the end of the 20th century, culturally (in the US at least), “we” never closed the skill gap between the nerds and the normies. We unleashed cheap machines and cheap access on a population that was unevenly educated in tech to put it very mildly. Fast forward to today where we are still dealing with the depth charge that is the smartphone and almost ubiquitous internet access.
In my various lines of work, I see people on the daily who are entirely dependent on tech as we know it. At the same time they do not know how to operate the machines and programs they need to use for daily life. This is wrong. Something is very very wrong with this. In this day and age, I should not see 20 year olds who are unable to keyboard (throwback term #2!) and grown adults who are so ignorant of the concept of opening and closing apps they leave everything open all at once.
“We” used to educate people on basic, almost primitive tech use from typing to saving and renaming documents. This should not have stopped, but it did. And goodness knows when it's coming back. At roughly the same time, tech interests gradually grafted themselves to politics and here we are now. Everything feels out of control and many of us – myself as well – feel utterly helpless.
However, I must acknowledge that formal opportunities for learning about computers are still alive and well. We still have computer training classes, public libraries are still offering assistance, we still publish “For Dummies” books. This is out there for anyone who wants it.
So why are we still having problems that make it seem that the PC just came out last year? I don't have the time or space in this post really find out but I have many hypotheses. In any event, casual, non-tech trained users (myself included!) have got to get a handle on how we use our machines and our software, especially regarding cloud storage and social media.